Monday, March 22, 2010

onward

Fifth Generation" redirects here. For other uses, see Fifth Generation (disambiguation).
The Fifth Generation Computer Systems project (FGCS) was an initiative by Japan's Ministry of International Trade and Industry, begun in 1982, to create a "fifth generation computer" (see history of computing hardware) which was supposed to perform much calculation using massive parallel processing. It was to be the end result of a massive government/industry research project in Japan during the 1980s. It aimed to create an "epoch-making computer" with supercomputer-like performance and to provide a platform for future developments in artificial intelligence[1].
The term fifth generation was intended to convey the system as being a leap beyond existing machines. Computers using vacuum tubes were called the first generation; transistors and diodes, the second; integrated circuits, the third; and those using microprocessors, the fourth. Whereas previous computer generations had focused on increasing the number of logic elements in a single CPU, the fifth generation, it was widely believed at the time, would instead turn to massive numbers of CPUs for added performance.
The project was to create the computer over a ten year period, after which it was considered ended and investment in a new, Sixth Generation project, began. Opinions about its outcome are divided: Either it was a failure, or it was ahead of its time.
Contents[hide]
1 History
1.1 Background and design philosophy
1.2 Implementation
1.3 Failure
1.4 Timeline
2 Notes
3 External links
//

microprocessor


Microprocessors
(The Fourth Generation)
After the integrated circuit, the only place to go was down - in size, that is. Large scale integration (LSI) could fit hundreds of components onto one chip. By the 1980's, very large scale integration (VLSI) squeezed hundreds of thousands of components onto a chip. The ability to fit so much onto an area about half the size of a U.S. dime helped diminish the size and price of computers. It also increased their power, efficiency and reliability. Marcian Hoff invented a device which could replace several of the components of earlier computers, the microprocessor. The microprocessor is the characteristic of fourth generation computers, capable of performing all of the functions of a computer's central processing unit. The reduced size, reduced cost, and increased speed of the microprocessor led to the creation of the first personal computers. Until now computers had been the almost exclusively the domain of universities, business and government. In 1976, Steve Jobs and Steve Wozniak built the Apple II, the first personal computer in a garage in California. Then, in 1981, IBM introduced its first personal computer. The personal computer was such a revolutionary concept and was expected to have such an impact on society that in 1982, "Time" magazine dedicated its annual "Man of the Year Issue" to the computer. The other feature of the microprocessor is its versatility. Whereas previously the integrated circuit had had to be manufactured to fit a special purpose, now one microprocessor could be manufactured and then programmed to meet any number of demands. Soon everyday household items such as microwave ovens, television sets and automobiles with electronic fuel injection incorporated microprocessors. The 1980's saw an expansion in computer use in all three arenas as clones of the IBM PC made the personal computer even more affordable. The number of personal computers in use more than doubled from 2 million in 1981 to 5.5 million in 1982. Ten years later, 65 million PCs were being used. Computers continued their trend toward a smaller size, working their way down from desktop to laptop computers (which could fit inside a briefcase) to palmtop (able to fit inside a breast pocket).

Monday, March 8, 2010

ONWARD

THE INTERAGRATED


babbage and the countess

Charles Babbage, FRS (26 December 1791 – 18 October 1871)[2] was an English mathematician, philosopher, inventor and mechanical engineer who originated the concept of a programmable computer[3]. Parts of his uncompleted mechanisms are on display in the London Science Museum. In 1991, a perfectly functioning difference engine was constructed from Babbage's original plans. Built to tolerances achievable in the 19th century, the success of the finished engine indicated that Babbage's machine would have worked. Nine years later, the Science Museum completed the printer Babbage had designed for the difference engine, an astonishingly complex device for the 19th century. Considered a "father of the computer",[4] Babbage is credited with inventing the first mechanical computer that eventually led to more complex designs.

second generation-the transistor

A transistor computer was a computer which used transistors instead of vacuum tubes. The "first generation" of electronic computers used vacuum tubes, which generated large amounts of heat, were bulky, and were unreliable. A "second generation" of computers, through the late 1950s and 1960s featured boards filled with individual transistors and magnetic memory cores.

FIRST GENERATION-THE VACUUM TUBE

In electronics, a vacuum tube, electron tube (in North America), thermionic valve, or valve (elsewhere, especially in Britain) is a device used to amplify, switch, otherwise modify, or create an electrical signal by controlling the movement of electrons in a low-pressure space. Some special function vacuum tubes are filled with low-pressure gas: these are so-called soft tubes as distinct from the hard vacuum type which have the internal gas pressure reduced as far as possible. Almost all tubes depend on the thermionic emission of electrons.

ENIAC


ENIAC (pronounced [ˈɛniæk]), short for Electronic Numerical Integrator And Computer,[1][2] was the first general-purpose electronic computer. It was a Turing-complete, digital computer capable of being reprogrammed to solve a full range of computing problems.[3] ENIAC was designed to calculate artillery firing tables for the United States Army's Ballistic Research Laboratory, but its first use was in calculations for the hydrogen bomb.[4][5] When ENIAC was announced in 1946 it was heralded in the press as a "Giant Brain". It boasted speeds one thousand times faster than electro-mechanical machines, a leap in computing power that no single machine has since matched. This mathematical power, coupled with general-purpose programmability, excited scientists and industrialists. The inventors promoted the spread of these new ideas by teaching a series of lectures on computer architecture.
The ENIAC's design and construction were financed by the United States Army during World War II. The construction contract was signed on June 5, 1943, and work on the computer was begun in secret by the University of Pennsylvania's Moore School of Electrical Engineering starting the following month under the code name "Project PX". The completed machine was unveiled on February 14, 1946 at the University of Pennsylvania, having cost almost $500,000 (nearly $6 m in 2008, adjusted for inflation). It was formally accepted by the U.S. Army Ordnance Corps in July 1946. ENIAC was shut down on November 9, 1946 for a refurbishment and a memory upgrade, and was transferred to Aberdeen Proving Ground, Maryland in 1947. There, on July 29, 1947, it was turned on and was in continuous operation until 11:45 p.m. on October 2, 1955.
ENIAC was conceived and designed by John Mauchly and J. Presper Eckert of the University of Pennsylvania.[6] The team of design engineers assisting the development included Robert F. Shaw (function tables), Chuan Chu (divider/square-rooter), Thomas Kite Sharpless (master programmer), Arthur Burks (multiplier), Harry Huskey (reader/printer), Jack Davis (accumulators) and Iredell Eachus Jr.[7]

THE modern era

ENIAC (pronounced [ˈɛniæk]), short for Electronic Numerical Integrator And Computer,[1][2] was the first general-purpose electronic computer. It was a Turing-complete, digital computer capable of being reprogrammed to solve a full range of computing problems.[3] ENIAC was designed to calculate artillery firing tables for the United States Army's Ballistic Research Laboratory, but its first use was in calculations for the hydrogen bomb.[4][5] When ENIAC was announced in 1946 it was heralded in the press as a "Giant Brain". It boasted speeds one thousand times faster than electro-mechanical machines, a leap in computing power that no single machine has since matched. This mathematical power, coupled with general-purpose programmability, excited scientists and industrialists.


Herman Hollerith (1860-1929),Columbia University School of Mines EM 1879,Columbia University PhD 1890.Photo: IBM.
Herman Hollerith is widely regarded as the father of modern automatic computation. He chose the punched card as the basis for storing and processing information and he built the first punched-card tabulating and sorting machines as well as the first key punch, and he founded the company that was to become IBM. Hollerith's designs dominated the computing landscape for almost 100 years.
After receiving his Engineer of Mines (EM) degree at age 19, Hollerith worked on the 1880 US census, a laborious and error-prone operation that cried out for mechanization. After some initial trials with paper tape, he settled on punched cards (pioneered in the Jacquard loom) to record information, and designed special equipment -- a tabulator and sorter -- to tally the results. His designs won the competition for the 1890 US census, chosen for their ability to count combined facts. These machines reduced a ten-year job to three months (different sources give different numbers, ranging from six weeks to three years), saved the 1890 taxpayers five million dollars, and earned him an 1890 Columbia PhD¹. This was the first wholly successful information processing system to replace pen and paper. Hollerith's machines were also used for censuses in Russia, Austria, Canada, France, Norway, Puerto Rico, Cuba, and the Philippines, and again in the US census of 1900. In 1911 Hollerith's company merged with two others to form the Computing-Tabulating-Recording Company (CTR), which changed its name to International Business Machines Corporation (IBM) in 1924.