More Topics

More Topics
Read More, Do More and Earn More

HISTORY OF COMPUTERS

 

HISTORY OF COMPUTERS

The history of computers is often referred to in reference to the different generations of computing devices. A generation refers to a state of improvement in the product development process. This term is also used in the different advancement of new computer technology. With each new generation, the circuitry has gotten smaller and more advanced than the previous generation before it. As a result of miniaturization, speed, power and computer memory has proportionally increased. New discoveries are constantly being developed that affect the way we live, work and play. 
Each generation of computers is characterized by major technological development that fundamentally changed the way computers operate, resulting in increasing smaller, cheaper, and more powerful and more efficient and reliable devices. There are five generations of computer namely;

1. First generation
2. Second generation
3. Third generation
4. Fourth generation
5. Fifth generation

Computer Generations

First generation: 1940 – 1956: Vacuum tubes
Second generation – 1956 – 1963 transistors
Third generation – 1964 – 1971 integrated circuits
Fourth generations – 1971 – present: Microprocessors
Fifth generations – present and beyond artificial intelligence.

The First Generation 

•The first computers used vacuum tubes for circuitry and magnetic drums for memory, and were often enormous, taking up entire rooms. They were very expensive to operate and in addition to using a great deal of electricity, generated a lot of heat, which was often the cause of malfunctions. First generation computers relied on machine language, the lowest-level programming language understood by computers, to perform operations, and they could only solve one problem at a time. Input was based on punched cards and paper tape, and output was displayed on printouts. 

The Second Generation

•Transistors replaced vacuum tubes and ushered in the second generation of computers. One transistor replaced the equivalent of 40 vacuum tubes. Allowing computers to become smaller, faster, cheaper, more energy-efficient and more reliable. Still generated a great deal of heat that can damage the computer. Second-generation computers moved from cryptic binary machine language to symbolic, or assembly, languages, which allowed programmers to specify instructions in words. Second-generation computers still relied on punched cards for input and printouts for output. These were also the first computers that stored their instructions in their memory, which moved from a magnetic drum to magnetic core technology.

The Third Generation

•The development of the integrated circuit was the hallmark of the third generation of computers. Transistors were miniaturized and placed on silicon chips, called semiconductors, which drastically increased the speed and efficiency of computers. Much smaller and cheaper compare to the second generation computers. It could carry out instructions in billionths of a second. Users interacted with third generation computers through keyboards and monitors and interfaced with an operating system, which allowed the device, to run many different applications at one time with a central program that monitored the memory. Computers for the first time became accessible to a mass audience because they were smaller and cheaper than their predecessors.

The Fourth Generation


•The microprocessor brought the fourth generation of computers, as thousands of integrated circuits were built onto a single silicon chip. As these small computers became more powerful, they could be linked together to form networks, which eventually led to the development of the Internet. Fourth generation computers also saw the development of GUIs, the mouse and handheld devices.

The Fifth Generation

•Based on Artificial Intelligence (AI). Still in development. The use of parallel processing and superconductors is helping to make artificial intelligence a reality. The goal is to develop devices that respond to natural language input and are capable of learning and self-organization. There are some applications, such as voice recognition, that are being used today.
The term was coined in 1956 by john McCarthy at the Massachusetts institute of Technology. Artificial intelligence includes:

1. Games playing
2. Expert system
3. Natural language
4. Neural network
5. Robotics

ASSIGNMENT

1. How many types of ways can we classify computers to? List them.
2. Explain the main basic classification of computer.

No comments:

Post a Comment