Generations of computer with images.
Computer Generation refers to the time of electronic computing technology development. Five generations of computers have been identified, with a sixth generation perhaps in development presently in the early twenty-first century.
Each generation of computers has progressed significantly throughout time, undergoing significant changes in size, type, and capability.
1st generation of computers (1940 – 1956)
The early computers were based on vacuum tube technology, which was invented in 1904 by John Ambrose Fleming and used for computations, storage, and control. The vacuum tubes and diode valves were the most important components in the early computers.
To accomplish operations, first-generation computers relied on the lowest-level machine language, and they could only solve one problem at a time.
The memory in these computers was magnetic drums (were very slow in speed). Even though the results weren’t 100% exact, punched and magnetic tapes were utilised for the computer’s input and output functions so that they could be displayed on printers.
Magnetic and punched tapes
In addition, the 8-bit microprocessor was used in the initial generation of computers.
The downsides of first-generation computers were their massive size and weight (made up of thousands of vacuum tubes), as well as the fact that they took up a lot of space. It was also difficult to move them once they were kept in one location. Another disadvantage is the use of a decimal number system and a large number of switches and wires.
The initial generation of computers had the advantage of being able to calculate in milliseconds (about five thousand sums per second.)
2nd generation of computers (1956 – 1964)
These were the first computers that used magnetic core technology instead of a magnetic drum to store their instructions in memory. During this time, a PDP-1 computer was used to play the first computer game, “Spacewar.”
In the timeline of second-generation computers, the central processing unit (CPU), multi-programming operating systems, programming language, memory, and input and output units (I / O units) were developed.
The main problems of second-generation computers were that they still used punch cards for input and hard copies for output, that they were difficult to move due to their massive size, and that some of them required ACs.
The atomic energy sector, nuclear power plants, and other commercial fields were the first to employ this second generation of computers.
3rd generation of computers (1964 – 1971)
Integrated circuits were introduced as the third generation (invented by Jack Kilby from 1958 to 1964). An integrated circuit (IC) is made up of numerous tiny transistors that are installed on semiconductor chips.
Multiprogramming (where numerous executable programmes are stored in memory) was developed at the same time as it reduced their production costs. IBM improved the term “computer architecture” in the mid-1960s. Minicomputers appeared at the end of the 1960s.
This ground-breaking innovation allowed machines to increase their processing capability and memory.
The third generation of computers was the first step toward miniaturisation, and it significantly increased their capabilities: control, scientific experiment automation, data transmission, and so on. It is also utilised in the production of radios, televisions, and other similar products.
4th generation of computers (1971 -2010)
Thousands of integrated circuits, equivalent to millions of transistors, were assembled and brought the entire central processing unit and other fundamental elements of the machine into a small chip called a microprocessor that was fitted on the CPU socket, ushering in the fourth generation of computers.
This generation of computers employed a graphical user interface (GUI)-based operating system, making it exceedingly simple to accomplish mathematical and logical operations.
High-speed memory systems using integrated circuits with capacities of several megabytes began to be used in computers. The speed of a computer has substantially improved (hundreds of millions of operations per second).
Personal computers were invented during this time period, and the concept still exists today. These were also the DEC (Digital Equipment Corporation) minicomputers’ generation.
5th generation of computers (2010 – now)
ULSI (Ultra Large Scale Integration) technology is the process of integrating or embedding millions of transistors on a single silicon microchip. Artificial intelligence is the name of the fifth and most recent generation of computers based on ULSI technology.
The main goal of the most recent fifth-generation computing and the effort made by computer researchers is to make them smart by implementing Artificial Intelligence so that gadgets that respond to natural language input and are capable of learning and self-organizing can be developed even in 2021.
This new information technology has greatly increased the size and working ability of the microprocessor, resulting in the widespread use of computers in fields such as entertainment, accounting, educational institutions, filmmaking, traffic control, business applications, and hospitals, as well as engineering, research, and defence.
As a result, the AI (Artificial Intelligence) generation of computers is also known as the 5th generation of computers.