Five Generations of Computers
We’ve come a long way since the first generation of computer, with
new generation of computers bringing significant advances in speed
and power to computing tasks. Today we will learn about each of the
five generations of computers and major technology developments
that have led to the computer technology that we use today.
The history of computer development is a computer science topic
that is often used to reference the different generations of
computing devices. Each computer generation is characterized by a
major technological development that fundamentally changed the
way computers operate.
Each major developments from the 1940s to the present day (5th
generation of computer) has introduced smaller, cheaper, more
powerful, and more efficient computing machines. This technology
has minimized storage and increased portability.
What Are the 5 Generations of Computers?
Our journey through the five generations of computers starts in 1940
with vacuum tube circuitry and goes to the present day and beyond
with artificial intelligence (AI) systems and devices.
5 Generations of
Computers
Getting Started: Key Terms to Know
First Generation: Vacuum Tubes
Second Generation: Transistors
Third Generation: Integrated Circuits
Fourth Generation: Microprocessors
Fifth Generation: Artificial Intelligence (AI)
Getting Started: Key Terms to Know
The following technology definitions will help you to better
understand the five generations of computing:
Computer – an electronic machine that is used
for storing, organizing, and finding words, numbers,
and pictures, for doing calculations, and
for controlling other machines.
Integrated circuit (IC) – (интегральная схема) also
known as a microchip, computer chip, or simply chip, is a small
electronic device made up of multiple interconnected electronic
components such as transistors, resistors, and capacitors.
Microprocessor – (микропроцессор) is a computer
processor for which the data processing logic and control
is included on a single integrated circuit (IC), or a small
number of ICs.
Magnetic drums – (магнитные барабаны) is a direct-
access, or random-access, storage device. A magnetic drum,
also referred to as drum, is a metal cylinder coated with
magnetic iron-oxide material on which data and programs
can be stored. Magnetic drums were once used as a primary
storage device but have since been implemented as auxiliary
storage devices.
Binary – (двоичный) using a system of numbers that uses only
0 and 1.
Semiconductor – (полупроводник) is a material that is
neither a good conductor of electricity (like copper) nor a
good insulator (like rubber). The most common
semiconductor materials are silicon and germanium. These
materials are then doped to create an excess or lack of
electrons.
Nanotechnology – (нанотехнологии) is a field of science
whose goal is to control individual atoms and molecules to
create computer chips and other devices that are thousands
of times smaller than current technologies permit.
Machine language – (машинный язык) is the lowest-
level programming language (except for computers that
utilize programmable microcode). Machine languages are the
only languages understood by computers as this language
consists only of numbers.
Assembly language – (язык ассемблера) -
representation of a processor command in a human-
readable form.
Artificial Intelligence (AI) – (искусственный
интеллект) in its broadest sense, is intelligence exhibited
by machines, particularly computer systems.
First Generation: Vacuum Tubes (1940–1956)
The first generation of computer systems used vacuum tubes for
circuitry and magnetic drums for main memory, and they were often
enormous, taking up entire rooms. These computers were very
expensive to operate, and in addition to using a great deal of
electricity, the first computers generated a lot of heat, which was
often the cause of malfunctions. The maximum internal storage
capacity was 20,000 characters.
First generation computers relied on machine language, the lowest-
level programming language understood by computers, to perform
operations, and they could only solve one problem at a time. It would
take operators days or even weeks to set up a new problem. Input
was based on punched cards and paper tape, and output was
displayed on printouts.
It was in this generation that the Von Neumann architecture was
introduced, which displays the design architecture of an electronic
digital computer. Later, the UNIVAC and ENIAC computers, invented
by J. Presper Eckert, became examples of first generation computer
technology. The UNIVAC was the first commercial computer delivered
to a business client, the U.S. Census Bureau in 1951.
Second Generation: Transistors (1956–1963)
The world would see transistors replace vacuum tubes in the second
generation of computer. The transistor was invented at Bell Labs in
1947 but did not see widespread use in computers until the late
1950s. This generation of computers also included hardware
advances like magnetic core memory, magnetic tape, and the
magnetic disk.
The transistor was far superior to the vacuum tube, allowing
computers to become smaller, faster, cheaper, more energy-efficient,
and more reliable than their first-generation predecessors. Though
the transistor still generated a great deal of heat that subjected the
computer to damage, it was a vast improvement over the vacuum
tube. A second-generation computer still relied on punched cards
for input and printouts for output.
When Did Computers Start Using
Assembly Languages?
Second-generation computers moved from cryptic binary language to
symbolic, or assembly, languages, which allowed programmers to
specify instructions in words. High-level programming languages were
also being developed at this time, such as early versions
of COBOL and FORTRAN. These were also the first computers that
stored their instructions in their memory, which moved from a
magnetic drum to magnetic core technology.
The first computers of this generation were developed for the atomic
energy industry.
Third Generation: Integrated Circuits (1964–1971)
The development of the integrated circuit was the hallmark of the
third generation of computers. Transistors were miniaturized and
placed on silicon chips, called semiconductors, which drastically
increased the speed and efficiency of computers.
Instead of punched cards and printouts, users would interact with a
third-generation computer
through keyboards, monitors, and interfaces with an operating
system, which allowed the device to run many
different applications at one time with a central program that
monitored the memory. Computers, for the first time, became
accessible to a mass audience because they were smaller and
cheaper than their predecessors.
Did You Know… ? Integrated circuit (IC) chips are small electronic
devices made out of semiconductor material. The first integrated
circuit was developed in the 1950s by Jack Kilby of Texas Instruments
and Robert Noyce of Fairchild Semiconductor.
Fourth Generation: Microprocessors (1971–Present)
The microprocessor ushered in the fourth generation of computers, as
thousands of integrated circuits were built onto a single silicon chip.
The technology in the first generation that filled an entire room could
now fit in the palm of the hand. The Intel 4004 chip, developed in
1971, integrated all the components of the computer, from
the central processing unit and memory to input/output controls, on a
single chip.
In 1981, IBM introduced its first personal computer for the home user,
and in 1984 Apple introduced the Macintosh. Microprocessors also
moved out of the realm of desktop computers and into many areas of
life as more and more everyday products began to use the
microprocessor chip.
As these small computers became more powerful, they could be
linked together to form networks, which eventually led to the
development of the Internet. Each fourth-generation computer also
saw the computer development of GUIs, the mouse,
and handheld technology.
Fifth Generation: Artificial Intelligence (Present and
Beyond)
The fifth generation of computer technology, based on artificial
intelligence, is still in development. However, there are some
applications, such as voice recognition, that are being used today.
The use of parallel processing and superconductors is helping to
make artificial intelligence a reality. This is also so far the prime
generation for packing a large amount of storage into a compact and
portable device.
Quantum computation and molecular and nanotechnology will
radically change the face of computers in years to come. The goal of
fifth-generation computing is to develop devices that will respond
to natural language input and are capable of learning and self-
organization.