Computer
From Wikipedia, the free encyclopedia
Jump to navigationJump to search
For other uses, see Computer (disambiguation).
Computer
Computers and computing devices from different eras
A computer is a machine that can be instructed to carry
out sequences of arithmetic or logical operations automatically via computer
programming. Modern computers have the ability to follow generalized sets of
operations, called programs. These programs enable computers to perform an
extremely wide range of tasks. A "complete" computer including the hardware,
the operating system (main software), and peripheral equipment required and used for
"full" operation can be referred to as a computer system. This term may as well be
used for a group of computers that are connected and work together, in particular
a computer network or computer cluster.
Computers are used as control systems for a wide variety of industrial and consumer
devices. This includes simple special purpose devices like microwave
ovens and remote controls, factory devices such as industrial robots and computer-
aided design, and also general purpose devices like personal computers and mobile
devices such as smartphones. The Internet is run on computers and it connects
hundreds of millions of other computers and their users.
Early computers were only conceived as calculating devices. Since ancient times,
simple manual devices like the abacus aided people in doing calculations. Early in
the Industrial Revolution, some mechanical devices were built to automate long tedious
tasks, such as guiding patterns for looms. More sophisticated electrical machines did
specialized analog calculations in the early 20th century. The first digital electronic
calculating machines were developed during World War II. The
first semiconductor transistors in the late 1940s were followed by the silicon-
based MOSFET (MOS transistor) and monolithic integrated circuit (IC) chip
technologies in the late 1950s, leading to the microprocessor and the microcomputer
revolution in the 1970s. The speed, power and versatility of computers have been
increasing dramatically ever since then, with MOS transistor counts increasing at a
rapid pace (as predicted by Moore's law), leading to the Digital Revolution during the
late 20th to early 21st centuries.
Conventionally, a modern computer consists of at least one processing element,
typically a central processing unit (CPU) in the form of a metal-oxide-
semiconductor (MOS) microprocessor, along with some type of computer memory,
typically MOS semiconductor memory chips. The processing element carries out
arithmetic and logical operations, and a sequencing and control unit can change the
order of operations in response to stored information. Peripheral devices include input
devices (keyboards, mice, joystick, etc.), output devices (monitor screens, printers,
etc.), and input/output devices that perform both functions (e.g., the 2000s-
era touchscreen). Peripheral devices allow information to be retrieved from an external
source and they enable the result of operations to be saved and retrieved.
Contents
1Etymology
2History
o 2.1Pre-20th century
o 2.2First computing device
o 2.3Analog computers
o 2.4Digital computers
o 2.5Modern computers
o 2.6Mobile computers
3Types
o 3.1By architecture
o 3.2By size and form-factor
4Hardware
o 4.1History of computing hardware
o 4.2Other hardware topics
o 4.3Input devices
o 4.4Output devices
o 4.5Control unit
o 4.6Central processing unit (CPU)
o 4.7Arithmetic logic unit (ALU)
o 4.8Memory
o 4.9Input/output (I/O)
o 4.10Multitasking
o 4.11Multiprocessing
5Software
o 5.1Languages
o 5.2Programs
6Networking and the Internet
7Unconventional computers
8Future
o 8.1Computer architecture paradigms
o 8.2Artificial intelligence
9Professions and organizations
10See also
11References
12Notes
13External links
Etymology
A female computer, with microscope and calculator, 1952
According to the Oxford English Dictionary, the first known use of the word "computer"
was in 1613 in a book called The Yong Mans Gleanings by English writer Richard
Braithwait: "I haue [sic] read the truest computer of Times, and the best Arithmetician
that euer [sic] breathed, and he reduceth thy dayes into a short number." This usage of
the term referred to a human computer, a person who carried out calculations or
computations. The word continued with the same meaning until the middle of the 20th
century. During the latter part of this period women were often hired as computers
because they could be paid less than their male counterparts. [1] By 1943, most human
computers were women.[2]
The Online Etymology Dictionary gives the first attested use of "computer" in the 1640s,
meaning "one who calculates"; this is an "agent noun from compute (v.)". The Online
Etymology Dictionary states that the use of the term to mean "'calculating machine' (of
any type) is from 1897." The Online Etymology Dictionary indicates that the "modern
use" of the term, to mean "programmable digital electronic computer" dates from "1945
under this name; [in a] theoretical [sense] from 1937, as Turing machine".[3]