0% found this document useful (0 votes)
10 views2 pages

History of Computing

The history of computing traces significant milestones from Joseph Marie Jacquard's punched card loom in 1801 to the emergence of modern technologies like Cloud Computing and AI from 2018 onwards. Key developments include the invention of the transistor in 1947, the introduction of personal computers in the 1970s, and the foundation of the World Wide Web in 1990. This timeline highlights the evolution of computing technology and its impact on society and industry.

Uploaded by

alexismaenoel
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
10 views2 pages

History of Computing

The history of computing traces significant milestones from Joseph Marie Jacquard's punched card loom in 1801 to the emergence of modern technologies like Cloud Computing and AI from 2018 onwards. Key developments include the invention of the transistor in 1947, the introduction of personal computers in the 1970s, and the foundation of the World Wide Web in 1990. This timeline highlights the evolution of computing technology and its impact on society and industry.

Uploaded by

alexismaenoel
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 2

History of Computing

 1801: Joseph Marie Jacquard invents a loom using punched wooden cards for
automatic fabric weaving, a precursor to early computer punch card systems.
 1890: Herman Hollerith designs a punch card system for the 1880 census,
saving the U.S. government $5 million and founding a company that becomes
IBM.
 1936: Alan Turing introduces the concept of a universal machine, later known
as the Turing machine, which lays the groundwork for modern computing.
 1941: Atanasoff and Clifford Berry create a computer capable of solving 29
equations simultaneously, marking the first instance of information storage in
main memory.
 1943-1944: John Mauchly and J. Presper Eckert build the ENIAC (Electrical
Numerical Integrator and Calculator), the first large-scale digital computer,
utilizing 18,000 vacuum tubes.
 1946: Mauchly and Eckert leave to develop UNIVAC, the first commercial
computer for business and government.
 1947: William Shockley, John Bardeen, and Walter Brattain invent the
transistor, a smaller and more efficient alternative to vacuum tubes.
 1953: Grace Hopper develops COBOL, the first business-oriented
programming language that remains in use today.
 1954: FORTRAN is created by IBM for scientific computing, making
programming more accessible.
 1958: Jack Kilby and Robert Noyce introduce the integrated circuit, a crucial
development in computer technology.
 1964: Douglas Engelbart showcases a prototype of a modern computer with
a mouse and graphical user interface (GUI).
 1969: UNIX operating system is developed, known for its compatibility and
portability.
 1970: Intel releases the first DRAM chip, revolutionizing memory storage.
 1971: IBM engineers invent the floppy disk, facilitating data sharing among
computers.
 1973: Robert Metcalfe develops Ethernet for networking multiple computers.
 1975: The Altair 8800, a personal computer, is introduced, leading to the rise
of Microsoft.
 1976: Apple Computers is founded by Steve Jobs and Steve Wozniak,
launching the Apple I.
 1981: IBM introduces its first personal computer, popularizing the term "PC."
 1990: Tim Berners-Lee develops HTML, laying the foundation for the World
Wide Web.
 1999: Wi-Fi technology begins to emerge, enabling wireless internet
connections.
 2001: Apple unveils Mac OS X; Microsoft releases Windows XP.
 2010: Apple introduces the iPad, transforming media consumption.
 2018-Present: Emerging technologies include Cloud Computing, Internet of
Things (IoT), Artificial Intelligence (AI), Virtual Assistance, Augmented Reality
(AR), 3-D Printing, Robotic Process Automation (RPA), and evolving
Cybersecurity measures.

You might also like