CSE 1100 – Lecture 1
Adapted from Mr. Christopher Muir lecture notes, UWI
A Brief history of computing
A long time ago, in a galaxy…
Ancient precursors to modern computing
The Algorithm
The Beginning of Computing Machinery
The Evolution of Programming
Augmenting the Human-Computer Interface
The Personal Computer
The Internet
The First (known) Math/Computing
Device
The Ishango Bone
• Dates back to 20,000BC
• Found in Congo, Africa
• Believed to be a tool for assisting in multiplication,
division, prime numbers and base arithmetic
• CNN News report on Ishango Bone
Ancient Computing
Humanity has been using computing devices for centuries
Many ancient cultures built devices to aid them in calculations
Examples
The Abacus ~ as early as 2700 BCE Mesopotamia.
Chinese abacus ~ 200 BCE which allowed more sophisticated calculations
Antikythera Mechanism ~ 1 BCE, Greece
A is for Algorithm
Algorithm comes from Al-Khwarizmi,
an ancient Persian mathematician, astronomer and geographer
His work on the Indian numerals introduced the decimal positional number
system to the Western world
Established the basis for innovation in algebra and trigonometry.
Devised a systematic solution of linear and quadratic equations
Grandfather of Computer Science??
The Golden Principle of Al-Khwarizmi
The al-Khwarizmi principle states that all complex problems of science can be solved by five
simple steps:
1. Break down each problem into a number of elemental or ‘atomic’ steps, which cannot
be simplified any further.
2. Second, arrange all the elements or steps of the problem in an order or sequence.
3. Next, find a way to solve each of the elements separately.
4. Then solve each element of the problem, either one at a time or several at a time, but
in the correct order.
5. When all steps are solved, the original problem itself has also been solved.
Al-Khwarizmi used his own principle to solve many major problems of Algebra, Geometry,
Astronomy, and other fields of science.
Ref: http://kenatica.wordpress.com/2007/01/30/the-golden-principle-of-al-khwarizmi-
algorithm/
The Beginnings of Modern Computing
(18th/19th Century)
Charles Babbage: Invented the Difference Engine and the Analytical
Engine
Ada Lovelace: Considered to be the first programmer. Worked on
Analytical Engine with Babbage
She described an algorithm for the Analytical Engine to compute Bernoulli
Numbers which is considered to be the first specifically tailored for a
computer
Neither the Difference Engine or the Analytical Engine were ever
completed…
Until Now - http://www.computerhistory.org/babbage/modernsequel/
The First Electronic Computers
Started being built ~ 1940 - http://www.computerhistory.org/timeline/
World War II accelerated computer research for military purposes
Complex Number Calculator (CNC)
Built by Bell Labs in 1940
Could Add, subtract, divide complex numbers
Demonstrated at a conference at Darthmouth
College in New Hampshire where he remotely
connected to the CNC via telegraph
1st demonstration of remote access computing
Colossus
Early (special purpose) electronic mainframe
computer
Designed specifically to break Nazi codes
Located at Bletchley Park
Based on work done by Alan Turing
http://www.bbc.com/news/technology-
26015436
Alan Turing
British mathematician, logician, cryptanalyst, computer scientist
Considered to be the father of Computer Science
Hero during WW2 for his work in breaking Nazi codes
Designed specification for stored-program computers
Pioneer in field of Artificial Intelligence. Turing Test is named after him
Convicted and discredited for “indecency” when his sexual orientation
became known
Committed suicide soon after
60 years later posthumously pardoned and is recognized as one of the
greatest contributors to modern computing
Harvard Mark I (1945)
• General purpose electro-mechanical
computer
• Was used to assist in the complex calculations
for the Manhattan Project
• 55 feet long, 8 feet high, 5 tonnes
• Designed and built by IBM
The Information Age
4 main phases so far:
Institutional Computing (1950 - ): large, expensive mainframes used by
governments, universities and industry
Personal Computing: (1975 - ): smaller, cheaper computers allowed for use in
homes and schools
Interpersonal Computing (1995 - ): Internet gains popularity due to the web and
people start go online
Collaborative Computing (2005 - ): More devices are connected to the internet
and users are now encouraged to contribute content eg. Facebook, Wikipedia
NEXT???
Institutional Computing
Stored-program computers
All early computers were fixed-program computers. i.e. they were very
restrictive in what they could do (typically mathematic problems)
Alan Turing defined the basis for stored-program computers (1936-1946).
These computers had an instruction set (list of things computer could do)
You could load a series of these instructions (program) to achieve a goal
Manchester “Baby” and UNIVAC 1
1st Stored-program Computer 1st Commercial Computer
Built at University of Manchester Built by Remington-Rand in 1951
First successful run in 1948 Sold to US Census Bureau
Integrated Circuit Computers
The Integrated Circuit (IC) conceived in 1952 by Geoffrey Dummer
All components of an electronic circuit completely integrated and
connected without the use of wires and made out of semiconductors
1st IC built in 1958 out of Germanium as semiconductor by Texas Instruments
Fairchild Semiconductor released their IC made out of silicon a year later
The IC paved the way for today’s computing devices
It allowed computers to be made that were smaller and cheaper than
before
Led to updates in graphics and Human Computer Interaction (HCI)
Eventually led to the PC revolution
Time Sharing
Computers were expensive. Few organizations could afford to own
Instead people/organizations could lease time on existing computers
Large central computers connected to dumb terminals
Terminals were input/output (I/O) devices
Interactive Graphics
Sketchpad – Developed by Ivan Sutherland at MIT in 1963
Direct image manipulation
Demonstrated first screen window with icons, copying
Could have multiple images and embedded images
Had a light pen (stylus) as an input device
https://www.youtube.com/watch?v=YB3saviItTI
Augmentation not Automation
"I tell people: look, you can spend all you want on building smart agents and
smart tools…"
"I'd bet that if you then give those to twenty people with no special training,
and if you let me take twenty people and really condition and train them
especially to learn how to harness the tools…"
"The people with the training will always outdo the people for whom the
computers were supposed to do the work."
--- Doug Engelbart
Doug Engelbart
Founded the Augmentation research Center (ARC) at
Stanford Research Institute (SRI)
Invented the computer mouse
Oversaw the development of hypertext, networked
computers and graphical user interfaces
These concepts were demonstrated in what is now called
“The Mother of all Demos” in 1968
http://youtu.be/VScVgXM7lQQ
http://youtu.be/hRYnloqYKGY
http://www.dougengelbart.org/firsts/dougs-1968-demo.html
Personal Computing
The dawn of the PC and GUI
Xerox PARC
Xerox PARC (Palo Alto Research Center) founded in 1970
Responsible for the prototypes of many of the developments that make up the
Personal Computing era:
laser printing (1971)
Ethernet (1973)
the graphical user interface/desktop (1973)
PC workstation – The Alto (1973)
Object Oriented Programming (OOP) – Smalltalk (1972)
Goal: “The Paperless Office”
Alan Kay (PARC Researcher): “The best way to predict the future is to invent it”
Alan Kay video
Xerox Alto
One of the first personal computers
Released in 1973
1st computer to use a desktop metaphor
One of first to have a mouse-driven GUI
One of first with Ethernet connections for LAN
Inspiration for Apple Macintosh and MS Windows
Alto ad in 1972
The Evolution of Programming
Languages
The Evolution of Programming
Languages
1st Generation Languages
Programmed in the language of computers – Machine Language
Date from the beginning of computing
What is Machine Language?
Based on binary as early electromechanical computers had switches and
plugs which could be set to an “on” (1) or “off” (0) position
Issues:
Machine Dependent: “programs” were specific to the particular computer
Extremely difficult to locate errors or track code changes
2nd Generation Languages
Assembly Language: symbolic representation of low-level machine code
that could be used for electronic computers (didn’t need to pull plugs/flick
switches anymore)
From the late 1940s
Easier to code than machine language due to symbolic nature
Still machine dependent
3rd Generation Languages
High level Languages including procedural languages, functional
languages, object oriented languages
First versions were from 1950s
First High Level Language designed by “mother of computing” Adm. Grace
Hopper
Adm. Grace Murray Hopper
The “mother of computing”
Designed the first high-level programming languages: MATH-
MATIC and FLOW-MATIC which helped define COBOL in 1959
Pioneer in the testing of computer systems and is credited with
popularizing the term “debugging” for fixing computer glitches
Was the oldest active duty US Naval officer and retired
(involuntarily) as a one-star Admiral at age 79
Was a consultant for Digital Equipment Corp until her death at 85
3rd Generation Languages
High level: were closer to natural languages
Structured: well defined syntax
Were machine independent: Once written, the code could be reused on a
variety of computers with little to no modifications
Much more abstracted: programmer didn’t have to deal with a lot of low-
level details
Support for a variety of data types: Strings, arrays, lists, etc
Much easier to debug and maintain that previous generations
3rd Generation Languages (Paradigms)
Procedural Languages: allowed creation of routines that could be called
at anytime within a program. Eg. C, Pascal, COBOL
Functional Languages: Computation treated as the evaluation of
mathematical functions. Eg. LISP, Scheme
Object Oriented Languages: introduced the concept that programs could
be made up of “objects” which consisted of “attributes” which described
the object and associated routines known as “methods”. Eg. C++, Java,
Smalltalk
Dennis Ritchie
Invented the C programming language (1969-1973)
Created Unix (along with Ken Thompson) (1973)
His death was overshadowed by Steve Jobs who passed the week before
C was used to write Unix, DOS, Windows and the Internet
Linux, MacOS and Android are all based on Unix
Received National Medal of Technology for C and Unix with the citation:
“led to enormous advances in computer hardware, software, and networking
systems and stimulated growth of an entire industry, thereby enhancing
American leadership in the Information Age"
“Ritchie was under the radar. His name was not a household name at all, but...
if you had a microscope and could look in a computer, you'd see his work
everywhere inside”
--- Paul Ceruzzi (Computer Historian)
Adele Goldberg
Head of PARC Systems Concepts Lab
Team developed the GUI and Smalltalk
One of primary developers of Smalltalk
Smalltalk-80 was first popular OO language and first with GUI programming
Directly influenced Java, JavaScript, C#, Python, PHP and others
Adele presents Smalltalk-80
Early Personal Computers
Apple II (1977)
IBM XT/AT (1981)
Marc Dean
Holds three of nine patents for the original IBM PC
Designed the Industry Standard Architecture (ISA) bus
This allowed add-on devices to be connected to the motherboard
This invention put him in National Inventors Hall of Fame
Designed the Color Graphics Adapter (CGA) for the original IBM PC
Was chief engineer for development of the modern PC
IBM PC/AT, IBM PS/2
PCs today still use almost the same configuration as the PS/2
The Apple tour of Xerox PARC
The Apple II was under pressure from competition
Steve Jobs got a tour of PARC (with dissent from Adele Goldberg) in
exchange for discounted shares of Apple
From the tour Apple got access to all of PARCs new technology
Jobs was especially impressed with the GUI and mouse combination
Read “The Creation Myth” by Malcolm Gladwell
Dramatization of the visit from movie “Pirates of Silicon Valley”
Apple Lisa (1982)
Based on the ideas they saw at Xerox
2nd PC to be sold that used a GUI (Xerox Star was first)
Lisa was a commercial failure partly due to cost (~USD10,000)
And partly due to the Macintosh which was released 2 years later
Apple Lisa ad
Apple Macintosh (1984)
Priced much more aggressively (~USD2500)
Upgraded GUI
Better graphics
More 3rd party applications
Apple vs. Microsoft
Microsoft licensed parts of early MacOS to create Windows 1.0
Apple complained that MS had used more of MacOS than was agreed to
Apple sued all the way to Supreme Court
Microsoft won
Justices decided that look and feel of software could not be patented or
copyrighted
4th Generation Languages
Very High level languages more abstracted than 3GLs
Machine independent
Focus was on Rapid Development and Application Generation
Goal was to enable programmer for the wider population
Paradigms include
General Programming eg. IBM Rational (Rose), Ruby, Python, Perl
Database eg. SQL, Ingres, Informix
Report Generators eg. Oracle Reports, RPG-II, NATURAL
Data Manipulation/Analysis eg. R, SPSS, SAS, PL/SQL
GUI/Screen Builders eg. Oracle Forms, SB+
5th Generation Languages
Rise of the Machines
Used for Artificial Intelligence primarily
Very High Level: Greater use of natural language
Machine Independent
Constraint driven: Attempts to mimic human thought
Logic driven: use of logic to express information and make inferences
Eg. Prolog, OPS5, CLIPS, WolframAlpha
Intelligent Personal Assistants
e.g. with Siri
Intelligent Personal Assistant for Apple devices
Developed at SRI (Remember them?)
Uses speech recognition engine developed by Nuance Communications
(also an SRI spin-off)
Recognizes speech and then uses multiple data sources to provide results:
Uses Bing and Yahoo for web search
Uses Bing Answers and Wolfram Alpha to answer factual queries
Is tied into other data sources such as Maps, Yelp, weather, etc
Main competition: Google Now and Microsoft Cortana
What’s next?