The first computers were people!
That is, electronic
    computers (and the earlier mechanical computers) were given
    this name because they performed the work that had
    previously been assigned to people. "Computer" was
    originally a job title: it was used to describe those human
    beings (predominantly women) whose job it was to perform
    the repetitive calculations required to compute such things as
    navigational tables, tide charts, and planetary positions for
    astronomical almanacs.
   Imagine you had a job where hour after hour, day after
    day, you were to do nothing but compute multiplications.
    Boredom would quickly set in, leading to carelessness,
    leading to mistakes. And even on your best days you
    wouldn't be producing answers very fast. Therefore,
    inventors have been searching for hundreds of years for
    a way to mechanize (that is, find a mechanism that can
    perform) this task.
   Picture of ancient
    counting tables
   The abacus was an early aid for
    mathematical computations. Its
    only value is that it aids the
    memory of the human performing
    the calculation. A skilled abacus
    operator can work on addition and
    subtraction problems at the speed
    of a person equipped with a hand
    calculator (multiplication and
    division are slower).
   The abacus is often
    wrongly attributed to
    China. In fact, the oldest
    surviving abacus was
    used in 300 B.C. by the
    Babylonians. The
    abacus is still in use
    today, principally in the
    far east.
   In 1617 an eccentric (some say mad) Scotsman named
    John Napier invented logarithms, which are a
    technology that allows multiplication to be performed
    via addition.
   Ex: log2x = 5
   The magic ingredient is the
    logarithm of each operand,
    which was originally
    obtained from a printed
    table. But Napier also
    invented an alternative to
    tables, where the logarithm
    values were carved on ivory
    sticks which are now called
    Napier's Bones.
   Napier's invention led directly to the slide rule, first
    built in England in 1632 and still in use in the 1960's
    by the NASA engineers of the Mercury, Gemini, and
    Apollo programs which landed men on the moon.
   Leonardo da Vinci (1452-1519) made drawings of
    gear-driven calculating machines but apparently
    never built any.
   The first gear-driven
    calculating machine to
    actually be built was
    probably the calculating
    clock, so named by its
    inventor, the German
    professor Wilhelm
    Schickard in 1623. This
    device got little publicity
    because Schickard died
    soon afterward in the
    bubonic plague.
   In 1642 Blaise Pascal, at age 19, invented the Pascaline as an aid
    for his father who was a tax collector. Pascal built 50 of this gear-
    driven one-function calculator (it could only add) but couldn't sell
    many because of their exorbitant cost and because they really
    weren't that accurate (at that time it was not possible to fabricate
    gears with the required precision).
   Up until the present age when car dashboards went digital, the
    odometer portion of a car's speedometer used the very same
    mechanism as the Pascaline to increment the next wheel after each
    full revolution of the prior wheel.
   Just a few years after Pascal, the German Gottfried Wilhelm Leibniz (co-
    inventor with Newton of calculus) managed to build a four-function
    (addition, subtraction, multiplication, and division) calculator that he
    called the stepped reckoner because, instead of gears, it employed fluted
    drums having ten flutes arranged around their circumference in a stair-
    step fashion. Although the stepped reckoner employed the decimal
    number system (each drum had 10 flutes), Leibniz was the first to
    advocate use of the binary number system which is fundamental to the
    operation of modern computers. Leibniz is considered one of the
    greatest of the philosophers but he died poor and alone.
   In 1801 the Frenchman Joseph
    Marie Jacquard invented a power
    loom that could base its weave
    (and hence the design on the
    fabric) upon a pattern
    automatically read from punched
    wooden cards, held together in a
    long row by rope. Descendents of
    these punched cards have been in
    use ever since (remember the
    "hanging chad" from the Florida
    presidential ballots of the year
    2000?).
   By selecting
    particular cards for
    Jacquard's loom
    you defined the
    woven pattern
   Close up of a tapestry
    woven by the loom
   Jacquard's technology was a real
    boon to mill owners, but put
    many loom operators out of work.
    Angry mobs smashed Jacquard
    looms and once attacked Jacquard
    himself. History is full of
    examples of labor unrest
    following technological
    innovation yet most studies show
    that, overall, technology has
    actually increased the number of
    jobs.
   By 1822 the English
    mathematician Charles
    Babbage was proposing a
    steam driven calculating
    machine the size of a
    room, which he called the
    Difference Engine.
   This machine would be able to compute tables of numbers, such
    as logarithm tables.
   He obtained government funding for this project due to the
    importance of numeric tables in ocean navigation.
   Construction of Babbage's Difference Engine proved exceedingly
    difficult and the project soon became the most expensive
    government funded project up to that point in English history.
   Ten years later the device was still nowhere near complete,
    acrimony abounded between all involved, and funding dried up.
    The device was never finished.
   Babbage was not deterred, and by then was on to his next
    brainstorm, which he called the Analytic Engine.
   This device, large as a house and powered by 6 steam
    engines,
   It was programmable, thanks to the punched card technology
    of Jacquard.
   Babbage saw that the pattern of holes in a punch card could
    be used to represent an abstract idea such as a problem
    statement or the raw data required for that problem's solution.
   Babbage realized that punched paper could be employed as a
    storage mechanism, holding computed numbers for future
    reference.
   Because of the connection to the Jacquard loom, Babbage
    called the two main parts of his Analytic Engine the "Store"
    and the "Mill", as both terms are used in the weaving
    industry.
   The Store was where numbers were held and the Mill was
    where they were "woven" into new results.
   In a modern computer these same parts are called the
    memory unit and the central processing unit (CPU).
   The Analytic Engine also had a key function that
    distinguishes computers from calculators: the
    conditional statement.
   A conditional statement allows a program to
    achieve different results each time it is run.
   Based on the conditional statement, the path of the
    program can be determined based upon a situation
    that is detected at the very moment the program is
    running.
   Babbage befriended Ada Byron, the daughter of the famous
    poet Lord Byron
   Though she was only 19, she was fascinated by Babbage's
    ideas
    She began fashioning programs for the Analytic Engine,
    although still unbuilt.
    The Analytic Engine remained unbuilt (the British
    government refused to get involved with this one) but Ada
    earned her spot in history as the first computer programmer.
   Ada invented the subroutine and was the first to recognize
    the importance of looping.
   The next breakthrough occurred in America. The U.S.
    Constitution states that a census should be taken of all U.S.
    citizens every 10 years in order to determine the
    representation of the states in Congress.
   While the very first census of 1790 had only required 9
    months, by 1880 the U.S. population had grown so much that
    the count for the 1880 census took 7.5 years. Automation was
    clearly needed for the next census.
   The census bureau offered a prize for an inventor to help
    with the 1890 census and this prize was won by Herman
    Hollerith,
   The Hollerith desk, consisted
    of:
    a card reader which sensed the
    holes in the cards,
   a gear driven mechanism
    which could count (similar to
    Pascal’s)
   A large wall of dial indicators
    to display the results of the
    count.
   Hollerith's technique was
    successful and the 1890
    census was completed in
    only 3 years at a savings
    of 5 million dollars.
   Hollerith built a company,
    the Tabulating Machine
    Company which, after a
    few buyouts, eventually
    became International
    Business Machines,
    known today as IBM.
   By using punch cards,
    Hollerith created a way to
    store and retrieve
    information.
   This was the first type of
    read and write technology
   The U.S. military desired a mechanical calculator more
    optimized for scientific computation.
   By World War II the U.S. had battleships that could lob shells
    weighing as much as a small car over distances up to 25
    miles.
   Physicists could write the equations that described how
    atmospheric drag, wind, gravity, muzzle velocity, etc. would
    determine the trajectory of the shell, but solving such
    equations was extremely laborious.
   Human computers would compute results of these equations and
    publish them in ballistic "firing tables"
   During World War II the U.S. military scoured the country looking
    for (generally female) math majors to hire for the job of computing
    these tables, but not enough humans could be found to keep up with
    the need for new tables.
   Sometimes artillery pieces had to be delivered to the battlefield
    without the necessary firing tables and this meant they were close to
    useless because they couldn't be aimed properly.
   Faced with this situation, the U.S. military was willing to invest in
    even hair-brained schemes to automate this type of computation.
   One early success was the
    Harvard Mark I computer
    which was built as a
    partnership between Harvard
    and IBM in 1944.
   This was the first
    programmable digital
    computer made in the U.S.
   But it was not a purely
    electronic computer. Instead
    the Mark I was constructed out
    of switches, relays, rotating
    shafts, and clutches.
   The machine weighed 5 tons,
    incorporated 500 miles of
    wire, was 8 feet tall and 51 feet
    long, and had a 50 ft rotating
    shaft running its length, turned
    by a 5 horsepower electric
    motor.
   The Mark I ran non-stop for 15
    years, sounding like a roomful
    of ladies knitting.
   One of the primary programmers
    for the Mark I was a woman,
    Grace Hopper.
   Hopper found the first computer
    "bug": a dead moth that had
    gotten into the Mark I
   The word "bug" had been used to
    describe a defect since at least
    1889 but Hopper is credited with
    coining the word "debugging" to
    describe the work to eliminate
    program faults.
   On a humorous note, the principal designer of the
    Mark I, Howard Aiken of Harvard, estimated in
    1947 that six electronic digital computers would
    be sufficient to satisfy the computing needs of the
    entire United States.
   IBM had commissioned this study to determine whether it
    should bother developing this new invention into one of its
    standard products (up until then computers were one-of-a-
    kind items built by special arrangement).
   Aiken's prediction wasn't actually so bad as there were very
    few institutions (principally, the government and military)
    that could afford the cost of what was called a computer in
    1947.
   He just didn't foresee the micro-electronics revolution which
    would allow something like an IBM Stretch computer of
    1959:
   The first electronic computer was designed at
    Iowa State between 1939-1942
   The Atanasoff-Berry Computer used the binary
    system(1’s and 0’s).
   Contained vacuum tubes and stored numbers for
    calculations by burning holes in paper
   One of the earliest attempts to
    build an all-electronic (that is, no
    gears, cams, belts, shafts, etc.)
    digital computer occurred in 1937
    by J. V. Atanasoff,
    This machine was the first to
    store data as a charge on a
    capacitor, which is how today's
    computers store information in
    their main memory (DRAM or
    dynamic RAM). As far as its
    inventors were aware, it was also
    the first to employ binary
    arithmetic.
   The Colossus, built during
    World War II by Britain for the
    purpose of breaking the
    cryptographic codes used by
    Germany.
   Britain led the world in
    designing and building
    electronic machines dedicated
    to code breaking, and was
    routinely able to read coded
    Germany radio transmissions.
   Not a general purpose,
    reprogrammable machine.
   The title of forefather of today's all-electronic digital
    computers is usually awarded to ENIAC, which stood for
    Electronic Numerical Integrator and Calculator.
   ENIAC was built at the University of Pennsylvania between
    1943 and 1945 by two professors, John Mauchly and the 24
    year old J. Presper Eckert, who got funding from the war
    department after promising they could build a machine that
    would replace all the "computers”
   ENIAC filled a 20 by 40 foot room, weighed 30 tons, and
    used more than 18,000 vacuum tubes.
   To reprogram the ENIAC you had to rearrange the patch
    cords that you can observe on the left in the prior photo, and
    the settings of 3000 switches that you can observe on the
    right.
   To program a modern computer, you type out a program with
    statements like:
   Circumference = 3.14 * diameter
   To perform this computation on ENIAC you had to rearrange
    a large number of patch cords and then locate three particular
    knobs on that vast wall of knobs and set them to 3, 1, and 4.
   The ENIAC used 18,000 vacuum tubes to hold a
    charge
   Vacuum tubes were so notoriously unreliable that
    even twenty years later many neighborhood drug
    stores provided a "tube tester"
   In 1945 John von Neumann presented his idea of
    a computer that would store computer
    instructions in a CPU
   The CPU(Central Processing Unit) consisted of
    elements that would control the computer
    electronically
   The EDVAC, EDSAC and UNIVAC were the
    first computers to use the stored program concept
   They used vacuum tubes so they were too
    expensive and too large for households to own
    and afford
   It took days to change
    ENIAC's program.
    Eckert and Mauchly's next
    teamed up with the
    mathematician John von
    Neumann to design EDVAC,
    which pioneered the stored
    program.
   After ENIAC and EDVAC
    came other computers with
    humorous names such as
    ILLIAC, JOHNNIAC, and,
    of course, MANIAC
   In 1947, the transistor was
    invented
   The transistor made
    computers smaller, less
    expensive and increased
    calculating speeds.
   Second generation
    computers also saw a new
    way data was stored
   Punch cards were
    replaced with magnetic
    tapes and reel to reel
    machines
   The UNIVAC computer was
    the first commercial (mass
    produced) computer.
   In the 50's, UNIVAC (a
    contraction of "Universal
    Automatic Computer") was the
    household word for
    "computer" just as "Kleenex"
    is for "tissue".
   UNIVAC was also the first
    computer to employ magnetic
    tape.
   Transistors were replaced by
    integrated circuits(IC)
   One IC could replace
    hundreds of transistors
   This made computers even
    smaller and faster.
   In 1970 the Intel
    Corporation invented the
    Microprocessor:an entire
    CPU on one chip
   This led to microcomputers-
    computers on a desk
   If you learned computer
    programming in the
    1970's, you dealt with
    what today are called
    mainframe computers,
    such as the IBM 7090
    (shown below), IBM 360,
    or IBM 370.
   There were 2 ways to interact
    with a mainframe.
   The first was called time
    sharing because the computer
    gave each user a tiny sliver of
    time in a round-robin fashion.
   Perhaps 100 users would be
    simultaneously logged on,
    each typing on a teletype such
    as the following:
   A teletype was a motorized
    typewriter that could transmit
    your keystrokes to the
    mainframe and then print the
    computer's response on its roll
    of paper.
   You typed a single line of text,
    hit the carriage return button,
    and waited for the teletype to
    begin noisily printing the
    computer's response
   The alternative to time sharing
    was batch mode processing,
    where the computer gives its
    full attention to your program.
   In exchange for getting the
    computer's full attention at
    run-time, you had to agree to
    prepare your program off-line
    on a key punch machine
    which generated punch cards.
   University students in the 1970's bought blank cards a linear
    foot at a time from the university bookstore.
   Each card could hold only 1 program statement.
   To submit your program to the mainframe, you placed your
    stack of cards in the hopper of a card reader.
   Your program would be run whenever the computer made it
    that far.
   You often submitted your deck and then went to dinner or to
    bed and came back later hoping to see a successful printout
    showing your results
   But things changed fast.
    By the 1990's a university
    student would typically
    own his own computer
    and have exclusive use of
    it in his dorm room.
   This transformation was a
    result of the invention of the
    microprocessor.
   A microprocessor (uP) is a
    computer that is fabricated on
    an integrated circuit (IC).
   Computers had been around
    for 20 years before the first
    microprocessor was developed
    at Intel in 1971.
   The micro in the name
    microprocessor refers to
    the physical size.
   Intel didn't invent the
    electronic computer, but
    they were the first to
    succeed in cramming an
    entire computer on a
    single chip (IC)
   The microelectronics
    revolution is what
    allowed the amount of
    hand-crafted wiring seen
    in the prior photo to be
    mass-produced as an
    integrated circuit which
    is a small sliver of silicon
    the size of your thumbnail
   Integrated circuits and
    microprocessors allowed
    computers to be faster
   This led to a new age of
    computers
   The first home-brew
    computers is called the
    ALTAIR 8800
   Most of the information for this powerpoint was
    obtained from the following web page:
   http://www.computersciencelab.com/ComputerHi
    story/History.htm