COMPUTERS

 

Like most advances in science and technology, the computer has no single inventor or eureka moment of creation. The word computer once described people, predominantly women, who did repetitive mathematical calculations. Only in the 20th century did it come to mean an electronic-based calculating machine. One of the roots of the modern computer lay in Herman Hollerith’s punch card tabulating machine, used to count the 1890 U.S. census. Hollerith’s Tabulating Machine Company was consolidated into the Computing-Tabulating-Recording Co. in 1911, which was renamed IBM in 1924.

World War II gave rise to an alliance between the military and academia that marked a turning point for computer development. During WW II, Harvard scientist Howard Aiken and U.S. WAVE Grace Hopper designed an electromechanical computing machine that IBM built and sent to Harvard in 1944. The Mark I solved complicated math calculations for the U.S. Navy Bureau of Ships. The ENIAC, developed by John Mauchly and John Presper Eckert at the University of Pennsylvania in 1946, utilized 17,000 vacuum tubes to make math calculations a thousand times faster than earlier machines. Military and academic researchers were the primary users of the ENIAC and its successors, the EDVAC and ORDVAC. In the early 1950s, John von Neumann at the Institute for Advanced Study in Princeton led a group of engineers and scientists in developing the MANIAC computer, which made the calculations necessary to develop the hydrogen bomb in 1952.

In the 1950s, computers were very large and few in number, but transistors made them smaller and commercialization driven by IBM made them more common in the 1960s. However, few would have predicted in 1970 that computers would become a ubiquitous part of the home and office. By the late 1970s, computers had advanced from hobbyist kits to the Apple II and Radio Shack TRS 80 and within a few years IBM had entered the personal computer market, run with Microsoft software. The progressive increase in computer speed and memory made it possible to transform the Internet, a computer network created by the U.S. Defense Department and research universities in the 1970s, into the locus of information and commerce hat has transformed our world. As computer microchips have become smaller and faster, computers can now fit in our phones, eyeglasses and perhaps our bodies. The possibilities seem limitless, but threats to privacy are real as is the specter of a world in which technology dominates our lives.

 

Learn More

  1. Charles Babbage, a British mathematician and engineer, designed the first automated computing machines in the 19th century.  The Computer History Museum's online exhibit of Babbage's contributions can be found at:
    http://www.computerhistory.org/babbage/
  2. The Charles Babbage Institute at the University of Minnesota contains over 300 oral history interviews with individuals involved in the information technology industry, many of which are digitally-accessible.
    http://www.cbi.umn.edu/oh/index.html
  3. Paul Ceruzzi, a curator at the Smithsonian's National Air and Space Museum, is author of The History of Modern Computing. 2nd edition.  Cambridge, MA: The MIT Press, 2003
    http://kolho3.tiera.ru/Cs_Computer%20science/Ceruzzi%20P.E.%20A%20History%20of%20Modern%20Computing%20(MIT,2003)(ISBN%200262532034)(452s)_Cs_.pdf <pdf>
  4. Microsoft co-founder Paul Allen created the Living Computer Museum to preserve the early history of electronic computing by preserving, restoring and returning to operation computer systems.
    http://www.livingcomputermuseum.org/default.aspx