1 / 46

History of Computers

History of Computers. For centuries, calculators were the only machines to help us compute. A long lineage of devices stretching from the ancient abacus to today’s digital computer. Mathematical Tools

gloria
Download Presentation

History of Computers

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. History of Computers For centuries, calculators were the only machines to help us compute. A long lineage of devices stretching from the ancient abacus to today’s digital computer.

  2. Mathematical Tools Throughout the history of calculating we’ve devised ways to add speed and accuracy while subtracting the drudgery. Many solutions used body parts, notably fingers. A 19th century Chinese technique can count to 10 billion using just two hands! Other solutions were mechanical — both general-purpose tools for everyday calculations and specialized instruments for engineering, navigational, or other scientific and technical problems.

  3. The Versatile, Venerable Abacus An American soldier and a Japanese postal worker faced off in Tokyo in 1946. Private Thomas Wood had an electric calculator. Kiyoshi Matsuzaki held a soroban, a Japanese abacus. Each was skilled at operating his device. In four out of five competitive rounds, the abacus won. Perhaps the oldest continuously used calculating tool aside from fingers, the abacus is a masterpiece of power and simplicity. Abacuses were widely used in Asia and Europe for centuries, and remain common today.

  4. The Original "Pocket Calculator" In an era before pencils and ballpoint pens, portability was a key advantage of the abacus. It enabled vendors or tax collectors, for instance, to make calculations anywhere, even standing in a marketplace where jotting down figures was impractical. Abacuses evolved in various forms at different times and places. But all share certain basic characteristics: movable markers (beads, stones, beans, sticks, coins) arranged in vertical or horizontal columns, with different rows representing different values (ones, fives, tens, etc.). Moving the markers “activates” them, creating different combination's of values.

  5. Sectors The compass-like sector, developed in the late 1500s, can perform a tremendous range of approximate computations, from basic arithmetic to calculating areas and volumes or converting currencies. Much in demand for tasks like military fortifications, where quick answers mattered more than precision, sectors became commercial products: astronomer Galileo built and sold more than 100.

  6. Slide Rules The principle behind a slide rule is straightforward. Two bars, each marked with scales, slide next to each other. Aligning numbers on different kinds of scales allows different calculations, such as multiplication or trigonometry. Accuracy, however, is limited and depends on the user’s skill. Invented in the 1600s, slide rules were widely used through the 20th century.

  7. Mass-Produced Calculators Innovation exploded during the Industrial Revolution, nourished by a self-perpetuating cycle of new markets, new ideas and new technologies. A growing demand for mechanical calculators coincided with a growing capacity to design and build them. This extraordinary confluence of need and ability helped to transform sophisticated calculators from handcrafted tools into mass-produced products.

  8. Introducing the Keyboard It started with a wooden pasta box, meat skewers, and rubber bands. It ended as a prototype for the first commercially successful keyboard adding machine, the Comptometer. Patented in 1887 by 24-year-old Dorr E. Felt, the Comptometer’s prime asset was ease of use. Pressing its keys drove the mechanism, which revealed the sum after all were pressed.

  9. Setting a New Standard Jay Monroe’s goals were not modest: a simple-to-use, portable, powerful four-function calculator with a keyboard and proof of input accuracy. In 1912, he and Frank Baldwin formed Monroe Calculating Machine Company to realize those goals. Baldwin’s electro-mechanical calculators, some with as many as 12,000 parts, revolutionized scientific and technical calculations and stimulated competition.

  10. Electronic Calculators The Electronic Age elbowed out the Mechanical Age in the 1960s. Calculator manufacturers had to adapt or perish.

  11. Power in Your Pocket Hewlett-Packard co-founder Bill Hewlett issued a challenge to his engineers in 1971: Fit all of the features of their desktop scientific calculator into a package small enough for his shirt pocket. They did. More than 100,000 HP-35 pocket calculators sold the first year at $395 each.

  12. Punch Cards People used calculators to manipulate numbers. But how do you make machines that also manipulate words or ideas? Punched cards, a mainstay of early office automation and computing, helped launch the transition from doing math to processing data. Patterns of holes punched in cards can represent any kind of information. Punched cards can preserve data too: just file them away!

  13. Date Entry Punching data into cards was considered clerical or secretarial job—which in those days was considered “women’s work.” So, most keypunch operators were women.

  14. Analog Computers Analog and Digital: Different Ways to Measure and Model the World Our world is a symphony of infinite variations. Long before digital computers existed, engineers built models to simulate those real world nuances. Analog computers continued this tradition, using mechanical motion or the flow of electricity to model problems and generate answers quickly. They remained the preferred tool until digital computers, based on electronic switches, became fast enough for software to do the same job.

  15. EAI PACE TR-48 EAI was the largest supplier of general-purpose analog computers. Transistorized models like the TR-48 were used for satellite design, chemotherapy studies, chemical reactor simulation, and more.

  16. Birth of the Computer Computation Becomes Electronic World War II acted as midwife to the birth of the modern electronic computer. Unprecedented military demands for calculations—and hefty wartime budgets—spurred innovation. Early electronic computers were one-of-a-kind machines built for specific tasks. But setting them up was cumbersome and time-consuming. The revolutionary innovation of storing programs in memory replaced the switches and wiring with readily changed software.

  17. ENIAC In 1942, physicist John Mauchly proposed an all-electronic calculating machine. The U.S. Army, meanwhile, needed to calculate complex wartime ballistics tables. Proposal met patron. The result was ENIAC (Electronic Numerical Integrator And Computer), built between 1943 and 1945—the first large-scale computer to run at electronic speed without being slowed by any mechanical parts. For a decade, until a 1955 lightning strike, ENIAC may have run more calculations than all mankind had done up to that point.

  18. The First Generation Once the world had seen a stored program computer, the advantages were obvious. Every university, research institute and lab wanted one of its own. But where to get one? There were no commercial manufacturers of electronic, stored-program computers. If you wanted one, you had to build one. Many of these early machines relied on published designs. Others were developed independently.

  19. JOHNNIAC The RAND Corporation’s JOHNNIAC was based on the stored-program computer developed at Princeton’s IAS—and named for John von Neumann, godfather of the IAS project. Used for scientific and engineering calculations, the JOHNNIAC was completed in 1954, though it was repeatedly expanded and improved throughout its 13-year lifespan.

  20. The First Mainframes Big businesses with big needs required big computers. Economies of scale also favored large, consolidated computer systems. This demand for big computers, just when “second generation” transistor-based computers were replacing vacuum-tube machines in the late 1950s, spurred developments in hardware and software. Manufacturers commonly built small numbers of each model, targeting narrowly defined markets.

  21. Memory & Storage How Computers Remember Computers are master jugglers, multitasking as we play music, solve equations, surf the web, and write novels. They also have become vast, searchable libraries of everything from banking records and encyclopedias to grandma’s recipes. These abilities require two kinds of memory: main memory (fast and comparatively expensive) and storage (big, slower, and cheap). Both types have rapidly and continually improved.

  22. Memory & Storage: Different Tasks, Different Technologies

  23. The First Disk Drive: RAMAC 350 The RAMAC 350 storage unit could hold the equivalent of 62,500 punched cards: 5 million characters.

  24. Floppy Disks The 3 1/2" floppy disk format was the last mass-produced format, replacing 5 1/4" floppies by the mid-1990s. It was more durable than previous floppy formats since the packaging was rigid plastic with a sliding metal shutter. It was eventually made obsolete by CDs and flash drives.

  25. How Do Digital Computers “Think”? All digital computers rely on a binary system of ones and zeros, and on rules of logic set out in the 1850s by English mathematician George Boole. A computer can represent the binary digits (bits) zero and one mechanically with wheel or lever positions, or electronically with voltage or current. The underlying math remains the same. Bit sequences can represent numbers or letters.

  26. Analog Integrated Circuits Most modern computers are digital. But they function in a world of continuously varying analog input such as sound, light, and heat. So, they must convert these analog signals into digital ones and zeros for processing.

  27. Moore's Law The number of transistors and other components on integrated circuits will double every year for the next 10 years. So predicted Gordon Moore, Fairchild Semiconductor’s R&D Director, in 1965.

  28. Intel “x86” Family and the Microprocessor Wars More is never enough. As cheaper memory encouraged bigger programs, 8 bits became insufficient. Intel developed the 16-bit 8086 as a stopgap while it worked on a more sophisticated chip. But after IBM adopted the 8088, a low-cost version of the 8086, the stopgap became an industry standard.

  29. 8086 microprocessor, Intel, 1976 Transistor count: 29,000. Minimum feature size: 3.2 µ (microns). 8088 microprocessor, Intel, 1979 Low cost version of the 8086, with an external 8-bit data bus and internal 16-bit data paths. 80286 microprocessor, Intel, 1982 Transistor count: 134,000. Minimum feature size: 1.5 µ.

  30. Pentium microprocessor, Intel, 1993 Transistor count: 3,100,000. Minimum feature size: 0.6 µ. Pentium Pro microprocessor, Intel, 1995 Transistor count: 3,100,000. Minimum feature size: 0.35 µ. Pentium 4 microprocessor, Intel, 2000 Transistor count: 42,000,000. Minimum feature size: 0.09 µ.

  31. Supercomputers Super is relative. Every era has supercomputers, but the definition shifts as technology advances. Today’s supercomputers may be tomorrow’s PCs. Supercomputers tackle the most calculation-intense problems, such as predicting weather, codebreaking and designing nuclear bombs. Early supercomputers were one-of-a-kind machines for the government or military—the only customers who could afford it.

  32. CDC 6600’s Five Year Reign CDC 6600's Five year Reign The CDC 6600 toppled speed records in 1964. Control Data sold about 100 of them, penetrating markets beyond the usual government and military customers. The 6600 executed a dizzying 3,000,000 instructions per second. Designed by supercomputer pioneer Seymour Cray, it wasn’t surpassed until the CDC 7600 in 1969—another Cray design.

  33. The Cray-1 Supercomputer For five years, this was the world’s fastest computer. Each Cray-1 was hand wired and took nearly a year to assemble. Its unique vector-processing design suited it to many critical computational problems, such as cryptography, bomb simulation, and aircraft design.

  34. Personal Computers Computers evolved primarily for military, scientific, government, and corporate users with substantial needs…and substantial budgets. They populated labs, universities, and big companies. Homes? Small businesses? Not so much. Over time, however, costs dropped. Equally important, computers grew sophisticated enough to hide their complex, technical aspects behind a user-friendly interface. Individuals could now afford and understand computers, which dramatically changed everyday life.

  35. What Was The First PC? The Computer Museum in Boston asked that question in 1986, and held a contest to find the answer. Judges settled on John Blankenbaker’s Kenbak-1 as the first personal computer. Designed in 1971, before microprocessors were invented, the Kenbak-1 had 256 bytes of memory and featured small and medium scale integrated circuits on a single circuit board. The title of first personal computer using a microprocessor went to the 1973 Micral. Designed in France by André Truong TrongThi and Francois Gernelle, the Micral used the Intel 8008 microprocessor.

  36. Computers for Everybody When people begin building their own products in their garages, entrepreneurs take notice. That’s a good sign of a business opportunity. Companies capitalized on the blossoming computer interest with products requiring little expertise. These included three influential computers introduced in 1977: the Apple II, TRS-80, and Commodore PET. The expanding market also meant more demand for software—a niche many companies eagerly filled.

  37. The Apple I The Apple II When it debuted in 1977, the Apple II was promoted as an extraordinary computer for ordinary people. The user-friendly design and graphical display made Apple a leader in the first decade of personal computing. Unlike the earlier Apple I, for which users had to supply essential parts such as a case and power supply, the Apple II was a fully realized consumer product. Design and marketing emphasized simplicity, an everyday tool for home, work, or school.

  38. The IBM PC Many companies were dubious. Could small personal computers really be serious business tools? The IBM name was a reassuring seal of approval. IBM introduced its PC in 1981 with a folksy advertising campaign aimed at the general public. Yet, the IBM PC had its most profound impact in the corporate world. Companies bought PCs in bulk, revolutionizing the role of computers in the office—and introducing the Microsoft Disk Operating System (MS DOS) to a vast user community.

  39. Networking Networking has transformed computers from stand-alone data-crunchers into the foundation of an unprecedented global community. Networking rests on a simple concept: getting computers to communicate with each other. This requires a physical connection, like wires or radio links, and a common language (protocol) for exchanging data. Once these are in place comes the layer we see: information systems like the Web.

  40. What’s the Difference Between the Internet and the Web? The Internet, linking your computer to other computers around the world, is a way of transporting content. The Web is software that lets you use that content…or contribute your own. The Web, running on the mostly invisible Internet, is what you see and click on in your computer’s browser. The Internet’s roots are in the U.S. during the late 1960s. The Web was invented 20 years later by an Englishman working in Switzerland—though it had many predecessors. To keep things “interesting,” many people use the term Internet to refer to both.

  41. Connecting Over Phone Line Some early modems used an “acoustic coupler” for the telephone handset, eliminating the need to wire them directly to phone lines. Hinged covers and foam padding isolated couplers from room noise. Until the 1960s, computers mostly used phone and telegraph equipment to communicate. Only as they began connecting to other computers did systems optimized for computer communication develop.

  42. The Internet Comes From Behind In 1980, the Internet was a medium-sized experiment with 213 computers. As late as 1988, insiders were betting against it. By 1992, it was emerging as the global winner, linking a million computers. In hindsight, the Internet had key advantages, from a growing community of enthusiasts churning out working software and hardware, to free distribution with UNIX. But the decisive factor? Probably money—especially U.S. government support from the National Science Foundation. Besides building infrastructure, NSFNET fueled the viral spread of networking in higher education.

  43. Inventing the Web At the world’s biggest physics laboratory, CERN in Switzerland, English programmer and physicist Tim Berners-Lee submitted two proposals for what became the Web. Neither was approved. He proceeded anyway. With only unofficial support from his boss and interested coworkers, he created “WorldWideWeb” on an advanced NeXT computer in 1990. It featured a server, HTML, URLs, and the first browser. That browser also functioned as an editor—like a word processor connected to the Internet – which reflected his original vision that the Web also incorporate authoring and personal organization tools.

  44. Why Did the Web Win? The Web was one networked information system among many. Why did it triumph? Several factors contributed. First, the Web was designed to spread virally, requiring no big up-front investment of time and money. It worked on different kinds of computers, and could handle existing data— both legacies of its birth in the international bazaar of CERN. For many people, its simple, easily implemented hypertext links were a key attraction. The Web also triumphed by absorbing potential rivals, adding support for WAIS and Gopher. Lynx and Viola converted themselves into Web browsers.

  45. Servers: Hidden Engines of the Web We see the Web through browsers, the programs that show us Web pages. Behind the scenes are servers, networked computers that “serve up” those Web pages to our browsers. Any computer can act as a server. Commercial racks, like the Google server rack, are the hidden engines of the Web—its unseen but essential infrastructure. This 1999 Google server rack incorporates many technologies, each the victor in its particular arena from among competing standards. Ethernet ties the rack’s individual boards into a local area network; Internet protocols connect the rack to the larger net; Web server software sends the results to our browsers.

More Related