Logo

Essay on History of Computer

Students are often asked to write an essay on History of Computer in their schools and colleges. And if you’re also looking for the same, we have created 100-word, 250-word, and 500-word essays on the topic.

Let’s take a look…

100 Words Essay on History of Computer

Early beginnings.

Computers didn’t always look like the laptops or smartphones we use today. The first computer was the abacus, invented in 2400 BC. It used beads to help people calculate.

First Mechanical Computer

In 1822, Charles Babbage, a British mathematician, designed a mechanical computer called the “Difference Engine.” It was supposed to perform mathematical calculations.

The Birth of Modern Computers

The first modern computer was created in the 1930s. It was huge and filled an entire room. These computers used vacuum tubes to process information.

Personal Computers

In the 1970s, companies like Apple and IBM started making personal computers. This made it possible for people to have computers at home.

Remember, computers have come a long way and continue to evolve!

Also check:

  • Paragraph on History of Computer

250 Words Essay on History of Computer

Introduction.

The history of computers is a fascinating journey, tracing back several centuries. It illustrates human ingenuity and evolution from primitive calculators to complex computing systems.

Early Computers

The concept of computing dates back to antiquity. The abacus, developed in 2400 BC, is often considered the earliest computer. In the 19th century, Charles Babbage conceptualized and designed the first mechanical computer, the Analytical Engine, which used punch cards for instructions.

Birth of Modern Computers

The 20th century heralded the era of modern computing. The first programmable computer, the Z3, was built by Konrad Zuse in 1941. However, it was the Electronic Numerical Integrator and Computer (ENIAC), developed in 1946, that truly revolutionized computing with its electronic technology.

Personal Computers and the Internet

The 1970s and 1980s saw the advent of personal computers (PCs). The Apple II, introduced in 1977, and IBM’s PC, launched in 1981, brought computers to the masses. The 1990s marked the birth of the internet, transforming computers into communication devices and information gateways.

Present and Future

Today, computers have become an integral part of our lives, from smartphones to supercomputers. They are now moving towards quantum computing, promising unprecedented computational power.

In summary, the history of computers is a testament to human innovation, evolving from simple counting devices to powerful tools that shape our lives. As we look forward to the future, the potential for further advancements in computing technology is limitless.

500 Words Essay on History of Computer

The dawn of computing.

The history of computers dates back to antiquity with devices like the abacus, used for calculations. However, the concept of a programmable computer was first realized in the 19th century by Charles Babbage, an English mathematician. His design, known as the Analytical Engine, is considered the first general-purpose computer, although it was never built.

The first half of the 20th century saw the development of electro-mechanical computers. The most notable was the Mark I, developed by Howard Aiken at Harvard University in 1944. It was the first machine to automatically execute long computations.

During the same period, the ENIAC (Electronic Numerical Integrator and Computer) was developed by John Mauchly and J. Presper Eckert at the University of Pennsylvania. Completed in 1945, it was the first general-purpose electronic computer. However, it was not programmable in the modern sense.

The Era of Transistors

The late 1940s marked the invention of the transistor, which revolutionized the computer industry. Transistors were faster, smaller, and more reliable than their vacuum tube counterparts. The first transistorized computer was built at the University of Manchester in 1953.

The 1950s and 1960s saw the development of mainframe computers, like IBM’s 700 series, which dominated the computing world for the next two decades. These machines were large and expensive, but they allowed multiple users to access the computer simultaneously through terminals.

Microprocessors and Personal Computers

The invention of the microprocessor in the 1970s marked the beginning of the personal computer era. The Intel 4004, released in 1971, was the first commercially available microprocessor. This development led to the creation of small, relatively inexpensive machines like the Apple II and the IBM PC, which made computing accessible to individuals and small businesses.

The Internet and Beyond

The 1980s and 1990s brought about the rise of the internet and the World Wide Web, expanding the use of computers into every aspect of modern life. The advent of graphical user interfaces, such as Microsoft’s Windows and Apple’s Mac OS, made computers even more user-friendly.

Today, computers have become ubiquitous in our society. They are embedded in everything from our phones to our cars, and they play a critical role in fields ranging from science to entertainment. The history of computers is a story of continuous innovation and progress, and it is clear that this trend will continue into the foreseeable future.

That’s it! I hope the essay helped you.

If you’re looking for more, here are essays on other interesting topics:

  • Essay on Generation of Computer
  • Essay on Computer Technology Good or Bad
  • Essay on Computer Network

Apart from these, you can look at all the essays by clicking here .

Happy studying!

One Comment

I’m so happy and it’s so helpful to me. May God bless you

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Save my name, email, and website in this browser for the next time I comment.

essay about history of computers

History of computers: A brief timeline

The history of computers began with primitive designs in the early 19th century and went on to change the world during the 20th century.

History of computers: Apple I computer 1976

  • 2000-present day

Additional resources

The history of computers goes back over 200 years. At first theorized by mathematicians and entrepreneurs, during the 19th century mechanical calculating machines were designed and built to solve the increasingly complex number-crunching challenges. The advancement of technology enabled ever more-complex computers by the early 20th century, and computers became larger and more powerful.

Today, computers are almost unrecognizable from designs of the 19th century, such as Charles Babbage's Analytical Engine — or even from the huge computers of the 20th century that occupied whole rooms, such as the Electronic Numerical Integrator and Calculator.  

Here's a brief history of computers, from their primitive number-crunching origins to the powerful modern-day machines that surf the Internet, run games and stream multimedia. 

19th century

1801: Joseph Marie Jacquard, a French merchant and inventor invents a loom that uses punched wooden cards to automatically weave fabric designs. Early computers would use similar punch cards.

1821: English mathematician Charles Babbage conceives of a steam-driven calculating machine that would be able to compute tables of numbers. Funded by the British government, the project, called the "Difference Engine" fails due to the lack of technology at the time, according to the University of Minnesota . 

1848: Ada Lovelace, an English mathematician and the daughter of poet Lord Byron, writes the world's first computer program. According to Anna Siffert, a professor of theoretical mathematics at the University of Münster in Germany, Lovelace writes the first program while translating a paper on Babbage's Analytical Engine from French into English. "She also provides her own comments on the text. Her annotations, simply called "notes," turn out to be three times as long as the actual transcript," Siffert wrote in an article for The Max Planck Society . "Lovelace also adds a step-by-step description for computation of Bernoulli numbers with Babbage's machine — basically an algorithm — which, in effect, makes her the world's first computer programmer." Bernoulli numbers are a sequence of rational numbers often used in computation.

Babbage's Analytical Engine

1853: Swedish inventor Per Georg Scheutz and his son Edvard design the world's first printing calculator. The machine is significant for being the first to "compute tabular differences and print the results," according to Uta C. Merzbach's book, " Georg Scheutz and the First Printing Calculator " (Smithsonian Institution Press, 1977).

1890: Herman Hollerith designs a punch-card system to help calculate the 1890 U.S. Census. The machine,  saves the government several years of calculations, and the U.S. taxpayer approximately $5 million, according to Columbia University  Hollerith later establishes a company that will eventually become International Business Machines Corporation ( IBM ).

Early 20th century

1931: At the Massachusetts Institute of Technology (MIT), Vannevar Bush invents and builds the Differential Analyzer, the first large-scale automatic general-purpose mechanical analog computer, according to Stanford University . 

1936: Alan Turing , a British scientist and mathematician, presents the principle of a universal machine, later called the Turing machine, in a paper called "On Computable Numbers…" according to Chris Bernhardt's book " Turing's Vision " (The MIT Press, 2017). Turing machines are capable of computing anything that is computable. The central concept of the modern computer is based on his ideas. Turing is later involved in the development of the Turing-Welchman Bombe, an electro-mechanical device designed to decipher Nazi codes during World War II, according to the UK's National Museum of Computing . 

1937: John Vincent Atanasoff, a professor of physics and mathematics at Iowa State University, submits a grant proposal to build the first electric-only computer, without using gears, cams, belts or shafts.

original garage where Bill Hewlett and Dave Packard started their business

1939: David Packard and Bill Hewlett found the Hewlett Packard Company in Palo Alto, California. The pair decide the name of their new company by the toss of a coin, and Hewlett-Packard's first headquarters are in Packard's garage, according to MIT . 

1941: German inventor and engineer Konrad Zuse completes his Z3 machine, the world's earliest digital computer, according to Gerard O'Regan's book " A Brief History of Computing " (Springer, 2021). The machine was destroyed during a bombing raid on Berlin during World War II. Zuse fled the German capital after the defeat of Nazi Germany and later released the world's first commercial digital computer, the Z4, in 1950, according to O'Regan. 

1941: Atanasoff and his graduate student, Clifford Berry, design the first digital electronic computer in the U.S., called the Atanasoff-Berry Computer (ABC). This marks the first time a computer is able to store information on its main memory, and is capable of performing one operation every 15 seconds, according to the book " Birthing the Computer " (Cambridge Scholars Publishing, 2016)

1945: Two professors at the University of Pennsylvania, John Mauchly and J. Presper Eckert, design and build the Electronic Numerical Integrator and Calculator (ENIAC). The machine is the first "automatic, general-purpose, electronic, decimal, digital computer," according to Edwin D. Reilly's book "Milestones in Computer Science and Information Technology" (Greenwood Press, 2003). 

Computer technicians operating the ENIAC

1946: Mauchly and Presper leave the University of Pennsylvania and receive funding from the Census Bureau to build the UNIVAC, the first commercial computer for business and government applications.

1947: William Shockley, John Bardeen and Walter Brattain of Bell Laboratories invent the transistor . They discover how to make an electric switch with solid materials and without the need for a vacuum.

1949: A team at the University of Cambridge develops the Electronic Delay Storage Automatic Calculator (EDSAC), "the first practical stored-program computer," according to O'Regan. "EDSAC ran its first program in May 1949 when it calculated a table of squares and a list of prime numbers ," O'Regan wrote. In November 1949, scientists with the Council of Scientific and Industrial Research (CSIR), now called CSIRO, build Australia's first digital computer called the Council for Scientific and Industrial Research Automatic Computer (CSIRAC). CSIRAC is the first digital computer in the world to play music, according to O'Regan.

Late 20th century

1953: Grace Hopper develops the first computer language, which eventually becomes known as COBOL, which stands for COmmon, Business-Oriented Language according to the National Museum of American History . Hopper is later dubbed the "First Lady of Software" in her posthumous Presidential Medal of Freedom citation. Thomas Johnson Watson Jr., son of IBM CEO Thomas Johnson Watson Sr., conceives the IBM 701 EDPM to help the United Nations keep tabs on Korea during the war.

1954: John Backus and his team of programmers at IBM publish a paper describing their newly created FORTRAN programming language, an acronym for FORmula TRANslation, according to MIT .

1958: Jack Kilby and Robert Noyce unveil the integrated circuit, known as the computer chip. Kilby is later awarded the Nobel Prize in Physics for his work.

1968: Douglas Engelbart reveals a prototype of the modern computer at the Fall Joint Computer Conference, San Francisco. His presentation, called "A Research Center for Augmenting Human Intellect" includes a live demonstration of his computer, including a mouse and a graphical user interface (GUI), according to the Doug Engelbart Institute . This marks the development of the computer from a specialized machine for academics to a technology that is more accessible to the general public.

The first computer mouse, invented in 1963 by Douglas C. Engelbart

1969: Ken Thompson, Dennis Ritchie and a group of other developers at Bell Labs produce UNIX, an operating system that made "large-scale networking of diverse computing systems — and the internet — practical," according to Bell Labs .. The team behind UNIX continued to develop the operating system using the C programming language, which they also optimized. 

1970: The newly formed Intel unveils the Intel 1103, the first Dynamic Access Memory (DRAM) chip.

1971: A team of IBM engineers led by Alan Shugart invents the "floppy disk," enabling data to be shared among different computers.

1972: Ralph Baer, a German-American engineer, releases Magnavox Odyssey, the world's first home game console, in September 1972 , according to the Computer Museum of America . Months later, entrepreneur Nolan Bushnell and engineer Al Alcorn with Atari release Pong, the world's first commercially successful video game. 

1973: Robert Metcalfe, a member of the research staff for Xerox, develops Ethernet for connecting multiple computers and other hardware.

1977: The Commodore Personal Electronic Transactor (PET), is released onto the home computer market, featuring an MOS Technology 8-bit 6502 microprocessor, which controls the screen, keyboard and cassette player. The PET is especially successful in the education market, according to O'Regan.

1975: The magazine cover of the January issue of "Popular Electronics" highlights the Altair 8080 as the "world's first minicomputer kit to rival commercial models." After seeing the magazine issue, two "computer geeks," Paul Allen and Bill Gates, offer to write software for the Altair, using the new BASIC language. On April 4, after the success of this first endeavor, the two childhood friends form their own software company, Microsoft.

1976: Steve Jobs and Steve Wozniak co-found Apple Computer on April Fool's Day. They unveil Apple I, the first computer with a single-circuit board and ROM (Read Only Memory), according to MIT .

Apple I computer 1976

1977: Radio Shack began its initial production run of 3,000 TRS-80 Model 1 computers — disparagingly known as the "Trash 80" — priced at $599, according to the National Museum of American History. Within a year, the company took 250,000 orders for the computer, according to the book " How TRS-80 Enthusiasts Helped Spark the PC Revolution " (The Seeker Books, 2007).

1977: The first West Coast Computer Faire is held in San Francisco. Jobs and Wozniak present the Apple II computer at the Faire, which includes color graphics and features an audio cassette drive for storage.

1978: VisiCalc, the first computerized spreadsheet program is introduced.

1979: MicroPro International, founded by software engineer Seymour Rubenstein, releases WordStar, the world's first commercially successful word processor. WordStar is programmed by Rob Barnaby, and includes 137,000 lines of code, according to Matthew G. Kirschenbaum's book " Track Changes: A Literary History of Word Processing " (Harvard University Press, 2016).

1981: "Acorn," IBM's first personal computer, is released onto the market at a price point of $1,565, according to IBM. Acorn uses the MS-DOS operating system from Windows. Optional features include a display, printer, two diskette drives, extra memory, a game adapter and more.

A worker using an Acorn computer by IBM, 1981

1983: The Apple Lisa, standing for "Local Integrated Software Architecture" but also the name of Steve Jobs' daughter, according to the National Museum of American History ( NMAH ), is the first personal computer to feature a GUI. The machine also includes a drop-down menu and icons. Also this year, the Gavilan SC is released and is the first portable computer with a flip-form design and the very first to be sold as a "laptop."

1984: The Apple Macintosh is announced to the world during a Superbowl advertisement. The Macintosh is launched with a retail price of $2,500, according to the NMAH. 

1985 : As a response to the Apple Lisa's GUI, Microsoft releases Windows in November 1985, the Guardian reported . Meanwhile, Commodore announces the Amiga 1000.

1989: Tim Berners-Lee, a British researcher at the European Organization for Nuclear Research ( CERN ), submits his proposal for what would become the World Wide Web. His paper details his ideas for Hyper Text Markup Language (HTML), the building blocks of the Web. 

1993: The Pentium microprocessor advances the use of graphics and music on PCs.

1996: Sergey Brin and Larry Page develop the Google search engine at Stanford University.

1997: Microsoft invests $150 million in Apple, which at the time is struggling financially.  This investment ends an ongoing court case in which Apple accused Microsoft of copying its operating system. 

1999: Wi-Fi, the abbreviated term for "wireless fidelity" is developed, initially covering a distance of up to 300 feet (91 meters) Wired reported . 

21st century

2001: Mac OS X, later renamed OS X then simply macOS, is released by Apple as the successor to its standard Mac Operating System. OS X goes through 16 different versions, each with "10" as its title, and the first nine iterations are nicknamed after big cats, with the first being codenamed "Cheetah," TechRadar reported.  

2003: AMD's Athlon 64, the first 64-bit processor for personal computers, is released to customers. 

2004: The Mozilla Corporation launches Mozilla Firefox 1.0. The Web browser is one of the first major challenges to Internet Explorer, owned by Microsoft. During its first five years, Firefox exceeded a billion downloads by users, according to the Web Design Museum . 

2005: Google buys Android, a Linux-based mobile phone operating system

2006: The MacBook Pro from Apple hits the shelves. The Pro is the company's first Intel-based, dual-core mobile computer. 

2009: Microsoft launches Windows 7 on July 22. The new operating system features the ability to pin applications to the taskbar, scatter windows away by shaking another window, easy-to-access jumplists, easier previews of tiles and more, TechRadar reported .  

Apple CEO Steve Jobs holds the iPad during the launch of Apple's new tablet computing device in San Francisco

2010: The iPad, Apple's flagship handheld tablet, is unveiled.

2011: Google releases the Chromebook, which runs on Google Chrome OS.

2015: Apple releases the Apple Watch. Microsoft releases Windows 10.

2016: The first reprogrammable quantum computer was created. "Until now, there hasn't been any quantum-computing platform that had the capability to program new algorithms into their system. They're usually each tailored to attack a particular algorithm," said study lead author Shantanu Debnath, a quantum physicist and optical engineer at the University of Maryland, College Park.

2017: The Defense Advanced Research Projects Agency (DARPA) is developing a new "Molecular Informatics" program that uses molecules as computers. "Chemistry offers a rich set of properties that we may be able to harness for rapid, scalable information storage and processing," Anne Fischer, program manager in DARPA's Defense Sciences Office, said in a statement. "Millions of molecules exist, and each molecule has a unique three-dimensional atomic structure as well as variables such as shape, size, or even color. This richness provides a vast design space for exploring novel and multi-value ways to encode and process data beyond the 0s and 1s of current logic-based, digital architectures."

2019: A team at Google became the first to demonstrate quantum supremacy — creating a quantum computer that could feasibly outperform the most powerful classical computer — albeit for a very specific problem with no practical real-world application. The described the computer, dubbed "Sycamore" in a paper that same year in the journal Nature . Achieving quantum advantage – in which a quantum computer solves a problem with real-world applications faster than the most powerful classical computer —  is still a ways off. 

2022: The first exascale supercomputer, and the world's fastest, Frontier, went online at the Oak Ridge Leadership Computing Facility (OLCF) in Tennessee. Built by Hewlett Packard Enterprise (HPE) at the cost of $600 million, Frontier uses nearly 10,000 AMD EPYC 7453 64-core CPUs alongside nearly 40,000 AMD Radeon Instinct MI250X GPUs. This machine ushered in the era of exascale computing, which refers to systems that can reach more than one exaFLOP of power – used to measure the performance of a system. Only one machine – Frontier – is currently capable of reaching such levels of performance. It is currently being used as a tool to aid scientific discovery.

What is the first computer in history?

Charles Babbage's Difference Engine, designed in the 1820s, is considered the first "mechanical" computer in history, according to the Science Museum in the U.K . Powered by steam with a hand crank, the machine calculated a series of values and printed the results in a table. 

What are the five generations of computing?

The "five generations of computing" is a framework for assessing the entire history of computing and the key technological advancements throughout it. 

The first generation, spanning the 1940s to the 1950s, covered vacuum tube-based machines. The second then progressed to incorporate transistor-based computing between the 50s and the 60s. In the 60s and 70s, the third generation gave rise to integrated circuit-based computing. We are now in between the fourth and fifth generations of computing, which are microprocessor-based and AI-based computing.

What is the most powerful computer in the world?

As of November 2023, the most powerful computer in the world is the Frontier supercomputer . The machine, which can reach a performance level of up to 1.102 exaFLOPS, ushered in the age of exascale computing in 2022 when it went online at Tennessee's  Oak Ridge Leadership Computing Facility (OLCF) 

There is, however, a potentially more powerful supercomputer waiting in the wings in the form of the Aurora supercomputer, which is housed at the Argonne National Laboratory (ANL) outside of Chicago.  Aurora went online in November 2023. Right now, it lags far behind Frontier, with performance levels of just 585.34 petaFLOPS (roughly half the performance of Frontier), although it's still not finished. When work is completed, the supercomputer is expected to reach performance levels higher than 2 exaFLOPS.

What was the first killer app?

Killer apps are widely understood to be those so essential that they are core to the technology they run on. There have been so many through the years – from Word for Windows in 1989 to iTunes in 2001 to social media apps like WhatsApp in more recent years

Several pieces of software may stake a claim to be the first killer app, but there is a broad consensus that VisiCalc, a spreadsheet program created by VisiCorp and originally released for the Apple II in 1979, holds that title. Steve Jobs even credits this app for propelling the Apple II to become the success it was, according to co-creator Dan Bricklin .

  • Fortune: A Look Back At 40 Years of Apple
  • The New Yorker: The First Windows
  • " A Brief History of Computing " by Gerard O'Regan (Springer, 2021)

Sign up for the Live Science daily newsletter now

Get the world’s most fascinating discoveries delivered straight to your inbox.

Timothy Williamson

Timothy is Editor in Chief of print and digital magazines All About History and History of War . He has previously worked on sister magazine All About Space , as well as photography and creative brands including Digital Photographer and 3D Artist . He has also written for How It Works magazine, several history bookazines and has a degree in English Literature from Bath Spa University . 

Quantum computing breakthrough could happen with just hundreds, not millions, of qubits using new error-correction system

Future quantum computers could use bizarre 'error-free' qubit design built on forgotten research from the 1990s

Breakthrough 6G antenna could lead to high-speed communications and holograms

Most Popular

  • 2 James Webb telescope confirms there is something seriously wrong with our understanding of the universe
  • 3 'I nearly fell out of my chair': 1,800-year-old mini portrait of Alexander the Great found in a field in Denmark
  • 4 6G speeds hit 100 Gbps in new test — 500 times faster than average 5G cellphones
  • 5 Why do people hear their names being called in the woods?
  • 2 2 plants randomly mated up to 1 million years ago to give rise to one of the world's most popular drinks
  • 3 Deepest blue hole in the world discovered, with hidden caves and tunnels believed to be inside
  • 4 Hundreds of black 'spiders' spotted in mysterious 'Inca City' on Mars in new satellite photos
  • 5 Stunning image shows atoms transforming into quantum waves — just as Schrödinger predicted

essay about history of computers

Logo for Clemson University Open Textbooks

Want to create or adapt books like this? Learn more about how Pressbooks supports open publishing practices.

Modern (1940’s-present)

66 History of Computers

Chandler Little and Ben Greene

Introduction

Modern technology first started evolving when electricity started to be used more often in everyday life. One of the biggest inventions in the 20th century was the computer, and it has gone through many changes and improvements since its creation. The last two decades have shown more advancement in technology than any other invention. They have advanced almost every level of learning in our lives and looks like it will only keep impacting through the decades. Computers in today’s society have become a focal point for everyday life and will continue to do so for the foreseeable future. One important company that has shaped the computer industry is Apple, Inc., which was founded in the final quarter of the 20th century. This is one of the primary computer providers in American society, so a history of the company is included in this chapter to contribute specific information about a business that has propelled the growth of computer popularity.

The Evolution of Computers

Computers have come a long way from their creation. This first computer was created in 1822 by Charles Babbage and was created with a series of vacuum tubes, ultimately weighing 700 pounds- far larger than computers today. Computers have vastly decreased in size due to the invention of the transistor in 1947, which revolutionized computing by replacing bulky vacuum tubes with

essay about history of computers

smaller components that made computers more compact and also more reliable. This led to an era of rapid technological advancement, leading to the development of integrated circuits, microprocessors, and eventually the smaller, lighter personal computers that have become indispensable in modern society. For example, most laptops today weigh in the range of two to eight pounds. A picture of one of the first computers can be seen in Figure 1. There have been large amounts of movement in the data storage sector in computers. The very first hard drive was created in 1956 and had a capacity of 5MB and weighed in at 550 pounds.

Today hard drives are becoming smaller and we see them weighing a couple of ounces to a couple of pounds. As files have come more complex, the need for more space in computers has increased drastically. Today we see games take up to 100GB of storage. To give a reference as to how big of a difference

essay about history of computers

5MB is to 100GB of storage is that 5MB is .005 GB. The hard drives we have today are seeing sizes from 10TB and larger. A TB is 1000GB. The evolution of the hard drive can be seen in figure 2. As the world of computers keeps progressing, there is a general concept of making them smaller, but at the same time seeing a generational step in improvement. The improvements allow the daily tasks of many users (like teachers, researchers, and doctors) to become shorter, making their tasks quicker and easier to accomplish. New software is also constantly being developed, and as a result, we are seeing strides in staying connected to others in the ways of social media, messaging platforms, and other means of communication. The downside to this growing dependence on computers is that  technological failures/damages can create major setbacks on any given day.

relation to sts

The development of computers as a prevalent form of technology has had a profound impact on society as a whole in the United States. Computers are now ubiquitous, playing a crucial role in various aspects of everyday life. They are utilized in most classrooms across the country, facilitating learning and enhancing educational experiences for students of all ages. In the workplace, computers have revolutionized business operations, streamlining processes, increasing efficiency, and enabling remote work capabilities. Additionally, computers have become indispensable tools in households across America. According to the U.S. Census Bureau’s American Community Survey, a staggering 92% of households reported having at least one type of computer (2021). This statistic underscores the widespread integration of computers into the fabric of American life. The impact of computers extends beyond mere accessibility. They have transformed communication, allowing people to connect instantaneously through email, social media, and video conferencing platforms. Additionally, computers have revolutionized entertainment, providing access to a vast array of digital content, from streaming services to video games.

Overall, the pervasive presence of computers underscores their monumental impact on Americans’ lives, shaping how we learn, work, communicate, and entertain ourselves. As technology continues to evolve, the influence of computers on society is expected to grow even further, driving transformative changes across various domains.

missing Voices in computer history

The evolution of computers has been happening at a fast rate, and when this happens people’s contributions are left out. The main demographic in computers that are left out are women. Grace Hopper is one of the most influential people in the computer spectrum, but her work is not shown in the classroom. In the 1950s, Grace Hopper was a senior mathematician on her team for UNIVAC (UNIVersal Automatic  Computing INC). Here she created the very first compiler (Cassel, 2016). This was a massive accomplishment for anyone in the field of computing because it allowed the idea that programming languages are not tied to a specific computer, but can be used on any computer. This single feature in computers was one of the main driving forces for computing to become so robust and powerful that it is today. Grace Hopper’s work needs to be talked about in classrooms not only just in engineering courses, but as well as general classes. Students need to hear that a woman was the driving force behind the evolution of computing. By talking about this, it may urge more women to join the computing field because right now only 25% of jobs in the computing sector are held by women (Cassel, 2016). With a more diverse workforce in computing, we can see the creation of new ideas and features that were never thought of before.

During the evolution of computers, many people have been left out with their creation with respect to the development and algorithms. With the push to gender equality in the world in future years, this gap between the disparity between women’s credibility and men’s credibility will be shrunk to a negligible amount. As computers continue to evolve the world of STS will need to evolve with them to adapt to the changes in technology. If not, some of the great creations in the computer sector will be neglected, and most notoriously here is VR (Virtual Reality) with its higher entry-level price and motion sickness that comes along with VR (Virtual Reality).

history of apple, inc.

In American society today, two primary operating systems dominate: Windows and MacOS. MacOS is the operating system for Apple computers, which were estimated to cover about 16.1% of the U.S. personal computer market in the fourth quarter of 2023 according to a study from Gartner (2024). The company Apple Inc. was founded on April 1, 1976, by Steve Jobs and Steve Wozniak who wanted to make computers more user-friendly and accessible to individuals. Their vision was to revolutionize the computer industry. They started by building the Apple I in Jobs’ garage, leading to the introduction of the Apple II, which featured color graphics and propelled the company’s growth. However, internal conflicts and the departure of key figures like Jobs and Wozniak led to a period of struggle in the 1980s and early 1990s. Jobs returned to Apple in 1997 and initiated transformative changes, including an alliance with Microsoft and the launch of groundbreaking products like the iBook and iPod. Apple continued to expand their product line, with the introduction of the iPhone in 2007 marking a new era of success for Apple, propelling it to become the second most valuable company in the world. Apple has been able to maintain a strong position in the technology market for this entire period by continuously improving its beginning product, the Macintosh computer, and by adapting to new technological changes.

Image of one of the original Apple computers

Apple’s Macintosh computers have changed quite a lot throughout the company’s history. The Macintosh 128k (figure 3) was the very first Apple computer, released on January 24, 1984. It had a 9-inch black and white display with 128KB of RAM (computer memory) and operated on MacOS 1.0. The next important release was in April 1995 with several variations of the Macintosh Performa which had 500MB to 1 GB of memory. Interestingly, the multiple models of this computer ended up competing with each other and were discontinued. This led to the iMac G3 in August 1998 which sported a futuristic design with multiple color options for the back of the computer as well as USB ports, 4GB of memory, and built-in speakers. iMac G3 was the beginning of MacOS 8. In 2007, the iMac went through a major redesign with melded glass and aluminum as the material and a widescreen display. The newer Mac’s continue to be built slimmer, with faster processors, better displays, and more storage (Mingis and Moreau, 2021).

Missing voices within apple

In a male-dominant field, it’s very possible for women’s impacts to be drowned out in technological evolutions. Within Apple’s business specifically, several women have made a large difference in their progression. Susan Kare for example, was the first designer of Apple’s icons like the stopwatch and paintbrush that helped Apple establish the Mac. Another woman with a large contribution to Apple was Joanne Hoffman. She was the fifth person to join Macintosh in 1980 and “wrote the first draft of the User Interface Guidelines for the Mac and figured out how to pitch the computer at the education markets” in the beginning of Apple’s existence (Evans, 2016).

Throughout this chapter, the importance of computers as a catalyst for advancement in our society is evident. Clearly computers have evolved from their inception up until this point today in multiple different ways, and several people from several different backgrounds have played an important part in this.

How has the advancement in technology improved your life?

A brief history of computers – unipi.it . (n.d.). Retrieved November 7, 2022, from https://digitaltools.labcd.unipi.it/wp-content/uploads/2021/05/A-brief-history-of-computers.pdf

Cassel, L.  (December 15, 2016 ). “Op-Ed: 25 Years After Computing Pioneer Grace Hopper’ s Death, We Still Have Work to Do”. USNEWS.com, Thursday. Accessed via Nexis Uni database from Clemson University.

Evans, J. (2016, March 8). 10 Women Who Made Apple Great. Computerworld. https://www.computerworld.com/article/3041874/10-women-who-made-apple-great.html

Gartner. (2024, January 10). Gartner Says Worldwide PC Shipments Increased 0.3% in Fourth Quarter of 2023 but Declined 14.8% for the Year. Gartner. https://www.gartner.com/en/newsroom/press-releases/01-10-2024-gartner-says-worldwide-pc-shipments-increased-zero-point-three-percent-in-fourth-quarter-of-2023-but-declined-fourteen-point-eight-percent-for-the-year#:~:text=HP%20maintained%20the%20top%20spot,share%20(see%20Table%202).&text=HP%20Inc.,-4%2C665

Kleiman, K. & Saklayen, N. (2018, April 19). These 6 pioneering women helped create modern computer s. ideas.ted.com. Retrieved September 26, 2021, from https://ideas.ted.com/how-i-discovered-six-pioneering-women-who-helped-create-modern-computers-and-why-we-should-never-forget-them / .

Mingis, K. and Moreau, S. (2021, April 28). The Evolution of the Macintosh – and the Imac. Computerworld. https://www.computerworld.com/article/1617841/evolution-of-macintosh-and-imac.html

Richardson, A. (2023, April). The Founding of Apple Computer, Inc. Library of Congress. https://guides.loc.gov/this-month-in-business-history/april/apple-computer-founded

Thompson, C. (2019, June 1). The gendered history of human computers. Smithsonian.com. Retrieved September 26, 2021, from https://www.smithsonianmag.com/science-nature/history-human-computers-180972202/ .

Women in Computing and Women in Engineering honored for promoting girls in STEM.  (May 26, 2017 Friday). US Official News. Accessed via Nexis Uni database from Clemson University.

Zimmermann, K. A. (2017, September 7). History of computers: A brief timeline. LiveScience. Retrieved September 26, 2021, from https://www.livescience.com/20718-computer-history.html .

“Gene Amdahl’s first computer.” by Erik Pitti is licensed under CC BY 2.0

“First hard drives” by gabrielsaldana is licensed under CC BY 2.0

 Sailko. (2017). Neo Preistoria Exhibition (Milan 2016). Wikipedia. https://en.wikipedia.org/wiki/Macintosh_128K#/media/File:Computer_macintosh_128k,_1984_(all_about_Apple_onlus).jpg

AI ACKNOWLEDGMENT

I acknowledge the use of ChatGPT to generate additional content for this chapter.

Prompts and uses:

I entered the following prompt: Summarize the history of Apple in 7 sentences based on that prompt [the Library of Congress article].

Use: I modified the output to add more information from the article that I found relevant. I also adjusted the wording to make it fit the style of the rest of the chapter.

I entered the following prompt: The development of computers as a new, prevalent form of technology has majorly impacted society as a whole in the United States, as they are involved in several aspects of everyday life. Computers are used in some form in most classrooms in the country, in most workplaces, and in most households. Specifically, 92% of households in the U.S. Census Bureau’s American Community Survey reported having at least one type of computer. This is a simple statistic that shows the monumental impact of computers in Americans’ lives.

Use: I used the output the expand upon this paragraph; after entering the prompt, ChatGPT added several sentences and reworded some of the previously written content. I then removed some of the added information from ChatGPT and made the output more concise.

I entered the following prompt: Give me 3 more sentences to add to the following prompt. I am trying to talk about the history of computers and how they were invented. The prompt begins now- Computers have come a long way from their creation. This first computer was created in 1822 by Charles Babbage. This computer was created with a series of vacuum tubes and weighed a total of 700 pounds, which is much larger than the computers we see today. For example, most laptops weigh in a range of two to eight pounds.

After getting this output, I wrote this prompt: Give me two more sentences to add to that.

Use: I used the 5 sentences of the output to select the information that I wanted and add it to the content that was already in the book in order to provide more detail to the reduction in sizes of computers over time.

To the extent possible under law, Chandler Little and Ben Greene have waived all copyright and related or neighboring rights to Science Technology and Society a Student Led Exploration , except where otherwise noted.

Share This Book

SEP home page

  • Table of Contents
  • Random Entry
  • Chronological
  • Editorial Information
  • About the SEP
  • Editorial Board
  • How to Cite the SEP
  • Special Characters
  • Advanced Tools
  • Support the SEP
  • PDFs for SEP Friends
  • Make a Donation
  • SEPIA for Libraries
  • Entry Contents

Bibliography

Academic tools.

  • Friends PDF Preview
  • Author and Citation Info
  • Back to Top

The Modern History of Computing

Historically, computers were human clerks who calculated in accordance with effective methods. These human computers did the sorts of calculation nowadays carried out by electronic computers, and many thousands of them were employed in commerce, government, and research establishments. The term computing machine , used increasingly from the 1920s, refers to any machine that does the work of a human computer, i.e., any machine that calculates in accordance with effective methods. During the late 1940s and early 1950s, with the advent of electronic computing machines, the phrase ‘computing machine’ gradually gave way simply to ‘computer’, initially usually with the prefix ‘electronic’ or ‘digital’. This entry surveys the history of these machines.

  • Analog Computers

The Universal Turing Machine

Electromechanical versus electronic computation, turing's automatic computing engine, the manchester machine, eniac and edvac, other notable early computers, high-speed memory, other internet resources, related entries.

Charles Babbage was Lucasian Professor of Mathematics at Cambridge University from 1828 to 1839 (a post formerly held by Isaac Newton). Babbage's proposed Difference Engine was a special-purpose digital computing machine for the automatic production of mathematical tables (such as logarithm tables, tide tables, and astronomical tables). The Difference Engine consisted entirely of mechanical components — brass gear wheels, rods, ratchets, pinions, etc. Numbers were represented in the decimal system by the positions of 10-toothed metal wheels mounted in columns. Babbage exhibited a small working model in 1822. He never completed the full-scale machine that he had designed but did complete several fragments. The largest — one ninth of the complete calculator — is on display in the London Science Museum. Babbage used it to perform serious computational work, calculating various mathematical tables. In 1990, Babbage's Difference Engine No. 2 was finally built from Babbage's designs and is also on display at the London Science Museum.

The Swedes Georg and Edvard Scheutz (father and son) constructed a modified version of Babbage's Difference Engine. Three were made, a prototype and two commercial models, one of these being sold to an observatory in Albany, New York, and the other to the Registrar-General's office in London, where it calculated and printed actuarial tables.

Babbage's proposed Analytical Engine, considerably more ambitious than the Difference Engine, was to have been a general-purpose mechanical digital computer. The Analytical Engine was to have had a memory store and a central processing unit (or ‘mill’) and would have been able to select from among alternative actions consequent upon the outcome of its previous actions (a facility nowadays known as conditional branching). The behaviour of the Analytical Engine would have been controlled by a program of instructions contained on punched cards connected together with ribbons (an idea that Babbage had adopted from the Jacquard weaving loom). Babbage emphasised the generality of the Analytical Engine, saying ‘the conditions which enable a finite machine to make calculations of unlimited extent are fulfilled in the Analytical Engine’ (Babbage [1994], p. 97).

Babbage worked closely with Ada Lovelace, daughter of the poet Byron, after whom the modern programming language ADA is named. Lovelace foresaw the possibility of using the Analytical Engine for non-numeric computation, suggesting that the Engine might even be capable of composing elaborate pieces of music.

A large model of the Analytical Engine was under construction at the time of Babbage's death in 1871 but a full-scale version was never built. Babbage's idea of a general-purpose calculating engine was never forgotten, especially at Cambridge, and was on occasion a lively topic of mealtime discussion at the war-time headquarters of the Government Code and Cypher School, Bletchley Park, Buckinghamshire, birthplace of the electronic digital computer.

Analog computers

The earliest computing machines in wide use were not digital but analog. In analog representation, properties of the representational medium ape (or reflect or model) properties of the represented state-of-affairs. (In obvious contrast, the strings of binary digits employed in digital representation do not represent by means of possessing some physical property — such as length — whose magnitude varies in proportion to the magnitude of the property that is being represented.) Analog representations form a diverse class. Some examples: the longer a line on a road map, the longer the road that the line represents; the greater the number of clear plastic squares in an architect's model, the greater the number of windows in the building represented; the higher the pitch of an acoustic depth meter, the shallower the water. In analog computers, numerical quantities are represented by, for example, the angle of rotation of a shaft or a difference in electrical potential. Thus the output voltage of the machine at a time might represent the momentary speed of the object being modelled.

As the case of the architect's model makes plain, analog representation may be discrete in nature (there is no such thing as a fractional number of windows). Among computer scientists, the term ‘analog’ is sometimes used narrowly, to indicate representation of one continuously-valued quantity by another (e.g., speed by voltage). As Brian Cantwell Smith has remarked:

‘Analog’ should … be a predicate on a representation whose structure corresponds to that of which it represents … That continuous representations should historically have come to be called analog presumably betrays the recognition that, at the levels at which it matters to us, the world is more foundationally continuous than it is discrete. (Smith [1991], p. 271)

James Thomson, brother of Lord Kelvin, invented the mechanical wheel-and-disc integrator that became the foundation of analog computation (Thomson [1876]). The two brothers constructed a device for computing the integral of the product of two given functions, and Kelvin described (although did not construct) general-purpose analog machines for integrating linear differential equations of any order and for solving simultaneous linear equations. Kelvin's most successful analog computer was his tide predicting machine, which remained in use at the port of Liverpool until the 1960s. Mechanical analog devices based on the wheel-and-disc integrator were in use during World War I for gunnery calculations. Following the war, the design of the integrator was considerably improved by Hannibal Ford (Ford [1919]).

Stanley Fifer reports that the first semi-automatic mechanical analog computer was built in England by the Manchester firm of Metropolitan Vickers prior to 1930 (Fifer [1961], p. 29); however, I have so far been unable to verify this claim. In 1931, Vannevar Bush, working at MIT, built the differential analyser, the first large-scale automatic general-purpose mechanical analog computer. Bush's design was based on the wheel and disc integrator. Soon copies of his machine were in use around the world (including, at Cambridge and Manchester Universities in England, differential analysers built out of kit-set Meccano, the once popular engineering toy).

It required a skilled mechanic equipped with a lead hammer to set up Bush's mechanical differential analyser for each new job. Subsequently, Bush and his colleagues replaced the wheel-and-disc integrators and other mechanical components by electromechanical, and finally by electronic, devices.

A differential analyser may be conceptualised as a collection of ‘black boxes’ connected together in such a way as to allow considerable feedback. Each box performs a fundamental process, for example addition, multiplication of a variable by a constant, and integration. In setting up the machine for a given task, boxes are connected together so that the desired set of fundamental processes is executed. In the case of electrical machines, this was done typically by plugging wires into sockets on a patch panel (computing machines whose function is determined in this way are referred to as ‘program-controlled’).

Since all the boxes work in parallel, an electronic differential analyser solves sets of equations very quickly. Against this has to be set the cost of massaging the problem to be solved into the form demanded by the analog machine, and of setting up the hardware to perform the desired computation. A major drawback of analog computation is the higher cost, relative to digital machines, of an increase in precision. During the 1960s and 1970s, there was considerable interest in ‘hybrid’ machines, where an analog section is controlled by and programmed via a digital section. However, such machines are now a rarity.

In 1936, at Cambridge University, Turing invented the principle of the modern computer. He described an abstract digital computing machine consisting of a limitless memory and a scanner that moves back and forth through the memory, symbol by symbol, reading what it finds and writing further symbols (Turing [1936]). The actions of the scanner are dictated by a program of instructions that is stored in the memory in the form of symbols. This is Turing's stored-program concept, and implicit in it is the possibility of the machine operating on and modifying its own program. (In London in 1947, in the course of what was, so far as is known, the earliest public lecture to mention computer intelligence, Turing said, ‘What we want is a machine that can learn from experience’, adding that the ‘possibility of letting the machine alter its own instructions provides the mechanism for this’ (Turing [1947] p. 393). Turing's computing machine of 1936 is now known simply as the universal Turing machine. Cambridge mathematician Max Newman remarked that right from the start Turing was interested in the possibility of actually building a computing machine of the sort that he had described (Newman in interview with Christopher Evans in Evans [197?].

From the start of the Second World War Turing was a leading cryptanalyst at the Government Code and Cypher School, Bletchley Park. Here he became familiar with Thomas Flowers' work involving large-scale high-speed electronic switching (described below). However, Turing could not turn to the project of building an electronic stored-program computing machine until the cessation of hostilities in Europe in 1945.

During the wartime years Turing did give considerable thought to the question of machine intelligence. Colleagues at Bletchley Park recall numerous off-duty discussions with him on the topic, and at one point Turing circulated a typewritten report (now lost) setting out some of his ideas. One of these colleagues, Donald Michie (who later founded the Department of Machine Intelligence and Perception at the University of Edinburgh), remembers Turing talking often about the possibility of computing machines (1) learning from experience and (2) solving problems by means of searching through the space of possible solutions, guided by rule-of-thumb principles (Michie in interview with Copeland, 1995). The modern term for the latter idea is ‘heuristic search’, a heuristic being any rule-of-thumb principle that cuts down the amount of searching required in order to find a solution to a problem. At Bletchley Park Turing illustrated his ideas on machine intelligence by reference to chess. Michie recalls Turing experimenting with heuristics that later became common in chess programming (in particular minimax and best-first).

Further information about Turing and the computer, including his wartime work on codebreaking and his thinking about artificial intelligence and artificial life, can be found in Copeland 2004.

With some exceptions — including Babbage's purely mechanical engines, and the finger-powered National Accounting Machine - early digital computing machines were electromechanical. That is to say, their basic components were small, electrically-driven, mechanical switches called ‘relays’. These operate relatively slowly, whereas the basic components of an electronic computer — originally vacuum tubes (valves) — have no moving parts save electrons and so operate extremely fast. Electromechanical digital computing machines were built before and during the second world war by (among others) Howard Aiken at Harvard University, George Stibitz at Bell Telephone Laboratories, Turing at Princeton University and Bletchley Park, and Konrad Zuse in Berlin. To Zuse belongs the honour of having built the first working general-purpose program-controlled digital computer. This machine, later called the Z3, was functioning in 1941. (A program-controlled computer, as opposed to a stored-program computer, is set up for a new task by re-routing wires, by means of plugs etc.)

Relays were too slow and unreliable a medium for large-scale general-purpose digital computation (although Aiken made a valiant effort). It was the development of high-speed digital techniques using vacuum tubes that made the modern computer possible.

The earliest extensive use of vacuum tubes for digital data-processing appears to have been by the engineer Thomas Flowers, working in London at the British Post Office Research Station at Dollis Hill. Electronic equipment designed by Flowers in 1934, for controlling the connections between telephone exchanges, went into operation in 1939, and involved between three and four thousand vacuum tubes running continuously. In 1938–1939 Flowers worked on an experimental electronic digital data-processing system, involving a high-speed data store. Flowers' aim, achieved after the war, was that electronic equipment should replace existing, less reliable, systems built from relays and used in telephone exchanges. Flowers did not investigate the idea of using electronic equipment for numerical calculation, but has remarked that at the outbreak of war with Germany in 1939 he was possibly the only person in Britain who realized that vacuum tubes could be used on a large scale for high-speed digital computation. (See Copeland 2006 for m more information on Flowers' work.)

The earliest comparable use of vacuum tubes in the U.S. seems to have been by John Atanasoff at what was then Iowa State College (now University). During the period 1937–1942 Atanasoff developed techniques for using vacuum tubes to perform numerical calculations digitally. In 1939, with the assistance of his student Clifford Berry, Atanasoff began building what is sometimes called the Atanasoff-Berry Computer, or ABC, a small-scale special-purpose electronic digital machine for the solution of systems of linear algebraic equations. The machine contained approximately 300 vacuum tubes. Although the electronic part of the machine functioned successfully, the computer as a whole never worked reliably, errors being introduced by the unsatisfactory binary card-reader. Work was discontinued in 1942 when Atanasoff left Iowa State.

The first fully functioning electronic digital computer was Colossus, used by the Bletchley Park cryptanalysts from February 1944.

From very early in the war the Government Code and Cypher School (GC&CS) was successfully deciphering German radio communications encoded by means of the Enigma system, and by early 1942 about 39,000 intercepted messages were being decoded each month, thanks to electromechanical machines known as ‘bombes’. These were designed by Turing and Gordon Welchman (building on earlier work by Polish cryptanalysts).

During the second half of 1941, messages encoded by means of a totally different method began to be intercepted. This new cipher machine, code-named ‘Tunny’ by Bletchley Park, was broken in April 1942 and current traffic was read for the first time in July of that year. Based on binary teleprinter code, Tunny was used in preference to Morse-based Enigma for the encryption of high-level signals, for example messages from Hitler and members of the German High Command.

The need to decipher this vital intelligence as rapidly as possible led Max Newman to propose in November 1942 (shortly after his recruitment to GC&CS from Cambridge University) that key parts of the decryption process be automated, by means of high-speed electronic counting devices. The first machine designed and built to Newman's specification, known as the Heath Robinson, was relay-based with electronic circuits for counting. (The electronic counters were designed by C.E. Wynn-Williams, who had been using thyratron tubes in counting circuits at the Cavendish Laboratory, Cambridge, since 1932 [Wynn-Williams 1932].) Installed in June 1943, Heath Robinson was unreliable and slow, and its high-speed paper tapes were continually breaking, but it proved the worth of Newman's idea. Flowers recommended that an all-electronic machine be built instead, but he received no official encouragement from GC&CS. Working independently at the Post Office Research Station at Dollis Hill, Flowers quietly got on with constructing the world's first large-scale programmable electronic digital computer. Colossus I was delivered to Bletchley Park in January 1943.

By the end of the war there were ten Colossi working round the clock at Bletchley Park. From a cryptanalytic viewpoint, a major difference between the prototype Colossus I and the later machines was the addition of the so-called Special Attachment, following a key discovery by cryptanalysts Donald Michie and Jack Good. This broadened the function of Colossus from ‘wheel setting’ — i.e., determining the settings of the encoding wheels of the Tunny machine for a particular message, given the ‘patterns’ of the wheels — to ‘wheel breaking’, i.e., determining the wheel patterns themselves. The wheel patterns were eventually changed daily by the Germans on each of the numerous links between the German Army High Command and Army Group commanders in the field. By 1945 there were as many 30 links in total. About ten of these were broken and read regularly.

Colossus I contained approximately 1600 vacuum tubes and each of the subsequent machines approximately 2400 vacuum tubes. Like the smaller ABC, Colossus lacked two important features of modern computers. First, it had no internally stored programs. To set it up for a new task, the operator had to alter the machine's physical wiring, using plugs and switches. Second, Colossus was not a general-purpose machine, being designed for a specific cryptanalytic task involving counting and Boolean operations.

F.H. Hinsley, official historian of GC&CS, has estimated that the war in Europe was shortened by at least two years as a result of the signals intelligence operation carried out at Bletchley Park, in which Colossus played a major role. Most of the Colossi were destroyed once hostilities ceased. Some of the electronic panels ended up at Newman's Computing Machine Laboratory in Manchester (see below), all trace of their original use having been removed. Two Colossi were retained by GC&CS (renamed GCHQ following the end of the war). The last Colossus is believed to have stopped running in 1960.

Those who knew of Colossus were prohibited by the Official Secrets Act from sharing their knowledge. Until the 1970s, few had any idea that electronic computation had been used successfully during the second world war. In 1970 and 1975, respectively, Good and Michie published notes giving the barest outlines of Colossus. By 1983, Flowers had received clearance from the British Government to publish a partial account of the hardware of Colossus I. Details of the later machines and of the Special Attachment, the uses to which the Colossi were put, and the cryptanalytic algorithms that they ran, have only recently been declassified. (For the full account of Colossus and the attack on Tunny see Copeland 2006.)

To those acquainted with the universal Turing machine of 1936, and the associated stored-program concept, Flowers' racks of digital electronic equipment were proof of the feasibility of using large numbers of vacuum tubes to implement a high-speed general-purpose stored-program computer. The war over, Newman lost no time in establishing the Royal Society Computing Machine Laboratory at Manchester University for precisely that purpose. A few months after his arrival at Manchester, Newman wrote as follows to the Princeton mathematician John von Neumann (February 1946):

I am … hoping to embark on a computing machine section here, having got very interested in electronic devices of this kind during the last two or three years. By about eighteen months ago I had decided to try my hand at starting up a machine unit when I got out. … I am of course in close touch with Turing.

Turing and Newman were thinking along similar lines. In 1945 Turing joined the National Physical Laboratory (NPL) in London, his brief to design and develop an electronic stored-program digital computer for scientific work. (Artificial Intelligence was not far from Turing's thoughts: he described himself as ‘building a brain’ and remarked in a letter that he was ‘more interested in the possibility of producing models of the action of the brain than in the practical applications to computing’.) John Womersley, Turing's immediate superior at NPL, christened Turing's proposed machine the Automatic Computing Engine, or ACE, in homage to Babbage's Difference Engine and Analytical Engine.

Turing's 1945 report ‘Proposed Electronic Calculator’ gave the first relatively complete specification of an electronic stored-program general-purpose digital computer. The report is reprinted in full in Copeland 2005.

The first electronic stored-program digital computer to be proposed in the U.S. was the EDVAC (see below). The ‘First Draft of a Report on the EDVAC’ (May 1945), composed by von Neumann, contained little engineering detail, in particular concerning electronic hardware (owing to restrictions in the U.S.). Turing's ‘Proposed Electronic Calculator’, on the other hand, supplied detailed circuit designs and specifications of hardware units, specimen programs in machine code, and even an estimate of the cost of building the machine (£11,200). ACE and EDVAC differed fundamentally from one another; for example, ACE employed distributed processing, while EDVAC had a centralised structure.

Turing saw that speed and memory were the keys to computing. Turing's colleague at NPL, Jim Wilkinson, observed that Turing ‘was obsessed with the idea of speed on the machine’ [Copeland 2005, p. 2]. Turing's design had much in common with today's RISC architectures and it called for a high-speed memory of roughly the same capacity as an early Macintosh computer (enormous by the standards of his day). Had Turing's ACE been built as planned it would have been in a different league from the other early computers. However, progress on Turing's Automatic Computing Engine ran slowly, due to organisational difficulties at NPL, and in 1948 a ‘very fed up’ Turing (Robin Gandy's description, in interview with Copeland, 1995) left NPL for Newman's Computing Machine Laboratory at Manchester University. It was not until May 1950 that a small pilot model of the Automatic Computing Engine, built by Wilkinson, Edward Newman, Mike Woodger, and others, first executed a program. With an operating speed of 1 MHz, the Pilot Model ACE was for some time the fastest computer in the world.

Sales of DEUCE, the production version of the Pilot Model ACE, were buoyant — confounding the suggestion, made in 1946 by the Director of the NPL, Sir Charles Darwin, that ‘it is very possible that … one machine would suffice to solve all the problems that are demanded of it from the whole country’ [Copeland 2005, p. 4]. The fundamentals of Turing's ACE design were employed by Harry Huskey (at Wayne State University, Detroit) in the Bendix G15 computer (Huskey in interview with Copeland, 1998). The G15 was arguably the first personal computer; over 400 were sold worldwide. DEUCE and the G15 remained in use until about 1970. Another computer deriving from Turing's ACE design, the MOSAIC, played a role in Britain's air defences during the Cold War period; other derivatives include the Packard-Bell PB250 (1961). (More information about these early computers is given in [Copeland 2005].)

The earliest general-purpose stored-program electronic digital computer to work was built in Newman's Computing Machine Laboratory at Manchester University. The Manchester ‘Baby’, as it became known, was constructed by the engineers F.C. Williams and Tom Kilburn, and performed its first calculation on 21 June 1948. The tiny program, stored on the face of a cathode ray tube, was just seventeen instructions long. A much enlarged version of the machine, with a programming system designed by Turing, became the world's first commercially available computer, the Ferranti Mark I. The first to be completed was installed at Manchester University in February 1951; in all about ten were sold, in Britain, Canada, Holland and Italy.

The fundamental logico-mathematical contributions by Turing and Newman to the triumph at Manchester have been neglected, and the Manchester machine is nowadays remembered as the work of Williams and Kilburn. Indeed, Newman's role in the development of computers has never been sufficiently emphasised (due perhaps to his thoroughly self-effacing way of relating the relevant events).

It was Newman who, in a lecture in Cambridge in 1935, introduced Turing to the concept that led directly to the Turing machine: Newman defined a constructive process as one that a machine can carry out (Newman in interview with Evans, op. cit.). As a result of his knowledge of Turing's work, Newman became interested in the possibilities of computing machinery in, as he put it, ‘a rather theoretical way’. It was not until Newman joined GC&CS in 1942 that his interest in computing machinery suddenly became practical, with his realisation that the attack on Tunny could be mechanised. During the building of Colossus, Newman tried to interest Flowers in Turing's 1936 paper — birthplace of the stored-program concept - but Flowers did not make much of Turing's arcane notation. There is no doubt that by 1943, Newman had firmly in mind the idea of using electronic technology in order to construct a stored-program general-purpose digital computing machine.

In July of 1946 (the month in which the Royal Society approved Newman's application for funds to found the Computing Machine Laboratory), Freddie Williams, working at the Telecommunications Research Establishment, Malvern, began the series of experiments on cathode ray tube storage that was to lead to the Williams tube memory. Williams, until then a radar engineer, explains how it was that he came to be working on the problem of computer memory:

[O]nce [the German Armies] collapsed … nobody was going to care a toss about radar, and people like me … were going to be in the soup unless we found something else to do. And computers were in the air. Knowing absolutely nothing about them I latched onto the problem of storage and tackled that. (Quoted in Bennett 1976.)

Newman learned of Williams' work, and with the able help of Patrick Blackett, Langworthy Professor of Physics at Manchester and one of the most powerful figures in the University, was instrumental in the appointment of the 35 year old Williams to the recently vacated Chair of Electro-Technics at Manchester. (Both were members of the appointing committee (Kilburn in interview with Copeland, 1997).) Williams immediately had Kilburn, his assistant at Malvern, seconded to Manchester. To take up the story in Williams' own words:

[N]either Tom Kilburn nor I knew the first thing about computers when we arrived in Manchester University. We'd had enough explained to us to understand what the problem of storage was and what we wanted to store, and that we'd achieved, so the point now had been reached when we'd got to find out about computers … Newman explained the whole business of how a computer works to us. (F.C. Williams in interview with Evans [1976])

Elsewhere Williams is explicit concerning Turing's role and gives something of the flavour of the explanation that he and Kilburn received:

Tom Kilburn and I knew nothing about computers, but a lot about circuits. Professor Newman and Mr A.M. Turing … knew a lot about computers and substantially nothing about electronics. They took us by the hand and explained how numbers could live in houses with addresses and how if they did they could be kept track of during a calculation. (Williams [1975], p. 328)

It seems that Newman must have used much the same words with Williams and Kilburn as he did in an address to the Royal Society on 4th March 1948:

Professor Hartree … has recalled that all the essential ideas of the general-purpose calculating machines now being made are to be found in Babbage's plans for his analytical engine. In modern times the idea of a universal calculating machine was independently introduced by Turing … [T]he machines now being made in America and in this country … [are] in certain general respects … all similar. There is provision for storing numbers, say in the scale of 2, so that each number appears as a row of, say, forty 0's and 1's in certain places or "houses" in the machine. … Certain of these numbers, or "words" are read, one after another, as orders. In one possible type of machine an order consists of four numbers, for example 11, 13, 27, 4. The number 4 signifies "add", and when control shifts to this word the "houses" H11 and H13 will be connected to the adder as inputs, and H27 as output. The numbers stored in H11 and H13 pass through the adder, are added, and the sum is passed on to H27. The control then shifts to the next order. In most real machines the process just described would be done by three separate orders, the first bringing [H11] (=content of H11) to a central accumulator, the second adding [H13] into the accumulator, and the third sending the result to H27; thus only one address would be required in each order. … A machine with storage, with this automatic-telephone-exchange arrangement and with the necessary adders, subtractors and so on, is, in a sense, already a universal machine. (Newman [1948], pp. 271–272)

Following this explanation of Turing's three-address concept (source 1, source 2, destination, function) Newman went on to describe program storage (‘the orders shall be in a series of houses X1, X2, …’) and conditional branching. He then summed up:

From this highly simplified account it emerges that the essential internal parts of the machine are, first, a storage for numbers (which may also be orders). … Secondly, adders, multipliers, etc. Thirdly, an "automatic telephone exchange" for selecting "houses", connecting them to the arithmetic organ, and writing the answers in other prescribed houses. Finally, means of moving control at any stage to any chosen order, if a certain condition is satisfied, otherwise passing to the next order in the normal sequence. Besides these there must be ways of setting up the machine at the outset, and extracting the final answer in useable form. (Newman [1948], pp. 273–4)

In a letter written in 1972 Williams described in some detail what he and Kilburn were told by Newman:

About the middle of the year [1946] the possibility of an appointment at Manchester University arose and I had a talk with Professor Newman who was already interested in the possibility of developing computers and had acquired a grant from the Royal Society of £30,000 for this purpose. Since he understood computers and I understood electronics the possibilities of fruitful collaboration were obvious. I remember Newman giving us a few lectures in which he outlined the organisation of a computer in terms of numbers being identified by the address of the house in which they were placed and in terms of numbers being transferred from this address, one at a time, to an accumulator where each entering number was added to what was already there. At any time the number in the accumulator could be transferred back to an assigned address in the store and the accumulator cleared for further use. The transfers were to be effected by a stored program in which a list of instructions was obeyed sequentially. Ordered progress through the list could be interrupted by a test instruction which examined the sign of the number in the accumulator. Thereafter operation started from a new point in the list of instructions. This was the first information I received about the organisation of computers. … Our first computer was the simplest embodiment of these principles, with the sole difference that it used a subtracting rather than an adding accumulator. (Letter from Williams to Randell, 1972; in Randell [1972], p. 9)

Turing's early input to the developments at Manchester, hinted at by Williams in his above-quoted reference to Turing, may have been via the lectures on computer design that Turing and Wilkinson gave in London during the period December 1946 to February 1947 (Turing and Wilkinson [1946–7]). The lectures were attended by representatives of various organisations planning to use or build an electronic computer. Kilburn was in the audience (Bowker and Giordano [1993]). (Kilburn usually said, when asked from where he obtained his basic knowledge of the computer, that he could not remember (letter from Brian Napper to Copeland, 2002); for example, in a 1992 interview he said: ‘Between early 1945 and early 1947, in that period, somehow or other I knew what a digital computer was … Where I got this knowledge from I've no idea’ (Bowker and Giordano [1993], p. 19).)

Whatever role Turing's lectures may have played in informing Kilburn, there is little doubt that credit for the Manchester computer — called the ‘Newman-Williams machine’ in a contemporary document (Huskey 1947) — belongs not only to Williams and Kilburn but also to Newman, and that the influence on Newman of Turing's 1936 paper was crucial, as was the influence of Flowers' Colossus.

The first working AI program, a draughts (checkers) player written by Christopher Strachey, ran on the Ferranti Mark I in the Manchester Computing Machine Laboratory. Strachey (at the time a teacher at Harrow School and an amateur programmer) wrote the program with Turing's encouragement and utilising the latter's recently completed Programmers' Handbook for the Ferranti. (Strachey later became Director of the Programming Research Group at Oxford University.) By the summer of 1952, the program could, Strachey reported, ‘play a complete game of draughts at a reasonable speed’. (Strachey's program formed the basis for Arthur Samuel's well-known checkers program.) The first chess-playing program, also, was written for the Manchester Ferranti, by Dietrich Prinz; the program first ran in November 1951. Designed for solving simple problems of the mate-in-two variety, the program would examine every possible move until a solution was found. Turing started to program his ‘Turochamp’ chess-player on the Ferranti Mark I, but never completed the task. Unlike Prinz's program, the Turochamp could play a complete game (when hand-simulated) and operated not by exhaustive search but under the guidance of heuristics.

The first fully functioning electronic digital computer to be built in the U.S. was ENIAC, constructed at the Moore School of Electrical Engineering, University of Pennsylvania, for the Army Ordnance Department, by J. Presper Eckert and John Mauchly. Completed in 1945, ENIAC was somewhat similar to the earlier Colossus, but considerably larger and more flexible (although far from general-purpose). The primary function for which ENIAC was designed was the calculation of tables used in aiming artillery. ENIAC was not a stored-program computer, and setting it up for a new job involved reconfiguring the machine by means of plugs and switches. For many years, ENIAC was believed to have been the first functioning electronic digital computer, Colossus being unknown to all but a few.

In 1944, John von Neumann joined the ENIAC group. He had become ‘intrigued’ (Goldstine's word, [1972], p. 275) with Turing's universal machine while Turing was at Princeton University during 1936–1938. At the Moore School, von Neumann emphasised the importance of the stored-program concept for electronic computing, including the possibility of allowing the machine to modify its own program in useful ways while running (for example, in order to control loops and branching). Turing's paper of 1936 (‘On Computable Numbers, with an Application to the Entscheidungsproblem’) was required reading for members of von Neumann's post-war computer project at the Institute for Advanced Study, Princeton University (letter from Julian Bigelow to Copeland, 2002; see also Copeland [2004], p. 23). Eckert appears to have realised independently, and prior to von Neumann's joining the ENIAC group, that the way to take full advantage of the speed at which data is processed by electronic circuits is to place suitably encoded instructions for controlling the processing in the same high-speed storage devices that hold the data itself (documented in Copeland [2004], pp. 26–7). In 1945, while ENIAC was still under construction, von Neumann produced a draft report, mentioned previously, setting out the ENIAC group's ideas for an electronic stored-program general-purpose digital computer, the EDVAC (von Neuman [1945]). The EDVAC was completed six years later, but not by its originators, who left the Moore School to build computers elsewhere. Lectures held at the Moore School in 1946 on the proposed EDVAC were widely attended and contributed greatly to the dissemination of the new ideas.

Von Neumann was a prestigious figure and he made the concept of a high-speed stored-program digital computer widely known through his writings and public addresses. As a result of his high profile in the field, it became customary, although historically inappropriate, to refer to electronic stored-program digital computers as ‘von Neumann machines’.

The Los Alamos physicist Stanley Frankel, responsible with von Neumann and others for mechanising the large-scale calculations involved in the design of the atomic bomb, has described von Neumann's view of the importance of Turing's 1936 paper, in a letter:

I know that in or about 1943 or ‘44 von Neumann was well aware of the fundamental importance of Turing's paper of 1936 … Von Neumann introduced me to that paper and at his urging I studied it with care. Many people have acclaimed von Neumann as the "father of the computer" (in a modern sense of the term) but I am sure that he would never have made that mistake himself. He might well be called the midwife, perhaps, but he firmly emphasized to me, and to others I am sure, that the fundamental conception is owing to Turing, in so far as not anticipated by Babbage … Both Turing and von Neumann, of course, also made substantial contributions to the "reduction to practice" of these concepts but I would not regard these as comparable in importance with the introduction and explication of the concept of a computer able to store in its memory its program of activities and of modifying that program in the course of these activities. (Quoted in Randell [1972], p. 10)

Other notable early stored-program electronic digital computers were:

  • EDSAC, 1949, built at Cambridge University by Maurice Wilkes
  • BINAC, 1949, built by Eckert's and Mauchly's Electronic Control Co., Philadelphia (opinions differ over whether BINAC ever actually worked)
  • Whirlwind I, 1949, Digital Computer Laboratory, Massachusetts Institute of Technology, Jay Forrester
  • SEAC, 1950, US Bureau of Standards Eastern Division, Washington D.C., Samuel Alexander, Ralph Slutz
  • SWAC, 1950, US Bureau of Standards Western Division, Institute for Numerical Analysis, University of California at Los Angeles, Harry Huskey
  • UNIVAC, 1951, Eckert-Mauchly Computer Corporation, Philadelphia (the first computer to be available commercially in the U.S.)
  • the IAS computer, 1952, Institute for Advanced Study, Princeton University, Julian Bigelow, Arthur Burks, Herman Goldstine, von Neumann, and others (thanks to von Neumann's publishing the specifications of the IAS machine, it became the model for a group of computers known as the Princeton Class machines; the IAS computer was also a strong influence on the IBM 701)
  • IBM 701, 1952, International Business Machine's first mass-produced electronic stored-program computer.

The EDVAC and ACE proposals both advocated the use of mercury-filled tubes, called ‘delay lines’, for high-speed internal memory. This form of memory is known as acoustic memory. Delay lines had initially been developed for echo cancellation in radar; the idea of using them as memory devices originated with Eckert at the Moore School. Here is Turing's description:

It is proposed to build "delay line" units consisting of mercury … tubes about 5′ long and 1″ in diameter in contact with a quartz crystal at each end. The velocity of sound in … mercury … is such that the delay will be 1.024 ms. The information to be stored may be considered to be a sequence of 1024 ‘digits’ (0 or 1) … These digits will be represented by a corresponding sequence of pulses. The digit 0 … will be represented by the absence of a pulse at the appropriate time, the digit 1 … by its presence. This series of pulses is impressed on the end of the line by one piezo-crystal, it is transmitted down the line in the form of supersonic waves, and is reconverted into a varying voltage by the crystal at the far end. This voltage is amplified sufficiently to give an output of the order of 10 volts peak to peak and is used to gate a standard pulse generated by the clock. This pulse may be again fed into the line by means of the transmitting crystal, or we may feed in some altogether different signal. We also have the possibility of leading the gated pulse to some other part of the calculator, if we have need of that information at the time. Making use of the information does not of course preclude keeping it also. (Turing [1945], p. 375)

Mercury delay line memory was used in EDSAC, BINAC, SEAC, Pilot Model ACE, EDVAC, DEUCE, and full-scale ACE (1958). The chief advantage of the delay line as a memory medium was, as Turing put it, that delay lines were "already a going concern" (Turing [1947], p. 380). The fundamental disadvantages of the delay line were that random access is impossible and, moreover, the time taken for an instruction, or number, to emerge from a delay line depends on where in the line it happens to be.

In order to minimize waiting-time, Turing arranged for instructions to be stored not in consecutive positions in the delay line, but in relative positions selected by the programmer in such a way that each instruction would emerge at exactly the time it was required, in so far as this was possible. Each instruction contained a specification of the location of the next. This system subsequently became known as ‘optimum coding’. It was an integral feature of every version of the ACE design. Optimum coding made for difficult and untidy programming, but the advantage in terms of speed was considerable. Thanks to optimum coding, the Pilot Model ACE was able to do a floating point multiplication in 3 milliseconds (Wilkes's EDSAC required 4.5 milliseconds to perform a single fixed point multiplication).

In the Williams tube or electrostatic memory, previously mentioned, a two-dimensional rectangular array of binary digits was stored on the face of a commercially-available cathode ray tube. Access to data was immediate. Williams tube memories were employed in the Manchester series of machines, SWAC, the IAS computer, and the IBM 701, and a modified form of Williams tube in Whirlwind I (until replacement by magnetic core in 1953).

Drum memories, in which data was stored magnetically on the surface of a metal cylinder, were developed on both sides of the Atlantic. The initial idea appears to have been Eckert's. The drum provided reasonably large quantities of medium-speed memory and was used to supplement a high-speed acoustic or electrostatic memory. In 1949, the Manchester computer was successfully equipped with a drum memory; this was constructed by the Manchester engineers on the model of a drum developed by Andrew Booth at Birkbeck College, London.

The final major event in the early history of electronic computation was the development of magnetic core memory. Jay Forrester realised that the hysteresis properties of magnetic core (normally used in transformers) lent themselves to the implementation of a three-dimensional solid array of randomly accessible storage points. In 1949, at Massachusetts Institute of Technology, he began to investigate this idea empirically. Forrester's early experiments with metallic core soon led him to develop the superior ferrite core memory. Digital Equipment Corporation undertook to build a computer similar to the Whirlwind I as a test vehicle for a ferrite core memory. The Memory Test Computer was completed in 1953. (This computer was used in 1954 for the first simulations of neural networks, by Belmont Farley and Wesley Clark of MIT's Lincoln Laboratory (see Copeland and Proudfoot [1996]).

Once the absolute reliability, relative cheapness, high capacity and permanent life of ferrite core memory became apparent, core soon replaced other forms of high-speed memory. The IBM 704 and 705 computers (announced in May and October 1954, respectively) brought core memory into wide use.

Works Cited

  • Babbage, C. (ed. by Campbell-Kelly, M.), 1994, Passages from the Life of a Philosopher , New Brunswick: Rutgers University Press
  • Bennett, S., 1976, ‘F.C. Williams: his contribution to the development of automatic control’, National Archive for the History of Computing, University of Manchester, England. (This is a typescript based on interviews with Williams in 1976.)
  • Bowker, G., and Giordano, R., 1993, ‘Interview with Tom Kilburn’, Annals of the History of Computing , 15 : 17–32.
  • Copeland, B.J. (ed.), 2004, The Essential Turing Oxford University Press
  • Copeland, B.J. (ed.), 2005, Alan Turing's Automatic Computing Engine: The Master Codebreaker's Struggle to Build the Modern Computer Oxford University Press
  • Copeland, B.J. and others, 2006, Colossus: The Secrets of Bletchley Park's Codebreaking Computers Oxford University Press
  • Copeland, B.J., and Proudfoot, D., 1996, ‘On Alan Turing's Anticipation of Connectionism’ Synthese , 108 : 361–377
  • Evans, C., 197?, interview with M.H.A. Newman in ‘The Pioneers of Computing: an Oral History of Computing’, London: Science Museum
  • Fifer, S., 1961, Analog Computation: Theory, Techniques, Applications New York: McGraw-Hill
  • Ford, H., 1919, ‘Mechanical Movement’, Official Gazette of the United States Patent Office , October 7, 1919: 48
  • Goldstine, H., 1972, The Computer from Pascal to von Neumann Princeton University Press
  • Huskey, H.D., 1947, ‘The State of the Art in Electronic Digital Computing in Britain and the United States’, in [Copeland 2005]
  • Newman, M.H.A., 1948, ‘General Principles of the Design of All-Purpose Computing Machines’ Proceedings of the Royal Society of London , series A, 195 (1948): 271–274
  • Randell, B., 1972, ‘On Alan Turing and the Origins of Digital Computers’, in Meltzer, B., Michie, D. (eds), Machine Intelligence 7 , Edinburgh: Edinburgh University Press, 1972
  • Smith, B.C., 1991, ‘The Owl and the Electric Encyclopaedia’, Artificial Intelligence , 47 : 251–288
  • Thomson, J., 1876, ‘On an Integrating Machine Having a New Kinematic Principle’ Proceedings of the Royal Society of London , 24 : 262–5
  • Turing, A.M., 1936, ‘On Computable Numbers, with an Application to the Entscheidungsproblem’ Proceedings of the London Mathematical Society , Series 2, 42 (1936–37): 230–265. Reprinted in The Essential Turing (Copeland [2004]).
  • Turing, A.M, 1945, ‘Proposed Electronic Calculator’, in Alan Turing's Automatic Computing Engine (Copeland [2005])
  • Turing, A.M., 1947, ‘Lecture on the Automatic Computing Engine’, in The Essential Turing (Copeland [2004])
  • Turing, A.M., and Wilkinson, J.H., 1946–7, ‘The Turing-Wilkinson Lecture Series (1946-7)’, in Alan Turing's Automatic Computing Engine (Copeland [2005])
  • von Neumann, J., 1945, ‘First Draft of a Report on the EDVAC’, in Stern, N. From ENIAC to UNIVAC: An Appraisal of the Eckert-Mauchly Computers Bedford, Mass.: Digital Press (1981), pp. 181–246
  • Williams, F.C., 1975, ‘Early Computers at Manchester University’ The Radio and Electronic Engineer , 45 (1975): 237–331
  • Wynn-Williams, C.E., 1932, ‘A Thyratron "Scale of Two" Automatic Counter’ Proceedings of the Royal Society of London , series A, 136 : 312–324

Further Reading

  • Copeland, B.J., 2004, ‘Colossus — Its Origins and Originators’ Annals of the History of Computing , 26 : 38–45
  • Metropolis, N., Howlett, J., Rota, G.C. (eds), 1980, A History of Computing in the Twentieth Century New York: Academic Press
  • Randell, B. (ed.), 1982, The Origins of Digital Computers: Selected Papers Berlin: Springer-Verlag
  • Williams, M.R., 1997, A History of Computing Technology Los Alamitos: IEEE Computer Society Press
How to cite this entry . Preview the PDF version of this entry at the Friends of the SEP Society . Look up topics and thinkers related to this entry at the Internet Philosophy Ontology Project (InPhO). Enhanced bibliography for this entry at PhilPapers , with links to its database.
  • The Turing Archive for the History of Computing
  • The Alan Turing Home Page
  • Australian Computer Museum Society
  • The Bletchley Park Home Page
  • Charles Babbage Institute
  • Computational Logic Group at St. Andrews
  • The Computer Conservation Society (UK)
  • CSIRAC (a.k.a. CSIR MARK I) Home Page
  • Frode Weierud's CryptoCellar
  • Logic and Computation Group at Penn
  • National Archive for the History of Computing
  • National Cryptologic Museum

computability and complexity | recursive functions | Turing, Alan | Turing machines

Copyright © 2006 by B. Jack Copeland < jack . copeland @ canterbury . ac . nz >

  • Accessibility

Support SEP

Mirror sites.

View this site from another server:

  • Info about mirror sites

The Stanford Encyclopedia of Philosophy is copyright © 2023 by The Metaphysics Research Lab , Department of Philosophy, Stanford University

Library of Congress Catalog Data: ISSN 1095-5054

essay about history of computers

The history of computing is both evolution and revolution

essay about history of computers

Head, Department of Computing & Information Systems, The University of Melbourne

Disclosure statement

Justin Zobel does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

University of Melbourne provides funding as a founding partner of The Conversation AU.

View all partners

This month marks the 60th anniversary of the first computer in an Australian university. The University of Melbourne took possession of the machine from CSIRO and on June 14, 1956, the recommissioned CSIRAC was formally switched on. Six decades on, our series Computing turns 60 looks at how things have changed.

It is a truism that computing continues to change our world. It shapes how objects are designed, what information we receive, how and where we work, and who we meet and do business with. And computing changes our understanding of the world around us and the universe beyond.

For example, while computers were initially used in weather forecasting as no more than an efficient way to assemble observations and do calculations, today our understanding of weather is almost entirely mediated by computational models.

Another example is biology. Where once research was done entirely in the lab (or in the wild) and then captured in a model, it often now begins in a predictive model, which then determines what might be explored in the real world.

The transformation that is due to computation is often described as digital disruption . But an aspect of this transformation that can easily be overlooked is that computing has been disrupting itself.

Evolution and revolution

Each wave of new computational technology has tended to lead to new kinds of systems, new ways of creating tools, new forms of data, and so on, which have often overturned their predecessors. What has seemed to be evolution is, in some ways, a series of revolutions.

But the development of computing technologies is more than a chain of innovation – a process that’s been a hallmark of the physical technologies that shape our world.

For example, there is a chain of inspiration from waterwheel, to steam engine, to internal combustion engine. Underlying this is a process of enablement. The industry of steam engine construction yielded the skills, materials and tools used in construction of the first internal combustion engines.

In computing, something richer is happening where new technologies emerge, not only by replacing predecessors, but also by enveloping them. Computing is creating platforms on which it reinvents itself, reaching up to the next platform.

Getting connected

Arguably, the most dramatic of these innovations is the web. During the 1970s and 1980s, there were independent advances in the availability of cheap, fast computing, of affordable disk storage and of networking.

essay about history of computers

Compute and storage were taken up in personal computers, which at that stage were standalone, used almost entirely for gaming and word processing. At the same time, networking technologies became pervasive in university computer science departments, where they enabled, for the first time, the collaborative development of software.

This was the emergence of a culture of open-source development, in which widely spread communities not only used common operating systems, programming languages and tools, but collaboratively contributed to them.

As networks spread, tools developed in one place could be rapidly promoted, shared and deployed elsewhere. This dramatically changed the notion of software ownership, of how software was designed and created, and of who controlled the environments we use.

The networks themselves became more uniform and interlinked, creating the global internet, a digital traffic infrastructure. Increases in computing power meant there was spare capacity for providing services remotely.

The falling cost of disk meant that system administrators could set aside storage to host repositories that could be accessed globally. The internet was thus used not just for email and chat forums (known then as news groups) but, increasingly, as an exchange mechanism for data and code.

This was in strong contrast to the systems used in business at that time, which were customised, isolated, and rigid.

With hindsight, the confluence of networking, compute and storage at the start of the 1990s, coupled with the open-source culture of sharing, seems almost miraculous. An environment ready for something remarkable, but without even a hint of what that thing might be.

The ‘superhighway’

It was to enhance this environment that then US Vice President Al Gore proposed in 1992 the “ information superhighway ”, before any major commercial or social uses of the internet had appeared.

essay about history of computers

Meanwhile, in 1990, researchers at CERN, including Tim Berners-Lee , created a system for storing documents and publishing them to the internet, which they called the world wide web .

As knowledge of this system spread on the internet (transmitted by the new model of open-source software systems), people began using it via increasingly sophisticated browsers. They also began to write documents specifically for online publication – that is, web pages.

As web pages became interactive and resources moved online, the web became a platform that has transformed society. But it also transformed computing.

With the emergence of the web came the decline of the importance of the standalone computer, dependent on local storage.

We all connect

The value of these systems is due to another confluence: the arrival on the web of vast numbers of users. For example, without behaviours to learn from, search engines would not work well, so human actions have become part of the system.

There are (contentious) narratives of ever-improving technology, but also an entirely unarguable narrative of computing itself being transformed by becoming so deeply embedded in our daily lives.

This is, in many ways, the essence of big data. Computing is being fed by human data streams: traffic data, airline trips, banking transactions, social media and so on.

The challenges of the discipline have been dramatically changed by this data, and also by the fact that the products of the data (such as traffic control and targeted marketing) have immediate impacts on people.

Software that runs robustly on a single computer is very different from that with a high degree of rapid interaction with the human world, giving rise to needs for new kinds of technologies and experts, in ways not evenly remotely anticipated by the researchers who created the technologies that led to this transformation.

Decisions that were once made by hand-coded algorithms are now made entirely by learning from data. Whole fields of study may become obsolete.

The discipline does indeed disrupt itself. And as the next wave of technology arrives (immersive environments? digital implants? aware homes?), it will happen again.

  • Computer science
  • Computing turns 60

essay about history of computers

Assistant Editor - 1 year cadetship

essay about history of computers

Program Development Officer - Business Processes

essay about history of computers

Executive Dean, Faculty of Health

essay about history of computers

Lecturer/Senior Lecturer, Earth System Science (School of Science)

essay about history of computers

Sydney Horizon Educators (Identified)

History of Computers

Cite this chapter.

essay about history of computers

  • Bruce I. Blum 2  

49 Accesses

We are in the midst of a revolution in technology and in the way we process information. The computer is at the heart of this revolution. Where it will take us is unknown, but we do know that events are moving rapidly. In fact, virtually everything related to digital computers took place in my generation’s adult lifetime. And I am not old.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Unable to display preview.  Download preview PDF.

For Further Reading

The computer age is sufficiently mature to have fostered an interest in recording its history. The materials in this chapter have been assembled from a variety of sources. The Encyclopedia of Computer Science and Engineering , 2nd edit., edited by Anthony Ralston and Edwin D. Reilly, Jr., Van Nostrand Reinhold, 1983, is a very useful source with overview sections and biographies.

Google Scholar  

A hard to find, but very readable, popular introduction is available in M. Harmon, Stretching Man’s Mind: A History of Data Processing , by Mason Charter, New York, 1975; the discussion of computers, however, is dated.

Paul Freiberger and Michael Swaine, Fire in the Valley: The Making of the Personal Computer , Osborne/McGraw-Hill, 1984, is one of several very readable modern histories.

K. D. Fishman, The Computer Establishment , McGraw-Hill, 1981 contains an interesting review of the business aspects of the industry.

Information Systems in the 80’s , by Ulric Weil, Prentice Hall, 1982, looks at the areas of greatest growth in this decade. Finally, I note that some new history texts—which I have not reviewed—are in progress or recently published; rich and up to date reading resources should be available soon.

For persons interested in researching the history of computing, B. Randell has selected an outstanding collection of source materials in The Origins of Digital Computers , third edit, Springer-Verlag, 1982.

A second collection of papers has been edited by N. Metropolis, J. Howlett and G. Rota in A History of Computing in the Twentieth Century , Academic Press, 1981.

Finally, J. W. Cortada has compiled An Annotated Bibliography on the History of Data Processing , Greenwood Press, 1983.

Among the more popular current works that deal with a limited area are Tracy Kidder, The Soul of a New Machine , Little, Brown and Co., 1981

and S. Levy, Hackers: Heroes of the Computer Revolution , Doubleday & Co., 1984.

Papers and articles dealing with the history of computing appear from time to time. AFIPS publishes a journal, the Annals of the History of Computing , and the ACM has sponsored R. L. Wexelblat (ed) A History of Programming Languages , Academic Press, 1981.

Among the more accessible review papers are S. Rosen, Electronic Computers: A Historical Survey, ACM Computer Surveys (1,1), 1969; the January 3, 1983 issue of Time which made the computer the Machine of the Year, the October 1982 issue of the National Geographic which was devoted to the chip and the September, 1984 issue of Scientific America dedicated to computer software. Biographical articles periodically appear in magazines such as Scientific American and Datamation.

Cited in Margaret Harmon, Stretching Man’s Mind: A History of Data Processing , Mason Charter, New York, 1975, p66, reprinted by permission. All rights reserved.

Cited in Harmon, Op cit , p74.

Cited in Harmon, Op cit , p85.

Cited in Harmon, Op cit , p87.

Cited in Harmon, Op cit , p103.

Adapted from S. Rosen, Digital Computers: History in Anthony Ralston, Edwin D. Reilly, Jr. (eds), Encyclopedia of Computer Science and Engineering , Second Edition, Van Nostrand Reinhold Co., New York, 1983, pp538–9.

Conversation: Jay W. Forrester, Interviewed by Christopher Evans, Annals of the History of Computing (5, 3) July 1983, pp298–299.

Article   Google Scholar  

John von Neumann, The Computer and the Brain , Yale University Press, 1958

Cited in J. Bernstein, The Analytical Engine , Random House, N.Y. 1966, p62, reprinted by permission.

Download references

Author information

Authors and affiliations.

Applied Physics Laboratory, The Johns Hopkins University, 20707, Laurel, Maryland, USA

Bruce I. Blum

You can also search for this author in PubMed   Google Scholar

Rights and permissions

Reprints and permissions

Copyright information

© 1986 Springer Science+Business Media New York

About this chapter

Blum, B.I. (1986). History of Computers. In: Clinical Information Systems. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-662-26537-6_1

Download citation

DOI : https://doi.org/10.1007/978-3-662-26537-6_1

Publisher Name : Springer, Berlin, Heidelberg

Print ISBN : 978-3-540-96190-1

Online ISBN : 978-3-662-26537-6

eBook Packages : Springer Book Archive

Share this chapter

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Publish with us

Policies and ethics

  • Find a journal
  • Track your research
  • Trending Now
  • Foundational Courses
  • Data Science
  • Practice Problem
  • Machine Learning
  • System Design
  • DevOps Tutorial

History of Computers

  • History of Web Browsers
  • Computer Memory
  • Full Form of Computer
  • Uses of Computer Network
  • First Generation Of Computer
  • Fifth Generation of Computers
  • Types of Computer Networks
  • Fourth Generation of Computers
  • What is Computer Networking?
  • Who Invented Computer?
  • Types of Computers
  • History of Cyber Security
  • Third Generation of Computers
  • What is a Computer Program?
  • Types of Computer Keyboard
  • History of Compiler
  • History of C++
  • History of Cloud Computing
  • History of DBMS
  • History of Git
  • History of Cryptography
  • Elements of Computer Network
  • What is Computer
  • What is a Computer?
  • History of Android
  • What is Computer Virus ?
  • Role of Computers in Crime
  • Basics of Computer Networking

Before computers were developed people used sticks, stones, and bones as counting tools. As technology advanced and the human mind improved with time more computing devices were developed like Abacus, Napier’s Bones, etc. These devices were used as computers for performing mathematical computations but not very complex ones. 

Some of the popular computing devices are described below, starting from the oldest to the latest or most advanced technology developed:

Around 4000 years ago, the Chinese invented the Abacus, and it is believed to be the first computer. The history of computers begins with the birth of the abacus.

Structure: Abacus is basically a wooden rack that has metal rods with beads mounted on them.

Working of abacus: In the abacus, the beads were moved by the abacus operator according to some rules to perform arithmetic calculations. In some countries like China, Russia, and Japan, the abacus is still used by their people.

Napier’s Bones

Napier’s Bones was a manually operated calculating device and as the name indicates, it was invented by John Napier. In this device, he used 9 different ivory strips (bones) marked with numbers to multiply and divide for calculation. It was also the first machine to use the decimal point system for calculation.

It is also called an Arithmetic Machine or Adding Machine. A French mathematician-philosopher Blaise Pascal invented this between 1642 and 1644. It was the first mechanical and automatic calculator. It is invented by Pascal to help his father, a tax accountant in his work or calculation. It could perform addition and subtraction in quick time. It was basically a wooden box with a series of gears and wheels. It is worked by rotating wheel like when a wheel is rotated one revolution, it rotates the neighbouring wheel and a series of windows is given on the top of the wheels to read the totals.

Stepped Reckoner or Leibniz wheel

A German mathematician-philosopher Gottfried Wilhelm Leibniz in 1673 developed this device by improving Pascal’s invention to develop this machine. It was basically a digital mechanical calculator, and it was called the stepped reckoner as it was made of fluted drums instead of gears (used in the previous model of Pascaline).

Difference Engine

Charles Babbage who is also known as the “Father of Modern Computer” designed the Difference Engine in the early 1820s. Difference Engine was a mechanical computer which is capable of performing simple calculations. It works with help of steam as it was a steam-driven calculating machine, and it was designed to solve tables of numbers like logarithm tables.

Analytical Engine

Again in 1830 Charles Babbage developed another calculating machine which was Analytical Engine. Analytical Engine was a mechanical computer that used punch cards as input. It was capable of performing or solving any mathematical problem and storing information as a permanent memory (storage).

Tabulating Machine

Herman Hollerith, an American statistician invented this machine in the year 1890. Tabulating Machine was a mechanical tabulator that was based on punch cards. It was capable of tabulating statistics and record or sort data or information. This machine was used by U.S. Census in the year 1890. Hollerith’s Tabulating Machine Company was started by Hollerith and this company later became International Business Machine (IBM) in the year 1924.

Differential Analyzer

Differential Analyzer was the first electronic computer introduced in the year 1930 in the United States. It was basically an analog device that was invented by Vannevar Bush. This machine consists of vacuum tubes to switch electrical signals to perform calculations. It was capable of doing 25 calculations in a few minutes.

In the year 1937, major changes began in the history of computers when Howard Aiken planned to develop a machine that could perform large calculations or calculations involving large numbers. In the year 1944, Mark I computer was built as a partnership between IBM and Harvard. It was also the first programmable digital computer marking a new era in the computer world.

Generations of Computers

First Generation Computers

In the period of the year 1940-1956, it was referred to as the period of the first generation of computers. These machines are slow, huge, and expensive. In this generation of computers, vacuum tubes were used as the basic components of CPU and memory. Also, they were mainly dependent on the batch operating systems and punch cards. Magnetic tape and paper tape were used as output and input devices. For example ENIAC, UNIVAC-1, EDVAC, etc.

Second Generation Computers

In the period of the year, 1957-1963 was referred to as the period of the second generation of computers. It was the time of the transistor computers. In the second generation of computers, transistors (which were cheap in cost) are used. Transistors are also compact and consume less power. Transistor computers are faster than first-generation computers. For primary memory, magnetic cores were used, and for secondary memory magnetic disc and tapes for storage purposes. In second-generation computers, COBOL and FORTRAN are used as Assembly language and programming languages, and Batch processing and multiprogramming operating systems were used in these computers.

For example IBM 1620, IBM 7094, CDC 1604, CDC 3600, etc.

Third Generation Computers

In the third generation of computers, integrated circuits (ICs) were used instead of transistors(in the second generation). A single IC consists of many transistors which increased the power of a computer and also reduced the cost. The third generation computers are more reliable, efficient, and smaller in size. It used remote processing, time-sharing, and multiprogramming as operating systems. FORTRON-II TO IV, COBOL, and PASCAL PL/1 were used which are high-level programming languages.

For example IBM-360 series, Honeywell-6000 series, IBM-370/168, etc.

Fourth Generation Computers

The period of 1971-1980 was mainly the time of fourth generation computers. It used VLSI(Very Large Scale Integrated) circuits. VLSI is a chip containing millions of transistors and other circuit elements and because of these chips, the computers of this generation are more compact, powerful, fast, and affordable(low in cost). Real-time, time-sharing and distributed operating system are used by these computers. C and C++ are used as the programming languages in this generation of computers.

For example STAR 1000, PDP 11, CRAY-1, CRAY-X-MP, etc.

Fifth Generation Computers

From 1980 – to till date these computers are used. The ULSI (Ultra Large Scale Integration) technology is used in fifth-generation computers instead of the VLSI technology of fourth-generation computers. Microprocessor chips with ten million electronic components are used in these computers. Parallel processing hardware and AI (Artificial Intelligence) software are also used in fifth-generation computers. The programming languages like C, C++, Java, .Net, etc. are used.

For example Desktop, Laptop, NoteBook, UltraBook, etc.

Sample Questions

Let us now see some sample questions on the History of computers:

Question 1: Arithmetic Machine or Adding Machine is used between ___________ years.

a. 1642 and 1644

b. Around 4000 years ago

c. 1946 – 1956

d. None of the above

Solution:  

a. 1642 and 1644 Explanation: Pascaline is also called as Arithmetic Machine or Adding Machine. A French mathematician-philosopher Blaise Pascal invented this between 1642 and 1644. 

Question 2: Who designed the Difference Engine?

a. Blaise Pascal

b. Gottfried Wilhelm Leibniz 

c. Vannevar Bush

d. Charles Babbage 

Solution: 

d. Charles Babbage  Explanation: Charles Babbage who is also known as “Father of Modern Computer” designed the Difference Engine in the early 1820s.

Question 3: In second generation computers _______________ are used as Assembly language and programming languages.

a. C and C++.

b. COBOL and FORTRAN 

c. C and .NET

d. None of the above.

b. COBOL and FORTRAN  Explanation: In second generation computers COBOL and FORTRAN are used as Assembly language and programming languages, and Batch processing and multiprogramming operating systems were used in these computers.

Question 4: ENIAC and UNIVAC-1 are examples of which generation of computers?

a. First generation of computers.

b. Second generation of computers. 

c. Third generation of computers. 

d. Fourth generation of computers.  

a. First-generation of computers. Explanation: ENIAC, UNIVAC-1, EDVAC, etc. are examples of the first generation of computers.

Question 5: The ______________ technology is used in fifth generation computers .

a. ULSI (Ultra Large Scale Integration)

b. VLSI( very large scale integrated)

c. vacuum tubes

d. All of the above

a. ULSI (Ultra Large Scale Integration) Explanation: From 1980 -to till date these computers are used. The ULSI (Ultra Large Scale Integration) technology is used in fifth generation computers. 

Please Login to comment...

Similar reads.

  • School Learning
  • School Programming

advertisewithusBannerImg

Improve your Coding Skills with Practice

 alt=

What kind of Experience do you want to share?

Computers: The History of Invention and Development Essay

The invention of the computer in 1948 is often regarded as the beginning of the digital revolution. It is hard to disagree that computers have indeed penetrated into the lives of people have changed them once and for all. Computer technologies have affected every single sphere of human activities starting from entertainment and ending with work and education. They facilitate the work of any enterprise, they are of great assistance for scientists in laboratories, they make it possible to diagnose diseases much faster, they control the work of ATMs, and help the banks to function properly. The first computers occupied almost the whole room and were very slow in processing data and performance in general. The modern world witnesses the development of computer technologies daily with computers turning into tiny machines and working unbelievably smoothly. A computer is now trusted as a best friend and advisor. It is treated as a reliable machine able to process and store a large amount of data and help out in any situation. “The storage, retrieval, and use of information are more important than ever” since “(w)e are in the midst of a profound change, going from hardcopy storage to online storage of the collected knowledge of the human race” (Dave, 2007), which is why the computers are of great assistance to us. However, to become a successful person, it is not enough to simply have a computer at home. It is often the case that people use computers merely to play games without knowing about the wide range of activities they may engage a person in. One has to know more about computers and use all their capabilities for one’s own benefit. Knowing the capabilities of one’s computer can help in the work and educational process, as well as it can save time and money. In this essay, you will find out reasons as to why it is important to know your computer; and how much time and money you will save by using all the capabilities of your computer.

What should be mentioned above all is that knowing one’s computer perfectly gives an opportunity of using it for the most various purposes. It depends on what exactly a person needs a computer for, in other words, whether it is needed for studying, for work, or for entertainment. Using a computer for work or education purposes involves much more than is required for playing computer games. These days most of the students are permitted to submit only typed essays, research papers, and other works, which makes mastering the computer vital. “Information technologies have played a vital role in higher education for decades” (McArthur & Lewis, n.d.); they contributed and still continue to contribute to students’ gaining knowledge from outside sources by means of using the World Wide Web where information is easily accessible and available for everyone. To have access to this information one has to know how to use a computer and to develop certain skills for this. These skills should include, first of all, using a Web browser. “In 1995, Microsoft invented a competing Web browser called Microsoft Internet Explorer” (Walter, n.d.), but there exist other browsers the choice of which depends on the user. Moreover, knowing different search engines (for instance, Google, Yahoo, etc,) is required; the user should also be able to process, analyze, and group similar sources by means of extracting the most relevant information. At this, the user is supposed to know that not all Internet sources should be trusted, especially when the information is gathered for a research paper. Trusting the information presented in ad banners is unwise for their main purpose is attracting the users’ attention. They may contain false or obsolete data misleading the user. Utilizing the information obtained from the Internet for scholarly works, one should remember about plagiarism or responsibility for copying somebody else’s works. Students who use such information should cite it properly and refer to the works of other scholars rather than simply stealing their ideas. Plagiarism is punishable and may result in dropping out of school or college. This testifies to the fact that using a computer for studies demands the acquisition of certain computer programs and practice in working with them, which would give a perfect idea on how to search and process the information needed for completion of different assignments.

What’s more, knowing a computer for work is no less important. Mastering certain computer programs depend on the type of work. Any prestigious work demands a definite level of computer skills from the basic to the advanced one. The work of a company involves sometimes more than using standard computer programs; the software is usually designed specifically for the company depending on the business’s application. This means that acquisition of a special program may be needed and a new worker will have to complete computer courses and gain knowledge on a particular program. Nevertheless, the knowledge of basic computer programs is crucial for getting a job one desires. Since the work of most companies is computerized, one will need to deal with a computer anyways and the skills obtained while playing computer games will not suffice. A person seeking a job should be a confident user of basic computer programs, such as Microsoft Office Word, Microsoft Office Excel, Internet Explorer (or other browsers), etc. A confident user is also supposed to know what to do with the computer when some malfunctions arise. Of course, each company has system administrators who deal with computer defects but minor problems are usually born by the users themselves. Apart from knowing the computer, a person should be aware of the policy of using it in the office. For instance, some companies prohibit using office computers for personal purposes, especially when it comes to downloading software and installing it on the computer without notifying the system administrator. This may be connected either with the fact that incorrectly installed software may harm the system of the computer in general or, if the software has been downloaded from the Internet, it may contain spyware which makes the information from your computer accessible for other users. This can hardly be beneficial for the company dealing with economic, political, governmental, or any other kind of issues. Therefore, knowing a computer is necessary for getting a prestigious job and ensuring proper and safe performance of the company one is working for.

And finally, using all the capabilities of a computer can save time and money. Firstly, a personal computer has a number of tools which facilitate people’s life. Special software, for instance, Microsoft Money, makes it possible to plan the budget, to discover faults in the plan, and correct it easily without having to rewrite it from the beginning; the program itself can manage financial information provided by the user and balance checkbooks in addition. Such computer tools as word processors enable the users to make corrections at any stage of the work; moreover by means of them, one may change the size of letters and overall design of the work to give it a better look. Mapping programs can also be useful; by means of a computer one may install such a program (GPS) into the car; the program then will take care about planning the route avoiding traffic jams and choosing the shortest ways. Secondly, electronic mail allows keeping in touch with people not only in your country but abroad. It is cheaper and much faster than writing letters or communicating over the telephone when the connection is often of low quality and the conversation is constantly interrupted. Most telephone companies are aimed at getting profits from people’s communication with their friends and relatives whereas electronic mail is almost free; all that one needs to do is to pay a monthly fee to the Internet Service Provider. Eventually, computer users have an opportunity to do shopping without leaving the apartment; the choice of the products one may want to buy is practically unlimited and the user can always find recommendations from those people who already purchased the product. A personal computer can also help to save money due to its being multifunctional. Knowing much about the capabilities of the computer, one may start using it as a TV set watching favorite programs online, and as a Playstation playing the same games on the personal computer. Not only can a user watch favorite TV shows by means of his/her computer, but can download them at various torrent sites for free. Using a PC to send faxes through online fax services saves money for one does not have to buy a fax machine and to use an additional telephone line; it also saves paper and ink which one would have to buy otherwise.

Taking into consideration everything mentioned above, it can be stated that knowing a computer is important for it can make people’s life much easier. Firstly, computers are helpful in getting an education since by means of them the students can find any possible information necessary for writing research papers and other kinds of written assignments. To do this, a student needs to know how to search the Internet and to process the information he/she can find there. Secondly, knowing a computer raises one’s chances of getting a good job because most of the companies look for employees with a sufficient level of computer skills. When working for a company one should also remember about its policy regarding the use of computer for personal purposes and be able to cope with minor problems arising in the course of work with the computer. Finally, a computer allows saving time and money. It saves the users’ time due to utilizing such tools as word processors, budget planning, and mapping programs which facilitate the users’ life. The computer can also save money serving as a TV, fax, and Playstation giving access to TV shows, online fax services, and allowing playing video games without buying special devices for this.

McArthur, D., Lewis, W.M., ND. Web.

Moursund, D. (2007). A College Student’s Guide to Computers in Education . Web.

Walter, R. ND. The Secret Guide to Computers . Web.

  • Chicago (A-D)
  • Chicago (N-B)

IvyPanda. (2021, December 3). Computers: The History of Invention and Development. https://ivypanda.com/essays/computers-the-history-of-invention-and-development/

"Computers: The History of Invention and Development." IvyPanda , 3 Dec. 2021, ivypanda.com/essays/computers-the-history-of-invention-and-development/.

IvyPanda . (2021) 'Computers: The History of Invention and Development'. 3 December.

IvyPanda . 2021. "Computers: The History of Invention and Development." December 3, 2021. https://ivypanda.com/essays/computers-the-history-of-invention-and-development/.

1. IvyPanda . "Computers: The History of Invention and Development." December 3, 2021. https://ivypanda.com/essays/computers-the-history-of-invention-and-development/.

Bibliography

IvyPanda . "Computers: The History of Invention and Development." December 3, 2021. https://ivypanda.com/essays/computers-the-history-of-invention-and-development/.

  • Inventory Management Systems: Equipment & Workflow
  • Social Media for Strategic Business Communication
  • Computer and Information Tech Program in Education
  • Smart Dubai: Creating a Paperless Organization
  • Business Law: Validity of Acceptance
  • Successful and Unsuccessful First-Movers
  • The Innovator’s Dilemma: Open Innovation or Discontinuous Innovation
  • Centre for Disease Control (CDC) Communication Plan
  • Behavior. “The Gender Blur” by Blum and “The Tipping Point” by Gladwell
  • Microsoft Tips and Tricks
  • Resource Description and Access (RDA) in Library
  • Macintosh vs. IBM for Personal Usage
  • Analogical Reasoning in Computer Ethics
  • Anti-Trust Cases: International Business Machine Corporation
  • Satisfaction With a Transitional Nursing Home Project
  • History of Computers

When we study the many aspects of computing and computers, it is important to know about the history of computers. Charles Babbage designed an Analytical Engine which was a general computer   It helps us understand the growth and progress of technology through the times. It is also an important topic for competitive and banking exams.

Suggested Videos

What is a computer.

A computer is an electronic machine that collects information, stores it, processes it according to user instructions, and then returns the result.

A computer is a programmable electronic device that performs arithmetic and logical operations automatically using a set of instructions provided by the user.

Early Computing Devices

People used sticks, stones, and bones as counting tools before computers were invented. More computing devices were produced as technology advanced and the human intellect improved over time. Let us look at a few of the early-age computing devices used by mankind.

Abacus was invented by the Chinese around 4000 years ago. It’s a wooden rack with metal rods with beads attached to them. The abacus operator moves the beads according to certain guidelines to complete arithmetic computations.

  • Napier’s Bone

John Napier devised Napier’s Bones, a manually operated calculating apparatus. For calculating, this instrument used 9 separate ivory strips (bones) marked with numerals to multiply and divide. It was also the first machine to calculate using the decimal point system.

Pascaline was invented in 1642 by Biaise Pascal, a French mathematician and philosopher. It is thought to be the first mechanical and automated calculator. It was a wooden box with gears and wheels inside.

  • Stepped Reckoner or Leibniz wheel

In 1673, a German mathematician-philosopher named Gottfried Wilhelm Leibniz improved on Pascal’s invention to create this apparatus. It was a digital mechanical calculator known as the stepped reckoner because it used fluted drums instead of gears.

  • Difference Engine

In the early 1820s, Charles Babbage created the Difference Engine. It was a mechanical computer that could do basic computations. It was a steam-powered calculating machine used to solve numerical tables such as logarithmic tables.

  • Analytical Engine 

Charles Babbage created another calculating machine, the Analytical Engine, in 1830. It was a mechanical computer that took input from punch cards. It was capable of solving any mathematical problem and storing data in an indefinite memory.

  • Tabulating machine 

An American Statistician – Herman Hollerith invented this machine in the year 1890. Tabulating Machine was a punch card-based mechanical tabulator. It could compute statistics and record or sort data or information. Hollerith began manufacturing these machines in his company, which ultimately became International Business Machines (IBM) in 1924.

  • Differential Analyzer 

Vannevar Bush introduced the first electrical computer, the Differential Analyzer, in 1930. This machine is made up of vacuum tubes that switch electrical impulses in order to do calculations. It was capable of performing 25 calculations in a matter of minutes.

Howard Aiken planned to build a machine in 1937 that could conduct massive calculations or calculations using enormous numbers. The Mark I computer was constructed in 1944 as a collaboration between IBM and Harvard.

History of Computers Generation

The word ‘computer’ has a very interesting origin. It was first used in the 16th century for a person who used to compute, i.e. do calculations. The word was used in the same sense as a noun until the 20th century. Women were hired as human computers to carry out all forms of calculations and computations.

By the last part of the 19th century, the word was also used to describe machines that did calculations. The modern-day use of the word is generally to describe programmable digital devices that run on electricity.

Early History of Computer

Since the evolution of humans, devices have been used for calculations for thousands of years. One of the earliest and most well-known devices was an abacus. Then in 1822, the father of computers, Charles Babbage began developing what would be the first mechanical computer. And then in 1833 he actually designed an Analytical Engine which was a general-purpose computer. It contained an ALU, some basic flow chart principles and the concept of integrated memory.

Then more than a century later in the history of computers, we got our first electronic computer for general purpose. It was the ENIAC, which stands for Electronic Numerical Integrator and Computer. The inventors of this computer were John W. Mauchly and J.Presper Eckert.

And with times the technology developed and the computers got smaller and the processing got faster. We got our first laptop in 1981 and it was introduced by Adam Osborne and EPSON.

Browse more Topics under Basics Of Computers

  • Number Systems
  • Number System Conversions

Generations of Computers

  • Computer Organisation
  • Computer Memory
  • Computers Abbreviations
  • Basic Computer Terminology
  • Computer Languages
  • Basic Internet Knowledge and Protocols
  • Hardware and Software
  • Keyboard Shortcuts
  • I/O Devices
  • Practice Problems On Basics Of Computers

In the history of computers, we often refer to the advancements of modern computers as the generation of computers . We are currently on the fifth generation of computers. So let us look at the important features of these five generations of computers.

  • 1st Generation: This was from the period of 1940 to 1955. This was when machine language was developed for the use of computers. They used vacuum tubes for the circuitry. For the purpose of memory, they used magnetic drums. These machines were complicated, large, and expensive. They were mostly reliant on batch operating systems and punch cards. As output and input devices, magnetic tape and paper tape were implemented. For example, ENIAC, UNIVAC-1, EDVAC, and so on.
  • 2nd Generation:  The years 1957-1963 were referred to as the “second generation of computers” at the time. In second-generation computers, COBOL and FORTRAN are employed as assembly languages and programming languages. Here they advanced from vacuum tubes to transistors. This made the computers smaller, faster and more energy-efficient. And they advanced from binary to assembly languages. For instance, IBM 1620, IBM 7094, CDC 1604, CDC 3600, and so forth.
  • 3rd Generation: The hallmark of this period (1964-1971) was the development of the integrated circuit.  A single integrated circuit (IC) is made up of many transistors, which increases the power of a computer while simultaneously lowering its cost. These computers were quicker, smaller, more reliable, and less expensive than their predecessors. High-level programming languages such as FORTRON-II to IV, COBOL, and PASCAL PL/1 were utilized. For example, the IBM-360 series, the Honeywell-6000 series, and the IBM-370/168.
  • 4th Generation: The invention of the microprocessors brought along the fourth generation of computers. The years 1971-1980 were dominated by fourth generation computers. C, C++ and Java were the programming languages utilized in this generation of computers. For instance, the STAR 1000, PDP 11, CRAY-1, CRAY-X-MP, and Apple II. This was when we started producing computers for home use.
  • 5th Generation:  These computers have been utilized since 1980 and continue to be used now. This is the present and the future of the computer world. The defining aspect of this generation is artificial intelligence. The use of parallel processing and superconductors are making this a reality and provide a lot of scope for the future. Fifth-generation computers use ULSI (Ultra Large Scale Integration) technology. These are the most recent and sophisticated computers. C, C++, Java,.Net, and more programming languages are used. For instance, IBM, Pentium, Desktop, Laptop, Notebook, Ultrabook, and so on.

Brief History of Computers

The naive understanding of computation had to be overcome before the true power of computing could be realized. The inventors who worked tirelessly to bring the computer into the world had to realize that what they were creating was more than just a number cruncher or a calculator. They had to address all of the difficulties associated with inventing such a machine, implementing the design, and actually building the thing. The history of the computer is the history of these difficulties being solved.

19 th Century

1801 – Joseph Marie Jacquard, a weaver and businessman from France, devised a loom that employed punched wooden cards to automatically weave cloth designs.

1822 – Charles Babbage, a mathematician, invented the steam-powered calculating machine capable of calculating number tables. The “Difference Engine” idea failed owing to a lack of technology at the time.

1848 – The world’s first computer program was written by Ada Lovelace, an English mathematician. Lovelace also includes a step-by-step tutorial on how to compute Bernoulli numbers using Babbage’s machine.

1890 – Herman Hollerith, an inventor, creates the punch card technique used to calculate the 1880 U.S. census. He would go on to start the corporation that would become IBM.

Early 20 th Century

1930 – Differential Analyzer was the first large-scale automatic general-purpose mechanical analogue computer invented and built by Vannevar Bush.

1936 – Alan Turing had an idea for a universal machine, which he called the Turing machine, that could compute anything that could be computed.

1939 – Hewlett-Packard was discovered in a garage in Palo Alto, California by Bill Hewlett and David Packard.

1941 – Konrad Zuse, a German inventor and engineer, completed his Z3 machine, the world’s first digital computer. However, the machine was destroyed during a World War II bombing strike on Berlin.

1941 – J.V. Atanasoff and graduate student Clifford Berry devise a computer capable of solving 29 equations at the same time. The first time a computer can store data in its primary memory.

1945 – University of Pennsylvania academics John Mauchly and J. Presper Eckert create an Electronic Numerical Integrator and Calculator (ENIAC). It was Turing-complete and capable of solving “a vast class of numerical problems” by reprogramming, earning it the title of “Grandfather of computers.”

1946 – The UNIVAC I (Universal Automatic Computer) was the first general-purpose electronic digital computer designed in the United States for corporate applications.

1949 – The Electronic Delay Storage Automatic Calculator (EDSAC), developed by a team at the University of Cambridge, is the “first practical stored-program computer.”

1950 – The Standards Eastern Automatic Computer (SEAC) was built in Washington, DC, and it was the first stored-program computer completed in the United States.

Late 20 th Century

1953 – Grace Hopper, a computer scientist, creates the first computer language, which becomes known as COBOL, which stands for CO mmon, B usiness- O riented L anguage. It allowed a computer user to offer the computer instructions in English-like words rather than numbers.

1954 – John Backus and a team of IBM programmers created the FORTRAN programming language, an acronym for FOR mula TRAN slation. In addition, IBM developed the 650.

1958 – The integrated circuit, sometimes known as the computer chip, was created by Jack Kirby and Robert Noyce.

1962 – Atlas, the computer, makes its appearance. It was the fastest computer in the world at the time, and it pioneered the concept of “virtual memory.”

1964 – Douglas Engelbart proposes a modern computer prototype that combines a mouse and a graphical user interface (GUI).

1969 – Bell Labs developers, led by Ken Thompson and Dennis Ritchie, revealed UNIX, an operating system developed in the C programming language that addressed program compatibility difficulties.

1970 – The Intel 1103, the first Dynamic Access Memory (DRAM) chip, is unveiled by Intel.

1971 – The floppy disc was invented by Alan Shugart and a team of IBM engineers. In the same year, Xerox developed the first laser printer, which not only produced billions of dollars but also heralded the beginning of a new age in computer printing.

1973 – Robert Metcalfe, a member of Xerox’s research department, created Ethernet, which is used to connect many computers and other gear.

1974 – Personal computers were introduced into the market. The first were the Altair Scelbi & Mark-8, IBM 5100, and Radio Shack’s TRS-80.

1975 – Popular Electronics magazine touted the Altair 8800 as the world’s first minicomputer kit in January. Paul Allen and Bill Gates offer to build software in the BASIC language for the Altair.

1976 – Apple Computers is founded by Steve Jobs and Steve Wozniak, who expose the world to the Apple I, the first computer with a single-circuit board.

1977 – At the first West Coast Computer Faire, Jobs and Wozniak announce the Apple II. It has colour graphics and a cassette drive for storing music.

1978 – The first computerized spreadsheet program, VisiCalc, is introduced.

1979 – WordStar, a word processing tool from MicroPro International, is released.

1981 – IBM unveils the Acorn, their first personal computer, which has an Intel CPU, two floppy drives, and a colour display. The MS-DOS operating system from Microsoft is used by Acorn.

1983 – The CD-ROM, which could carry 550 megabytes of pre-recorded data, hit the market. This year also saw the release of the Gavilan SC, the first portable computer with a flip-form design and the first to be offered as a “laptop.”

1984 – Apple launched Macintosh during the Superbowl XVIII commercial. It was priced at $2,500

1985 – Microsoft introduces Windows, which enables multitasking via a graphical user interface. In addition, the programming language C++ has been released.

1990 – Tim Berners-Lee, an English programmer and scientist, creates HyperText Markup Language, widely known as HTML. He also coined the term “WorldWideWeb.” It includes the first browser, a server, HTML, and URLs.

1993 – The Pentium CPU improves the usage of graphics and music on personal computers.

1995 – Microsoft’s Windows 95 operating system was released. A $300 million promotional campaign was launched to get the news out. Sun Microsystems introduces Java 1.0, followed by Netscape Communications’ JavaScript.

1996 – At Stanford University, Sergey Brin and Larry Page created the Google search engine.

1998 – Apple introduces the iMac, an all-in-one Macintosh desktop computer. These PCs cost $1,300 and came with a 4GB hard drive, 32MB RAM, a CD-ROM, and a 15-inch monitor.

1999 – Wi-Fi, an abbreviation for “wireless fidelity,” is created, originally covering a range of up to 300 feet.

21 st Century

2000 – The USB flash drive is first introduced in 2000. They were speedier and had more storage space than other storage media options when used for data storage.

2001 – Apple releases Mac OS X, later renamed OS X and eventually simply macOS, as the successor to its conventional Mac Operating System.

2003 – Customers could purchase AMD’s Athlon 64, the first 64-bit CPU for consumer computers.

2004 – Facebook began as a social networking website.

2005 – Google acquires Android, a mobile phone OS based on Linux.

2006 – Apple’s MacBook Pro was available. The Pro was the company’s first dual-core, Intel-based mobile computer.

Amazon Web Services, including Amazon Elastic Cloud 2 (EC2) and Amazon Simple Storage Service, were also launched (S3)

2007 – The first iPhone was produced by Apple, bringing many computer operations into the palm of our hands. Amazon also released the Kindle, one of the first electronic reading systems, in 2007.

2009 – Microsoft released Windows 7.

2011 – Google introduces the Chromebook, which runs Google Chrome OS.

2014 – The University of Michigan Micro Mote (M3), the world’s smallest computer, was constructed.

2015 – Apple introduces the Apple Watch. Windows 10 was also released by Microsoft.

2016 – The world’s first reprogrammable quantum computer is built.

Types of Computers

  • Analog Computers –  Analog computers are built with various components such as gears and levers, with no electrical components. One advantage of analogue computation is that designing and building an analogue computer to tackle a specific problem can be quite straightforward.
  • Mainframe computers –  It is a computer that is generally utilized by large enterprises for mission-critical activities such as massive data processing. Mainframe computers were distinguished by massive storage capacities, quick components, and powerful computational capabilities. Because they were complicated systems, they were managed by a team of systems programmers who had sole access to the computer. These machines are now referred to as servers rather than mainframes.
  • Supercomputers –  The most powerful computers to date are commonly referred to as supercomputers. Supercomputers are enormous systems that are purpose-built to solve complicated scientific and industrial problems. Quantum mechanics, weather forecasting, oil and gas exploration, molecular modelling, physical simulations, aerodynamics, nuclear fusion research, and cryptoanalysis are all done on supercomputers.
  • Minicomputers –  A minicomputer is a type of computer that has many of the same features and capabilities as a larger computer but is smaller in size. Minicomputers, which were relatively small and affordable, were often employed in a single department of an organization and were often dedicated to a specific task or shared by a small group.
  • Microcomputers –  A microcomputer is a small computer that is based on a microprocessor integrated circuit, often known as a chip. A microcomputer is a system that incorporates at a minimum a microprocessor, program memory, data memory, and input-output system (I/O). A microcomputer is now commonly referred to as a personal computer (PC).
  • Embedded processors –  These are miniature computers that control electrical and mechanical processes with basic microprocessors. Embedded processors are often simple in design, have limited processing capability and I/O capabilities, and need little power. Ordinary microprocessors and microcontrollers are the two primary types of embedded processors. Embedded processors are employed in systems that do not require the computing capability of traditional devices such as desktop computers, laptop computers, or workstations.

FAQs on History of Computers

Q: The principle of modern computers was proposed by ____

  • Adam Osborne
  • Alan Turing
  • Charles Babbage

Ans: The correct answer is C.

Q: Who introduced the first computer from home use in 1981?

  • Sun Technology

Ans: Answer is A. IBM made the first home-use personal computer.

Q: Third generation computers used which programming language ?

  • Machine language

Ans: The correct option is C.

Customize your course in 30 seconds

Which class are you in.

tutor

Basics of Computers

  • Computer Abbreviations
  • Basic Computer Knowledge – Practice Problems
  • Computer Organization
  • Input and Output (I/O) Devices

One response to “Hardware and Software”

THANKS ,THIS IS THE VERY USEFUL KNOWLEDGE

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Download the App

Google Play

Home — Essay Samples — Information Science and Technology — Computer — The History Of Computing

test_template

The History of Computing

  • Categories: Computer What Is History

About this sample

close

Words: 392 |

Published: Jan 8, 2020

Words: 392 | Page: 1 | 2 min read

Image of Alex Wood

Cite this Essay

Let us write you an essay from scratch

  • 450+ experts on 30 subjects ready to help
  • Custom essay delivered in as few as 3 hours

Get high-quality help

author

Verified writer

  • Expert in: Information Science and Technology History

writer

+ 120 experts online

By clicking “Check Writers’ Offers”, you agree to our terms of service and privacy policy . We’ll occasionally send you promo and account related email

No need to pay just yet!

Related Essays

5 pages / 2111 words

3 pages / 1163 words

6 pages / 2892 words

6 pages / 3163 words

Remember! This is just a sample.

You can get your custom paper by one of our expert writers.

121 writers online

The History of Computing Essay

Still can’t find what you need?

Browse our vast selection of original essay samples, each expertly formatted and styled

Related Essays on Computer

Internet of Things (IoT) is a system of integrated technology that authorizes interaction of distinctively connected computing devise which could be rooted with other interfaces like humans or machines, associated via wired and [...]

RAM (random access memory) is the memory that the computer can use ‘randomly’, this is the memory that is kept available for programs to use – the memory available is measured in gigabytes (GB) and speed is measured in [...]

Fortnite has undoubtedly risen to become one of the most popular ongoing games. It has become a cult in itself and has been able to attract crowds from multiple age brackets into this super engrossing video game. The game was [...]

The scanner is a device that optically scans images, printed text, handwriting, any object, and which converts it into a digital image. Scanner which are Commonly used in offices are variations of the desktop flatbed scanner [...]

Hadoop is an open source, Java-based programming framework that supports the computing and storage of extremely large data sets in a distributed computing environment. It is section of the Apache project created by the Apache [...]

Anna Kournikova (named by its author as "Vbs.OnTheFly Created By OnTheFly") was a computer worm written by a 20-year-old Dutch student named Jan de Wit who called himself 'OnTheFly' on February 11, 2001. It was designed to trick [...]

Related Topics

By clicking “Send”, you agree to our Terms of service and Privacy statement . We will occasionally send you account related emails.

Where do you want us to send this sample?

By clicking “Continue”, you agree to our terms of service and privacy policy.

Be careful. This essay is not unique

This essay was donated by a student and is likely to have been used and submitted before

Download this Sample

Free samples may contain mistakes and not unique parts

Sorry, we could not paraphrase this essay. Our professional writers can rewrite it and get you a unique paper.

Please check your inbox.

We can write you a custom essay that will follow your exact instructions and meet the deadlines. Let's fix your grades together!

Get Your Personalized Essay in 3 Hours or Less!

We use cookies to personalyze your web-site experience. By continuing we’ll assume you board with our cookie policy .

  • Instructions Followed To The Letter
  • Deadlines Met At Every Stage
  • Unique And Plagiarism Free

essay about history of computers

Browse Course Material

Course info.

  • Dr. Slava Gerovitch

Departments

  • Science, Technology, and Society

As Taught In

  • Computer Science
  • History of Science and Technology
  • Modern History

Learning Resource Types

The history of computing, assignments.

This section includes reading response paper assignments in the  unstructured  and  structured  formats and a  final paper assignment.

Weekly Questions

Reading response paper assignment (sessions 2-6: the unstructured format).

Write a 1-2 page reading response paper addressing the issues raised in the readings. You may choose from the provided list of tentative questions, but you are encouraged to raise your own questions. Your paper must touch upon all the readings assigned for the upcoming session.

Strategies for Writing a Good Reading Response Paper

  • Define your personal stance towards the issues raised in the readings.
  • Avoid generalities, be specific.
  • Focus on the points where you disagree, or where you can push the argument further.
  • Cite examples from your personal experience or from other literature.
  • Ask provocative questions, even if you do not know the answers.

Your paper will be made accessible to other members of the class after the deadline. It will be part of discussion in class.

Papers must be submitted in the morning before each class. No late papers are accepted.

Be creative and imaginative! Good luck!

Reading Response Paper Assignment (Sessions 7-13: The Structured Format)

Write a 1-2 page structured paper in response to your readings. The paper must focus on a single question; you may choose from the provided list, but you are encouraged to formulate your own question. Your paper must have the following format:

  • Introduction: State your question; explain its significance; formulate your thesis.
  • Background: Briefly give relevant historical information about the computing developments that you will analyze.
  • Survey of literature: State the existing perspectives (more than one) on the subject of your analysis; these can be gauged from your readings or simply hypothesized (one could argue that…).
  • Analysis: Give your own perspective and supporting argument.
  • Conclusion: What is the lesson here? What are further lines of inquiry, new questions to ask?
  • References: Use the format from the syllabus.

Devote no more than 1-2 paragraphs to each section. You may combine sections 2 and 3, if necessary. I realize that information in your readings may not be sufficient to fill all the sections; do the best you can. Your paper does not have to cite all the readings for the week, but you must read all of them. Spell-check and proof-read your paper before submission.

Final Paper Assignment

Write a 10-15 page paper (double-spaced, 1.25" margins, 12 pt font). You may choose any topic that addresses the use of the computer as a scientific instrument. You may choose something close to your own area of expertise, or something completely different. You can focus on one specific computer system and analyze its uses from different perspectives (designers’, users’, scientists’, humanists’, etc.), or you can address a larger issue that involves a certain category of computer systems (for example, expert systems) and perhaps a range of scientific disciplines. You may choose one of the topics we discussed in class, but you must significantly broaden the range of your sources. Your final paper must analyze both primary sources (participants’ accounts) and secondary sources (works by historians, sociologists, anthropologists, or other commentators). Choose an issue over which there has been (or should have been) some debate, and take a stand on that issue. Provide ample argumentation for your position and explain your objections to the alternative position(s). The final paper should follow the same structured format that is required for the Session 7-13 reading responses.

Final Paper Guide

Proposal for a Final Paper

Write a 1-2 page proposal for your final paper. The proposal should include: (1) the central question the final paper will address; (2) the historical significance of this question and how it relates to discussions in class; (3) a brief outline; and (4) a tentative bibliography, including both primary and secondary sources. Your proposal will receive the instructor’s feedback the following week. The proposal is due in class on Session 9.

Final Paper Guidelines

Write a 10-15 page paper (double-spaced, 1.25" margins, 12 pt font). You may choose any topic that addresses the use of the computer as a scientific instrument. You may choose something close to your own area of expertise, or something completely different. You can focus on one specific computer system and analyze its uses from different perspectives (designers’, users’, scientists’, humanists’, etc.), or you can address a larger issue that involves a certain category of computer systems (for example, expert systems) and perhaps a range of scientific disciplines. You may choose one of the topics we discussed in class, but you must significantly broaden the range of your sources. Your final paper must analyze both primary sources (participants’ accounts) and secondary sources (works by historians, sociologists, anthropologists, or other commentators). Choose an issue over which there has been (or should have been) some debate, and take a stand on that issue. Provide ample argumentation for your position and explain your objections to the alternative position(s). The final paper should follow the same structured format that is required for the Session 7-13 reading responses. The final paper is due in class on Session 14.

Sample Final Paper

Anthony Ronald Grue ( PDF )

facebook

You are leaving MIT OpenCourseWare

English Summary

100 Words Essay On The History Of Computer In English

The history of computers is one that is very interesting to look at. The first ever computer to be made was invented in the 1820s by Charles Babbage, who is termed the “Father of the Computer.”

From here, the first electronic digital computer was developed between 1940 and 1945 in the US and UK. Unlike modern-day computers, these were mammoth in size, akin to the size of a large room! They also required a supply of a large amount of power source, equivalent to that of several hundred modern personal computers.

With the passage of time, of course, computers have shrunk in size and today are portable with more features. 

Related Posts:

  • What is Pronoun - Definition, Examples and Types
  • What is Digital Humanities and What is it Doing in English Departments Summary
  • Random Joke of the Day Generator
  • A Woman Killed With Kindness by Thomas Heywood
  • Goblin Market Poem by Christina Rossetti Summary, Notes and Line by Line Explanation in English
  • What are the activities at the core of the Digital Humanities??

essay about history of computers

Help | Advanced Search

Computer Science > Machine Learning

Title: kan: kolmogorov-arnold networks.

Abstract: Inspired by the Kolmogorov-Arnold representation theorem, we propose Kolmogorov-Arnold Networks (KANs) as promising alternatives to Multi-Layer Perceptrons (MLPs). While MLPs have fixed activation functions on nodes ("neurons"), KANs have learnable activation functions on edges ("weights"). KANs have no linear weights at all -- every weight parameter is replaced by a univariate function parametrized as a spline. We show that this seemingly simple change makes KANs outperform MLPs in terms of accuracy and interpretability. For accuracy, much smaller KANs can achieve comparable or better accuracy than much larger MLPs in data fitting and PDE solving. Theoretically and empirically, KANs possess faster neural scaling laws than MLPs. For interpretability, KANs can be intuitively visualized and can easily interact with human users. Through two examples in mathematics and physics, KANs are shown to be useful collaborators helping scientists (re)discover mathematical and physical laws. In summary, KANs are promising alternatives for MLPs, opening opportunities for further improving today's deep learning models which rely heavily on MLPs.

Submission history

Access paper:.

  • Other Formats

license icon

References & Citations

  • Google Scholar
  • Semantic Scholar

BibTeX formatted citation

BibSonomy logo

Bibliographic and Citation Tools

Code, data and media associated with this article, recommenders and search tools.

  • Institution

arXivLabs: experimental projects with community collaborators

arXivLabs is a framework that allows collaborators to develop and share new arXiv features directly on our website.

Both individuals and organizations that work with arXivLabs have embraced and accepted our values of openness, community, excellence, and user data privacy. arXiv is committed to these values and only works with partners that adhere to them.

Have an idea for a project that will add value for arXiv's community? Learn more about arXivLabs .

IMAGES

  1. Essay On History Of Computer Free Essay Example

    essay about history of computers

  2. 🌱 Short essay about internet. Short Essay on Internet. 2022-10-16

    essay about history of computers

  3. History of the Computer Workbook

    essay about history of computers

  4. A brief History of Computer:Infographic

    essay about history of computers

  5. 😊 Age of computer essay. Essay On The Role Of Computers in Everyday Life.. 2019-02-01

    essay about history of computers

  6. A brief history of computers

    essay about history of computers

VIDEO

  1. History of computer

  2. Road to democracy Essay, History grade 12

  3. Computer 10 lines in english

  4. What is brief 'History of computer'? Explain Short History of computer? /Basic computer knowledge

  5. 10 lines on computer in English/essay on computer

  6. How Did the History of Computers Shape Today's World?

COMMENTS

  1. The History of Computers: An Essay

    The fourth generation of computers started in 1971 and goes until the present day. During this time the popularity and technology of computers has become epic. The Altair 8800 in 1975 was the first real microcomputer. In 1975 Stephen Wozniak and Jobs started to build their own microprocessor.

  2. History of computing

    The history of computing is longer than the history of computing hardware and modern computing technology and includes the history of methods intended for pen and paper or for chalk and slate, ... In his Essays on Automatics (1914) Torres presented the design of an electromechanical calculating machine and introduced the idea of Floating-point ...

  3. Computer

    The close relationship between the device and the program became apparent some 20 years later, with Charles Babbage's invention of the first computer. Computer - History, Technology, Innovation: A computer might be described with deceptive simplicity as "an apparatus that performs routine calculations automatically.".

  4. Essay on History of Computer

    The history of computers dates back to antiquity with devices like the abacus, used for calculations. However, the concept of a programmable computer was first realized in the 19th century by Charles Babbage, an English mathematician. His design, known as the Analytical Engine, is considered the first general-purpose computer, although it was ...

  5. Computer

    computer, device for processing, storing, and displaying information.. Computer once meant a person who did computations, but now the term almost universally refers to automated electronic machinery.The first section of this article focuses on modern digital electronic computers and their design, constituent parts, and applications. The second section covers the history of computing.

  6. History of computers: A brief timeline

    The history of computers goes back over 200 years. At first theorized by mathematicians and entrepreneurs, during the 19th century mechanical calculating machines were designed and built to solve ...

  7. History of Computers

    66 History of Computers Chandler Little and Ben Greene. Introduction. Modern technology first started evolving when electricity started to be used more often in everyday life. One of the biggest inventions in the 20th century was the computer, and it has gone through many changes and improvements since its creation. The last two decades have ...

  8. PDF The History of Computing: An Introduction for the Computer Scientist

    papers on computing by Babbage, Alan Turing, and John von Neumann, back into print. Finally, CBI administers the Tomash Fellowship, which supports one graduate student working on a dissertation topic in the history of computing each year. The last of the three major institutions in the history of computing is the Computer

  9. The Modern History of Computing

    The Modern History of Computing. First published Mon Dec 18, 2000; substantive revision Fri Jun 9, 2006. Historically, computers were human clerks who calculated in accordance with effective methods. These human computers did the sorts of calculation nowadays carried out by electronic computers, and many thousands of them were employed in ...

  10. The history of computing is both evolution and revolution

    Six decades on, our series Computing turns 60 looks at how things have changed. It is a truism that computing continues to change our world. It shapes how objects are designed, what information we ...

  11. History of Computing

    This chapter gives an overview of the history of computing science in hardware, software, and networking, covering prehistoric (prior to 1946) computing devices and computing pioneers since the Abacus.The emergency of social and ethical problems in computing is discussed via the history of computer crimes which started with the invention of the computer virus.

  12. History of Computers

    Papers and articles dealing with the history of computing appear from time to time. AFIPS publishes a journal, the Annals of the History of Computing , and the ACM has sponsored R. L. Wexelblat (ed) A History of Programming Languages , Academic Press, 1981.

  13. Essay about The History of Computers

    Good Essays. 1316 Words. 6 Pages. Open Document. The first ever computer was invented in the 1820s by Charlse Babbage. However the first electronic digital computer were developed between 1940 and 1945 in the United States and in the United Kingdom. They were gigantic, originally the size of a large room, and also need to be supply a large ...

  14. History of Computers

    The history of computers begins with the birth of the abacus. Structure: Abacus is basically a wooden rack that has metal rods with beads mounted on them. Working of abacus: In the abacus, the beads were moved by the abacus operator according to some rules to perform arithmetic calculations. In some countries like China, Russia, and Japan, the ...

  15. Computers: The History of Invention and Development Essay

    The first computers occupied almost the whole room and were very slow in processing data and performance in general. The modern world witnesses the development of computer technologies daily with computers turning into tiny machines and working unbelievably smoothly. A computer is now trusted as a best friend and advisor.

  16. History of Computers: From the 1800s to Now

    The concept of modern computers was based on his idea. 1937: A professor of physics and mathematics at Iowa State University, J.V. Atanasoff, attempts to build the first computer without cams, belts, gears, or shafts. 1939: Bill Hewlett and David Packard found Hewlett-Packard in a garage in Palo Alto, California.

  17. History of Computers: Parts, Networking, Operating Systems, FAQs

    The word 'computer' has a very interesting origin. It was first used in the 16th century for a person who used to compute, i.e. do calculations. The word was used in the same sense as a noun until the 20th century. Women were hired as human computers to carry out all forms of calculations and computations.

  18. The History Of Computing: [Essay Example], 392 words

    The history of tackling these issues is the history of computing ( Freiberger ). There have been a number of computing milestones and it has evolved in many ways over the years. The earliest form of the computer dates back to the 14 th century and it was known as the "Abacus". It is an instrument used for calculations by sliding counters ...

  19. (PDF) History of computer and its generations.

    The history of computer dated back to the period of scientific revolution (i.e. 1543 - 1678). The calculating machine invented by Blaise Pascal in 1642 and. that of Goffried Liebnits marked the ...

  20. Essay on Computer

    History. Computer's history can trace back to the Abacus, a calculating tool of ancient times, which eventually gave way to the development of the computer. The very first mechanical computer was developed in the 1820s by Charles Baggage, credited as the Father of the "Modern Computer."" ... Thus, we saw an essay on computer, it is a ...

  21. History Of Computers Essay

    Brief History Of Computers Essay. The history of computers is a long and fascinating one. The computer was initially born out of necessity, not just for entertainment, which is more or less how much people utilize computers these days. In fact, computers were born out of a need to solve a serious number-crunching crisis.

  22. Assignments

    Final Paper Assignment. Write a 10-15 page paper (double-spaced, 1.25" margins, 12 pt font). You may choose any topic that addresses the use of the computer as a scientific instrument. You may choose something close to your own area of expertise, or something completely different.

  23. 100 Words Essay On The History Of Computer In English

    The history of computers is one that is very interesting to look at. The first ever computer to be made was invented in the 1820s by Charles Babbage, who is termed the "Father of the Computer.". From here, the first electronic digital computer was developed between 1940 and 1945 in the US and UK. Unlike modern-day computers, these were ...

  24. [2404.19756] KAN: Kolmogorov-Arnold Networks

    Code, Data, Media Demos Related Papers About arXivLabs. Inspired by the Kolmogorov-Arnold representation theorem, we propose Kolmogorov-Arnold Networks (KANs) as promising alternatives to Multi-Layer Perceptrons (MLPs). While MLPs have fixed activation functions on nodes ("neurons"), KANs have learnable activation functions on edges ("weights").