A Brief History of Computers invention

In the early 1820s, it was designed by Charles Babbage who is known as "Father of Modern Computer". It was a mechanical computer which could perform

 

Analytical Engine

Do you know what is Computer?

A computer is a programmable electronic device that accepts raw data as input and processes it with a set of instructions (a program) to produce the result as output. It renders output just after performing mathematical and logical operations and can save the output for future use. It can process numerical as well as non-numerical calculations. The term "computer" is derived from the Latin word "computare" which means to calculate.

A computer is designed to execute applications and provides a variety of solutions through integrated hardware and software components. It works with the help of programs and represents the decimal numbers through a string of binary digits. It also has a memory that stores the data, programs, and results of processing. The components of a computer such as a piece of machinery that includes wires, transistors, circuits, hard disks are called hardware. Whereas, the programs and data are called software.

It is believed that the Analytical Engine was the first computer which was invented by Charles Babbage in 1837. It used punch cards as read-only memory. Charles Babbage is also known as the father of the computer.

What is the origin of the Computer?

Abacus

The history of computers begins with the birth of the Abacus which is believed to be the first computer. It is said that the Chinese invented Abacus around 4,000 years ago.

It was a wooden rack which has metal rods with beads mounted on them. The beads were moved by the abacus operator according to some rules to perform arithmetic calculations. Abacus is still used in some countries like China, Russia, and Japan

Napier's Bones

It was a manually-operated calculating device that was invented by John Napier (1550-1617) of Merchiston. In this calculating tool, he used 9 different ivory strips or bones marked with numbers to multiply and divide. So, the tool became known as "Napier's Bones". It was also the first machine to use the decimal point.

Pascaline

Pascaline is also known as Arithmetic Machine or Adding Machine. It was invented between 1642 and 1644 by a French mathematician-philosopher Blaise Pascal. It is believed that it was the first mechanical and automatic calculator.

Pascal invented this machine to help his father, a tax accountant. It could only perform addition and subtraction. It was a wooden box with a series of gears and wheels. When a wheel is rotated one revolution, it rotates the neighboring wheel. A series of windows is given on the top of the wheels to read the totals.

Stepped Reckoner or Leibnitz wheel

It was developed by a German mathematician-philosopher Gottfried Wilhelm Leibnitz in 1673. Someone called Stepped Reckoner and also Leibnitz Wheel. He improved Pascal's invention to develop this machine. It was a digital mechanical calculator which was called the stepped reckoner as instead of gears it was made of fluted drums.

Difference Engine

In the early 1820s, it was designed by Charles Babbage who is known as "Father of Modern Computer". It was a mechanical computer that could perform simple calculations. It was a steam-driven calculating machine designed to solve tables of numbers like logarithm tables.

Analytical Engine

This calculating machine was also developed by Charles Babbage in 1830. It was a mechanical computer that used punch cards as input. It was capable of solving any mathematical problem and storing information as a permanent memory.

Tabulating Machine

It was invented in 1890, by Herman Hollerith, an American statistician. It was a mechanical tabulator based on punch cards. It could tabulate statistics and record or sort data or information. This machine was used in the 1890 U.S. Census. Hollerith also started Hollerith's Tabulating Machine Company which later became International Business Machine (IBM) in 1924.

Differential Analyzer

It was the first electronic computer introduced in the US in 1930. It was an analog device invented by Vannevar Bush. This machine has vacuum tubes to switch electrical signals to perform calculations. It could do 25 calculations in few minutes.

Mark I

The next major changes in the history of Computers began in 1937 when Howard Aiken planned to develop a machine that could perform calculations involving large numbers. In 1944, Mark I computer was built as a partnership between IBM and Harvard. It was the first programmable digital computer.

A Brief History of Computer

The computer was born not for entertainment or email but out of a need to solve a serious number-crunching crisis. By 1880, the U.S. population had grown so large that it took more than seven years to tabulate the U.S. Census results. The government sought a faster way to get the job done, giving rise to punch-card-based computers that took up entire rooms.

Today, we carry more computing power on our smartphones than was available in these early models. The following brief history of computing is a timeline of how computers evolved from their humble beginnings to the machines of today that surf the Internet, play games, and stream multimedia in addition to crunching numbers.

1801: In France, Joseph Marie Jacquard invents a loom that uses punched wooden cards to automatically weave fabric designs. Early computers would use similar punch cards.

1822: English mathematician Charles Babbage conceives of a steam-driven calculating machine that would be able to compute tables of numbers. The project, funded by the English government, is a failure. More than a century later, however, the world's first computer was actually built.

1890: Herman Hollerith designs a punch card system to calculate the 1880 census, accomplishing the task in just three years and saving the government $5 million. He establishes a company that would ultimately become IBM.

1936: Alan Turing presents the notion of a universal machine, later called the Turing machine, capable of computing anything that is computable. The central concept of the modern computer was based on his ideas.

1937: J.V. Atanasoff, a professor of physics and mathematics at Iowa State University, attempts to build the first computer without gears, cams, belts or shafts.

1939: Hewlett-Packard is founded by David Packard and Bill Hewlett in a Palo Alto, California, garage, according to the Computer History Museum. 

1941: Atanasoff and his graduate student, Clifford Berry, design a computer that can solve 29 equations simultaneously. This marks the first time a computer is able to store information on its main memory.

1943-1944: Two University of Pennsylvania professors, John Mauchly and J. Presper Eckert build the Electronic Numerical Integrator and Calculator (ENIAC). Considered the grandfather of digital computers, it fills a 20-foot by 40-foot room and has 18,000 vacuum tubes.

1946: Mauchly and Presper leave the University of Pennsylvania and receive funding from the Census Bureau to build the UNIVAC, the first commercial computer for business and government applications.

1947: William Shockley, John Bardeen, and Walter Brattain of Bell Laboratories invented the transistor. They discovered how to make an electric switch with solid materials and no need for a vacuum. 

1953: Grace Hopper develops the first computer language, which eventually becomes known as COBOL. Thomas Johnson Watson Jr., son of IBM CEO Thomas Johnson Watson Sr., conceives the IBM 701 EDPM to help the United Nations keep tabs on Korea during the war.

1954: The FORTRAN programming language, an acronym for FORmula TRANslation, is developed by a team of programmers at IBM led by John Backus, according to the University of Michigan.

1958: Jack Kilby and Robert Noyce unveil the integrated circuit, known as the computer chip. Kilby was awarded the Nobel Prize in Physics in 2000 for his work.

1964: Douglas Engelbart shows a prototype of the modern computer, with a mouse and a graphical user interface (GUI). This marks the evolution of the computer from a specialized machine for scientists and mathematicians to technology that is more accessible to the general public.

1969: A group of developers at Bell Labs produce UNIX, an operating system that addressed compatibility issues. Written in the C programming language, UNIX was portable across multiple platforms and became the operating system of choice among mainframes at large companies and government entities. Due to the slow nature of the system, it never quite gained traction among home PC users.

1970: The newly formed Intel unveils the Intel 1103, the first Dynamic Access Memory (DRAM) chip.

1971: Alan Shugart leads a team of IBM engineers who invent the "floppy disk," allowing data to be shared among computers.

1973: Robert Metcalfe, a member of the research staff for Xerox, develops an Ethernet for connecting multiple computers and other hardware.

1974-1977: A number of personal computers hit the market, including Scelbi & Mark-8 Altair, IBM 5100, Radio Shack's TRS-80 — affectionately known as the "Trash 80" — and the Commodore PET.

1975: The January issue of Popular Electronics magazine features the Altair 8080, described as the "world's first minicomputer kit to rival commercial models." Two "computer geeks," Paul Allen and Bill Gates, offer to write software for the Altair, using the new BASIC language. On April 4, after the success of this first endeavor, the two childhood friends form their own software company, Microsoft. 

1976: Steve Jobs and Steve Wozniak start Apple Computers on April Fool's Day and roll out the Apple I, the first computer with a single-circuit board, according to Stanford University


The TRS-80, introduced in 1977, was one of the first machines whose documentation was intended for non-geeks



1977: Radio Shack's initial production run of the TRS-80 was just 3,000. It sold like crazy. For the first time, non-geeks could write programs and make a computer do what they wished.

1977: Jobs and Wozniak incorporate Apple and show the Apple II at the first West Coast Computer Faire. It offers color graphics and incorporates an audio cassette drive for storage.

1978: Accountants rejoice at the introduction of VisiCalc, the first computerized spreadsheet program.

1979: Word processing becomes a reality as MicroPro International releases WordStar. "The defining change was to add margins and word wrap," said creator Rob Barnaby in the email to Mike Petrie in 2000. "Additional changes included getting rid of command mode and adding a print function. I was the technical brains — I figured out how to do it, and did it, and documented it".


The first IBM personal computer, introduced on Aug. 12, 1981, used the MS-DOS operating system.


1981: The first IBM personal computer, code-named "Acorn," is introduced. It uses Microsoft's MS-DOS operating system. It has an Intel chip, two floppy disks, and an optional color monitor. Sears & Roebuck and Computerland sell the machines, marking the first time a computer is available through outside distributors. It also popularizes the term PC.

1983: Apple's Lisa is the first personal computer with a GUI. It also features a drop-down menu and icons. It flops but eventually evolves into the Macintosh. The Gavilan SC is the first portable computer with the familiar flip form factor and the first to be marketed as a "laptop."

1985: Microsoft announces Windows, according to Encyclopedia Britannica. This was the company's response to Apple's GUI. Commodore unveils the Amiga 1000, which features advanced audio and video capabilities.

1985: The first dot-com domain name is registered on March 15, years before the World Wide Web would mark the formal beginning of Internet history. The Symbolics Computer Company, a small Massachusetts computer manufacturer, registers Symbolics.com. More than two years later, only 100 dot-coms had been registered.

1986: Compaq brings the Deskpro 386 to market. Its 32-bit architecture provides a speed comparable to mainframes.

1990: Tim Berners-Lee, a researcher at CERN, the high-energy physics laboratory in Geneva, develops HyperText Markup Language (HTML), giving rise to the World Wide Web.

1993: The Pentium microprocessor advances the use of graphics and music on PCs.

1994: PCs become gaming machines as "Command & Conquer," "Alone in the Dark 2," "Theme Park," "Magic Carpet," "Descent" and "Little Big Adventure" are among the games to hit the market.

1996: Sergey Brin and Larry Page developed the Google search engine at Stanford University.

1997: Microsoft invests $150 million in Apple, which was struggling at the time, ending Apple's court case against Microsoft in which it alleged that Microsoft copied the "look and feel" of its operating system.

1999: The term Wi-Fi becomes part of the computing language and users begin connecting to the Internet without wires.

2001: Apple unveils the Mac OS X operating system, which provides protected memory architecture and pre-emptive multi-tasking, among other benefits. Not to be outdone, Microsoft rolls out Windows XP, which has a significantly redesigned GUI.

2003: The first 64-bit processor, AMD's Athlon 64, becomes available to the consumer market.

2004: Mozilla's Firefox 1.0 challenges Microsoft's Internet Explorer, the dominant Web browser. Facebook, a social networking site, launches.

2005: YouTube, a video-sharing service, is founded. Google acquires Android, a Linux-based mobile phone operating system.

2006: Apple introduces the MacBook Pro, its first Intel-based, dual-core mobile computer, as well as an Intel-based iMac. Nintendo's Wii game console hits the market.

2007: The iPhone brings many computer functions to the smartphone.

2009: Microsoft launches Windows 7, which offers the ability to pin applications to the taskbar and advances in touch and handwriting recognition, among other features.

2010: Apple unveils the iPad, changing the way consumers view media and jumpstarting the dormant tablet computer segment.

2011: Google releases the Chromebook, a laptop that runs the Google Chrome OS.

2012: Facebook gains 1 billion users on October 4.

2015: Apple releases the Apple Watch. Microsoft releases Windows 10.

2016: The first reprogrammable quantum computer was created. "Until now, there hasn't been any quantum-computing platform that had the capability to program new algorithms into their system. They're usually each tailored to attack a particular algorithm," said study lead author Shantanu Debnath, a quantum physicist and optical engineer at the University of Maryland, College Park.

2017: The Defense Advanced Research Projects Agency (DARPA) is developing a new "Molecular Informatics" program that uses molecules as computers. "Chemistry offers a rich set of properties that we may be able to harness for rapid, scalable information storage and processing," Anne Fischer, program manager in DARPA's Defense Sciences Office, said in a statement. "Millions of molecules exist, and each molecule has a unique three-dimensional atomic structure as well as variables such as shape, size, or even color. This richness provides a vast design space for exploring novel and multi-value ways to encode and process data beyond the 0s and 1s of current logic-based, digital architectures."

Generations of Computers

A generation of computers refers to the specific improvements in computer technology with time. In 1946, electronic pathways called circuits were developed to perform the counting. It replaced the gears and other mechanical parts used for counting in previous computing machines.

In each new generation, the circuits became smaller and more advanced than the previous generation circuits. Miniaturization helped increase the speed, memory, and power of computers. There are five generations of computers which are described below.

First Generation Computers

The first generation (1946-1959) computers were slow, huge, and expensive. In these computers, vacuum tubes were used as the basic components of CPU and memory. These computers were mainly dependent on the batch operating systems and punch cards. Magnetic tape and paper tape were used as output and input devices in this generation;

Some of the popular first-generation computers are:

1. ENIAC ( Electronic Numerical Integrator and Computer)
2. EDVAC ( Electronic Discrete Variable Automatic Computer)
3. UNIVAC I ( Universal Automatic Computer)
4. IBM-701
5. IBM-650

Second Generation Computers

The second generation (1959-1965) was the era of transistor computers. These computers used transistors which were cheap, compact, and consuming less power; it made transistor computers faster than the first generation computers.

In this generation, magnetic cores were used as the primary memory and magnetic disc and tapes were used as the secondary storage. Assembly language and programming languages like COBOL and FORTRAN, and Batch processing and multiprogramming operating systems were used in these computers.

Some of the popular second-generation computers are:

1. IBM 1620
2. IBM 7094
3. CDC 1604
4. CDC 3600
5. UNIVAC 1108

Third Generation Computers

The third generation computers used integrated circuits (ICs) instead of transistors. A single IC can pack a huge number of transistors which increased the power of a computer and reduced the cost. The computers also became more reliable, efficient, and smaller in size. These generation computers used remote processing, time-sharing, multiprogramming as the operating systems. Also, high-level programming languages like FORTRON-II TO IV, COBOL, PASCAL PL/1, ALGOL-68 were used in this generation.

Some of the popular third-generation computers are:

1. IBM-360 series
2. Honeywell-6000 series
3. PDP(Personal Data Processor)
4. IBM-370/168
5. TDC-316

Fourth Generation Computers

The fourth-generation (1971-1980) computers used very large-scale integrated (VLSI) circuits; a chip containing millions of transistors and other circuit elements. These chips made this generation of computers more compact, powerful, fast, and affordable. These generation computers used real-time, time-sharing, and distributed operating systems. The programming languages like C, C++, DBASE were also used in this generation.

Some of the popular fourth-generation computers are:

1. DEC 10
2. STAR 1000
3. PDP 11
4. CRAY-1(Super Computer)
5. CRAY-X-MP(Super Computer)

Fifth Generation Computers

In fifth generation (1980-till date) computers, the VLSI technology was replaced with ULSI (Ultra Large Scale Integration). It made possible the production of microprocessor chips with ten million electronic components. This generation of computers used parallel processing hardware and AI (Artificial Intelligence) software. The programming languages used in this generation were C, C++, Java, .Net, etc.

Some of the popular fifth generation computers are:

1. Desktop
2. Laptop
3. NoteBook
4. UltraBook
5. Chromebook

List of the Basic parts of Computer:

Processor: It executes instructions from software and hardware.
Memory: It is the primary memory for data transfer between the CPU and storage.
Motherboard: It is the part that connects all other parts or components of a computer.
Storage Device: It permanently stores the data, e.g., hard drive.
Input Device: It allows you to communicate with the computer or to input data, e.g., a keyboard.
Output Device: It enables you to see the output, e.g., monitor.

Computers are divided into different types based on different criteria, sizes. Here is the list of Computers:

1. Micro Computer
2. Mini Computer
3. Mainframe Computer
4. Super Computer
5. Workstations

Now I am explaining every single thing and you will get a clear idea about Computers: 

1. Micro Computer:

It is a single-user computer that has less speed and storage capacity than the other types. It uses a microprocessor as a CPU. The first microcomputer was built with 8-bit microprocessor chips. The common examples of microcomputers include laptops, desktop computers, personal digital assistants (PDA), tablets, and smartphones. Microcomputers are generally designed and developed for general usage like browsing, searching for information, the internet, MS Office, social media, etc.

2. Mini Computer:

Mini-computers are also known as "Midrange Computers". They are not designed for a single. They are multi-user computers designed to support multiple users simultaneously. So, they are generally used by small businesses and firms. Individual departments of a company use these computers for specific purposes. For example, the admission department of a University can use a Mini-computer for monitoring the admission process.


3. Mainframe Computer:

It is also a multi-user computer capable of supporting thousands of users simultaneously. They are used by large firms and government organizations to run their business operations as they can store and process large amounts of data. For example, Banks, universities, and insurance companies use mainframe computers to store the data of their customers, students, and policyholders, respectively.

4. Super Computer:

Super-computers are the fastest and most expensive computers among all types of computers. They have huge storage capacities and computing speeds and thus can perform millions of instructions per second. The super-computers are task-specific and thus used for specialized applications such as large-scale numerical problems in scientific and engineering disciplines including applications in electronics, petroleum engineering, weather forecasting, medicine, space research, and more. For example, NASA uses supercomputers for launching space satellites and monitoring and controlling them for space exploration.

5. Work stations:

It is a single-user computer. Although it is like a personal computer, it has a more powerful microprocessor and a higher-quality monitor than a microcomputer. In terms of storage capacity and speed, it comes between a personal computer and a minicomputer. Work stations are generally used for specialized applications such as desktop publishing, software development, and engineering designs.

Why do we use Computers?

1. Increases your productivity: A computer increases your productivity. For example, after having a basic understanding of a word processor, you can create, edit, store, and print documents easily and quickly.
2. Connects to the Internet: It connects you to the internet that allows you to send emails, browse content, gain information, use social media platforms, and more. By connecting to the internet, you can also connect to your long-distance friends and family members.
3. Storage: A computer allows you to store a large amount of information, e.g., you can store your projects, ebooks, documents, movies, pictures, songs, and more.
4. Organized Data and Information: It not only allows you to store data but also enables you to organize your data. For example, you can create different folders to store different data and information and thus can search for information easily and quickly.
5. Improves your abilities: It helps write good English if you are not good at spelling and grammar. Similarly, if you are not good at math, and don't have a great memory, you can use a computer to perform calculations and store the results.
6. Assist the physically challenged: It can be used to help the physically challenged, e.g., Stephen Hawking, who was not able to speak used a computer to speak. It also can be used to help blind people by installing special software to read what is on the screen.
7. Keeps you entertained: You can use the computer to listen to songs, watch movies, play games and more.
Note: In this article, I try to clear about basics of computers. Hopefully, I succeed to clear everything. If you have any questions or suggestions can comment or contact us.