History of computer
A History of Computers
Mankind has
evolved from a primitive to a highly advanced society by continually inventing
tools.  Stone tools, fire powder, wheels,
and other inventions have changed the lives of humans dramatically.  In recent history, the computer is arguably
the most important invention.  In today’s
highly advanced society, computers affect our lives 24 hours a day: your class
schedules are formulated by computers, you student record are maintained by
computers, your exams are graded by computers, and numerous other functions
that affect you are controlled by computers.
Although
the first true computer was invented in the 1940’s, the concept of a computer
is actually more than 160 years old. 
Charles Babbage is credited with inventing a precursor to the modern
computer.  In 1823 he received a grant
from the British government to build a mechanical device he called the Difference
Engine, intended for computing and printing mathematical tables. The device
was based on rotating wheels and was operated by a single crank.  Unfortunately, the technology of the time was
not advanced enough to build the device. 
He ran into difficulties and eventually abandoned the project. 
But an even
more grandiose scheme was already with him. 
In fact, one of the reasons he gave up on the Difference Engine may  have been to work on his new concept for a
better machine.  He called his new device
the Analytical Engine. This device also was never built.  His second device also was ahead of its time;
the technology did not yet exist to make the device a reality.  Although never built, the Analytical Engine
was a remarkable achievement because its design was essentially based on the
same fundamental principles of the modern computer.  One principle 
that stands out was its programmability. 
With the Difference Engine, Babbage would have been able to compute only
mathematical tables, but with the Analytical Engine he would have been able to
compute any calculation by inputting instructions on punch cards.  The method of 
inputting programs to computers on punch cards was actually adopted for
real machines and was still in wide use as late as the 1970s.
The
Analytical Engine was never built, but a demonstration program was written by
Ada Lovelace, a daughter of the poet Lord Byron.  The programming language Ada was named in
honour of Lady Lovelace, the first computer programmer.
In the late
1930s John Atanasoff of Iowa State University, with his graduate student
Clifford Berry, built the prototype of the first automatic electronic
calculator.  One innovation of their
machine was the use of binary numbers. 
At around the same time, Howard Aiken of Harvard University was working
on the Automatic Sequence-Controlled Calculator,  known more commonly as MARK I, with
support from IBM and U.S. Navy.  MARK I
was very similar to the Analytical Engine in design and was described as
“Babbage’s dream come true:”
MARK I  was an electromechanical computer based on
relays.  Mechanical relays were not fast
enough, and MARK I was quickly replaced by machines based on electronic vacuum
tubes.  The first completely electronic
computer, ENIAC I (Electronic Numerical Integrator and Claculator),
Was built
at the University of Pennsylvania under the supervision of John W. Mauchly and
J. Preper Eckert.  Their work was
influenced by the work of John Atanasoff.
ENIAC I was
programmed laboriously by plugging wires into a control panel that resembled an
old telephone switchboard.  Programming
took an enormous amount of the engineer’s time, and even making a simple change
to a program was a time-consuming effort. 
While programming activities were going on, the expensive computer sat
idle. To improve its productivity, John von Neumann of Princeton University
proposed storing programs in the computer’s memory.  This stored-program
Scheme not
only improved computation speed but also allowed far more flexible ways of
writing programs.  For example, because a
program is stored in the memory, the computer can change the program
instructions to alter the sequence of the execution, thereby making it possible
to get different results from a single program.
We
characterized these early computers with vacuum tubes as first generation
computers, Second-generation computers,  with transistors replacing the vacuum tubes, started
appearing in the late 1950s. Improvements in memory devices also increased
processing speed further.  In the early
1960s, transistors were replaced by integrated circuits and third-generation
computers emerged.  A single
integrated circuit of this period incorporated hundreds of transistors and made
the construction of minicomputers possible. 
Minicomputers are small enough to be placed  on desktops in individual offices and
labs.  The early computers, on the other
hand, were so huge, they easily occupied the whole basement of a large
building.
Advancement
of integrated circuits was phenomenal. 
Large-scale integrated circuits, commonly  known as computer chips or silicon chips, packed
the power equivalent to thousands of transistors and made the notion of a
“computer on a single chip” a reality. 
With large-scale integrated circuits, microcomputers emerged in
the mid 19702.  The machines we call personal
computers today are descendants of the microcomputers in the 1970s.  The computer chips used in today’s personal
computers pack the power equivalent to several millions of transistors.  Personal computers are fourth- generation
computers.
Early
microcomputers were isolated, stand-alone machines.  The word personal describes a machine
as a personal device intended to be used by an individual.  However, it did not take long to realize
there was a need to share computer resources. 
For example, early microcomputers required a dedicated printer.  Wouldn’t it make more sense to have many
computers share a single printer?
Wouldn’t it
also make more sense to share data among computers, instead of duplicating the
same data on individual machines? 
Wouldn’t it be nice to send electronic messages between the computers?
The notion of networked computers arose to meet those needs.
Computers
of all kinds are connected into a network. A network that connects computers in
a single building or in several nearby buildings is called a local area
network  (LAN).  A network that connects geographically
dispersed computers is called a wide area network (WAN).  These individual networks can be connected
further to form interconnected networks called internets.  The most famous internet is simply called the
Internet.  The Internet makes the sharing
of worldwide information possible and easy. 
The hottest tool for viewing information on the Internet is a web
browser.  A web browser allows you
to view multimedia information consisting of text, audio, video, and other
types of information.
Questions:
1)     
When
was the first computer invented?
2)     
Why
was  Babbage’s  Analytical Engine so remarkable?
3)     
What
are first-generation computers?
4)     
What
generation are personal computers?
5)     
What
do LAN and WAN stand for?

 
Comments
Post a Comment