history of computer

Early computation
Main articles: History of computing and Timeline of computing 2400 BC–1949

The earliest known tool for use in computation was the abacus, and it was thought to have been invented in Babylon circa 2400 BCE. Its original style of usage was by lines drawn in sand with pebbles. This was the first known computer and most advanced system of calculation known to date – preceding Greek methods by 2,000 years. Abaci of a more modern design are still used as calculation tools today.

The Antikythera mechanism is believed to be the earliest known mechanical analog computer.[2] It was designed to calculate astronomical positions. It was discovered in 1901 in the Antikythera wreck off the Greek island of Antikythera, between Kythera and Crete, and has been dated to circa 100 BC. Technological artifacts of similar complexity did not reappear until the 14th century, when mechanical astronomical clocks appeared in Europe.[3]

In the 3rd century CE the South Pointing Chariot was invented in ancient China. It was the first known geared mechanism to use a differential gear, which was later used in analog computers. The Chinese also invented a more sophisticated abacus from around the 2nd century BCE, known as the Chinese abacus.[citation needed]

Mechanical analog computing devices appeared again a thousand years later in the medieval Islamic world. Examples of devices from this period include the equatorium by Arzachel,[4] the mechanical geared astrolabe by Abū Rayhān al-Bīrūnī,[5] and the torquetum by Jabir ibn Aflah.[6] Muslim engineers built a number of Automata, including some musical automata that could be ‘programmed’ to play different musical patterns. These devices were developed by the Banū Mūsā brothers[7] and Al-Jazari[8] Muslim mathematicians also made important advances in cryptography, such as the development of cryptanalysis and frequency analysis by Alkindus.[9]

When John Napier discovered logarithms for computational purposes in the early 17th century, there followed a period of considerable progress by inventors and scientists in making calculating tools. In 1623 Wilhelm Schickard designed a calculating machine, but abandoned the project, when the prototype he had started building was destroyed by a fire in 1624. Around 1640, Blaise Pascal, a leading French mathematician, constructed the first mechanical adding device[10] based on a design described by Greek mathematician Hero of Alexandria.[11] Then in 1672 Gottfried Wilhelm Leibniz invented the Stepped Reckoner which he completed in 1694.[12]

None of the early computational devices were really computers in the modern sense, and it took considerable advancement in mathematics and theory before the first modern computers could be designed.
[edit] Algorithms

In the 7th century, Indian mathematician Brahmagupta gave the first explanation of the Hindu-Arabic numeral system and the use of zero as both a placeholder and a decimal digit.

Approximately around the year 825, Persian mathematician Al-Khwarizmi wrote a book, On the Calculation with Hindu Numerals, that was principally responsible for the diffusion of the Indian system of numeration in the Middle East and then Europe. Around the 12th century, there was translation of this book written into Latin: Algoritmi de numero Indorum. These books presented newer concepts to perform a series of steps in order to accomplish a task such as the systematic application of arithmetic to algebra. By derivation from his name, we have the term algorithm.
[edit] Binary logic

Around the 3rd century BC, Indian mathematician Pingala discovered the binary numeral system. In this system, still used today in all modern computers, a sequence of ones and zeros can represent any number.

In 1703, Gottfried Leibnitz developed logic in a formal, mathematical sense with his writings on the binary numeral system. In his system, the ones and zeros also represent true and false values or on and off states. But it took more than a century before George Boole published his Boolean algebra in 1854 with a complete system that allowed computational processes to be mathematically modeled.

By this time, the first mechanical devices driven by a binary pattern had been invented. The industrial revolution had driven forward the mechanization of many tasks, and this included weaving. Punched cards controlled Joseph Marie Jacquard’s loom in 1801, where a hole punched in the card indicated a binary one and an unpunched spot indicated a binary zero. Jacquard’s loom was far from being a computer, but it did illustrate that machines could be driven by binary systems.
[edit] Birth of computer science

Before the 1920s, computers (sometimes computors) were human clerks that performed computations. They were usually under the lead of a physicist. Many thousands of computers were employed in commerce, government, and research establishments. Most of these computers were women, and they were known to have a degree in calculus. Some performed astronomical calculations for calendars.

After the 1920s, the expression computing machine referred to any machine that performed the work of a human computer, especially those in accordance with effective methods of the Church-Turing thesis. The thesis states that a mathematical method is effective if it could be set out as a list of instructions able to be followed by a human clerk with paper and pencil, for as long as necessary, and without ingenuity or insight.

Machines that computed with continuous values became known as the analog kind. They used machinery that represented continuous numeric quantities, like the angle of a shaft rotation or difference in electrical potential.

Digital machinery, in contrast to analog, were able to render a state of a numeric value and store each individual digit. Digital machinery used difference engines or relays before the invention of faster memory devices.

The phrase computing machine gradually gave away, after the late 1940s, to just computer as the onset of electronic digital machinery became common. These computers were able to perform the calculations that were performed by the previous human clerks.

Since the values stored by digital machines were not bound to physical properties like analog devices, a logical computer, based on digital equipment, was able to do anything that could be described “purely mechanical.” The theoretical Turing Machine, created by Alan Turing, is a hypothetical device theorized in order to study the properties of such hardware.
See also: Philosophy of physics, Philosophy of biology, Philosophy of mathematics, Philosophy of language, and Philosophy of mind
[edit] Emergence of a discipline
[edit] The theoretical groundwork

The mathematical foundations of modern computer science began to be laid by Kurt Gödel with his incompleteness theorem (1931). In this theorem, he showed that there were limits to what could be proved and disproved within a formal system. This led to work by Gödel and others to define and describe these formal systems, including concepts such as mu-recursive functions and lambda-definable functions.

1936 was a key year for computer science. Alan Turing and Alonzo Church independently, and also together, introduced the formalization of an algorithm, with limits on what can be computed, and a “purely mechanical” model for computing.

These topics are covered by what is now called the Church–Turing thesis, a hypothesis about the nature of mechanical calculation devices, such as electronic computers. The thesis claims that any calculation that is possible can be performed by an algorithm running on a computer, provided that sufficient time and storage space are available.

Turing also included with the thesis a description of the Turing machine. A Turing machine has an infinitely long tape and a read/write head that can move along the tape, changing the values along the way. Clearly such a machine could never be built, but nonetheless, the model can simulate the computation of any algorithm which can be performed on a modern computer.

Turing is so important to computer science that his name is also featured on the Turing Award and the Turing test. He contributed greatly to British code-breaking successes in the Second World War, and continued to design computers and software through the 1940s, but committed suicide in 1954.

At a symposium on large-scale digital machinery in Cambridge, Turing said, “We are trying to build a machine to do all kinds of different things simply by programming rather than by the addition of extra apparatus”.

In 1948, the first practical computer that could run stored programs, based on the Turing machine model, had been built – the Manchester Baby.

In 1950, Britain’s National Physical Laboratory completed Pilot ACE, a small scale programmable computer, based on Turing’s philosophy.
[edit] Shannon and information theory

Up to and during the 1930s, electrical engineers were able to build electronic circuits to solve mathematical and logic problems, but most did so in an ad hoc manner, lacking any theoretical rigor. This changed with Claude Elwood Shannon’s publication of his 1937 master’s thesis, A Symbolic Analysis of Relay and Switching Circuits. While taking an undergraduate philosophy class, Shannon had been exposed to Boole’s work, and recognized that it could be used to arrange electromechanical relays (then used in telephone routing switches) to solve logic problems. This concept, of utilizing the properties of electrical switches to do logic, is the basic concept that underlies all electronic digital computers, and his thesis became the foundation of practical digital circuit design when it became widely known among the electrical engineering community during and after World War II.

Shannon went on to found the field of information theory with his 1948 paper titled A Mathematical Theory of Communication, which applied probability theory to the problem of how to best encode the information a sender wants to transmit. This work is one of the theoretical foundations for many areas of study, including data compression and cryptography.
[edit] Wiener and Cybernetics

From experiments with anti-aircraft systems that interpreted radar images to detect enemy planes, Norbert Wiener coined the term cybernetics from the Greek word for “steersman.” He published “Cybernetics” in 1948, which influenced artificial intelligence. Wiener also compared computation, computing machinery, memory devices, and other cognitive similarities with his analysis of brain waves.

The first actual computer bug was a moth. It was stuck in between the relays on the Harvard Mark II.[1] While the invention of the term ‘bug’ is often but erroneously attributed to Grace Hopper, a future rear admiral in the U.S. Navy, who supposedly logged the “bug” on September 9, 1945, most other accounts conflict at least with these details. According to these accounts, the actual date was September 9, 1947 when operators filed this ‘incident’ — along with the insect and the notation “First actual case of bug being found” (see software bug for details).

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: