Get Premium for free!
History of Computer Programming
2 months ago
The Abacus “The First Automatic Computer” is the earliest known tool of computing.
Analog computers are believed to have been first developed by the Greeks with the Antikythera mechanism, which was used for astronomy.
In 825, a Persian mathematician, Al-Kwarizmi wrote, "On the Calculation" with Hindu Numerals. This book helped the diffusion of Hindu-Arabic numerals into Europe.
Charles Babbage and Ada Lovelace together are often thought of as the founders of modern computing.
In the late 1800’s the first “programmable” computers appeared using punch card technology. To be programmable, a machine had to be able to simulate the computations of any other machine by altering its computational process.
By the 1940’s the age of analog computers was about to become the age of digital computers.
In 1975, the first commercially successful home computer, the MITS Altair 8800 (the same computer that Microsoft was founded to provide programming for), was released.
Every modern computer has the same five basic components: memory, control unit, arithmetic logic unit, input and output.
The advent of World War Two prompted the transition from analog computers to digital.
Transistors hold several advantages over vacuum tubes, not the least of which is their small size and small price. With transistor technology, electronic equipment gradually became smaller and smaller.
In 300 BC Euclid writes a series of 13 books called Elements. The definitions, theorems, and proofs covered in the books became a model for formal reasoning. Elements was instrumental in the development of logic & modern science. It is the first documented work in Mathematics that used a series of numbered chunks to break down the solution to a problem.
The story really starts in the 800s where we meet a person named al-Khwārizmī. His name roughly translates to Algoritmi, in Latin, and he develops a technique called Algorism. Algorism is the technique of performing arithmetic with Hindu-Arabic numerals. Al-Khwārizmī is considered to be the grandfather of Computer Science. He is the first one to develop the concept of the algorithm in Mathematics.
Then came Ramon Llull, who is considered the pioneer of computation theory. He started working on a new system of logic in his work — Ars magna. Ars magna influenced the development of a couple subfields within mathematics.
The binary system was invented by the Indian mathematician Pingala in the 3rd century BCE. In this system any number can represented with just zeroes and ones.
The analytical engine is a key step in the formation of the modern computer. It was designed by Charles Babbage starting in 1837, but he worked on it until his death in 1871
Analog computers and the very first digital computers were being developed by the 20’s and 30’s were known as “computing machines”, but this phrase had passed out by the 40’s. By that time “computer” meant a machine that performed calculations.
Throughout 1500 to 1800’s many breakthroughs in computational hardware technology were developed, including mechanical calculators and punch card technology (used to this day).
Charles Babbage laid the foundations of Computer Science, but it was Alan Turing of England who is regarded as the “Father of Computer Science”.
Following Euclid, notes about recipes, etc. were the closest thing to how we think at algorithms today. Also, there was also the inventions of mathematical methods. For example, Sieve of Eratosthenes, Euclid’s algorithms, and methods for factorization square roots. It seems common to us now to list things out in the order we will do them in. But, in Mathematics at that time — it was not a common idea.
Share on Google+
Share on Facebook
Submit to Reddit
Share on LinkedIn
Post to Tumblr