The development of computer science and its evolution

If we go back to prehistory, and the primitive man used the fingers to count, hence the man finds the need to count and begins to use the stones, where the word calculation comes from, which in Latin means stone, and it isAt that time, in which we can say that the history of calculation or mathematics begins. These prehistoric men used notches that could make in stones, bones, trunks, etc.

One of the instruments that comes to mind when we think of the mathematics of antiquity, is the abacus, but first I would like to tell its origin. The abacus has at least one seniority of 3000 years, it is considered the oldest device used to perform arithmetic operations. For some, the abacus originated in Madagascar to count the soldiers, for each soldier a stone was placed in a groove dug on the ground, for every 10 soldiers a second slot was created, the "groove of the tens", every 100 soldiers, a third slot was created, the "groove of the hundreds", and so on. In this way they calculated the amount of ammunition necessary in a battle. For others, the abacus originated in Central Asia, somewhere in the former Soviet Union, the Russian abacus is known as Steschoty, from there it extended to neighboring countries. Some people like Socrates, Plato, Aristotle, Thales of Miletus and Pythagoras, among others, were the first to seek rational explanations to understand the behavior of nature or that of the universe, so it is considered that science was born in Greece. Both Greeks and Egyptians made great contributions to science.

The numbering system that we use in our daily lives is the decimal system, and it is called that because it counts the quantities of 10 in 10. As I said before, the primitive man already used the fingers to count, and remember that the fingers of both hands add up 10, so counting the objects is relatively easy when assigning a finger by object and keeping the account of how many times we fill our hands,that is, we reach 10 or a dozen. The Babylonian and Sumerian civilizations did not have the dozen as a base and had the number 60 as a reference and that is why we have the computation of time (minutes, seconds) they trained to that unit of measure. Until the fourth century A.C did not know the 0. It is said that the first time that the numbers from 1 to 9 were used in Europe was by Gerberto de Aurallic, (Papa Silvestre II). Towards the twelfth century on Europe, it arrives in Europe by Leonardo de Pisa or also known as Fibonacci. Arabic or Indo-Rate numbers are the most used symbols to represent numbers. They are called Arabic because the Arabs introduced them in Europe, although their invention arose in India. Thanks to Indian culture we know the 0. The Persian mathematicians of India adopted the system of those who took the Arabic and they adopted it in Europe in the Middle Ages.

In 1642 a 19 -year -old French young man, named Blaise Pascal, built a mechanism to perform arithmetic operations. This mechanism. He was called Pascalina, considered the first mechanical calculator that works by a series of wheels and gears. This device could add and subtract, although it did not do it directly, and multiply and divide by means of subtractions or sums repetitively. In the 60s you could still find machines of this type, replaced in the 70s by digital computers. Pascal introduced improvements in his machine, built about 50 but were very expensive and complicated since all the pieces were made by hand to fit perfectly, but it was cheaper to hire six men than buying a pascalina. Also at this time the fear of unemployment appeared when thinking that men could be replaced by the machine.

In 1671 the German Leibniz wanted to improve Pascal calculation machines. It was a long process since the first prototype dates from 1671 and the definitive model is 1694, it was a spectacular result, considered as the universal calculator of Leibniz. Not only added and subtracted, but also multiplied and divided. This machine consisted of a cylindrical tooth wheel with nine teeth, was driven by another by means of a dial that marked the number by which it wanted to multiply or divide. This machine had no commercial success, so its predecessors, its cost. Leibniz also studied the binary system that is the basis of modern computers.

Algebra originated in ancient Egypt and Babylon. These were able to solve both linear and quadratic equations. The methods we use to solve the equations come from the Babylonians. One of the first books about algebra was written by the Mathematician Al-Juarismi in the ninth century. The word algebra comes from the Arabic word "al-jabru" which means reduction. From algebra the algorithms emerged, which are the step to solve a problem. Algorithms are part of the bases of computer science.[

In 1822, Charles Babbage, an English mathematician, presented the first model of the difference machine. At first it consisted of 96 wheels and 24 axes, which then reduced to 18 wheels and 3 axes, later signed a contract with the English government to build a new machine that gave the name of an analytical machine in 1834. This did not need an operator for its operation since it was automatic and could be programmed by the user to execute the repertoire of instructions in the desired order, the truth is that, Babbage could only build parts of the machine. The machine had a memory with a capacity of 1000 numbers of 50 figures, used auxiliary functions of the type of logarithmic tables, could compare numbers and act in accordance with comparisons, so that it proceeded according to the rules specified in advance in the instructions. All this, or at least largely, is what has been done on computers, but Babbage had limitation to have to do it mathematically. Its design did not include the use of electrical circuits, transistors and much less microprocessors, all that through perforated cards.

In 1855 the Swedish printer George Schewz, who knew Babbage’s first job, built a small differential machine that bought a New York Observatory to print astronomical tables. George Boole, an English mathematician, made an important contribution to the development of computers. He created an algebra very similar to ordinary algebra, but that solves the equations operating with the symbols that represented the numbers. Since the time of the Greek philosopher Aristotle, man has tried to discover ways to solve logic problems. To represent the truth or falsehood of the logical statements and their conclusions, Boole used digits 1 and 0. Until 100 years later computers appeared, Boole’s algebra had no practical utility at the time, it is currently essential for the computer circuit. Also numbers 0 and1 are the basis of another very important part of modern computers: the binary numbering system. The computer is nothing more than a lot of switches, and the information is stored as a 0 or 1, depending on whether the switch is open or closed. The Morse code is another example of a binary system, which uses points and stripes to represent the numbers and letters.

In 1887, in the United States, they were looking for an urgent solution to make the 1890 census, since for the last census held in 1880 they needed 7 years and to process those of 1890 they would need about 100 years due to the spectacular growth of the population among1880 and 1890. The United States Government convened a contest to grant a contract to the best project to censor. 3 proposals were submitted and Hollerith was awarded, a mechanical engineer who had already worked at the Census Office. Hollerith applied the principle of pierced cards for data storage that had already used Babbage. The Hollerith system allowed completing the work of the 1890 census in 2 years. The Hollerith machines tabulation company merged with others in 1924 forming the International Business Machines (the famous IBM).

Alan Turing published in 1936 "Computable Numbers" with the intention of solving the EntscheidungSproblem (decision problem) which meant the beginning of theoretical computer science. In that article it defined what was computable and or that, that is, what could be resolved through algorithms and what does not. Turing also wanted to demonstrate that a machine could be able to "learn", thus born the first idea of artificial intelligence. Nowadays, becausepaper that played Turing.

First generation

In 1904, Despe Ambrose Fleming, applying the Edison effect, produced the first vacuum tube, also called diode because it only has two elements. In 1906, Forest, discovered electronic expansion, and adding a third element to the diode, could control a large current using a small. This new element was called vacuum triod. In 1945, physicist Mauchly and the electrical engineer, Eckert created the first computer with vacuum valves, ENIAC, (actually the first computer was Colossus created during World War II). The eniac at the beginning, consumed such an amount of energy that when they launched it, some areas of Pensylvania were left without light.

Von Neumann was the one who introduced the binary arithmetic of computers, realizing the great advantages that binary arithmetic had over the decimal, and in 1950 the Eniac completed. Another contribution was to store the program for the computer to keep the instructions and data inside, this allowed the computer to operate with greater speed.

## Leave feedback