The same human need to count and calculate and the limitation of his brain in storing information and its relatively low speed caused the creation of computers.
Although our brain is undoubtedly not comparable to any computer in the world in terms of complexity, it should be noted that the human brain always makes errors under the influence of some factors and has a low speed in performing calculations.
In the distant past, other tools for calculation were made by humans. Some archaeologists believe that the Stonehenge monument in England was used as a solar calendar.
In 1642, Blaise Pascal, a French mathematician, philosopher and physicist, built a mechanical calculator that added numbers by combining several gears. Pascal used this calculator in his father's office to collect the taxes of the province of Auguste Normandie.
With the beginning of the 19th century and the flourishing of basic sciences, including mathematics, in Europe, mathematical logics took on an applied form until George Boole founded Boole's algebra in the 40s of the 19th century. This part of mathematics is the cornerstone of today's digital computers and logic circuits.
With the development of the electrical industry and its combination with Boolean algebra and logic, electrical and electronic circuits became the main tools for building computers.
In 1936, English mathematician Alan Turing designed a theoretical computer that became known as the Turing machine. This computer had the ability to perform various logical calculations. Turing machine logic later became Finite State Machine in computer science and machine language theory discussion.
In 1937, the world's first electronic digital computer named Atanasoff-Berry Computer or ABC was built. Although this machine could not be programmed, it was used to solve linear rates.
Due to the basic nature of ABC, other computers soon replaced it, but this computer had a great impact on the construction of newer computers due to the simultaneous use of binary arithmetic and electronic switches.
The computer industry in 1946 saw the unveiling of the world's first modern computer called ENIAC. This computer was designed based on the Turing machine to perform calculations for the missile projects of the United States Army.
This programmable computer was capable of performing relatively complex mathematical calculations. Two researchers named John Maklay and Jay Persper Eckert from the University of Pennsylvania were the leaders of the ENIAC research team.
ENIAC contained most of the main components of modern computers, including registers, counters, memory, and accumulators.
The main electronic component of the ENIAC was the vacuum tubes, which made up most of the gates and flip-flops in its circuit.
Parts of this computer are now on display at the University of Pennsylvania School of Engineering and Applied Sciences, the National Museum of American History in Washington, the London Science Museum, the Computer Science Museum in Mountain View, California, the University of Michigan in Ann Arbor, and the US Army Museum in Maryland. Visiting people is located.
Although ENIAC was a great scientific advancement for its time, it had major disadvantages, including being large and occupying a lot of space, slow speed, and limited computation.
Another disadvantage of this computer and computers of the same generation was the vacuum lamps. In addition to being expensive and consuming, these lamps needed a lot of time to heat up and get ready for work.
The next generation of computers was built on the memory program architecture or Neumann architecture. This architecture was first proposed in 1945 by John von Neumann.
The first computers based on Neumann's SSEM and EDSAC architectures were built at Manchester and Cambridge universities in the 1940s.
The invention and industrial production of the transistor in the 40s and 50s of the 20th century caused a revolution in the computer industry, which increased the speed of its progress several times.
Transistor computers were many times smaller, faster, cheaper, more energy efficient and more reliable than vacuum lamps. The first transistor computer was built in 1953 at the University of Manchester.
Along with the development of transistors, the computer industry also made significant progress. In the 70s, the development of integrated circuit technology led to the creation of microprocessors such as the Intel 4004.
Since the early 1970s, with the acceleration of the development of the computer industry, its products left the space of university laboratories and became available to the public.
Until, in the early 80s, devices such as video recording and broadcasting devices and even dishwashers and washing machines were equipped with microcontrollers that are made based on microprocessors.
In the early 70s, HP designed a complete computer called BASIC, which was the first generation of personal computers. This computer with a screen, keyboard and printer was very similar to current modern computers.
In 1973, Xerox made another personal computer called the Alto, which had a graphical user interface (GUI). This graphical interface inspired the creation of Macintosh computers and Microsoft Windows.
In 1975, IBM produced its 5100 computer. The characteristic of this computer was the ability to program in BASIC and APL environments.
Since then, the architecture of IBM computers has become a standard for the computer industry.
In 1976, Steve Jobs and Steve Wozniak, the founders of Apple, built a computer called the Apple I, which was the start of Apple's business selling computers.
However, the first personal computer to hit the consumer market was the Commodore PET. The success of this computer was the beginning of personal computers that were designed mostly based on the concept of graphic software.
In 1982, the Commodore company produced the best-selling personal computer of that time, the Commodore 64. This computer had sixty-four kilobytes of RAM memory, which is among the computers available in the open