The term blue blood. What does "blue-blooded man" mean? There is not only a figurative, but also a direct meaning

One of the first devices (5th-4th centuries BC), from which the history of the development of computers can be considered, was a special board, later called "abacus". Calculations on it were carried out by moving bones or stones in the recesses of boards made of bronze, stone, ivory and the like. In Greece, the abacus existed already in the 5th century. BC, among the Japanese it was called "serobayan", among the Chinese - "suanpan". In ancient Russia, a device similar to an abacus was used for counting - a “board count”. In the 17th century, this device took the form of familiar Russian accounts.

Abacus (V-IV centuries BC)

The French mathematician and philosopher Blaise Pascal in 1642 created the first machine, which received the name Pascaline in honor of its creator. A mechanical device in the form of a box with many gears, in addition to addition, also performed subtraction. Data was entered into the machine by turning dials that corresponded to numbers from 0 to 9. The answer appeared at the top of the metal case.


Pascalina

In 1673, Gottfried Wilhelm Leibniz created a mechanical calculating device (Leibniz step calculator - Leibniz calculator), which for the first time not only added and subtracted, but also multiplied, divided and calculated the square root. Subsequently, the Leibniz wheel became the prototype for mass calculating devices - adding machines.


Leibniz step calculator model

English mathematician Charles Babbage developed a device that not only performed arithmetic operations, but also immediately printed the results. In 1832, a ten-fold reduced model was built from two thousand brass parts, which weighed three tons, but was able to perform arithmetic operations with an accuracy of six decimal places and calculate second-order derivatives. This computer became the prototype of real computers, it was called a differential machine.

differential machine

The summing apparatus with continuous transmission of tens is created by the Russian mathematician and mechanic Pafnuty Lvovich Chebyshev. This device has achieved automation of all arithmetic operations. In 1881, a prefix was created for a adding apparatus for multiplying and dividing. The principle of continuous transmission of tens has been widely used in various counters and computers.


Chebyshev summing apparatus

Automated data processing appeared at the end of the last century in the United States. Herman Hollerith created a device - Hollerith's Tabulator - in which, applied to punched cards, it was deciphered by electric current.

Hollerith tabulator

In 1936, a young scientist from Cambridge, Alan Turing, came up with a mental calculating machine-computer that existed only on paper. His "smart machine" acted according to a certain predetermined algorithm. Depending on the algorithm, the imaginary machine could be used for a wide variety of purposes. However, at that time these were purely theoretical considerations and schemes that served as a prototype of a programmable computer, as a computing device that processes data in accordance with a certain sequence of commands.

Information revolutions in history

In the history of the development of civilization, there have been several information revolutions - transformations of social social relations due to changes in the processing, storage and transmission of information.

First the revolution is associated with the invention of writing, which led to a gigantic qualitative and quantitative leap of civilization. It became possible to transfer knowledge from generations to generations.

Second(mid-16th century) the revolution was caused by the invention of printing, which radically changed industrial society, culture, and the organization of activities.

Third(end of the 19th century) a revolution with discoveries in the field of electricity, thanks to which the telegraph, telephone, radio, and devices appeared that allow you to quickly transfer and accumulate information in any volume.

Fourth(since the seventies of the XX century) the revolution is associated with the invention of microprocessor technology and the advent of the personal computer. Computers, data transmission systems (information communications) are created on microprocessors and integrated circuits.

This period is characterized by three fundamental innovations:

  • transition from mechanical and electrical means of information conversion to electronic ones;
  • miniaturization of all nodes, devices, devices, machines;
  • creation of software-controlled devices and processes.

History of the development of computer technology

The need for storing, converting and transmitting information in humans appeared much earlier than the telegraph apparatus, the first telephone exchange and an electronic computer (computer) were created. In fact, all the experience, all the knowledge accumulated by mankind, one way or another, contributed to the emergence of computer technology. The history of the creation of computers - the general name of electronic machines for performing calculations - begins far in the past and is associated with the development of almost all aspects of human life and activity. As long as human civilization has existed, a certain automation of calculations has been used for so long.

The history of the development of computer technology has about five decades. During this time, several generations of computers have changed. Each subsequent generation was distinguished by new elements (electronic tubes, transistors, integrated circuits), the manufacturing technology of which was fundamentally different. Currently, there is a generally accepted classification of computer generations:

  • First generation (1946 - early 50s). Element base - electronic lamps. Computers were distinguished by large dimensions, high energy consumption, low speed, low reliability, programming in codes.
  • Second generation (late 50s - early 60s). Element base - semiconductor. Almost all technical characteristics have improved in comparison with the computers of the previous generation. Algorithmic languages ​​are used for programming.
  • 3rd generation (late 60s - late 70s). Element base - integrated circuits, multilayer printed wiring. A sharp decrease in the dimensions of computers, an increase in their reliability, an increase in productivity. Access from remote terminals.
  • The fourth generation (from the mid-70s to the end of the 80s). Element base - microprocessors, large integrated circuits. Improved specifications. Mass production of personal computers. Directions of development: powerful multiprocessor computing systems with high performance, creation of cheap microcomputers.
  • Fifth generation (since the mid-80s). The development of intelligent computers began, which has not yet been crowned with success. Introduction to all areas of computer networks and their association, the use of distributed data processing, the widespread use of computer information technologies.

Along with the change of generations of computers, the nature of their use also changed. If at first they were created and used mainly for solving computational problems, then later the scope of their application expanded. This includes information processing, automation of management of production, technological and scientific processes, and much more.

How Computers Work by Konrad Zuse

The idea of ​​​​the possibility of building an automated calculating machine came up with the German engineer Konrad Zuse (Konrad Zuse) and in 1934 Zuse formulated the basic principles on which future computers should work:

  • binary number system;
  • the use of devices operating on the principle of "yes / no" (logical 1 / 0);
  • fully automated operation of the calculator;
  • software control of the computing process;
  • support for floating point arithmetic;
  • use of large capacity memory.

Zuse was the first in the world to determine that data processing begins with a bit (he called the bit "yes / no status", and the formulas of binary algebra - conditional propositions), the first to introduce the term "machine word" (Word), the first to combine arithmetic and logical calculators operations, noting that “the elementary operation of a computer is to check two binary numbers for equality. The result will also be a binary number with two values ​​(equal, not equal).

First generation - computers with vacuum tubes

Colossus I - the first computer on lamps, created by the British in 1943, to decode German military ciphers; it consisted of 1800 vacuum tubes - information storage devices - and was one of the first programmable electronic digital computers.

ENIAC - was created to calculate artillery ballistics tables; this computer weighed 30 tons, occupied 1000 square feet and consumed 130-140 kW of electricity. The computer contained 17468 vacuum tubes of sixteen types, 7200 crystal diodes and 4100 magnetic elements, and they were contained in cabinets with a total volume of about 100 m 3 . ENIAC had a performance of 5000 operations per second. The total cost of the machine was $750,000. The electricity requirement was 174 kW, and the total space occupied was 300 m2.


ENIAC - a device for calculating artillery ballistics tables

Another representative of the 1st generation of computers that you should pay attention to is EDVAC (Electronic Discrete Variable Computer). EDVAC is interesting in that it attempted to record programs electronically in so-called "ultrasonic delay lines" using mercury tubes. In 126 such lines, it was possible to store 1024 lines of four-digit binary numbers. It was "fast" memory. As a "slow" memory, it was supposed to fix numbers and commands on a magnetic wire, but this method turned out to be unreliable, and teletype tapes had to be returned to. The EDVAC was faster than its predecessor, adding in 1 µs and dividing in 3 µs. It contained only 3.5 thousand electron tubes and was located on 13 m 2 of area.

UNIVAC (Universal Automatic Computer) was an electronic device with programs stored in memory, which were entered there no longer from punched cards, but using a magnetic tape; this provided a high speed of reading and writing information, and, consequently, a higher speed of the machine as a whole. One tape could contain a million characters written in binary form. Tapes could store both programs and intermediate data.


Representatives of the 1st generation of computers: 1) Electronic Discrete Variable Computer; 2) Universal Automatic Computer

The second generation is a computer on transistors.

Transistors replaced vacuum tubes in the early 1960s. Transistors (which act like electrical switches) consume less electricity and generate less heat, and take up less space. Combining several transistor circuits on one board gives an integrated circuit (chip - “chip”, “chip” literally, a plate). Transistors are binary counters. These details fix two states - the presence of current and the absence of current, and thereby process the information presented to them in this binary form.

In 1953, William Shockley invented the p-n junction transistor. The transistor replaces the vacuum tube and at the same time operates at a higher speed, generates very little heat and consumes almost no electricity. Simultaneously with the process of replacing electron tubes with transistors, information storage methods were improved: as memory devices, magnetic cores and magnetic drums began to be used, and already in the 60s information storage on disks became widespread.

One of the first transistorized computers, the Atlas Guidance Computer, was launched in 1957 and was used to control the launch of the Atlas rocket.

Created in 1957, the RAMAC was an inexpensive computer with modular external memory on disks, combined magnetic core random access memory and drums. Although this computer was not yet completely transistorized, it was highly operable and easy to maintain and was in great demand in the office automation market. Therefore, a “large” RAMAC (IBM-305) was urgently released for corporate customers; to accommodate 5 MB of data, the RAMAC system needed 50 disks with a diameter of 24 inches. The information system created on the basis of this model smoothly processed arrays of requests in 10 languages.

In 1959, IBM created its first all-transistorized large mainframe computer, the 7090, capable of 229,000 operations per second—a true transistorized mainframe. In 1964, based on two 7090 mainframes, the American airline SABER for the first time applied an automated system for selling and booking airline tickets in 65 cities around the world.

In 1960, DEC introduced the world's first minicomputer, the PDP-1 (Programmed Data Processor), a computer with a monitor and keyboard, which became one of the most notable items on the market. This computer was capable of performing 100,000 operations per second. The machine itself occupied only 1.5 m 2 on the floor. The PDP-1 became, in fact, the world's first gaming platform thanks to MIT student Steve Russell, who wrote a Star War computer toy for it!


Representatives of the second generation of computers: 1) RAMAC; 2) PDP-1

In 1968, Digital began mass production of minicomputers for the first time - it was the PDP-8: their price was about $ 10,000, and the model was the size of a refrigerator. It was this PDP-8 model that laboratories, universities and small businesses were able to buy.

Domestic computers of that time can be characterized as follows: in terms of architectural, circuit and functional solutions, they corresponded to their time, but their capabilities were limited due to the imperfection of the production and element base. The machines of the BESM series were the most popular. Serial production, rather insignificant, began with the release of the Ural-2 computer (1958), BESM-2, Minsk-1 and Ural-3 (all in 1959). In 1960, they went into the M-20 and Ural-4 series. At the end of 1960, the M-20 had the maximum performance (4500 lamps, 35 thousand semiconductor diodes, memory for 4096 cells) - 20 thousand operations per second. The first computers based on semiconductor elements (Razdan-2, Minsk-2, M-220 and Dnepr) were still under development.

Third generation - small-sized computers on integrated circuits

In the 1950s and 60s, assembling electronic equipment was a labor-intensive process that was slowed down by the increasing complexity of electronic circuits. For example, a CD1604 computer (1960, Control Data Corp.) contained about 100,000 diodes and 25,000 transistors.

In 1959, Americans Jack St. Clair Kilby (Texas Instruments) and Robert N. Noyce (Fairchild Semiconductor) independently invented the integrated circuit (IC), a collection of thousands of transistors placed on a single silicon chip inside a microcircuit.

The production of computers on ICs (they were called microcircuits later) was much cheaper than on transistors. Thanks to this, many organizations were able to acquire and master such machines. And this, in turn, led to an increase in demand for universal computers designed to solve various problems. During these years, the production of computers acquired an industrial scale.

At the same time, semiconductor memory appeared, which is still used in personal computers to this day.


Representative of the third generation of computers - ES-1022

Fourth generation - personal computers on processors

The forerunners of the IBM PC were the Apple II, Radio Shack TRS-80, Atari 400 and 800, Commodore 64 and Commodore PET.

The birth of personal computers (PC, PC) is rightfully associated with Intel processors. The corporation was founded in mid-June 1968. Since then, Intel has become the world's largest manufacturer of microprocessors with over 64,000 employees. Intel's goal was to create semiconductor memory, and in order to survive, the company began to take third-party orders for the development of semiconductor devices.

In 1971, Intel received an order to develop a set of 12 chips for programmable calculators, but the creation of 12 specialized chips seemed cumbersome and inefficient to Intel engineers. The task of reducing the range of microcircuits was solved by creating a "twin" from semiconductor memory and an actuator capable of working on commands stored in it. It was a breakthrough in computing philosophy: a universal logic device in the form of a 4-bit central processing unit i4004, which was later called the first microprocessor. It was a set of 4 chips, including one chip controlled by commands that were stored in an internal semiconductor memory.

As a commercial development, a microcomputer (as the microcircuit was then called) appeared on the market on November 11, 1971 under the name 4004: 4 bit, containing 2300 transistors, clock frequency 60 kHz, cost - $ 200. In 1972, Intel released the eight-bit microprocessor 8008, and in 1974 - its improved version Intel-8080, which by the end of the 70s became the standard for the microcomputer industry. Already in 1973, the first computer based on the 8080 processor, Micral, appeared in France. For various reasons, this processor was not successful in America (in the Soviet Union it was copied and produced for a long time under the name 580VM80). At the same time, a group of engineers left Intel and formed Zilog. Its loudest product is the Z80, which has an extended 8080 command set and, which made it a commercial success for home appliances, made do with a single 5V supply. On its basis, in particular, the ZX-Spectrum computer was created (sometimes it is called by the name of the creator - Sinclair), which practically became the prototype of the Home PC of the mid-80s. In 1981, Intel released the 16-bit processor 8086 and 8088, an analogue of the 8086, except for the external 8-bit data bus (all peripherals were still 8-bit at that time).

Intel's competitor, the Apple II computer, differed in that it was not a completely finished device and there was some freedom for refinement directly by the user - it was possible to install additional interface boards, memory boards, etc. It was this feature, which later became known as "open architecture", became its main advantage. Two more innovations, developed in 1978, contributed to the success of the Apple II. An inexpensive floppy disk drive and the first commercial calculation program, the VisiCalc spreadsheet.

The Altair-8800 computer, built on the basis of the Intel-8080 processor, was very popular in the 70s. Although the Altair's capabilities were rather limited - RAM was only 4 Kb, there was no keyboard and screen, its appearance was met with great enthusiasm. It was released to the market in 1975 and several thousand sets of the machine were sold in the first months.


Representatives of the 4th generation of computers: a) Micral; b) Apple II

This computer, designed by MITS, was sold by mail order as a DIY kit. The entire build kit cost $397, while only one processor from Intel sold for $360.

The spread of the PC by the end of the 70s led to some decrease in demand for main computers and minicomputers - IBM released the IBM PC based on the 8088 processor in 1979. The software that existed in the early 80s was focused on word processing and simple electronic tables, and the very idea that a "microcomputer" could become a familiar and necessary device at work and at home seemed incredible.

On August 12, 1981, IBM introduced the Personal Computer (PC), which, in combination with software from Microsoft, became the standard for the entire PC fleet of the modern world. The price of the IBM PC model with a monochrome display was about $3,000, with a color one - $6,000. IBM PC configuration: Intel 8088 processor with a frequency of 4.77 MHz and 29 thousand transistors, 64 KB of RAM, 1 floppy drive with a capacity of 160 KB, - a conventional built-in speaker. At this time, launching and working with applications was a real pain: due to the lack of a hard drive, you had to change floppy disks all the time, there was no “mouse”, no graphical windowed user interface, no exact correspondence between the image on the screen and the final result (WYSIWYG ). Color graphics were extremely primitive, there was no question of three-dimensional animation or photo processing, but the history of the development of personal computers began with this model.

In 1984, IBM introduced two more innovations. First, a model for home users called the 8088-based PCjr was released, which was equipped with perhaps the first wireless keyboard, but this model did not succeed in the market.

The second novelty is the IBM PC AT. The most important feature: the transition to higher-level microprocessors (80286 with 80287 digital coprocessor) while maintaining compatibility with previous models. This computer proved to be a trendsetter for many years to come in a number of respects: it was the first to introduce a 16-bit expansion bus (which remains standard to this day) and EGA graphics adapters with a resolution of 640x350 at a color depth of 16 bits.

1984 saw the release of the first Macintosh computers with a graphical interface, a mouse, and many other user interface attributes that modern desktop computers cannot be without. Users of the new interface did not leave indifferent, but the revolutionary computer was not compatible with either the previous programs or hardware components. And in the corporations of that time, WordPerfect and Lotus 1-2-3 had already become normal working tools. Users have already become accustomed to and adapted to the symbolic DOS interface. From their point of view, the Macintosh even looked somehow frivolous.

Fifth generation of computers (from 1985 to our time)

Distinctive features of the 5th generation:

  1. New production technologies.
  2. Rejection of traditional programming languages ​​such as Cobol and Fortran in favor of languages ​​with enhanced character manipulation and elements of logic programming (Prolog and Lisp).
  3. Emphasis on new architectures (for example, data flow architecture).
  4. New user-friendly input/output methods (e.g. speech and image recognition, speech synthesis, natural language message processing)
  5. Artificial intelligence (that is, automation of the processes of solving problems, obtaining conclusions, manipulating knowledge)

It was at the turn of the 80-90s that the Windows-Intel alliance was formed. When Intel released the 486 microprocessor in early 1989, computer manufacturers didn't wait for an example from IBM or Compaq. A race began, in which dozens of firms entered. But all the new computers were extremely similar to each other - they were united by compatibility with Windows and processors from Intel.

In 1989, the i486 processor was released. It had a built-in math coprocessor, a pipeline, and a built-in first-level cache.

Directions for the development of computers

Neurocomputers can be attributed to the sixth generation of computers. Despite the fact that the actual use of neural networks began relatively recently, neurocomputing as a scientific direction has entered its seventh decade, and the first neurocomputer was built in 1958. The developer of the machine was Frank Rosenblatt, who gave his brainchild the name Mark I.

The theory of neural networks was first identified in the work of McCulloch and Pitts in 1943: any arithmetic or logical function can be implemented using a simple neural network. Interest in neurocomputing flared up again in the early 80s and was fueled by new work with multilayer perceptrons and parallel computing.

Neurocomputers are PCs consisting of many simple computing elements working in parallel, which are called neurons. Neurons form so-called neural networks. The high speed of neurocomputers is achieved precisely due to the huge number of neurons. Neurocomputers are built according to the biological principle: the human nervous system consists of individual cells - neurons, the number of which in the brain reaches 10 12, despite the fact that the response time of a neuron is 3 ms. Each neuron performs fairly simple functions, but since it is connected on average with 1-10 thousand other neurons, such a team successfully ensures the functioning of the human brain.

Representative of the VIth generation of computers - Mark I

In optoelectronic computers, the information carrier is the luminous flux. Electrical signals are converted to optical and vice versa. Optical radiation as an information carrier has a number of potential advantages over electrical signals:

  • Light streams, unlike electrical ones, can intersect with each other;
  • Light fluxes can be localized in the transverse direction of nanometer dimensions and transmitted through free space;
  • The interaction of light streams with non-linear media is distributed throughout the entire environment, which gives new degrees of freedom in organizing communication and creating parallel architectures.

Currently, developments are underway to create computers entirely consisting of optical information processing devices. Today this direction is the most interesting.

An optical computer has unprecedented performance and a completely different architecture than an electronic computer: for 1 clock cycle of less than 1 nanosecond (this corresponds to a clock frequency of more than 1000 MHz), an optical computer can process a data array of about 1 megabyte or more. To date, individual components of optical computers have already been created and optimized.

An optical computer the size of a laptop can give the user the ability to place in it almost all the information about the world, while the computer can solve problems of any complexity.

Biological computers are ordinary PCs, only based on DNA computing. There are so few really demonstrative works in this area that it is not necessary to talk about significant results.

Molecular computers are PCs, the principle of which is based on the use of changes in the properties of molecules in the process of photosynthesis. In the process of photosynthesis, the molecule assumes different states, so that scientists can only assign certain logical values ​​to each state, that is, "0" or "1". Using certain molecules, scientists have determined that their photocycle consists of only two states, which can be “switched” by changing the acid-base balance of the environment. The latter is very easy to do with an electrical signal. Modern technologies already make it possible to create entire chains of molecules organized in this way. Thus, it is very possible that molecular computers are waiting for us “just around the corner”.

The history of the development of computers is not over yet, in addition to improving the old ones, there is also the development of completely new technologies. An example of this is quantum computers - devices that operate on the basis of quantum mechanics. A full-scale quantum computer is a hypothetical device, the possibility of building which is associated with the serious development of quantum theory in the field of many particles and complex experiments; this work lies at the forefront of modern physics. Experimental quantum computers already exist; elements of quantum computers can be used to increase the efficiency of calculations on an existing instrument base.

Now the use of personal computers from Apple, Samsung, HP, Dell and other manufacturers seems to us something completely natural. However, less than a century ago, the average person had no idea about computer technology, and any development that is used today on every device has become a real breakthrough in the industry.

In this article, we will talk about what the very first computers in the world were like, who developed them and why, what their capabilities were, and how much they contributed to the development of technology.

Building the very first computers

The very first computers in the world occupied tens of square meters, and their weight was measured in tons. Nevertheless, it was they who allowed humanity to come to those compact and convenient devices that we use now. Unfortunately, there is no exact answer to the question of which computer was really the very first computer. However, there are several variants of this answer, which we will consider below.

Computer "Mark 1"

The Mark 1, also known as the ASCC (Automatic Sequence Controlled Calculator), was designed and built in 1941. The US Navy acted as the customer of the work, and IBM as the general contractor. Five engineers were directly involved in the development of the device, led by the representative of the American army, Howard Aiken. As a basis for the implementation of the project, the developers took an analytical computer, which was created by the famous British inventor Charles Babbage.

At its core, "Mark 1" was an advanced adding machine that could be programmed and did not require human intervention directly in the process of performing calculations. The developers did not take into account all the advantages of the binary number system, which is used by most modern computers in the world, and forced the machine to operate in decimal numbers.

Information was entered into the device using punched tape. Mark 1 could not perform any conditional jumps, and therefore the code of each program was very long and cumbersome. There was also no software option for creating cycles: in order to make a loop in the code, the punched tape with the code literally had to be “closed” by connecting its beginning and end.

Physically, ASCC looked like this:

  • length about 17 m;
  • height over 2.5 m;
  • weight about 4.5 tons;
  • 765,000 parts;
  • 800 km of connecting wires;
  • 15-meter shaft that provides synchronization of the main computing elements;
  • 4 kW electric motor.

At the insistence of IBM CEO Thomas Watson, the computer was placed in a stainless steel and glass case, while Howard Aiken insisted on a transparent case to leave the "innards" of the computer visible.

"Mark 1" was able to work with numbers, the length of which was up to 23 digits. It took only 0.3 seconds to subtract and add, 6 seconds to multiply, 15.3 seconds to divide, and more than a minute to perform trigonometric functions and calculate logarithms. At that time, this was an amazing speed, which made it possible to perform calculations in one day, which would previously have taken six months. Therefore, at the final stage of the Second World War, the device was quite successfully used by the American Navy, after which it worked for about 15 years at Harvard University.

The debate about who created the very first computer in the world, and when it happened, has not subsided so far. As it is not difficult to guess, in the USA the first “ancestor” of modern PCs is considered to be the “Mark 1”. However, in reality, he began to work about 2 years after the German engineer Konrad Zuse developed his Z3 computer, presented to the general public all in the same 1941. In addition, Zuse, in principle, used more advanced technologies (at least the binary number system), while Mark 1, according to some researchers, was outdated even before it was created.

Or is it Z3 from Zuse Konrad

Konrad Zuse is one of the most important figures in the history of all computer engineering in the world, although he worked for the benefit of the Third Reich. However, Zuse considered the bombing of Dresden and other German cities, where the predominantly civilian population remained, by Anglo-American aircraft as the main motivation in his work. Conrad began working on his computers back in the 1930s, while studying at the Berlin Polytechnic University.

His work was based on several revolutionary ideas at the time:

  • The memory must be divided: one part of it must be allocated for control data, the other for calculated data.
  • Numbers must be represented in the binary system.
  • The machine must be able to work with floating point numbers (whereas the Mark 1 only worked with fixed point numbers). It is worth noting that the algorithm for implementing this idea, which Zuse called "semilogarithmic notation", is similar to that used on modern computers.

Data in the Z3 was entered using punched tape. All instructions that the machine could execute were divided into three groups: arithmetic operators, memory, and also input and output. There were no restrictions on the location of instructions within the punched tape, while there were two specific commands - Ld and Lu - designed to display information on the display and read from the keyboard, respectively.

Both of these instructions stopped the machine so that the operator could write down the result, or enter the required number. This computer did not support conditional transitions, and cycles, as in the case of Mark 1, had to be implemented by fastening the beginning and end of the punched tape.

The main characteristics of the machine are as follows:

  • the addition operation was performed in 0.7 seconds;
  • multiplication and division operations lasted 3 seconds;
  • the device consisted of 2600 telephone relays;
  • the clock frequency of the Z3 was approximately 5.33 Hz;
  • the device consumed 4 kW of energy;
  • its size was about two times smaller than the dimensions of the "Mark 1";
  • its weight was 1 ton.

The machine existed until 1944 and helped the Third Reich to make complex calculations for fascist aviation. In 1944, the computer burned down along with the project documentation after one of the regular air bombardments. However, Konrad Zuse soon created the Z4, and the Z3 computer was reconstructed in 1960 by Zuse KG. But that's a completely different story.

Unbiased critics agree that the status of the first free programmable and workable computer in the world rightfully belongs to Z3, and all attempts to refute this statement are pseudo-patriotic speculation by representatives of individual countries. It is unlikely that these discussions will ever end, but one can definitely say the following: if the Mark 1 was outdated even before its release, then the Z3 implemented many of the technologies and principles that began to be applied in the computers of the future.

The first electronic computer in the USSR and continental Europe

The first computer on the territory of the USSR and continental Europe is considered to be a development called "MESM", which stands for "Small Electronic Computing Machine". The device was created in Ukraine, in the laboratory of computer technology of the Kyiv Institute of Electrical Engineering. The project was implemented under the leadership of academician Sergei Lebedev.

Sergey Alekseevich, like Tsuse, began to think about the creation of a computer back in the 30s of the last century. However, he was able to start this work closely only after the war, and even then not in the best conditions: the Institute of Electrical Engineering was provided with the premises of a monastery hotel in Feofaniya (at a distance of about 10 km from Kyiv), in a dilapidated house.

However, domestic engineers managed to more or less repair the building, and in just three years to create and establish MESM. At the same time, only 12 engineers worked on the project, as well as 15 installers and technicians who helped them as needed. The machine had the following characteristics:

  • occupied a room of about 60 square meters;
  • could perform 3000 operations per minute, which at that time was an incredible indicator;
  • worked on 6,000 vacuum tubes, which consumed 25 kW;
  • could perform addition, subtraction, division, multiplication, and shift, taking into account comparison in absolute value, sign, transmission of numbers from a magnetic drum, control transfer, and addition of commands.

As you might guess, 6000 lamps provided an almost tropical climate in the room. Nevertheless, the MESM was successfully used in a large number of scientific studies until 1957: in the field of space flights, thermonuclear processes, mechanics, long-distance power lines, and so on.

Other early systems

"Mark 1" and Z3 are not all participants in the dispute for the title of the very first computer in the world. Considering that in the middle of the twentieth century, the development of computer technologies began to develop exponentially, and computers acquired more and more features of modern computers, many researchers give first place in this kind of “rating” to those systems that will be discussed below.

Eniac calculators

The ENIAC electronic digital computer began to be developed in 1943, and completed in 1945. Scientists from the University of Pennsylvania John Eckert and John Mauchly worked on its creation. The order for the development of ENIAC was fulfilled by the US Army, which needed a device for the accurate calculation of firing tables. But due to the fact that the computer was assembled only towards the end of the war, its purpose had to be changed: from 1947 to 1955, it was used by the US Army Ballistic Research Laboratory, which, using ENIAC, performed various calculations in the development of thermonuclear weapons. It is noteworthy that six girls became the first programmers of this computer.

First commercial copies of UNIVAC

Conventionally, the first computer of the UNIVAC series (UNIVersal Automatic Computer I) is considered the first commercial computer in the United States, and the third in the world. It was developed by the same John Eckert and John Mauchly, commissioned by the US Air Force and the US Army in cooperation with the Census Bureau. The UNIVAC I was developed from 1947 to 1951. It was the Bureau that officially sold the first computer of this series, several dozen other copies appeared in private corporations, government agencies and three American universities. UNIVAC I used BCD arithmetic, 5200 vacuum tubes consuming 125 kW of electricity, and weighed 13 tons. In one second, he could carry out 1905 operations. To accommodate it, a room of 35.5 square meters was required.

Apple's first computer

The first computer from the eminent "apple" brand was called "Apple I" and was released in 1976. The key innovation used in the creation of this computer was the ability to enter information from the keyboard with its instant display on the display. During the presentation of the device, the oratorical and entrepreneurial talent of Steve Jobs appeared, while his shy friend Steve Wozniak was directly involved in the development of the Apple I. This computer was completely assembled on a circuit board, which consisted of about thirty chips, which is why it is sometimes called the very first full-fledged PC in the world.

The price of the very first computer

The cost of developing the first computers in the world was significantly higher than the current prices for computers in the middle price segment. So, about $500,000 was invested in the creation of Mark 1. Z3 cost the Third Reich 50,000 Reichsmarks, which at the rate of those times was about $ 20,000. For the creation of ENIAC, the developers requested 61,700 dollars. And to fulfill the first order for the Apple I, made by Paul Terrell, Jobs and Wozniak needed $ 15,000. At the same time, the first models of the "apple" computer were sold at $666.66 apiece.

Video "The first computer"

All the information provided above was taken from open sources, mainly from the free encyclopedia Wikipedia.

One of the greatest inventions of its time. Billions of people use computers in their daily lives around the world.

Over the decades, the computer has evolved from a very expensive and slow device to today's highly intelligent machines with incredible processing power.

No one person is credited with inventing the computer, many believe that Konrad Zuse and his Z1 machine were the first in a long line of innovations that gave us the computer. Konrad Zuse was a German who gained fame for creating the first freely programmable mechanical computing device in 1936. The Z1 Zuse was created with a focus on 3 main elements that are still used in today's calculators. Later, Konrad Zuse created the Z2 and Z3.

The first computers in the Mark series were built at Harvard. MARK was created in 1944, and this computer was the size of a room measuring 55 feet long and 8 feet high. MARK could perform a wide range of calculations. It became a successful invention and was used by the US Navy until 1959.

The ENIAC computer was one of the most important advances in computing. It was commissioned during World War II by the US military. This computer used vacuum tubes instead of electric motors and levers for fast calculations. Its speed was thousands of times faster than any other computing device of the time. This computer was huge and had a total cost of $500,000. ENIAC was in service until 1955.

RAM or Random Access Memory was introduced in 1964. The first RAM was a metal-detecting plate placed next to a vacuum tube that detected differences in electrical charges. It was an easy way to store computer instructions.

There were many innovations in 1940. Manchester developed the Telecommunications Research Establishment. It was the first computer to use a stored program, and it became operational in 1948. The Manchester MARK I continued into 1951 and showed tremendous progress.

UNIVAC was built by the creators of ENIAC. It was the fastest and most innovative computer capable of handling many calculations. It was a masterpiece of its time and was highly acclaimed by the public.

IBM, the first personal computer widely used and available to people. The IBM 701 was the first general purpose computer developed by IBM. A new computer language called "Fortran" was used in the new 704 model. The IBM 7090 was also a great success and dominated the office computer for the next 20 years. In the late 1970s and into the 1980s, IBM developed the personal computer known as the PC. IBM has had a huge impact on the computers used today.

With the growth of the personal computer market in the early and mid 1980s, many companies realized that the graphical interface was more user friendly. This led to the development of an operating system called Windows by Microsoft. The first version was called Windows 1.0 and later came Windows 2.0 and 3.0. Microsoft is getting more popular today.

Today computers are extremely powerful and more affordable than ever. They have penetrated almost every aspect of our lives. They are used as a powerful communication and trading tool. The future of computers is huge.

Our time of computerization obliges every educated person to know a lot of things directly related to computer technology. And you can start by studying when the first computer appeared. Agree, few people think about the origin of the computer.

The history of the creation of the first computer

The computer as we see it now was not always like this. The "life path" of the computer can be easily traced. It began with the invention by Blaise Pascal in 1642 of a device capable of adding and subtracting decimal numbers - the Pascaline. It was difficult to call the first computer in the world a computer, because it was something more like a calculator. This computer performed only two operations - addition and subtraction. Then, in 1653, multiplication and division were added to these functions. Further, the development of the functionality froze somewhat, they began to focus on the appearance, that is, they tried to clothe all the same options in a more compact and less cumbersome shell. It wasn't until 1822 that a machine was invented that could solve simple equations. It was a turning point in the history of computer technology. After that, automated computing devices evolved at a tremendous speed. Already in 1946, a new computer appeared to the world. Of course, comparing the first computer and a modern PC, one can smile skeptically, because it is not clear how a machine weighing 30 tons could turn into something that even a woman can easily lift.

Who is the developer of the first computer

Of course, the one who invented the first computer in the world, did a great job in the development of technology. So who is this person who introduced the first computer to the world? It was a scientist of German origin Konrad Zuse. The very first computer in the world, according to Wikipedia, is precisely the computer invented in the forties of the 20th century, since it contained all the basic functions of a modern computer. But the size of this technique was still incredibly large, sometimes taking up an entire room alone. And only with the invention of the microprocessor, for which thanks to Tedd Hoff, the computer began to approach in size to the PC familiar to us.

By the way, the PC gained wide popularity after the start of competition between two companies - Apple and Microsoft. In the battle for customers, these companies improved and improved their products, delighting us with compactness and functionality. Since the first PC was invented, relatively little time has passed, but the differences between that device and the current one are huge. We can only guess what improvements await the computer in the future.

Loading...
Top