Мои Конспекты
Главная | Обратная связь


Автомобили
Астрономия
Биология
География
Дом и сад
Другие языки
Другое
Информатика
История
Культура
Литература
Логика
Математика
Медицина
Металлургия
Механика
Образование
Охрана труда
Педагогика
Политика
Право
Психология
Религия
Риторика
Социология
Спорт
Строительство
Технология
Туризм
Физика
Философия
Финансы
Химия
Черчение
Экология
Экономика
Электроника

EVOLUTION OF ELECTRONIC COMPUTER SYSTEMS



 

Mechanical computers had two serious drawbacks: computer speed was limited by the inertia of the moving parts, and the transmission of information by mechanical means (gears, levers, etc.) was cumbersome and unreliable. In electronic computers, by contrast, the "moving parts" are electrons, and information can be transmitted by electric currents at speeds approaching the speed of light (300,000 km/s). The triode vacuum tube, invented in 1906 by Lee de Forest, enabled the switching of electrical signals at speeds far exceeding those of any mechanical device. The use of vacuum tube technology marked the beginning of the electronic era in computer design.

In the five decades since 1940, the computer industry has experienced four generations of development. Each computer generation is marked by a rapid change in the implementation of its building blocks: from relays and vacuum tubes (1940s-1950s) to discrete diodes and transistors (1950s-1960s), through small-scale and medium-scale integrated (SSI/MSI) circuits (1960s-1970s) to large-scale and very large scale integrated (LSI/VLSI) devices (1970s-1980s). Increases in device speed and reliability and reductions in hardware cost and physical size have greatly enhanced computer performance. However, better devices are not the sole factor contributing to high performance. The division of computer system generations is determined by the device technology, system architecture, processing mode, and languages used. We are currently (1989) in the fourth generation; the fifth generation has not mate­rialized yet, but researchers are working on it.

The First Generation

First-generation machines of the early 1950s were mostly programmed in machine language. Machine language consists of strings of zeros and ones that act as instruc­tions to the computer, specifying the desired electrical states of its internal circuits and memory banks. Obviously, writing a machine language program was extremely cumbersome, tedious, and time consuming. To make programming easier, symbolic languages were developed. Such languages enable instructions to be written with symbolic codes (called mnemonics, or memory aids) rather than strings of ones and zeros. The symbolic instructions are then translated into corresponding binary codes (machine language instructions). The first set of programs, or instructions, telling the computer how to do this translation was developed in 1952 by Dr. Grace M. Hopper at the University of Pennsylvania. (She was also instrumental in making COBOL the "official" language of U.S. government computing.) After this breakthrough, most first-generation computers were programmed in symbolic language.

First-generation computers had a central processing unit (CPU) that contained a set of high-speed registers used for temporary storage of data, instructions, and memory addresses. The control unit decoded the instructions, routed information through the system, and provided timing signals. Instructions were fetched and ex­ecuted in two separate consecutive steps called the fetch cycle and execution cycle, respectively. Together they formed the instruction cycle.

First-generation computers had numerous design shortcomings, including these:

• Inefficient control of I/O operations resulted in poor overall system performance.

• Address modification schemes were inefficient.

• Because the instruction set was oriented toward numeric computations, programming of nonnumeric and logical problems was difficult.

• Facilities for linking programs, such as instructions for calling subroutines that automatically save the return address of the calling program, were not provided.

• Floating-point arithmetic was not implemented, mainly due to the cost of the hardware needed.

Characteristics of first-generation computers

• Use of vacuum tubes in electronic circuits and mercury delay lines for memory

• Magnetic drum as primary internal storage medium

• Limited main storage capacity (1000-4000 bytes)

• Low-level symbolic language programming

• Heat and maintenance problems

• Applications: scientific computations, payroll processing, record keeping

• Cycle time: milliseconds

• Cost: $5 per floating-point operation

• Processing speed: 2000 instructions per second

 

List of words and expressions to remember