previous | contents | next

64 Part 1½ Fundamentals Section 2½ The Computer Space

fixed locations where they can be interconnected during the final metalization stages of semiconductor manufacture.

There is a special branch of the tree shown in Fig. 1 purely for memory functions. Memory is used in the processor as conventional memory, but it can also be used as an alternative to conventional logic for performing combinational logic functions. For example, the inputs to a combinational function can be used as an address, and the output can be obtained by reading the contents of that address. Memory can also be used to implement sequential logic functions. For example, it can be used to hold state information for a microprogram. (See Sec. 1 of Part 2.)

There is a special branch for bit-slice components that can be combined to form data paths of arbitrary widths. These are being used to construct most of today's high-speed digital systems, mid-range computers, and computer peripherals. Although there have been several bit-slice families, the AMD Corp. 2900-series has become the most widely used (see Chaps. 13, 14, and 15).

The final branch of the tree in Fig. 1 is the most complex and is used to mark the fourth (microprocessor-on-a-chip) generation of technology and the beginning of the fifth (computer-on-a-chip) generation. The fourth generation is marked by the packaging of a complete processor on a single silicon die. Using this standard, the fifth generation has already begun, since a complete computer (processor with memory), called a monolithic microcomputer in our computer classification of Chap. 1, now occupies a single die. The evolution in complexity during each generation simply permits larger-word-length processors or computers to be placed on one chip. At the beginning of the fourth generation, a 4-bit processor was the benchmark; toward the end of the fourth generation, a complete 16-bit processor could be placed on a single chip.

Figure 2 plots the increase in IC complexity as a function of time, a graph known as the Moore plot. In 1964, Gorden E. Moore, then director of research at Fairchild Semiconductor, predicted that the component count per IC chip would double every year. Indeed, since the introduction of the planar transistor (1959), with a component density of 1, this essential doubling has occurred each year up to the present. According to the Moore plot, integrated-circuit chips composed of 1 million components are predicted for the early 1950s. As pointed out by Moore, three factors must be considered to contribute equally to the doubling of component count per year: (1) an increase in chip area, (2) a decrease in minimum physical dimensions of components, and (3) the contributions made by the invention of new structures and/or circuit cleverness.

The result given in Fig. 2 is exponential and indicates that the number of bits per chip for a metal oxide semiconductor (MOS) memory doubles every two years according to the relationship:

Number of bits per chip = 2t-1962

There are separate curves, each following this relationship, for bipolar read-write memories, bipolar read-only memories, and MOS read-only memories. Thus products lead or lag behind the above state-of-the art time line by one to three years according to the following rules:

· Bipolar read-write memories lag by two to three years.

· Bipolar-read-only memories lag by about one year.

· MOS read-only memories lead by one year

Random logic, as represented by the 8-bit microprocessor and SDLC chip in Fig. 2, actually lies on a different exponential curve. Chapter 36 discusses the trends in microprocessor densities.

After density, the most important characteristic of integrated circuits is price. The price of integrated circuits is probably the hardest of all the parameters to identify and predict because it is set by a complex marketplace.

The price/history of integrated circuits is reflected very dramatically in the price history of a special class of integrated circuits, semiconductor memory. The semiconductor memory price curves, given in Fig. 3, are also interesting because of the important role of memory in past and future computer structures.1

1Discussion of Fig. 3 adapted from Bell, Mudge, and McNamara [1978].
 
 

previous | contents | next