previous | contents | next
Chapter 6 ½ Structure 95
The random-access memory seems nearly perfect for the Mp's of present computers. Of course, enthusiasm for this memory may be based on not knowing how computers would have developed if we had not had it. However, with little or no effort an M. random can be a stack, a queue, a linear, a cyclic, and even (within limits) a content or associative memory. It is an organization very hard to beat.
Content-Addressable and Associative Memories
It is possible to conceive of many exotic accessing capabilities, and numerous proposals have been made involving either theoretical structures or experimental prototypes. Since no particular varieties have become widespread, terminology is still variable. Content-addressable memories are usually taken to mean a collection of cells of predetermined size (i.e., a fixed i-unit) such that if one presents as the address the contents of a predetermined part of the cell (the tag or content address) then the contents of the entire cell will be retrieved. An associative memory is usually taken to mean a system that, when presented with an item of information, delivers one or more "associated" items of information. The principle of association is variable, yielding different kinds of associative memories. Content-addressable memories provide a form of association, as do all memories, in fact. Thus the term associative memory tends to denote forms of association different from familiar ones-forms that presumably have less sharp constraints imposed by the structure of memory (as opposed to the structure of the information in the memory).
STARAN implements an associative memory from random- access memories under the control of special bit-serial processors. Variations of associative memories have been used to increase performance in the form of caches and instruction buffers (see Secs. 2 and 3 in Part 2). In the latter two cases there is a large but slower Mp.random behind the content-addressable memory. The purpose of the fast, small content-addressable memory is to bold local, current data so that an access will not have to be made to the random-access memory.
There are immediate uses for content-addressable memories with a large information-content address. For example, the read-only memories for microprogram processors use long words principally because content-addressable memories are not available. Ideally a microprogrammed processor would like to look at a fairly large processor state to determine what action is to be taken in the microprogram.
It is interesting to speculate about the evolution of computers if a content-addressable memory had been developed in place of the random-access memory.
Multiprocess Environment and Storage Hierarchies
The multiprocess environment region of computer space has become so important that even single-chip microprocessors (e.g., Zilog Z8000, Intel 8086) have added memory management units. Memory management allows multiple processes to be resident in Mp, all in various stages of execution, and for these processes to intercommunicate. A closely related topic is storage hierarchy, whereby several different types of memory technology (from small, fast, and expensive to large, slow, and low-cost) are integrated into the system to appear as one large, fast, and economical memory. The purpose of multiprocess environment and storage hierarchies is to improve not only individual program performance but also system throughput. Section 2 in Part 2 is devoted to these important, interlaced dimensions.
Parallelism and Overlap
Several techniques have evolved to increase system performance via overlap and parallelism. Section 3 of Part 2 presents a detailed discussion of these various techniques.
Anacker ; Bell, Mudge, and McNamara ; Bhandarkar and Juliussen ; Blake ; Bloch and Galage ; Casale ; Forrester ; Harrahy ; Hoagland ; Kenney, Lou, McFarlane, Chan, Nadan, Kohler, Wagner, and Zernike ; Lunde ; Military Handbook 217A ; Military Handbook 217B ; Molnar  ;Myer and Sutherland ; Noyce ; Queyssac ; Siewiorek, Kini, Mashburn, and Joobbani ; Wichmann ; Wilkes ; Williams and Kilburn .
previous | contents | next