obvious properties when it comes to computing all the things of interest in playing checkers.
MEMORY VERSUS COMPUTATION: TABLE LOOK-UP
The one alternative algorithm that always exists for a simple input-output function is to store the values in memory and use a table look-up to obtain them. This was the strategy followed above in trying to obtain the logarithm and the antilogarithm. Of course, the values must be computed once in order to initialize the table and they must be stored in appropriate cells. If the function is to be used many times, as with multiplication, such one-time developmental costs are justified. But if a function is to be computed only a small number of times, compared to the total number of values in the table, then table look-up is a poor choice.
The prime determiner of the feasibility of a table look-up is the size of the table involved. This depends critically on the number of arguments, since values must be available for each combination of arguments, hence on the product of their ranges. Our arguments for the multiplication are 8-bit numbers, hence have a range of 2^8 or 256. This implies a reasonable sized table if only one argument is involved, as in the logarithm. However, if we apply the same strategy to the multiplication itself -- planning to do the whole job by table look-up -- then we need 2^8 x 2^8 = 2^16 = 65,536 cells. This is a large, hence expensive, memory, though it is still possible with available RTM components.
It would seem that the size of the table required settles the matter. But let us quantify the judgment. After all, table look-up is extraordinarily simple and fast, and its use here is even more advantageous than in the case above involving the logarithm. A 2^10 word read-only memory costs 85 (we clearly need no capability for writing). This is to be compared to a DMgpa at 31. Counting the extra control and Kbus's (at 14), we might consider that 2 DMgpa's are easily equivalent to 2110 words of memory. To compete with the pipeline which .has 9 DMgpa's (whose operation rate of around 2 microseconds would be roughly equivalent to the table look-up scheme) implies a memory of 4 - 5000 words. But we require around 65,000 words. Thus, unless something can be done to decrease drastically the memory used, the table look-up scheme is out of bounds. This is true of a particular technology. If the relative costs of memory and data operations were to change significantly, then the solution might not be outlandish. There is nothing inherently wrong with using a 65,000 word table, if such tables are cheap enough and fast enough.
Can anything be done to decrease the amount of memory used? We can attempt to exploit some of the structure of multiplication. The most obvious is the commutativity of multiplication so that A * B is the same as B * A. Thus, we need only store approximately half the table. This would reduce us to a 32,000 word table which is still quite a way from the desired goal. Furthermore, we must now compute the address to the table from the inputs, rather than simply concatenating the two arguments A and B, which is all that is required in a pure table look-up. Such a calculation will take some precious time, and so reduce the effective gain from the memory reduction. Something much more drastic is needed.
Since any scheme must involve more processing than the pure look-up let us lower our sights to using 1000 words or less of memory. What could be done then? One could always do a smaller multiplication. 2^10 words permits numbers of size 2^5, since 2^5 x 2^5 = 2^10. Thus one can multiply 5-bit numbers rapidly. This suggests breaking down the total multiplication into multi-