Translate

Friday, January 4, 2013

Quantitative principles of computer design


Guidelines and principles useful in the design and analysis of computers.

v  ADVANTAGE OF PARALLELISM
Ø  Is one the most important methods for improving  performance.
Ø  EXAMPLE:
·         Is the use of parallelism at the system level.
·         To improve the throughput performance on a typical server benchmark, such as SPECWeb orTPC-C, multiple processors and multiple disks can be used.
·         The workload of handling requests can then be spread among the processors and disks, resulting in improved throughput.
·         It is scalable.
Ø  At the level of an individual processor, taking advantage of parallelism among instructions is critical to achieving high performance.
Ø  One of the simplest ways to do this is through Pipelining.

v  PRINCIPLE OF LOCALITY
Ø  Programs tend to reuse data and instructions they have used recently.
Ø  A program spends 90% of its execution time in only 10% of the code.
Ø  We can predict what instruction and data a program will use in near furure based on its accesses in the recent past.
Ø  This principle is also applies to data accesses, though not as strongly as to code accesses.
Ø  Types: Temporal locality states that recently accessed items are likely to be accessed in the near future.
                  Spatial locality says that items whose addresses are near one another tend to be referenced close together in time.

v  FOCUS ON THE COMMON CASE
Ø  Perhaps the most important and pervasive principle of computer design is to focus on the common case: In making a design trade-off, favor the frequent case over the infrequent case.
Ø  This principle helps in determining how to spend resources.
Ø  In applying this simple principle, we have to decide what the frequent case is and how much performance can be improved by making that case faster.



1 comment: