Professional Documents
Culture Documents
CA Classes-171-175
CA Classes-171-175
Structure:
8.1 Introduction
Objectives
8.2 Memory Hierarchy
Cache memory organisation
Basic operation of cache memory
Performance of cache memory
8.3 Cache Addressing Modes
Physical address mode
Virtual address mode
8.4 Mapping
Direct mapping
Associative mapping
8.5 Elements of Cache Design
8.6 Cache Performance
Improving cache performance
Techniques to reduce cache miss
Techniques to decrease cache miss penalty
Techniques to decrease cache hit time
8.7 Shared Memory organisation
8.8 Interleaved Memory Organisation
8.9 Bandwidth and Fault Tolerance
8.10 Consistency Models
Strong consistency models
Weak consistency models
8.11 Summary
8.12 Glossary
8.13 Terminal Questions
8.14 Answers
8.1 Introduction
You can say that Memory system is the important part of a computer
system. The input data, the instructions necessary to manipulate the input
data and the output data are all stored in the memory.
S e c o n d a ry I/ O P ro c e s s o r M a in
S t o ra g e M e m o ry
D e vic e
A u x ilia ry
M e m o ry
CP U Cac he
M e m o ry
Now, we let us discuss cache memory and the cache memory organisation.
8.2.1 Cache memory organisation
A cache memory is an intermediate memory between two memories
having large difference between their speeds of operation. Cache memory
is located between main memory and CPU. It may also be inserted between
CPU and RAM to hold the most frequently used data and instructions.
Communicating with devices with a cache memory in between enhances
the performance of a system significantly. Locality of reference is a common
observation that at a particular time interval, references to memory acts
limited for some localised memory areas. Its illustration can be given by
making use of control structure like 'loop'. Cache memories exploit this
situation to enhance the overall performance.
Whenever a loop is executed in a program, CPU executes the loop
repeatedly. Hence for fetching instructions, subroutines and loops act as
locality of reference to memory. Memory references also act as localised.
Table look-up procedure continually refers to memory portion in which table
is stored. These are the properties of locality of reference. Cache memory’s
basic idea is to hold the often accessed data and instruction in quick cache
memory, the total accessed memory time will attain almost the access time
of cache. The fundamental idea of cache organisation is that by keeping the
most frequently accessed instructions and data in the fast cache memory,
the average memory access time will reach near to access time of cache.
8.2.2 Basic operation of cache memory
Whenever CPU needs to access the memory, cache is examined. If the file
is found in the cache, it is read from the fast memory. If the file is missing in
cache then main memory is accessed to read the file. A set of files just
accessed by CPU is then transferred from main memory to cache memory.
8.2.3 Performance of cache memory
Cache memory performance is measured in terms of Hit Ratio. If the
processor detects a word in cache, while referring that word in main
memory is known to produce a “hit”. If processor cannot detect that word in
cache is known as “miss”. Hit ratio is a ratio of hits to misses. High hit ratio
signifies validity of "locality of reference". When the hit ratio is high, then the
processor accesses the cache memory rather than main memory.