Download as pdf or txt
Download as pdf or txt
You are on page 1of 43

DIGITAL LOGIC AND

COMPUTER ORGANIZATION
Lecture 20: Caches (P1)
ELEC3010
ACKNOWLEGEMENT

I would like to express my special thanks to Professor Zhiru Zhang


School of Electrical and Computer Engineering, Cornell University
and Prof. Rudy Lauwereins, KU Leuven for sharing their teaching
materials.

2
COVERED IN THIS COURSE
❑ Binary numbers and logic gates
❑ Boolean algebra and combinational logic
❑ Sequential logic and state machines
❑ Binary arithmetic
Digital logic
❑ Memories

❑ Instruction set architecture


❑ Processor organization Computer
❑ Caches and virtual memory
❑ Input/output Organization
❑ Advanced topics
3
MOTIVATION EXAMPLE 1

4
MOTIVATION EXAMPLE 2 (1)

5
MOTIVATION EXAMPLE 2 (2)
Caches !
Level 1
Data $
Level 2 $

Level 1
Insn $

6
MOTIVATION EXAMPLE 3

7
USING CACHES IN THE PIPELINE

8
CACHE
❑ Small SRAM memory that permits rapid access to a
subset of instructions or data
▪ If the data is in the cache (cache hit), we retrieve it without
slowing down the pipeline
▪ If the data is not in the cache (cache miss), we retrieve it from
the main memory (penalty incurred in accessing DRAM)
❑ The hit rate is the fraction of memory accesses found in
the cache
❑ The miss rate = 1 – hit rate

9
MEMORY ACCESS WITH CACHE
❑ Average memory access time with cache:
Hit time + Miss rate * Miss penalty

❑ An example
▪ Main memory access time = 50ns
▪ Cache hit time = 2ns
▪ Miss rate = 10%

Average mem access time w/o cache = 50ns


Average mem access time w/ cache = 2 + 0.1*50 = 7ns
10
WHY CACHES WORK: PRINCIPLE OF LOCALITY
❑ Temporal locality
▪ If memory location X is accessed, then it is likely to be accessed
again in the near future
• Caches exploit temporal locality by keeping a referenced
instruction or data in the cache
❑ Spatial locality
▪ If memory location X is accessed, then locations near X are
likely to be accessed in the near future
• Caches exploit spatial locality by bringing in a block of
instructions or data into the cache on a miss

11
SOME IMPORTANT TERMS
❑ Cache is partitioned into blocks
▪ Each cache block typically contains multiple bytes of data
(block size is a power of 2)
▪ A whole block is read or written during data transfer between
main memory and cache
❑ Each cache block is associated with a tag and a valid bit
▪ Tag: A unique ID to differentiate between different memory
blocks, which may be mapped into the same cache location
▪ Valid bit: indicates whether the data in a cache block is valid
(=1) or not (=0)

12
DIRECT MAPPED CACHE CONCEPTS
❑ Each memory block is mapped to one and only one cache
block (many-to-one mapping)
❑ Example:
• A cache with 8 blocks
– Block addresses in decimal
• Assume the main memory is 4
times larger than cache (with 32
blocks)

13
DIRECT MAPPED (DM) CACHE CONCEPTS

14
ADDRESS TRANSLATION FOR DM CACHE

15
DM CACHE ORGANIZATION

16
READING DM CACHE
❑ Use the index bits to retrieve the tag, data,
and valid bit
❑ Compare the tag from the address with the
retrieved tag
❑ If valid & a match in tag (hit), select
the desired data using the byte offset
❑ Otherwise (miss)
▪ Bring the memory block into the cache
(also set valid)
▪ Store the tag from the address with the
block
▪ Select the desired data using the byte offset
17
WRITING DM CACHE
❑ Use the index bits to retrieve the tag and
valid bit
❑ Compare the tag from the address with the
retrieved tag
❑ If valid & a match in tag (hit), write the data
into the cache location
❑ Otherwise (miss), one option:
▪ Bring the memory block into the cache
(also set valid)
▪ Store the tag from the address with the
block
▪ Write the data into the cache location
18
DIRECT MAPPED CACHE EXAMPLE

19
DIRECT MAPPED CACHE EXAMPLE

20
DIRECT MAPPED CACHE EXAMPLE

21
DIRECT MAPPED CACHE EXAMPLE

22
DIRECT MAPPED CACHE EXAMPLE

23
DIRECT MAPPED CACHE EXAMPLE

24
DIRECT MAPPED CACHE EXAMPLE

25
DIRECT MAPPED CACHE EXAMPLE

26
DIRECT MAPPED CACHE EXAMPLE

27
DIRECT MAPPED CACHE EXAMPLE

28
DIRECT MAPPED CACHE EXAMPLE

29
DIRECT MAPPED CACHE EXAMPLE

30
DIRECT MAPPED CACHE EXAMPLE

31
DOUBLING THE BLOCK SIZE

32
DOUBLING THE BLOCK SIZE

33
DOUBLING THE BLOCK SIZE

34
DOUBLING THE BLOCK SIZE

35
DOUBLING THE BLOCK SIZE

36
DOUBLING THE BLOCK SIZE

37
DOUBLING THE BLOCK SIZE

38
DOUBLING THE BLOCK SIZE

39
DOUBLING THE BLOCK SIZE

40
DOUBLING THE BLOCK SIZE

41
DOUBLING THE BLOCK SIZE

42
BEFORE NEXT CLASS

• H&H 8-8.3
• Next time:
More Caches

43

You might also like