Cache Memory

Cache Memory
Yi-Ning Huang
Principle of Locality
A phenomenon that the recent used
memory location is more likely to be used
again soon.
What is cache memory?
According to locality principle, Scientists
designed cache memory to make memory more
efficient.
Cache contains the information most frequently
needed by the processor.
Cache memory is a small but fast memory
module inserted between the processor and the
main memory.
Example of grocery shopping
If you buy one single item you need immediately
in the grocery store.
Then you might have to go to grocery store
again soon.
It is waste of time and waste of gas ( suppose
you drive to grocery store)
Example of grocery shopping
Most of people will do:
You buy any items you need immediately
and additional items you will most likely
need in the future.
Example of grocery shopping
Grocery store is similar to main memory
Your home is similar to cache.
To speed up access time, cache would stores
information the computer will need immediately
and will most likely need in the future.
What is cache memory?
According to locality principle, Scientists
designed cache memory to make memory more
efficient.
Cache contains the information most frequently
needed by the processor.
Cache memory is a small but fast memory
module inserted between the processor and the
main memory.
3 general functions in cache
Address translation function
Address mapping function
Replacement algorithm
Mapping Cache
Determines where the blocks are to be
located in the cache.
3 types of mapping



Fully associative mapping
Direct mapping
Set associative mapping
Mapping Cache: Direct mapping
Each primary memory address maps to a unique cache
block.
If cache has N blocks, then block address X of main
memory maps to cache block Y = X mod N
( Cache block = block address mod
number of blocks)
Mapping Cache: Direct mapping
Main memory: 64kb
Frame
Tag
Block
Byte
8
5
3
Address
0
1
…
…
…
…
256-byte cache
Block
0
1
…
…
31
32
,,,,
63
64
31
,,,,
8191
Note: tag area is not shown
Mapping Cache: Direct mapping
Main memory: 64kb
Frame
Tag
Block
Byte
8
5
3
Address
0
1
…
…
…
…
256-byte cache
Block
0
1
…
…
31
32
,,,,
63
64
31
,,,,
8191
Note: tag area is not shown
Mapping Cache: Direct mapping
Use least significant b bits of the tag to
indicate the cache blocks in which a frame
can reside. Then, the tags would be only
(p-n-b) bits long.
p-n-b
b
n
Tag
Block
Word
Bits in main memory address
Main memory Address (A)
Mapping Cache:
Fully associative mapping
Main memory can occupy any of the
cache blocks.
Disadvantages: All tags mush be searched
in order to determine a hit or miss. If
number of tags are large, then this search
can be time consuming.
For example, a system with 64KB of primary
memory and a 1KB cache. There are 8
bytes per block
Block
Main memory: 64kb
0
13 bits
1
Frame
2
8 bytes/frame 0
3
1
Cache
4
2
5
3
…
4
5
Tag area
…
8191
8 bytes/block
Data Area
Hit/Miss
Comparator
Tag
13
127
Word
3
Main memory Address (A)
Bits in main memory address: 16
Disadvantages of direct mapping
and fully associative mapping
Disadvantages of fully associative mapping : All
tags mush be searched in order to determine a
hit or miss. If number of tags are large, then this
search can be time consuming.
Disadvantages of direct mapping: It overcomes
the disadvantages of fully associative mapping,
but It will continue replace blocks even though
there are some free blocks.
Mapping Cache:
Set associative mapping
Set associative mapping combines
advantages and disadvantages of direct
mapping and fully associative mapping.
A block can placed in a restricted set of
places in the cache.
It divides the main-memory addresses into
K sets.
Mapping Cache:
Set associative mapping
Main memory: 64kb
Frame
0
1
Tag
Number
Set
Number
Word
Address
8
5
3
Address
4-way set-associative cache
Set
…
0
…
1
…
…
…
…
31
32
,,,,
63
64
31
,,,,
4 slot
8191
Mapping Cache:
Set associative mapping
The set is usually chosen by bit selection
=(Block address) MOD (Number of sets
in cache)
Mapping Cache:
Set associative mapping
Main memory: 64kb
Frame
0
1
Tag
Number
Set
Number
Word
Address
8
5
3
Address
4-way set-associative cache
Set
…
0
…
1
…
…
…
…
31
32
,,,,
63
64
31
,,,,
4 slot
8191
Mapping Cache:
Set associative mapping
Main memory: 64kb
Frame
0
1
Tag
Number
Set
Number
Word
Address
8
5
3
Address
4-way set-associative cache
Set
…
0
…
1
…
…
…
…
31
32
,,,,
63
64
31
,,,,
4 slot
8191
Mapping Cache:
Set associative mapping
Direct mapped is simply one-way set
associative
A fully associative cache with m blocks
could be called m -way set associative.
Replacement algorithm
When there is a cache miss, it requires space for
new blocks.
Replacement algorithm determines which block
can be replaced by new blocks.
If use direct mapping, there is only one cache
block the frame can occupy. Therefore,
replacement algorithm is not needed in this case.
Replacement algorithm
There are three replacement algorithm.

LRU
(Least recently used)

FIFO (First in first out)

Random
Replacement algorithm: LRU
Replaced the least recently used block in the
cache.
To determine where is LRU block, a counter can
be associated with each cache block.
Advantage: This algorithm follows locality
principle, so it limits number of times the block to
be replaced.
Disadvantage: Implementation is more complex.
Replacement algorithm: FIFO
The first-in block in the cache is replaced first.
In the other word, the block that is in the cache
longest is replaced.
Advantage: Easy to implement.
Disadvantage: In some condition, blocks are
replaced too frequently.
Reference
Computer Organization, Design, and
Architecture by Sajjan G. Shiva
http://www.cs.sjsu.edu/~lee/cs147/cs147.htm
http://www.cs.iastate.edu/~prabhu/Tutorial/CAC
HE/bl_place_applet.html
http://www.articlesbase.com/hardwarearticles/cache-memory-675304.html