0% found this document useful (0 votes)
10 views17 pages

DD Cdt16ppt

The document discusses cache memory, focusing on mapping techniques and read/write operations. It explains the concept of locality reference and the importance of cache in reducing memory access time, detailing three mapping methods: associative, direct, and set-associative mapping. The lecture outcomes emphasize students' ability to examine these mapping techniques and operations in cache memory.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
10 views17 pages

DD Cdt16ppt

The document discusses cache memory, focusing on mapping techniques and read/write operations. It explains the concept of locality reference and the importance of cache in reducing memory access time, detailing three mapping methods: associative, direct, and set-associative mapping. The lecture outcomes emphasize students' ability to examine these mapping techniques and operations in cache memory.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 17

Classroom Discussion Topic (CDT)

W6 – CDT16
Topic: Cache memory - Mapping techniques, R/W
operations
Dr. B. Rama Devi
Professor
Department of Electronics & Communication Engineering
Kakatiya Institute of Technology and Science, Warangal, TS, India
1
LECTURE OUTCOMES

Examine various mapping techniques, R/W


operations of Cache memory

2
CACHE MEMORY
❑Analysis of a large number of typical programs refers to
memory at any given interval of time tend to be confined
within a few localized areas in memory. This phenomenon
is know as property of locality reference.
❑If the active portions of the program and data are placed
in a fast small memory, average memory access time can
be reduced, reducing total execution time of program.
Such a fast memory is called Cache.
❑When CPU needs to access memory, cache is examined. If
it found in cache memory, it reads from it fast, if not main
memory is accessed to read the word. 3
CACHE MEMORY
15-bit address

12-bit data
15-bit
add

❑ Hit Ratio: ratio of number of hits to total


CPU references to memory. When CPU
refers to memory & finds word in cache, it is
called hit. If missed called as miss. Hit ratio
4
is 0.9 or above.
MAPPING

❑ The transfer of data from main memory to


cache memory is called mapping. Three
types of mapping procedures:
1. Associative mapping
2. Direct mapping
3. Set-associative mapping

5
ASSOCIATIVE MAPPING

15 bit 12-bit

OCTAL

6
ASSOCIATIVE MAPPING…
❑Associative memory stores both address and
content.
❑CPU place 15-bit address in argument register
and search for matching address. If matched,
12-bit data read from cache. No match, main
memory is accessed for that word.
❑ The address and data is transferred to cache
using replacing algorithm. EX: FIFO
7
DIRECT MAPPING

15-bit address is divided into 6-bits tag and 9-bits index(address of cache
memory)
DIRECT MAPPING…
❑Each word in cache has data and tag.
❑When CPU generates memory request, indexed
field is used to address to access cache. The tag field
of CPU is compared with tag in the word read from
the cache. If two tags matched, there is a hit, the
desired data is in the cache. If not, it is a miss.

9
DIRECT MAPPING…

10
DIRECT MAPPING…

8 Words

Disadvantage of direct mapping: Two words with same 11


index in their address but different tag values cannot reside
in cache memory at the same time.
SET ASSOCIATIVE
Index 9 bits, tags 6 bits, data 12-bits. Word length 2(6+12)=36 bits

2 tags

12
SET ASSOCIATIVE..

Index 9 bits, tags 6 bits, data 12-bits.


Word length 2(6+12)=36 bits
Each index refers to two data words with 2
different tags
Index address 9 bits accommodate 512 words.
Cache memory size 512x36 and accommodate
1024 words.
Tags compared like associative memory search,
called as set associative
13
LECTURE OUTCOMES- REVISITED

Having completed the discussion on Cache


memory - Mapping techniques, R/W
operations now students should be
able to

Examine various Mapping


techniques, R/W operations of
Cache memory
14
FURTHER READING

Refer :
❑Computer System Architecture-Morris
mano Third Edition

15
LECTURE LEVEL PRACTICE QUESTIONS/PROBLEMS

LLQ1/LLP1(on LLO1): Inspect mapping techniques


and R/W operations in cache memory

16
Thank you!!!!!

17

You might also like