Unit-5
Memory Organization
Static RAM (SRAM) uses a six-transistor flip-flop circuit for each memory cell, while Dynamic RAM
(DRAM) uses a single transistor and a capacitor for each cell.
Static RAM (SRAM)
Structure: SRAM uses a cross-coupled flip-flop circuit, typically made up of six transistors.
Function: The flip-flop stores the data bit in a stable state as long as power is supplied.
Advantages: SRAM is faster and consumes less power when idle compared to DRAM.
Disadvantages: SRAM is more complex to implement and has a higher cost per bit than
DRAM.
Application: SRAM is commonly used for cache memory.
Dynamic RAM (DRAM)
Structure: DRAM uses a single transistor and a capacitor for each memory cell.
Function: The capacitor stores the data bit as a charge. The transistor controls access to the
cell.
Advantages: DRAM is more dense and has a lower cost per bit than SRAM.
Disadvantages: DRAM requires periodic refreshing to maintain the stored charge due to
capacitor leakage. It also has a slower speed compared to SRAM.
Application: DRAM is commonly used for main system memory.
Memory chips
They are integrated circuits that store data, either temporarily or permanently, within electronic
devices. They are crucial for enabling devices to function efficiently, storing data needed for quick
access and long-term storage.
Here's a more detailed look:
Key Aspects of Memory Chips:
Function:
Memory chips store and retrieve data, which is essential for various applications like running
programs, storing files, and holding operating systems.
Types:
There are different types of memory chips, each with its unique characteristics:
RAM (Random Access Memory): Used for temporary data storage, like when a
program is running. It requires power to maintain the data and is erased when the
device is turned off.
ROM (Read-Only Memory): Used for permanent data storage, like the BIOS (Basic
Input/Output System) in a computer, where the data is not erased when the power is
off.
Flash Memory: A type of non-volatile memory that can be rewritten and retains data
even when power is off, commonly found in USB drives and SSDs.
SRAM (Static RAM): Uses a more complex circuit to hold data, typically in a six-
transistor memory cell, and is faster than DRAM.
Associative Memory
An associative memory can be treated as a memory unit whose saved information can be recognized
for approach by the content of the information itself instead of by an address or memory location.
Associative memory is also known as Content Addressable Memory (CAM).
The block diagram of associative memory is shown in the figure. It includes a memory array and logic
for m words with n bits per word. The argument register A and key register K each have n bits, one
for each bit of a word.
The match register M has m bits, one for each memory word. Each word in memory is related in
parallel with the content of the argument register.
The words that connect the bits of the argument register set an equivalent bit in the match register.
After the matching process, those bits in the match register that have been set denote the fact that
their equivalent words have been connected.
Reading is proficient through sequential access to memory for those words whose equivalent bits in
the match register have been set.
The key register supports a mask for selecting a specific field or key in the argument word. The whole
argument is distinguished with each memory word if the key register includes all 1's.
Hence, there are only those bits in the argument that have 1's in their equivalent position of the key
register are compared. Therefore, the key gives a mask or recognizing a piece of data that determines
how the reference to memory is created.
Cache memory Organization
Cache Memory is a special very high-speed memory. It is used to speed up and synchronizing with
high-speed CPU. Cache memory is costlier than main memory or disk memory but economical than
CPU registers. Cache memory is an extremely fast memory type that acts as a buffer between RAM
and the CPU. It holds frequently requested data and instructions so that they are immediately
available to the CPU when needed.
Cache memory is used to reduce the average time to access data from the Main memory. The cache
is a smaller and faster memory which stores copies of the data from frequently used main memory
locations. There are various different independent caches in a CPU, which store instructions and
data.
Cache Performance:
When the processor needs to read or write a location in main memory, it first checks for a
corresponding entry in the cache.
If the processor finds that the memory location is in the cache, a cache hit has occurred and
data is read from cache
If the processor does not find the memory location in the cache, a cache miss has occurred.
For a cache miss, the cache allocates a new entry and copies in data from main memory,
then the request is fulfilled from the contents of the cache.
The performance of cache memory is frequently measured in terms of a quantity called Hit ratio.
Hit ratio = hit / (hit + miss) = no. of hits/total accesses
We can improve Cache performance using higher cache block size, higher associativity, reduce miss
rate, reduce miss penalty, and reduce Reduce the time to hit in the cache.
Cache Mapping:
There are three different types of mapping used for the purpose of cache memory which are as
follows: Direct mapping, Associative mapping, and Set-Associative mapping. These are explained
below.
1. Direct Mapping –
The simplest technique, known as direct mapping, maps each block of main memory into
only one possible cache line. or
In Direct mapping, assigne each memory block to a specific line in the cache. If a line is
previously taken up by a memory block when a new block needs to be loaded, the old block
is trashed. An address space is split into two parts index field and a tag field. The cache is
used to store the tag field whereas the rest is stored in the main memory. Direct mapping`s
performance is directly proportional to the Hit ratio.For purposes of cache access, each main
memory address can be viewed as consisting of three fields. The least significant w bits
identify a unique word or byte within a block of main memory. In most contemporary
machines, the address is at the byte level. The remaining s bits specify one of the 2s blocks of
main memory. The cache logic interprets these s bits as a tag of s-r bits (most significant
portion) and a line field of r bits. This latter field identifies one of the m=2r lines of the cache.
Application of Cache Memory –
1. Usually, the cache memory can store a reasonable number of blocks at any given time, but
this number is small compared to the total number of blocks in the main memory.
2. The correspondence between the main memory blocks and those in the cache is specified by
a mapping function.
What is the Function of Cache?
A cache is a data storage area on your device that may be utilized to reduce load times. They're
frequently embedded into an app's architecture.
Browsing the Internet is simply a never-ending information exchange. Every click on your phone or
laptop, whether it is a browser or another program, is a request for certain bits of data, and
everything you see on your screen is an answer to one of those requests.
However, displaying an entire website, with all of its pictures and code, might take a long time. Web
browsers remember some of that information and utilize it the next time you visit that same page to
speed up the process. This is referred to as caching.
Of course, cached data systems are not only for web browsers. Cache memory is used by every
device and its programs to speed up data access. They don't, however, make the same use of stored
data. While the disc cache is used to store data in advance for device functioning, browsers and
other apps preserve data from the earlier activity so that particular sites and page components may
be loaded faster.
What Role Does Cache Play in Improving User Experience?
It is primarily about speed. Browsers, applications, and operating systems may show information
more quickly, thanks to their caches. But it isn't the only advantage.
By storing data locally, you can conserve bandwidth by not having to download the same information
many times. You may also use applications or see information in offline mode because data is already
on your smartphone.
What Does "Clear Cache" Mean?
When you clear your cache, you are erasing the data that is automatically saved to your device when
you visit a new website or open an app. You might do this if you are running out of space on your
device or if it is running slower than normal. Depending on how many websites or applications you
browse or use, you may have a large amount of cache on your smartphone that is slowing it down.
Because the process is automated, as soon as you access those websites or applications again, your
device will begin to re-download the information. Clearing your cache is just temporary, and it will
need to be done on a regular basis to keep cache storage capacity at a minimum.
What Happens When You Clear Cache Data?
Cache data can accumulate over time, especially if you use many sites or apps and have never
emptied your cache. It is possible to free up space on your smartphone by clearing your cache. It also
won't affect the speed of websites or applications; it'll only take a few seconds longer for them to
load the first time you visit them.
A site will occasionally stop operating, and deleting the cache will resolve the issue. For example, a
coworker may unable to post articles to a website. Suggesting to erase the browser cache may fix the
problem.
What is the benefit of this? To put it bluntly, there is occasionally a discrepancy between the version
of a website cached (stored) on your computer and the version you're loading from the internet. This
incompatibility might cause strange issues, and emptying your cache can assist if nothing else does.
The backend of our website had recently been modified, which was most likely the cause of the
dispute in our situation.
Signing on to public Wi-Fi may also be hampered by the cache.
Do I Need to Clean My Cache On a Frequent Basis?
Ii is generally advisable not to empty your cache unless you have a compelling reason to do so. The
cache files allow the most frequently visited websites to load faster, which is a positive thing. Your
browser will remove outdated files on a regular basis, so the cache will not expand indefinitely.
It is true that the cache takes up space on your hard drive, which some people find inconvenient.
However, the purpose of a hard drive is to store data, and a cache that speeds up web browsing
appears to be a reasonable use of hard drive space.