Monday, September 6, 2021

Set Associative Cache Memory

 

What is Cache Memory?

Small section of SRAM added between processor and main memory to speed up execution process is known as cache memory. Cache memory is a special high-speed memory. It is buffer between CPU and Main Memory.

It helps to increase the processing speed of CPU to provide data of program which is currently in execution mode. A cache memory system includes a small amount of fast memory (SRAM) and a large amount of slow memory (DRAM).

Set Associative Cache Memory

The set associative mapping is a combination of both direct mapping and fully associative mapping. It contains several groups of direct mapped blocks that operates as direct-mapped caches in parallel. A block of data from any page in the main memory can go into a particular block location of any direct-mapped cache.

Hence the contention problem (Drawback of Direct map cache) of the direct-mapped technique is improved by having a few choices for block placement. The required address comparisons depend on the number of direct-mapped caches in the cache system. These comparisons are always less than the comparisons required in the fully-associative mapping. (Drawback of Fully associative cache)

Diagram shows two-way set-associative cache. Each page in the main memory is organized in such a way that the size of each page is same as the size of one directly mapped cache. It is called two-way set-associative cache because each block from main memory has two choices for block placement.

 

Figure: Set Associative Cache Memory

As there are two choices, it is necessary to compare address of memory with the tag bits of corresponding two block locations of particular set. Thus, for two-way set associative cache, we require two comparisons whether a given block is in the cache. To implement set-associative cache system, the address is divided into three fields shown in diagram. The 4-bit word field selects one of the 16 words in a block. The set fields need 6-bits to determine the desired block from 64 sets. However, there are now 64 pages. To identify a block belongs to a particular page from 64 pages, six tag bits are required.

Which Cache Memory mapping techniques is best?

There are two direct-mapped caches, any two bytes having same offset from different pages can be in the cache at a time. This improves the hit ratio of the cache system. Set associative cache memory is better than fully associative and direct mapping cache memory.

Click here to watch video of Set Associative Cache Memory

Watch more videos click here.


No comments:

Post a Comment