Caching Key Areas
Unified Instruction
Separate Data Instruction
Direct Mapped Caching
Organization of Direct Mapping
Direct Mapping Solution
Direct Mapped In Sum
Cache Operation
Direct Mapping with Spacial Locality
Performance
Strategies In Caches
Help
Questions
Things To Do
 
  
 

Direct Mapped In Sum:

  • A particular memory item is stored in a unique location in the cache.
  • To check if a particular memory item is in cache, the relevant address bits
    are used to access the cache entry.
  • The top address bits are then compared with the stored tag. If they are
    equal, we have got a hit.
  • Two items with the same cache address field will contend for use of that
    location.
  • Only those bits of the address which are not used to select within the line
    or to address the cache RAM need be stored in the tag field.
  • When a miss occurs, data cannot be read from the cache. A slower read
    from the next level of memory must take place, incurring a miss penalty.
  • A cache line is typically more than one word. It shows 4 words in the
    diagram here. A large cache line exploits principle of spatial locality -
    more hits for sequential access. It also incurs higher miss penalty.

 



Web Designer:Chetan Narake