• The purpose of cache memory is to speed up
accesses by storing recently used data closer to the
CPU, instead of storing it in main memory.
• Although cache is much smaller than main memory,
its access time is a fraction of that of main memory.
• Unlike main memory, which is accessed by address,
cache is typically accessed by content; hence, it is
often called content addressable memory.
• Because of this, a single large cache memory isn’t
always desirable-- it takes longer to search.
• The “content” that is addressed in content
addressable cache memory is a subset of the bits of
a main memory address called a field.
• The fields into which a memory address is divided
provide a many-to-one mapping between larger
main memory and the smaller cache memory.
• Many blocks of main memory map to a single block
of cache. A tag field in the cache block
distinguishes one cached memory block from
another.
• The simplest cache mapping scheme is direct
mapped cache.
• In a direct mapped cache consisting of N blocks of
cache, block X of main memory maps to cache block
Y = X mod N.
• Thus, if we have 10 blocks of cache, block 7 of cache
may hold blocks 7, 17, 27, 37, . . . of main memory.
• Once a block of memory is copied into its slot in
cache, a valid bit is set for the cache block to let the
system know that the block contains valid data.
Selasa, 02 Juni 2009
Langganan:
Posting Komentar (Atom)

Tidak ada komentar:
Posting Komentar