Cache Terms and Definitions
17
TMS320C64x Two-Level Internal MemorySPRU610C
Table 2. Terms and Definitions (Continued)
Term Definition
Tag A storage element containing the most-significant bits of the address stored in a
particular line. Tag addresses are stored in special tag memories that are not directly
visible to the CPU. The cache queries the tag memories on each access to
determine if the access is a hit or a miss.
Thrash An algorithm is said to thrash the cache when its access pattern causes the
performance of the cache to suffer dramatically. Thrashing can occur for multiple
reasons. One possible situation is that the algorithm is accessing too much data or
program code in a short time frame with little or no reuse. That is, its working set is
too large, and thus the algorithm is causing a significant number of capacity misses.
Another situation is that the algorithm is repeatedly accessing a small group of
different addresses that all map to the same set in the cache, thus causing an
artificially high number of conflict misses.
Touch A memory operation on a given address is said to touch that address. Touch can also
refer to reading array elements or other ranges of memory addresses for the sole
purpose of allocating them in a particular level of the cache. A CPU-centric loop used
for touching a range of memory in order to allocate it into the cache is often referred
to as a touch loop. Touching an array is a form of software-controlled prefetch for data.
Valid When a cache line holds data that has been fetched from the next level memory, that
line frame is valid. The invalid state occurs when the line frame holds no data, either
because nothing has been cached yet, or because previously cached data has been
invalidated for whatever reason (coherence protocol, program request, etc.). The
valid state makes no implications as to whether the data has been modified since it
was fetched from the lower-level memory; rather, this is indicated by the dirty or
clean state of the line.
Victim When space is allocated in a set for a new line, and all of the line frames in the set
that the address maps to contain valid data, the cache controller must select one of
the valid lines to evict in order to make room for the new data. Typically, the
least-recently used (LRU) line is selected. The line that is evicted is known as the
victim line. If the victim line is dirty, its contents are written to the next lower level of
memory using a victim writeback.
Victim Buffer A special buffer that holds victims until they are written back. Victim lines are moved
to the victim buffer to make room in the cache for incoming data.
Victim Writeback When a dirty line is evicted (that is, a line with updated data is evicted), the updated
data is written to the lower levels of memory. This process is referred to as a victim
writeback.
Way
In a set-associative cache, each set in the cache contains multiple line frames. The
number of line frames in each set is referred to as the number of ways in the cache.
The collection of corresponding line frames across all sets in the cache is called a
way in the cache. For instance, a 4-way set-associative cache has 4 ways, and each
set in the cache has 4 line frames associated with it, one associated with each of the
4 ways. As a result, any given cacheable address in the memory map has 4 possible
locations it can map to in a 4-way set-associative cache.