Title :
Extending data prefetching to cope with context switch misses
Author :
Cui, Hanyu ; Sair, Suleyman
Abstract :
Among the various costs of a context switch, its impact on the performance of L2 caches is the most significant because of the resulting high miss penalty. To reduce the impact of frequent context switches, we propose restoring a program´s locality by prefetching into the L2 cache the data a program was using before it was swapped out. A Global History List is used to record a process´ L2 read accesses in LRU order. These accesses are saved along with the process´ context when the process is swapped out and loaded to guide prefetching when it is swapped in. We also propose a feedback mechanism that greatly reduces memory traffic incurred by our prefetching scheme. Experiments show significant speedup over baseline architectures with and without traditional prefetching in the presence of frequent context switches.
Keywords :
cache storage; memory architecture; switches; L2 cache; LRU order; context switch; data prefetching; memory traffic reduction; Arithmetic; Costs; Delay; History; Operating systems; Prefetching; State feedback; Streaming media; Switches; Yarn;
Conference_Titel :
Computer Design, 2009. ICCD 2009. IEEE International Conference on
Conference_Location :
Lake Tahoe, CA
Print_ISBN :
978-1-4244-5029-9
Electronic_ISBN :
1063-6404
DOI :
10.1109/ICCD.2009.5413144