DocumentCode :
322254
Title :
A novel demand prefetching algorithm based on Volterra adaptive prediction for virtual memory management systems
Author :
Mumolo, Enzo ; Bernardis, Giulia
Author_Institution :
Dipt. di Elettrotecnica, Elettronica ed Inf., Trieste Univ., Italy
Volume :
5
fYear :
1997
fDate :
7-10 Jan 1997
Firstpage :
160
Abstract :
The performance of a virtual memory system is the result of the goodness of the memory management policy. The `demand fetch´ policy is one of the most popular, mainly for its simplicity. However, at the expense of increased complexity, other policies can be devised. In this paper, a novel approach with a relatively low complexity is described for the determination of a suitable set of pages to be brought into memory when a page fault occurs. This algorithm is an example of how the overall performance of complex systems can be improved with little computational effort. To anticipate their future use, some pages are determined by using a nonlinear predictor based on the truncated Volterra series. The Volterra predictor is updated every time a new page reference comes in. We first give experimental evidence that page reference sequences contain nonlinearities which can be described using a Volterra predictor. Then we show how the predictor´s performance is improved by exploiting temporal and spatial localities in the trace on the basis of the page references histogram and with an input LRU stack filter. When a page fault occurs, a number of pages around the predicted page are brought into memory, in addition to the page which caused the page fault, replacing the pages chosen on an LRU basis in the same section. Trace-driven simulations show that this algorithm leads to a page fault improvement of as much as 10.9% with respect to a conventional demand paging algorithm with the same dimension of the working set (WS). Some results in terms of page fault rate vs. WS dimension are reported
Keywords :
Volterra series; paged storage; prediction theory; sequences; Volterra adaptive prediction; complexity; demand paging algorithm; demand prefetching algorithm; input LRU stack filter; least recently used stack filter; memory management policy; nonlinear predictor; nonlinearities; page fault rate; page reference sequences; page references histogram; performance; spatial localities; temporal localities; trace-driven simulations; truncated Volterra series; virtual memory system; working set dimension; Bandwidth; Cache memory; Costs; Filters; Histograms; Linearity; Memory management; Prediction algorithms; Prefetching;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
System Sciences, 1997, Proceedings of the Thirtieth Hawaii International Conference on
Conference_Location :
Wailea, HI
ISSN :
1060-3425
Print_ISBN :
0-8186-7743-0
Type :
conf
DOI :
10.1109/HICSS.1997.663171
Filename :
663171
Link To Document :
بازگشت