Title :
A case for buffer servers
Author :
Anderson, Darrell ; Yocum, Ken ; Chase, Jeff
Author_Institution :
Dept. of Comput. Sci., Duke Univ., Durham, NC, USA
Abstract :
Faster networks and cheaper storage have brought us to a point where I/O caching servers have an important role in the design of scalable, high-performance file systems. These intermediary I/O servers-or buffer servers-can be deployed at strategic points in the network, interposed between clients and data sources such as standard file servers, Internet data servers and tertiary storage. Their purpose is to provide a fast and incrementally scalable I/O service throughout the network while reducing and smoothing demands on shared data servers and the network backbone. This paper outlines a case for caching buffer servers and addresses some of the key technical challenges in the design of a buffer service. We also describe the role of buffer servers in the Trapeze project, which uses Gbit/s networks as a vehicle for high-speed network I/O
Keywords :
cache storage; client-server systems; file servers; I/O caching servers; Trapeze project; buffer servers; demand smoothing; high-speed network I/O; incrementally scalable I/O service; network backbone; scalable high-performance file systems; shared data servers; Buffer storage; File servers; File systems; High-speed networks; IP networks; Network servers; Smoothing methods; Spine; Vehicles; Web server;
Conference_Titel :
Hot Topics in Operating Systems, 1999. Proceedings of the Seventh Workshop on
Conference_Location :
Rio Rico, AZ
Print_ISBN :
0-7695-0237-7
DOI :
10.1109/HOTOS.1999.798382