DocumentCode :
3753109
Title :
Content Piece Rarity Aware In-Network Caching for BitTorrent
Author :
Daishi Kondo;HyunYong Lee;Akihiro Nakao
Author_Institution :
Univ. of Tokyo, Tokyo, Japan
fYear :
2015
Firstpage :
1
Lastpage :
6
Abstract :
BitTorrent causes redundant inter-AS traffic that increases the operational cost by constructing topology-agnostic overlay network. One promising way for eliminating the redundant traffic is to utilize the in-network cache. Even though LRU algorithm is widely used for cache eviction in practice and believed to result in good performance in most cases, we posit that LRU may lead to suboptimal performance in the context of BitTorrent. This is due to the fact that BitTorrent adopts so called rarest-first algorithm for exchanging content pieces. Thus, LRU is rendered suboptimal in BitTorrent since LRU exploits temporal locality. In this paper, we propose a method consisting of two steps: (1) inference of the pieces of content to be requested in near future and (2) content piece rarity aware caching strategy for BitTorrent. To be concrete, our network node infers rare pieces transparently to BitTorrent applications, inspecting HAVE/BITFIELD messages within network and setting high priority for caching rare pieces. Simulation results show that our approach increases the cache hit ratio from 6.9% up to 45.7% compared to LRU. In particular, the less the size of cache is, the more effective our proposed caching algorithm is compared to LRU.
Keywords :
"Peer-to-peer computing","Payloads","Indexes","Inference algorithms","IP networks","Context","Internet"
Publisher :
ieee
Conference_Titel :
Global Communications Conference (GLOBECOM), 2015 IEEE
Type :
conf
DOI :
10.1109/GLOCOM.2015.7416996
Filename :
7416996
Link To Document :
بازگشت