In this paper, cache is the repository of browsing results located in cache disk. The size of cache repository and the choice of cache replacement algorithm affect the speed of a system. Improper deletion of an object during cache replacement may erase the most frequently used objects and cause misses during request. In this study, we propose a method of throughput improvement by combining FIFO (First in First Out) and LRU (Least Recently Used) cache replacement algorithms. The analysis was conducted to identify the effect of cache size on hit rate percentage, response time, delay time, and throughput when the combined FIFO-LRU algorithm is applied. The finding indicates bandwidth efficiency improvement compared to single algorithms, as showed by 73% throughput improvement on 200 MB cache. The application of the combined algorithm also reduces bandwidth usage and delay time while minimizing miss rate and increasing hit rate.

Original languageEnglish
Pages (from-to)710-715
Number of pages6
JournalARPN Journal of Engineering and Applied Sciences
Issue number3
Publication statusPublished - 2017


  • Cache size
  • FIFO
  • Hit rate
  • LRU
  • Miss
  • Throughput


Dive into the research topics of 'Combination of fifo-lru cache replacement algorithms on proxy server to improve speed of response to object requests from clients'. Together they form a unique fingerprint.

Cite this