Cache herein refers to a storage for results of internet browsing located in cache disk. Size of cache and choice of cache replacement algorithm influence the system speed and client access throughput. Arbitrary deletion of objects during cache replacement can lead to loss of objects frequently used which result in a miss when there is a new request. In order to improve throughput an algorithm combining three cache replacement algorithms, FIFO-LRU-LFU is proposed. The new algorithm combine constructively advantages of the three different algorithms. Analysis has been conducted to understand the effect of cache size on hit rate percentage, response time, delay time and throughput when the FIFO-LRU-LFU algorithm is implemented. The results indicate improvement of bandwidth efficiency in cache replacement process compared to either single algorithm or double combination algorithms which reaches 87% of rate hit percentage. As a result, there is a decrease in bandwidth use and delay time which results in increased hit rate. In addition, there is an increase of relative throughput compared to double algorithm of approximately 95% when 100 MB cache is used and 83% with 200 MB cache. The increase of this throughput influences the efficiency of bandwidth use and reduces delay time to only 25.6% with 100 MB cache and 11.8% with 200 MB cache. This result is in favour of the adoption of the FIFO-LRU-LFU for cache replacement algorithm.

Original languageEnglish
Pages (from-to)9013-9020
Number of pages8
JournalJournal of Engineering and Applied Sciences
Issue numberSpecialissue10
Publication statusPublished - 2017


  • Cache
  • Cache replacement algorithm
  • Hit rate
  • Internet networking
  • Proxy server
  • Throughput


Dive into the research topics of 'Performance improvement of Proxy Server Cache replacement by combination FIFO-LRU-LFU algorithms'. Together they form a unique fingerprint.

Cite this