Open Access Open Access  Restricted Access Subscription or Fee Access

Effective Web Cache Pre-fetching Technique using Markov Chain

Urmi Vasishnav, Deven Agravat, Sanjay Garg

Abstract


With the rapid growth of World Wide Web and Internet, a problem of improving web server’s performance is having primary importance. When a web user requests any resource from web server, he/she experiences some amount of delay which is known as User Perceived Latency. Web caching is an effective and economical technique to reduce UPL. Performance of web server can still be improved by predicting a future request from the current page and then move it to the web server’s cache. This concept is known as web cache pre-fetching. Web cache pre-fetching using markov tree and multidimensional matrix have limitations regarding to time complexity and memory, respectively. In this paper, an attempt has been made to use markov model and concept of stationary distribution for web cache pre-fetching. A comparison has been presented for proposed algorithms with Least Recently Used (LRU). Empirical results based on implementation confirm that proposed algorithms are better than LRU.

Keywords


Markov Model, User Perceived Latency, Web Cache Pre-Fetching,

Full Text:

PDF

References


A. Balmash and M. Krunz, ”An Overview of web cache replacement algorithms”, IEEE Communications Surveys & Tutorials, vol. 6, no 2, p.p. 44-56, 2004.

B. D. Davison, “Learning Web request patterns”, in Proc. Web Dynamics: Adapting to Change in Content, Size, Topology and Use, pp. 435-460, 2004.

E. P. Markatos and C. E. Chronaki, “A top 10 approach for prefetching the Web”, in Proc. Ann. Internet Society Conf. (INET), Internet Soc., 1998.

J. Snopek, and I. Jelinek, “Web Access Predictive Models”, in Proc. International Conference on Computer Systems and Technologies – CompSysTech, Technical University, Varna, Bulgaria, 16-17 June, 2005.

J. Zhu, J. Hong, and J. Hughes, “Using Markov Chains for Link Prediction in Adaptive Web Sites”, in Proc. Springer LNCS 2311, pp. 60-73, 2002.

K. Geetha, Dr. N. Gounden, and S. Monikandan, “SEMALRU: An Implementation of modified web cache replacement algorithm”, in Proc. World Congress on Nature & Biologically Inspired Computing, p.p. 1406 – 1410, 2009.

K. Li, W. Qu, H. Shen, D. Wu, and T. Nanya, “Two Cache Replacement Algorithms Based on Association Rules and Markov Models”, in Proc. first International Conference on Semantics, Knowledge and Grid , p.p. 28 , 2005.

Kin-Yeung Wong, “Web Cache Replacement Policies: A Pragmatic Approach” , IEEE Network, vol. 20, no 1, p.p. 28-34, 2006.

Kishor S. Trivedi, Probability and Statistics with Reliability, Queuing, and Computer Science Applications, PHI India, 15th Indian Reprint, June 2003, pp. 1,268, 309-310.

N. Bian, H. Chen, “A Least Grade Page Replacement Algorithm for Web Cache Optimization”, in Proc. Workshop on Knowledge Discovery and Data Mining, p.p. 469 – 472, 2008.

Q. Yang, and H. Zhang, “Web-Log Mining for Predictive Web Caching”, IEEE Transaction on Knowledge and Data Engineering, vol. 15, no. 4, July/August , 2003.

R. Sarukkai, “Link prediction and path analysis using Markov chains”, in Proc. of the 9th international World Wide Web conference on Computer network , p.p. 377 – 386, 2000.

W Spears, “A compression algorithm for probability transition matrices”, SIAM Matrix Analysis and Applications, vol. 20, no 1, pp. 60-77, 1998.

W. Feng, and H. Chen, “A Matrix Algorithm for web cache pre-fetching”, in Proc. 6th IEEE/ACIS International Conference on Computer & Information Science , p.p. 788 – 794, 2007.

W. Feng and K. Vij, “Machine learning prediction and Web access modeling”, in Proc. 31st Annual IEEE International Computer Software and Applications Conference, p.p. 607-612, 2007.

W. Feng and K. Vij, “Web Cache Prefetching by Multi-dimensional Matrix”, in Proc. Advanced Software Engineering & Its Applications, p.p. 265 – 270, 2008.


Refbacks

  • There are currently no refbacks.


Creative Commons License
This work is licensed under a Creative Commons Attribution 3.0 License.