Web Cache (a.k.a HTTP Cache) is temporary storage used for storing frequently accessed static data such as HTML, CSS, images, etc to reduce the latency and the server load.
Web caching is a process of storing data for frequently accessed by users for faster to enhance web page/content delivery speed. The first time when users visit a web page, the original page is cached and in the subsequent user’s request for the same content, the cached copy is returned, hence reducing the server load. The Cache server too are updated at a specific time interval to make sure, the cached copy of web content is not stale or to make sure the freshest copy of the content is cached and returned.
Types of Web Cache
Web content can be cached at various locations between the original server and the client. Web content can be cached at the client’s browsers or user agent, or a different proxy server can be used to cache static contents.
- Forward Cache (Browsers Cache) – Forward cache is a cache stored outside the webserver e.g. it can be stored at the client browser or at Internet Service Provides(ISP) etc. They store the heavily accessed content or reuse with low latency. The browser cache is limited to one user and it can store the private responses specific to a user. Ex – When you press the back button on the browser, instead of re-requesting the page, the browser returns the cached copy. Another example can be if you browse twitter with poor connectivity, it will load the basic layout of twitter(cached copy), but the actual tweets will take time to appear.
- Reverse Cache (Caching Web-Servers) – The Reverse cache or caching proxies sits in front of one or more web servers or web apps, and caches or replicates the most visited web content on caching servers at many different locations. Since many users visit the same popular content(say any trending video), hence their hit rate is higher than Browers Cache.
Web Cache Terminologies
The following are the important terminologies used in web caching.
- Origin Server – The origin server(a.k.a web server) is the main web server, where the actual content is stored. The content used in the caching server is retrieved from the origin server itself. Also, if a content is not found in the caching server (Cache miss) then it is fetched from the origin server.
- Cache Performance – Cache performance is measured in terms of Hit Ratio. Hit Ratio is a ratio of the number of requests found in the cache server to the total number of requests made. Greater the hit ratio, the better the cache performance.
- Freshness – The term freshness describes the content that the content present is cache is synced with that in origin server and can be used without rechecking with the origin server. Only fresh content will be used by caching the server as a response to a request.
- Validation – An expired (or not fresh) content is known as a stale item. Once the cache entry becomes stale, validation is done with the origin server to check whether the stale item still represents the most recent version of the content.
- Invalidation – Invalidation involves the removal of an item from the caching server. This is done to make sure the user does not get an old outdated copy of the content. If POST, PUT or DELETE request is made a resource specified by a URL, it means that resource at origin server has been updated and the cached copy is not stated hence the existing cached copy will be invalidated.
Benefits of Web Caching
- Faster delivery of web content to the user.
- Less Bandwidth consumption.
- Reduced workload on the server.
- Enhanced robustness of the Web services. If the remote server is not available due to a crash or some other reasons the client can still obtain a cached copy at the caching server.
- CPU Cache – The cache is the high-speed data storage memory. It is a temporary storage area that lies between the processor and the main memory(RAM) of a computer for faster data retrieval. It stores the copy of data/information frequently used. The information stored in the cache memory is the result of the previous computation of the main memory. The data stored in CPU cache need not be synchronized with the actual main memory content every time. see more…
- CDN – CDN is a collection of many caching servers in multiple geographical locations each(a.k.a., points of presence, or PoPs) connected together. Each PoP contains the cached version of the website. When a user requests a webpage the, CDN algorithm redirects the request to the server which is closest to the location of the requesting user, hence decreasing the latency and optimizing the performance. see more…