Performance is one of the most complicated aspects of your application development cycle. It commands the faith of your application and demands continuous attention from you in terms of improving it at every stage.
Every application represents two sides coin - first is the content it displays, and second is the underlying infrastructure where it is hosted. It needs to be a perfect amalgamation of making your application load fast at the same time keeping the load lightweight on your infrastructure.
To bring this theory in reality - you cannot depend upon a single tool or a system. Let's look at such different systems which can be used at different levels and for specific reasons.
Here is a quick look at the Cache Layers of web applications:
01. Browser Cache
For performance improvement and faster page delivery in the browser, the browser cache can be enabled. But, when utilizing services such as Cloudfront, Varnish, Redis, and DBs, the browser cache can create a problem by showing stale content.
Hence, it is best not to utilize the browser as web applications will have no (or minimal) control over purging the browser cache.
02. Content Delivery Network
Options: Cloudflare, Cloudfront, Akamai, Fastly.
CDN gets your cached data decentralized and dramatically reduces the direct hits to the server.
This way CDN serves the cached data of your web pages from the nearest data center, pinging the shortest possible route.
CDN caches the full page and is useful for anonymous users.
We need to configure exactly what kind of data needs to be cached and for how long.
Example
- Bypass the cache for admin pages
- Bypass the cache for anything specific to user sessions
- Cache everything (excluding the above two specs)
The specification above will cache your entire site except for the admin or active user session-specific pages.
03. HTTP Reverse Proxy
Options: Varnish, Nginx, Squid, and TrafficServer
HTTP reverse proxy is also known as an HTTP accelerator. It caches the content and serves the same directly to the browsers or CDN while acting as a mediator between the client and server. It primarily speeds up content delivery by skipping the data manipulations at the business and DB layers.
04. In Memory Cache
Options: Memcache, Redis, Amazon DynamoDB, Apache Ignite
In memory caching temporarily stores the data in dynamic memory (RAM) and enables significantly faster data retrieval in cases where the application follows common and repetitive data access patterns.
The flexibility of such systems depends on the volume of dynamic data and the scalability it needs to achieve.
If the web application is caching the pages internally, that can be disabled, and an In-memory cache such as Memcache can be configured.
05. DB Query Caching
Options: MySQL, Postgres, MariaDB, MongoDB, DB2
Query caching is a mechanism in which frequently queried data is stored temporarily in the memory. This helps not hit the DB engine and manipulate the operations but serves the queried data from the cache.
It is highly efficient in high-read, low-write situations, which is quite common to most websites.
06. Cache Purging
In simple words, caching generates the static copy of the web page and serves it to the client without making any dynamic calls to servers (app / DB).
Purging the cache means - the next time your client renders the page, every time, it will get the latest (new) content from the underlying layers such as HTTP Proxy, In-memory Cache, or DB.
Not every web application will have all the cache layers explained here. Based on the application, its serving audiences and the nature of visitors - layers can be added or removed to bring the performance efficiency.