Speed Up Your Website With Cache

It's quick, it's easy, and it's free.

If you're like me, you've been running your website without server-side cache for 7 years.  We've got lots of posts and traffic now, so I figured it's about time to do something about that.  By enabling nginx caching I was able to cut page load times by 80%. Yeah, if your backend is running on Node, this is probably for you.

Create an nginx cache directory, I chose /var/cache/nginx/:

mkdir -p /var/cache/nginx

Add a proxy_cache_path directive to the http context in /etc/nginx/nginx.conf:

http {
	...
	proxy_cache_path /var/cache/nginx/ keys_zone=my_cache:10m inactive=4h levels=1:2;
}

Here, we name the keys_zone my_cache, with a size of 10MB, and an inactive timeout of 4 hours.  This means that files will be removed from cache if they are not accessed for 4 hours, regardless of any Cache-Control headers.  levels creates a 2-tier directory structure to improve filesystem performance with large lists of files.

Add caching directives to your desired location contexts in /etc/nginx/sites-enabled/default:

server {
	...
	location = / {
		...
		proxy_ignore_headers Cache-Control;
		proxy_hide_header Cache-Control;
		add_header 'Cache-Control' "public, max-age=7200";
		proxy_cache my_cache;
		proxy_cache_valid 10m;
	}
}

Perhaps you only have specific pages you want to cache, so you can set up your location block accordingly.  I'm only caching the home page of cako.io, as I make frequent edits to posts in the minutes and hours after I publish them, so I want people to always see the latest version.

Cache-Control headers from your backend will mess with nginx's caching, so we ignore (meaning nginx itself disregards them in the response from the app) and remove them, adding our own Cache-Control header which tells the client that cached versions of pages are good for 120 minutes.  The browser will try to use its locally cached copy first, and only hit the server cache after 120 minutes or if the user hard-refreshes.

proxy_cache enables caching using the my_cache zone we created, and we set proxy_cache_valid to 10 minutes, which is the actual TTL we want for these cache entries.  This means if a file in the cache is more than 10 minutes old, nginx will proxy the request along to fetch the latest page from your app.

Finally, restart nginx:

sudo service nginx restart

Done!

Enabling server side caching means that nginx will serve cached versions of files when available, instead of passing every request along to the proxied app.  In practice, this makes your site significantly faster, because any expensive operations like querying databases are only performed as necessary, and all of the requests to follow will receive a speedy cached response.

This is the quickest way to speed up your website without any new code or infrastructure.  cako.io still runs on a single DigitalOcean VPS in the NYC3 zone, as it had when I started it 7 years ago!  This makes changes and updates to the site super fast, and it gives me the most freedom for system and application configuration.  I've found this simple SSH and VSCode workflow very gratifying when compared to other cloud options with more complicated infrastructure and containerization.  It's just faster and easier to iterate on, and it reduces the complexity to a single Linux host.