When should I add caching to my serverless API?
A client recently asked about adding cloud-side API caching, wondering could and should they be using it in their AppSync API.
My short answer to this is always default to no caching, irrespective of what service you’re using to host your API.
Motivations for adding caching
First let’s look at the reasons why you might be considering adding caching to your API:
- There is latency on some requests which is noticeably affecting the user’s experience
- The billing cost of hitting the downstream service behind the cache has become an issue
- Availability of a downstream service (and thus of your API) is threatened by the API’s peak throughput
These are all valid reasons, but there’s more to consider before making your decision.
The costs and risks of adding caching
As soon as you decide to implement caching for your API, you’re now on the hook for building and supporting it. This has several costs that you need to take into account.
Implementation costs of caching:
- Writing code to evict stale items from the cache
- Ensuring that this cache eviction logic is applied to all the places where the origin data can be changed
- Testing that your cache eviction logic is correct
- Capacity planning for what size of cache you need
Operational and maintenance costs and risks of caching:
- Difficult-to-reproduce bugs related to users seeing stale data
- Second order bugs caused by developers wrongly assuming an API request returned fresh data
- How will your API behave if your cache becomes unavailable?
- The billing cost is pay-per-hour instead of the pay-per-use serverless model, which makes it less cost effective for low-traffic environments such as per-developer stacks.
- Disabling caching in development or test environments means less bugs get discovered at dev time
Most of these items fall into the fuzzy “engineers’ time” cost center which unfortunately doesn’t make for as simple an argument as “we can shave 50ms off every request if we add caching to this route”. And so well-meaning developers often pre-emptively implement caching because “this route will get a lot of traffic once we launch”.
Are you really going to need it?
Of the reasons for implementing caching, user-facing latency is the one which holds most weight with me, as this often contributes directly to the value proposition of your product. But if your downstream service is already highly scalable and extremely fast, like DynamoDB say, do you really need the extra few milliseconds that caching gives you?
On the billing cost argument, you will need to either be getting a ton of traffic or using an expensive downstream service for its billing cost to outweigh the implementation and operational costs of adding caching in front of it.
And finally, having a downstream service which isn’t capable of handling large volumes of traffic is the exception in my experience of building serverless apps on AWS. While caching could be a good solution here, I would still tend towards deploying it reactively rather than proactively.
Wait until it hurts before adding caching
This advice may seem negligent and I’m certainly not advocating ignorance or lack of forward planning — by all means, learn and understand how you might implement caching within your API in the future. But assuming you have appropriate monitoring set up in your AWS account for slow requests and cost thresholds, the problem to which caching is a potential solution will find you when it’s ready.
Further reading
- Implementing caching in AWS API Gateway (AWS docs)
- Implementing caching in AWS AppSync (AWS docs)
- How caching works with CloudFront edge locations (AWS docs)
Other articles you might enjoy:
Free Email Course
How to transition your team to a serverless-first mindset
In this 5-day email course, you’ll learn:
- Lesson 1: Why serverless is inevitable
- Lesson 2: How to identify a candidate project for your first serverless application
- Lesson 3: How to compose the building blocks that AWS provides
- Lesson 4: Common mistakes to avoid when building your first serverless application
- Lesson 5: How to break ground on your first serverless project