“Maybe we should cache this”. It’s a common thought when we notice a slow response or an expensive computation. Just as common - and even more frustrating - is the follow-up thought: “Must be the cache”. That moment when something returns an unexpected response, a stale value in a view, a wrong number in an API response, or a method returning outdated or just plain incorrect data. And the culprit? Caching gone wrong.
Most Rails apps are already leveraging caching from the get-go, even if it’s not immediately obvious. For example, digested assets (like stylesheets, JavaScript files, and logos) are fingerprinted and cached by the browser to avoid re-downloading them on every visit. Turbo Drive also caches full pages in memory by default, enabling near-instant back and forward navigation.
Let’s explore common caching strategies and gems available in Ruby on Rails, but first, let’s take a step back to understand the purpose of caching and where it fits into a well-architected Rails app. After all, anything can be cached, but not everything should be.
What is caching?
Caching is the practice of storing precomputed or previously retrieved data in a quickly accessible place so your application doesn’t have to regenerate or re-fetch it every time it’s needed. Instead, it uses the most recent stored value, whether it was generated in advance preemptively or stored after a previous request was finished.
Key points of Caching
When thinking about caching, it helps to break it down into four key questions:
- Is caching the right solution? This is the most important one. Before caching, investigate the root cause of the slowness. Is it a slow database query? An N+1 problem? Inefficient code? Caching should be an optimization, not a fix for underlying issues.
- What should be cached? This could be data, partial or full-page views, API responses, or any computation-heavy output.
- Where should it be stored? In an in-memory data store (e.g., Redis), on a persistent disk (e.g., FileStore), in a relational database-backed store (e.g., Solid Cache), or even on the edge (CDNs, browser cache).
- How and when should it expire?
- Automatically using model-based versioning (like Rails’
cache_key_with_version
). - Time-based expiration (e.g.,
expires_in: 10.minutes
). - Manually, using event-driven logic, for example, when a user changes roles, or a record is updated or deleted.
- Automatically using model-based versioning (like Rails’
When and what should you be caching?
The purpose of caching is to prevent your application or server from doing repetitive, expensive work. The first step is identifying the areas that are frequently accessed or involve data that’s requested repeatedly and, if any of these operations are also slow, caching can significantly improve response times.
Slow or expensive database queries
Before you decide to cache a query, check for common performance issues like:
- Missing Indexes: Use
EXPLAIN ANALYZE
on your query to see if it’s performing full table scans. Adding the right database index is often the most effective fix to improve response times. - N+1 Queries: Are you accidentally running hundreds of queries in a loop? Use tools like the
bullet
gem to detect and fix these by eager-loading associations. - Inefficient SQL: Sometimes the query itself can be rewritten for better performance. For example, instead of counting associated records in a loop (e.g.,
category.posts.count
for multiple categories), which triggers a separate query for each category, you can use Rails’ built-incounter_cache
functionality to keep a running count stored directly on the parent record.
Once you’ve optimized the query, good candidates for caching are those that remain expensive and are frequently run, such as:
- Queries with multiple
JOINS
across large tables. - Complex aggregations (e.g.,
GROUP BY
withSUM
,AVG
,COUNT
). - Queries that power dashboards, reports, or leaderboards where the data doesn’t need to be real-time, such as weekly, monthly, or yearly summaries.
Application code that is doing heavy or extensive calculations
Serializations and/or transformations of sorts (e.g. generating CSV files or data tables).
Before deciding to cache, you should check that you are only pulling the data you need, and see if any calculations can be refactored for improved response times.
Static content
Blog posts, documentation pages, landing pages, FAQs, and “About us” sections. Anything that looks the same to everyone and is not affected by authentication or authorization.
Frequently accessed API endpoints
Think endpoints like /countries
, /categories
, /articles
, etc. Especially useful when the response doesn’t change often.
When to be extra careful with caching
While caching is powerful, caching everything can easily get out of hand. You should be extra careful with:
- Rapidly changing data: Frequent invalidation defeats the purpose and may introduce race conditions.
- Responses relying heavily on authorization (Role-based or User specific): Places where the cache must be scoped to the current user or role. If not handled correctly, it can expose sensitive data to the wrong user.
- Non-deterministic logic: Avoid caching results based on current time, randomized values, or other volatile inputs unless they’re part of the cache key.
Where to cache in a Rails app
For a well-architected conventional Rails app, caching takes place in a couple of places:
View or Fragment Caching
Cache partials or reusable UI components (e.g., a sidebar, navigation, a table of records). For example:
<% cache(@product) do %>
<!-- ... -->
<% end %>
Page Caching
Full HTML written to disk (public
folder). Great for public static pages. This allows your web server (e.g., Nginx, Apache) to serve the page directly, completely bypassing Rails.
Controller / Action Caching
Caches a controller action output but still allows filters to go through (like before_action
). Still applies filters and authorization checks (useful for JSON API responses and/or tables). See Rails Action Caching.
Object / Model Caching
Store serialized or precomputed objects in the cache. See IdentityCache and CacheCrispies.
Low-Level Caching
Strings, hashes, arrays, precomputed results, etc. If you have a method that performs an expensive calculation with predictable output that rarely changes and has fixed inputs, it can be cached.
Often used in service objects, decorators, or model methods.
Cache stores: Where is it stored?
The most common caching setup in Rails involves storing data in memory. MemoryStore
is local to a single Rails process, meaning that if you’re running multiple processes (which is typically the case when using Puma in clustered mode or Phusion Passenger) the cache won’t be shared across them.
If you want the cache to be shared across processes (or different servers, even), you’ll need something like Redis
or Memcached
, which run as separate services and are accessed over the network, making them suitable for distributed setups.
FileStore
is also commonly used for development environments because it stores cached data in the file system, which gives developers easy visibility and control over cached data. One important exception is page caching, which writes static HTML files directly to disk.
Starting with Rails 8, the default cache store is now Solid Cache, a database-backed store designed to be fast, reliable, and simple to use out of the box. Unlike in-memory stores, Solid Cache persists cached data to disk via your primary database and is optimized to take advantage of modern NVMe SSDs, which keep getting closer and closer to RAM-level performance while being far more affordable and scalable. I highly recommend checking out Rails’ official YouTube video on Solid Cache if you want a deeper walkthrough.
To sum it up, FileStore
is fine for local development, but in production, choose a centralized and network-accessible cache like Redis
, and let your infrastructure do the heavy lifting.
Ultimately, the best cache store depends on your application’s architecture. If you’re already using Redis for background jobs, reusing it for caching keeps things simple. If you’re starting fresh, Solid Cache is an excellent default that works seamlessly with your database and scales well as your application grows.
Common pitfalls
One very common mistake comes when your application starts to scale and you introduce a load balancer. Suddenly, you’re running multiple Rails instances, but users begin reporting inconsistent or slower responses. That’s when it hits you: Did I think about where the cache is stored?
If you’re using FileStore
, each instance writes to and reads from its local file system, meaning no cache is shared across instances. Also, some file systems are ephemeral (Heroku, Dockerized deployments), which means each deployment or restart would wipe your cache data clean. This defeats the purpose of caching and can even increase load due to repeated cache misses.
Even when using Redis
or Memcached
, the cache store must run outside the app instances, ideally as a separate, centralized service. If each instance runs its own isolated Redis or Memcached process, you’ll end up with the same problem: no shared cache across servers.
Is Caching always worth it?
Caching is not intended to be your first performance boost option, it should be the final layer of optimization. If your application is facing slow performance, it’s usually more effective (and safer) to:
- Improve your queries (e.g., using proper indexing).
- Make sure to eager load associations when needed.
- Find and reduce N+1 queries.
- Leverage background jobs for slow, async work.
Only after you’ve identified and addressed real bottlenecks should you consider caching to further reduce your server’s load and boost its performance.
Final thoughts
Keep in mind that while caching is a powerful tool, it’s easy to misuse. Caching introduces code complexity and carries the risk of serving stale or inconsistent data. Bugs can easily slip through your cache setup if you are not extra careful, and it adds a complexity layer that makes it harder to debug.
Rails gives us a lot of great tools to implement caching across our applications, and it is our job to do so responsibly. Good caching will reduce load, improve performance, and make everyone happier. Bad caching causes confusion, stale data, and can leak sensitive information if not done properly.
Always weigh the pros and cons before introducing caching to your code, think and test thrice.
Happy caching!