Professional Documents
Culture Documents
NestJs Caching With Redis - The Ultimate Guide - Medium
NestJs Caching With Redis - The Ultimate Guide - Medium
Congratulations! You have deployed a NestJs application that is gaining traction! A lot of
users are using your app, the traffic goes viral.
At some point, you receive emails complaining that your website is slow. You’ve probably
heard that caching can solve the problem, but you are unsure how to implement it.
In this article, I will explain caching, why you need it and how to implement it in your
NestJs application.
What is Caching?
Before we start, please note that you can find the github repository with the completed
project
Caching is a fairly old technique designed to improve your application’s performance and
reliability.
An HTTP request asking for data cached by the server will receive it directly from the
cache store instead of getting it from a database. Which is much faster!
Indeed, as the number of users grows, so does the number of HTTP requests made to the
server. This results in the same data being fetched all over again and again. Optimizing
your application for speed and efficiency is important by caching frequently requested
data.
Since most relational databases involve structured data, they are optimised for reliability
and not for speed. That means the data they store on a disk is many times slower than the
RAM. Using a NoSQL database does not bring any tremendous performance gains either.
In this tutorial, we will implement caching in NestJs and ultimately scale it with Redis, a
fast in-memory database that is perfect for this use case.
Pre-requisites
Docker
The NestJs CacheModule is included in the @nestjs/common package. You will need to add
it to your app.module.ts file.
app.module.ts
@Module({
imports: [
CacheModule.register({
isGlobal: true,
}),
],
controllers: [AppController],
})
The Cache Module handles a lot of cache configuration for us, and we will customize it
later. Let’s just point out that we can use caching with two different approaches:
As a rule of thumb, you will use the Cache Interceptor If you need an endpoint to return
cached data from the primary database in a traditional CRUD app.
However, if you need more control or do not necessarily want to return cached data, you
will use the cache manager service as dependency injection.
So to summarise…
You will use the Cache Manager if you need more control, like:
Deleting from cache
Updating cache
We’ll create an src/utils.ts file that will store a getter function with a small timeout to
simulate some database delay.
utils.ts
Now that we have a getter function for our dogs, we can use it in the app.controller.ts
app.controller.ts
@Controller()
export class AppController {
@Get("dogs")
getDogs() {
return getDogs();
}
}
Let’s add some cache! Adding caching with interceptors is as simple as this 👇🏻
app.controller.ts
import {
CacheInterceptor,
Controller,
Get,
UseInterceptors,
} from "@nestjs/common";
import { getDogs } from "./utils";
@Controller()
export class AppController {
@UseInterceptors(CacheInterceptor)
@Get("dogs")
getDogs() {
return getDogs();
}
}
Note that you can apply caching at the controller level by moving
the @UseInterceptors(CacheInterceptor) above the @Controller() decorator. However,
caching should be used in specific parts of your application. So it's usually better to apply
it sporadically, at the endpoint level.
It’s time to make a request now! I will use curl, but you can use any HTTP client of your
choice.
You see that the first request takes approximately 1 second, while the second one takes 16
milliseconds. This is because the second request gets the array directly from the cache.
This is the power of caching! When applied to specific endpoints that are requested a lot, it
can greatly accelerate your application.
# first request
time curl http://localhost:3333/dogs
[{"id":1,"name":"Luna","breed":"Caucasian Shepherd"},
{"id":2,"name":"Ralph","breed":"Husky"}]curl http://localhost:3333/dogs
0.00s user 0.01s system 1% cpu 1.024 total
# second request
time curl http://localhost:3333/dogs
[{"id":1,"name":"Luna","breed":"Caucasian Shepherd"},
{"id":2,"name":"Ralph","breed":"Husky"}]curl http://localhost:3333/dogs
0.00s user 0.01s system 51% cpu 0.018 total
The cache has an expiration date, so you don’t serve stale data to your users, the default
value is 5 seconds, but you can change that.
app.controller.ts
import {
CacheInterceptor,
Controller,
Get,
UseInterceptors,
CacheTTL,
} from "@nestjs/common";
import { getDogs } from "./utils";
@Controller()
export class AppController {
@UseInterceptors(CacheInterceptor)
@CacheTTL(10)
@Get("dogs")
getDogs() {
return getDogs();
}
}
Let’s also add a custom cache key (which is, by default, the name of the endpoint)
import {
CacheInterceptor,
Controller,
Get,
UseInterceptors,
CacheTTL,
CacheKey,
} from "@nestjs/common";
import { getDogs } from "./utils";
@Controller()
export class AppController {
@UseInterceptors(CacheInterceptor)
@CacheTTL(10)
@CacheKey("all-dogs")
@Get("dogs")
getDogs() {
return getDogs();
}
}
While very practical, this approach does not allow us to delete from cache or update
certain elements manually. While you might not need it in most cases, you will sometimes
need more control over how data is saved to your cache-store.
To avoid modifying our existing logic, let’s add another getter function to get some cats! 😺
utils.ts
And update our app.controller.ts with an additional endpoint to get the array of cats. This
endpoint will use the cache manager.
app.controller.ts
import {
CacheInterceptor,
CacheKey,
CacheTTL,
CACHE_MANAGER,
Controller,
Get,
Inject,
UseInterceptors,
} from "@nestjs/common";
import { Cache } from "cache-manager";
import { getCats, getDogs } from "./utils";
@Controller()
export class AppController {
constructor(
@Inject(CACHE_MANAGER)
private cacheManager: Cache
) {}
@UseInterceptors(CacheInterceptor)
@CacheTTL(10)
@CacheKey("all-dogsdogs")
@Get("dogs")
getDogs() {
return getDogs();
}
@Get("cats")
async getCats() {
const cachedCats = await this.cacheManager.get(
"all-cats"
);
if (cachedCats) return cachedCats;
return cats;
}
}
Note that the cache manager is injected in the constructor with the
token CACHE_MANAGER. The cache manager gives us more control over how we get
and return fetched data at the expense of a bit of code complexity.
In the code above, we try to get the cats array from the cache with the key all-cats.
If the cached value exists, we immediately return it. Otherwise, we call getCats (in
production, that would be a call to your database) and save the fetched data in the cache
with a TTL of 10 seconds.
That’s it!
reset() — to delete the whole cache store (you would probably never want to use that!)
Our application is now cached and can sustain a great load. However, there are some
limitations…
Our cache does not scale past one node process, so we can’t run our app in the
cluster
The cache is stored in the server’s RAM, so there is a possibility for the server to run
out of memory
To fix that, we need to outsource our cache store to an in-memory database that is very
fast and performant. Meet Redis ❤️
This small library uses the node-redis (unfortunately, it does not support ioredis) under the
hood. Integrating it into our app is very simple and can be done in the app.module.ts file.
We also need to add redis to get the RedisClientOptions type. Not mandatory, but a nice
to have.
app.module.ts
@Module({
imports: [
CacheModule.register<RedisClientOptions>({
isGlobal: true,
store: redisStore,
url: "redis://localhost:6379",
}),
],
controllers: [AppController],
})
export class AppModule {}
The last thing to add is the Redis database. The easiest way to install it is through docker-
compose! We can spawn a Redis database and redis-commander using Docker, thanks to
a docker-compose.yml file.
Redis commander is similar to pgadmin or MySQL workbench, but for Redis. It will help us
inspect the contents of Redis as we run our application.
docker-compose.yml
version: "3.9"
services:
redis:
image: redis:6.0
ports:
- 6379:6379
redis-commander:
container_name: redis-commander
hostname: redis-commander
image: ghcr.io/joeferner/redis-commander:latest
environment:
- REDIS_HOSTS=local:redis:6379
ports:
- "8081:8081"
To start the Redis database, you must use docker-compose in your terminal.
docker compose up -d
Redis commander will show that our dogs are saved in the cache, with (in my example) a
time to live of 8 seconds before the cache is cleared.
Summary
Well done if you’ve read so far! I hope that you found this content useful and educational.
PS: Want to become an expert with NestJs? Get notified when my NestJs Essentials
course is released here
Code with Vlad | Learn NestJs from the best courses online
Write NestJs like an ExpertBuild reliably, with confidence Bootstrap your
journey with this complete deep dive into…
www.codewithvlad.com