Caching in Nodejs With Redis and Postgresql

By onjsdev

Redis is an in-memory store used as a database, cache, streaming engine, and message broker. It supports various data structures such as strings, hashes, lists, and sets. In this article we will integrate redis into an API built with nodejs, expressjs and postgresql to cache the responses to the requests made.

Integrating Redis with Node And PostgreSQL

Before you start, make sure you have redis server running your local machine. if you don't have it you can find installation guide for your operating system in the redis website.

npm install redis express pg

Configure PostgreSQL Database

We will use the PostgreSQL database to store the data. You can use a different database or an external API to test purposes. Make sure your database server is running on your local machine.

const { Pool } = require('pg');
// Db Connection
const pool = new Pool({
  user: 'postgres',
  host: 'localhost',
  port: 5432,
  password: 'secret',
  database: 'myblog',

Creating Expressjs Server

We have a simple server running on 3001 to respond to requests made for getting a post and deleting a post.

const express = require('express');
const app = express();

// Codes Here

app.listen(3001, () => {
  console.log('Server Started');

Setting API Routes

Our goal is caching responses sent before so we have two routes for getting and deleting a post. Realize that, we add a cache middleware to route for getting post so that we will be able to cache responses.

app.get('/posts/:id', cache, getPost);
app.delete('/posts/:id', delPost);

Integrate Redis Into Nodejs API

You can connect to a Redis server using the createClient method and specify a host and port for remote server.

const redis = require('redis');

const redisClient = redis.createClient({
  host: '',
  port: 6379,

redisClient.connect().catch((err) => {

Creating Controllers


The first controller is getPost getting the requested data from the database and storing it in the Redis store for serving the data in the next request without getting the data from the database, which will reduce the response time.

async function getPost(req, res, next) {
  try {
    const id =;
    const result = await pool.query('SELECT * FROM posts WHERE id = $1', [id]);
    if (result.rows.length > 0) {
      // Store in Redis
      // Expiry time is 10 minutes

      await redisClient.setEx(
        60 * 10,

        posts: result.rows[0],

  } catch (error) {
    res.status(500).json({ error: 'Something Went Wrong' });


The second controller is delPost removing the data from both the database and Redis store.

async function delPost(req, res, next) {
  try {
    // get id of the post from request params
    const id =;
    // delete the data from database
    await pool.query('DELETE FROM posts WHERE id = $1', [id]);
    // delete the data from redis
    await redisClient.del(`post:${id}`);
    res.status(200).json({ msg: 'Post was deleted' });
  } catch (error) {
    res.status(500).json({ error: 'Something Went Wrong' });

Cache Middleware

What the cache middleware does is check if the requested data is in the Redis store, send the response if it does, otherwise run the next callback to run the getPost controller.

async function cache(req, res, next) {
  try {
    const id =;
    const post = await redisClient.get(`post:${id}`);

    if (!post) {
      return next();

    res.status(200).json({ post: JSON.parse(post) });

  } catch (error) {

Results Before And After Caching With Redis

before-cache.png after-cache.png


In this article, we have covered how to cache the responses to the requests in nodejs with Redis using a cache middleware to reduce the response time.

Thank you for reading