2024-01-02
4560
#node
Philip Obosi
14373
Jan 2, 2024 β‹… 16 min read

Understanding and implementing rate limiting in Node.js

Philip Obosi Frontend engineer and data visualist πŸ‘¨πŸ»β€πŸ’» based in Lagos, Nigeria.

Recent posts:

i tried kiro and here is what i learned

I tried out Kiro: Here’s what I learned

Check out Kiro, AWS’s AI-powered IDE, see what makes it different from other AI coding tools, and explore whether it lives up to the hype.

Elijah Asaolu
Aug 28, 2025 β‹… 5 min read
Go Design Pattern Article Image With Logo

Why Go design patterns still matter

Here’s how three design patterns solved our Go microservices scaling problems without sacrificing simplicity.

Peter Aideloje
Aug 28, 2025 β‹… 2 min read
how to protect your ai agent from prompt injection attacks

How to protect your AI agent from prompt injection attacks

Explore six principled design patterns (with real-world examples) to help you protect your LLM agents from prompt injection attacks.

Rosario De Chiara
Aug 27, 2025 β‹… 5 min read
Don’t Let AI Erase The Next Generation Of Dev Leaders

Don’t let AI erase the next generation of dev leaders

As AI tools take over more routine coding work, some companies are cutting early-career dev roles β€” a short-sighted move that could quietly erode the next generation of tech leaders if we aren’t careful.

Jack Herrington
Aug 26, 2025 β‹… 6 min read
View all posts

11 Replies to "Understanding and implementing rate limiting in Node.js"

  1. 2 of 3 cons of fixed window counter are not fair:
    – “user’s window should start counting from the time of their first request” -> this is easy to implement.
    – “burst traffic towards the end of a window” -> it may be issue, if your service is for one customer. It is unlikely, that all your thousands users would make all requests at once.

  2. Hi,
    It looks like using app.use() would limit the rate to the whole API. How would you go about applying rate limit to only a particular POST request while letting users do unlimited GET requests?

  3. Michal,

    You can do this by applying the middleware to the POST route directly instead of `app.use`

    e.g.

    `app.post(‘/limitedRoute’, customRedisRateLimiter, (req, res, next) => {})`

  4. When the record is null in the Redis store, you create the record, store it and then go to the next middleware. Shouldn’t there be a return statement after the next() instruction to prevent the middleware from executing the rest of the code ?

  5. you should wrap “await redisClient.connect()” in if statement with condition “!redisClient.isReady” or “!redisClient.isOpen” so it doesn’t throw “Socket already opened” error.

  6. this line get time of 24 hours ago from now ‘const windowStartTimestamp = moment().subtract(WINDOW_SIZE_IN_HOURS, ‘hours’).unix();’ and the record in redis already deleted after 24 hours , so how it comes?

  7. I tested the first implementation. I noticed that requestCount is only incremented when you call a different endpoint. But I want the rate to be per request, no matter the endpoints.

Leave a Reply