2024-01-02
4560
#node
Philip Obosi
14373
Jan 2, 2024 ⋅ 16 min read

Understanding and implementing rate limiting in Node.js

Philip Obosi Frontend engineer and data visualist 👨🏻‍💻 based in Lagos, Nigeria.

Recent posts:

Giving Jarvis Hands: What Mcp Means For Ai Integrations On The Open Web LogRocket Article

The next phase of dev: Building for MCP and the open web

MCP is the bridge between AI and the open web — giving intelligent agents the ability to act, not just talk. Here’s how this open protocol transforms development, business models, and the future of software itself.

Peter Aideloje
Nov 11, 2025 ⋅ 5 min read

You’ve authenticated your user, but have you authorized your agent?

AI agents can now log in, act, and access data, but have you truly authorized them? This guide walks through how to secure your autonomous agents using Auth0’s Auth for GenAI, covering token vaults, human-in-the-loop approvals, and fine-grained access control.

Ikeh Akinyemi
Nov 10, 2025 ⋅ 5 min read

FTC’s AI chatbot crackdown: A developer compliance guide

A hands-on guide to building an FTC-ready chatbot: real age checks, crisis redirects, parental consent, audit logs, and usage limits – designed to protect minors and prevent harm.

Clara Ekekenta
Nov 10, 2025 ⋅ 21 min read
When to use CSS text-wrap: balance vs text-wrap: pretty

When to use CSS text-wrap: balance vs. text-wrap: pretty

Compare and contrast two CSS components, text-wrap: balance and text-wrap: pretty, and discuss their benefits for better UX.

Daniel Schwarz
Nov 7, 2025 ⋅ 5 min read
View all posts

11 Replies to "Understanding and implementing rate limiting in Node.js"

  1. 2 of 3 cons of fixed window counter are not fair:
    – “user’s window should start counting from the time of their first request” -> this is easy to implement.
    – “burst traffic towards the end of a window” -> it may be issue, if your service is for one customer. It is unlikely, that all your thousands users would make all requests at once.

  2. Hi,
    It looks like using app.use() would limit the rate to the whole API. How would you go about applying rate limit to only a particular POST request while letting users do unlimited GET requests?

  3. Michal,

    You can do this by applying the middleware to the POST route directly instead of `app.use`

    e.g.

    `app.post(‘/limitedRoute’, customRedisRateLimiter, (req, res, next) => {})`

  4. When the record is null in the Redis store, you create the record, store it and then go to the next middleware. Shouldn’t there be a return statement after the next() instruction to prevent the middleware from executing the rest of the code ?

  5. you should wrap “await redisClient.connect()” in if statement with condition “!redisClient.isReady” or “!redisClient.isOpen” so it doesn’t throw “Socket already opened” error.

  6. this line get time of 24 hours ago from now ‘const windowStartTimestamp = moment().subtract(WINDOW_SIZE_IN_HOURS, ‘hours’).unix();’ and the record in redis already deleted after 24 hours , so how it comes?

  7. I tested the first implementation. I noticed that requestCount is only incremented when you call a different endpoint. But I want the rate to be per request, no matter the endpoints.

Leave a Reply

Hey there, want to help make our blog better?

Join LogRocket’s Content Advisory Board. You’ll help inform the type of content we create and get access to exclusive meetups, social accreditation, and swag.

Sign up now