2024-01-02
4560
#node
Philip Obosi
14373
Jan 2, 2024 β‹… 16 min read

Understanding and implementing rate limiting in Node.js

Philip Obosi Frontend engineer and data visualist πŸ‘¨πŸ»β€πŸ’» based in Lagos, Nigeria.

Recent posts:

Building a Full-Featured Laravel Admin Dashboard with Filament

Building a full-featured Laravel admin dashboard with Filament

Build scalable admin dashboards with Filament and Laravel using Form Builder, Notifications, and Actions for clean, interactive panels.

Kayode Adeniyi
Dec 20, 2024 β‹… 5 min read
Working With URLs In JavaScript

Working with URLs in JavaScript

Break down the parts of a URL and explore APIs for working with them in JavaScript, parsing them, building query strings, checking their validity, etc.

Joe Attardi
Dec 19, 2024 β‹… 6 min read
Lazy Loading Vs. Eager Loading

Lazy loading vs. Eager loading

In this guide, explore lazy loading and error loading as two techniques for fetching data in React apps.

Njong Emy
Dec 18, 2024 β‹… 5 min read
Deno logo over an orange background

How to migrate your Node.js app to Deno 2.0

Deno is a popular JavaScript runtime, and it recently launched version 2.0 with several new features, bug fixes, and improvements […]

Yashodhan Joshi
Dec 17, 2024 β‹… 7 min read
View all posts

10 Replies to "Understanding and implementing rate limiting in Node.js"

  1. 2 of 3 cons of fixed window counter are not fair:
    – “user’s window should start counting from the time of their first request” -> this is easy to implement.
    – “burst traffic towards the end of a window” -> it may be issue, if your service is for one customer. It is unlikely, that all your thousands users would make all requests at once.

  2. Hi,
    It looks like using app.use() would limit the rate to the whole API. How would you go about applying rate limit to only a particular POST request while letting users do unlimited GET requests?

  3. Michal,

    You can do this by applying the middleware to the POST route directly instead of `app.use`

    e.g.

    `app.post(‘/limitedRoute’, customRedisRateLimiter, (req, res, next) => {})`

  4. When the record is null in the Redis store, you create the record, store it and then go to the next middleware. Shouldn’t there be a return statement after the next() instruction to prevent the middleware from executing the rest of the code ?

  5. you should wrap “await redisClient.connect()” in if statement with condition “!redisClient.isReady” or “!redisClient.isOpen” so it doesn’t throw “Socket already opened” error.

  6. this line get time of 24 hours ago from now ‘const windowStartTimestamp = moment().subtract(WINDOW_SIZE_IN_HOURS, ‘hours’).unix();’ and the record in redis already deleted after 24 hours , so how it comes?

  7. I tested the first implementation. I noticed that requestCount is only incremented when you call a different endpoint. But I want the rate to be per request, no matter the endpoints.

Leave a Reply