2022-11-24
2310
#node
Rishabh Rawat
143530
Nov 24, 2022 â‹… 8 min read

Build a durable pub-sub with Kafka in Node.js

Rishabh Rawat Rishabh is a full-time Software Engineer, GSoC '19 participant, and GCI '18 mentor. He writes about backend web development biweekly. Find him on Twitter and LinkedIn.

Recent posts:

What happens when dev communities die Stack Overflow’s slow collapse

What happens when dev communities die: Stack Overflow’s slow collapse

Explore how Stack Overflow’s slow collapse affects programming and the possible future for Stack Overflow vs. generative AI competition.

Shalitha Suranga
Aug 29, 2025 â‹… 10 min read
How to build a multimodal AI app with voice and vision in Next.js

How to build a multimodal AI app with voice and vision in Next.js

Learn how to build multimodal AI interactions to process images, audio, and even real-time video streams, using Next.js and Gemini.

Elijah Asaolu
Aug 29, 2025 â‹… 6 min read
i tried kiro and here is what i learned

I tried out Kiro: Here’s what I learned

Check out Kiro, AWS’s AI-powered IDE, see what makes it different from other AI coding tools, and explore whether it lives up to the hype.

Elijah Asaolu
Aug 28, 2025 â‹… 5 min read
Go Design Pattern Article Image With Logo

Why Go design patterns still matter

Here’s how three design patterns solved our Go microservices scaling problems without sacrificing simplicity.

Peter Aideloje
Aug 28, 2025 â‹… 2 min read
View all posts

8 Replies to "Build a durable pub-sub with Kafka in Node.js"

  1. >This helps in replaying or resyncing your applications and certain operations

    I tend to disagree, as complex applications involve integrations with 3p services, and replaying events might cause inconsistencies on those services if invoked multiple times (in longer periods of time). Events which represent business facts should be handled, processed and consumed only once. However, I agree on the auditing part, but in a distributed, service-oriented architecture, that should be framed as a proper service.

    1. Yes, replaying the events is only possible when all of the external services or vendors have idempotent APIs.

      Absolutely. Replaying Kafka events should not be a go-to for compensating request failures. If unexpected failures happen frequently, the API design needs to be relooked at.

  2. Hi, i am getting below issue after running first command..

    Connection to node -1 (localhost/127.0.0.1:9092) could not be established. Broker may not be available. (org.apache.kafka.clients.NetworkClient)

Leave a Reply