2022-11-24
2310
#node
Rishabh Rawat
143530
Nov 24, 2022 â‹… 8 min read

Build a durable pub-sub with Kafka in Node.js

Rishabh Rawat Rishabh is a full-time Software Engineer, GSoC '19 participant, and GCI '18 mentor. He writes about backend web development biweekly. Find him on Twitter and LinkedIn.

Recent posts:

Vue logo over a brown background.

A guide to two-way binding in Vue

Learn how to implement one-way and two-way data binding in Vue.js, using v-model and advanced techniques like defineModel for better apps.

David Omotayo
Nov 22, 2024 â‹… 10 min read
TypeScript logo over a pink and white background.

Drizzle vs. Prisma: Which ORM is best for your project?

Compare Prisma and Drizzle ORMs to learn their differences, strengths, and weaknesses for data access and migrations.

Temitope Oyedele
Nov 21, 2024 â‹… 10 min read
Practical Implementation Of The Rule Of Least Power For Developers

Practical implementation of the Rule of Least Power for developers

It’s easy for devs to default to JavaScript to fix every problem. Let’s use the RoLP to find simpler alternatives with HTML and CSS.

Timonwa Akintokun
Nov 21, 2024 â‹… 8 min read
Rust logo over black marble background.

Handling memory leaks in Rust

Learn how to manage memory leaks in Rust, avoid unsafe behavior, and use tools like weak references to ensure efficient programs.

Ukeje Goodness
Nov 20, 2024 â‹… 4 min read
View all posts

8 Replies to "Build a durable pub-sub with Kafka in Node.js"

  1. >This helps in replaying or resyncing your applications and certain operations

    I tend to disagree, as complex applications involve integrations with 3p services, and replaying events might cause inconsistencies on those services if invoked multiple times (in longer periods of time). Events which represent business facts should be handled, processed and consumed only once. However, I agree on the auditing part, but in a distributed, service-oriented architecture, that should be framed as a proper service.

    1. Yes, replaying the events is only possible when all of the external services or vendors have idempotent APIs.

      Absolutely. Replaying Kafka events should not be a go-to for compensating request failures. If unexpected failures happen frequently, the API design needs to be relooked at.

  2. Hi, i am getting below issue after running first command..

    Connection to node -1 (localhost/127.0.0.1:9092) could not be established. Broker may not be available. (org.apache.kafka.clients.NetworkClient)

Leave a Reply