In the land of web development, streams (and their building blocks, observables) are an increasingly popular topic. Libraries like BaconJS and RxJS have been around for years now, and RxJS is even used as a foundation for Angular 2+. In fact, there’s even a TC39 proposal to add native observables to the language.
So streams are kind of a big deal. But… why? Why do so many people care about streams?
The short answer is that a stream-based approach dramatically simplifies several problems that have caused migraines for decades. We’ll talk about those problems and how streams help to solve them in a sec, but before we do, I want to plant a seed here, present an overall theme that I want you to keep in the back of your mind as we continue.
The problems that streams solve are all about sending, receiving, and processing data. So here’s our thematic seed: as I see them, what streams provide is a change of perspective from asking for data to listening for data.
It’s nearly too obvious to be worth saying (but here I go) — modern web applications are incredibly complex. They tend to have a ton of more-or-less independent components all sitting on a page at the same time, requesting data from various sources, transforming that data, combining data from different sources in interesting ways, and ultimately, if all goes well, putting some of that data on the screen for us users to look at.
And by the way, “data source” doesn’t just mean “REST API”. Data can come from all sorts of places:
postMessage()
communications from web workers, iframes, or related windowslocalStorage
or IndexedDBAnd the list goes on (you can probably think of something I’ve missed).
All of this complexity can be tough to manage. Here are a few problem situations that come up all the time:
Streams can handle all of these problems with ease, and do so in a way that’s easy to follow and understand.
Before we get into code samples, let’s talk a tiny bit of theory, just for a minute.
The software design pattern being invoked here is called the Observer pattern. In this pattern, we’ve got two important players: “observers” and “subjects” (also called “observables”). As their names suggest, observers “observe” subjects, and whenever subjects emit any data, observers find out about it. In code, this is accomplished by subjects keeping a list of all observers that are currently observing them, and whenever they’ve got some data to pass along, they run through that list and call a special method on each observer, passing the data as an argument.
The observer pattern is used all over the place in software. It’s the basic architecture behind all pub/sub interactions. You can even think of everyday event handlers as observers. And I think it’s clear why this pattern is so popular: the ability to easily find out about asynchronous events when they happen, and to get data from a source whenever it’s available without needing to poll for it, is very powerful.
Streams are one layer of abstraction higher than observers and subjects. Streams use subjects that can also act as observers, observing other subjects to receive data. Each subject observes someone else to wait for data, performs some kind of processing on the data it receives, then sends some data along to whoever is observing it. These observer-subjects make it really easy to build up long chains of data processors that can do interesting things with the data and help get it to the components in our app that need it.
Another aspect worth mentioning is that just as a single subject can be observed by multiple observers, a single observer can also observe multiple subjects. This enables merging data from different sources together in all sorts of interesting ways.
Take a moment and imagine linking lots of these individual observer-subjects together, then step back and look at the big picture. Think about how data flows through this system from sources to destinations, merging with data from other sources, splitting into tributaries and joining up again with more data, creating interesting paths to bring it to where it’s needed all over our system very efficiently. This big picture is what we talk about as “streams”.
So now that we know the theory, let’s put it into practice.
For each data source you have, no matter what kind of source it is, create a subject and make it available to any component that needs data from that source. Different UI frameworks facilitate this in different ways, but for our purposes, we’ll put each subject in a JavaScript module. Then any component that needs data from that source can import the subject.
Note: I’ll use JavaScript as the language and RxJS as the stream library for the code examples here, but this is arbitrary. RxJS is what I’m most familiar with, but there are other stream libraries that accomplish the same thing, both in JS and other languages. In fact, RxJS is just the JavaScript implementation of an abstract sort-of-spec called ReactiveX that has implementations in all sorts of languages.
So suppose we need to poll an API periodically. We can create a subject to handle that, using RxJS’s handy ajax
helper and the interval
function, which creates a subject that emits at the specified interval. (The pipe
operator essentially chains together the operators you give it, and switchMap
creates a new observable from each bit of data it receives, then emits that observable’s data before creating the next one, but don’t get too hung up here; these are specific to RxJS and sort of beside the point).
We can keep going this way, creating a module for each data source that returns a subject. When it’s time to use the data from these sources in a component, it’s as easy as any other import:
This is already useful, to have all data sources producing data through a common interface. But the real power of streams comes from the incredible ease with which we can process and manipulate data by chaining together those observer-subjects. Stream libraries like RxJS make this very easy by providing “operator” methods on their subject data types that each internally observe the subject and return a new subject to be observed.
To demonstrate this, let’s imagine a very simple example: a chat room application. In this scenario, the above web socket could be used for real-time chat notifications, and the API could be used for updates from the server that don’t need to be quite as real time. (Yeah, I know, you could do both over web socket, but let’s roll with this for the sake of demonstration).
Suppose our server updates API returns two sorts of things:
Suppose the packets received from the server are formatted this way:
We need to handle the “who” messages by updating the user list, and handle the “notice” messages by displaying them in the chatroom. One way to accomplish the second task might be to treat the notices the same as user messages, and give them a special user name, like “SERVER”.
Now suppose that messages received from the web socket are formatted this way:
We’ll need to transform notices to match this format and combine the notice messages with the web socket messages to send to the chatroom. Fortunately, with streams this is super simple:
Not bad at all! Something that’s not super obvious from this code, since it’s abstracted away behind fancy helpers and operators, is that every one of those helpers and operators (webSocket
, ajax
, from
, pluck
, switchMap
, filter
, merge
) creates a new subject that observes the previous subject (or subjects!) in the stream, does something with each bit of data it receives and sends something new down the stream. The special subscribe
method creates a simple observer that consumes whatever comes out the end of the stream, but can’t itself be observed.
So now that we’ve seen a little of what streams can do, let’s return to the list of problems we talked about earlier and make sure we have an answer to each of them. Let’s take them one-by-one:
merge
that deal with combining data from multiple streams, such as combineLatest
, zip
, or concat
.This was a relatively shallow dive into streams, but I hope I’ve managed to give a glimpse of the power streams can provide. They can significantly simplify the flow of data through a system, especially when dealing with several data sources that need to interact and update disparate parts of an application simultaneously.
But because I wanted this to stay pretty shallow, there’s a lot I didn’t talk about. How do you handle errors in the stream? How do you clean up your observables to prevent memory leaks? What the heck are “hot” and “cold” observables? All of these are super important and should be some of the first things you learn if you decide to dive into streams (heh), but that’s the part I was focusing on: convincing you to dive in. I hope I’ve done that!
If you want to learn more about what streams can do for you, and I hope you do, here are some links for further reading/viewing:
Install LogRocket via npm or script tag. LogRocket.init()
must be called client-side, not
server-side
$ npm i --save logrocket // Code: import LogRocket from 'logrocket'; LogRocket.init('app/id');
// Add to your HTML: <script src="https://cdn.lr-ingest.com/LogRocket.min.js"></script> <script>window.LogRocket && window.LogRocket.init('app/id');</script>
Would you be interested in joining LogRocket's developer community?
Join LogRocket’s Content Advisory Board. You’ll help inform the type of content we create and get access to exclusive meetups, social accreditation, and swag.
Sign up nowJavaScript’s Date API has many limitations. Explore alternative libraries like Moment.js, date-fns, and the new Temporal API.
Explore use cases for using npm vs. npx such as long-term dependency management or temporary tasks and running packages on the fly.
Validating and auditing AI-generated code reduces code errors and ensures that code is compliant.
Build a real-time image background remover in Vue using Transformers.js and WebGPU for client-side processing with privacy and efficiency.