Did you know that web users on desktop loaded 19 percent more JavaScript over the last two years, while their mobile user counterparts loaded 14 percent more in the same time frame?
After managing to load these scripts (hopefully progressively), they still need to be parsed and executed — and executing JavaScript code accounted for up to 40 percent of CPU time, based on a 2018 analysis of roughly 1.3 million websites.
Now throw the mobile web into this mix. Thanks to lower hardware prices, more people are coming online for the first time, and they do so on low-powered mobile devices that often cost less than $200 in places like India and Nigeria. While more people are getting connected today and can easily land on your web app, many of them are on low-end hardware.
I know we are just getting started here, but in summary, we are shipping more JavaScript and demanding plenty of CPU resources from web users. Worse still, the bulk if not all of this work is done on the UI thread, the resource meant to help users interact with your app and get things done, thereby deprioritizing and degrading their experience.
In this article, we will be talking about web workers, the problem they solve, and how to use them while building a modern web app. We will explore an approach to a use case without workers and see how adopting a worker significantly improves UX and performance.
We will then refactor our solution to use Comlink, a relatively new JS library that makes working with web workers as intuitive as simply calling functions. Yes, we are ditching the mental overhead of manually managing the call routing and payload marshaling you’d need without Comlink.
In the end, you will see why web workers and Comlink are a match made in heaven!
Our modern web apps are getting bigger and more complex. We often blame it on the fact that such apps are almost fully driven by JavaScript — meaning a lot of code.
While it is one thing to build web apps with code splitting and load bundles per page or per route, running only UI code on the UI thread could very well have the biggest impact on user experience and the bottom line of many web app-driven businesses.
The UI thread (aka the main thread) should be for UI work like layout, painting, dispatching events, capturing data from input sources (forms, cameras, etc.), and rendering data back into the DOM.
Things like data and object manipulation, client-side logic (e.g., validation, state management, etc.), and any form of non-UI-related work — especially compute- or memory-intensive work — should all ideally live in web workers.
Chances are, you’ve already heard about web workers, and you might even know how they work, but let’s recap. A web worker is a native mechanism in the browser that allows background and parallel execution of JavaScript code in a separate context or process — actually, a separate thread, different from the main thread (the UI thread), which is the default code execution thread in browsers.
Web workers are different from service workers. They are simpler, have been around for a long time, and are supported in all major browsers (about 97 percent, according to Can I Use…). However, we are still advocating for web workers today because they are hardly used, meaning web developers are passing up opportunities to deliver better experiences to their users by trying to execute code only on the UI thread.
For this exploration, our sample case study app allows users to enter free-form text into a multiline field and tries to do basic text analysis while the user is still typing. The analysis includes character count, word count, most-used word, and line count. To simulate a CPU-intensive task, the analysis pipeline will also incorporate a complex math operation inspired from this sample, which helps the overall text analysis to slow down as the number of input words increases.
Imagine a web app doing this sort of text analysis while the user is still typing, so as to highlight metadata about the entry and maybe enforce a validation constraint based on word count and correct grammar.
You’ll have to try out the demo app on Chrome Canary since it currently uses worker modules, which is yet to ship in modern browsers. This should not be a blocker with adopting workers since bundlers (webpack, Rollup, etc.) got you covered for modules if you must use them. The complete code for our sample demo app is here on GitHub.
Let’s see how the app behaves when all the code executes on the UI thread, like most of the web is built today.
analyzer.js ... export const Analyzer = { analyzeText(text) { return analyze({ text }); }, async analyzeGrammar(text, callback) { const status = await checkGrammar(text); callback({ status }); } };
And then the HTML file using the above code:
import Analyzer from "../analyzer.js"; const takeOff = () => { const statsDisplays = [ ...document.querySelectorAll("#statsplainer span") ]; const inputElement = document.querySelector("textarea"); inputElement.addEventListener("input", ({ target: field }) => { const text = field.value || ""; if (text.trim() === "") return; const { stats } = Analyzer.analyzeText(text); requestAnimationFrame(() => { // update the UI statsDisplays[0].textContent = stats.chars; statsDisplays[1].textContent = stats.words; statsDisplays[2].textContent = stats.lines; statsDisplays[3].textContent = stats.mostUsed ? stats.mostUsed[0] : "N/A"; }); }); }; document.addEventListener("DOMContentLoaded", takeOff);
Basically, after the page is loaded and ready, we listen for user input on the textarea
, and for each input change (i.e., valid keystroke), we attempt to analyze the entire input entry and get the stats
of the analysis back. We then display the details of the analysis on the UI.
Since all of this code is running on the UI thread, users begin to notice sluggishness and lag from the app as they continues to type into the input field. The app could easily intermittently or completely freeze. In my tests, I did witness the entire page grinding to a halt and Chrome issuing the dreaded “this page has become unresponsive” warning message.
No Title
No Description
While you might not be finding prime numbers, mining cryptocurrencies, computing password hashes, or doing other similar, overly expensive tasks that could result in the page freezing, you might still be doing too much and failing to yield back to the UI thread when you should.
You could be doing so much within a short period of time (recall users on low-end hardware) that users (who are still trying to click or scroll) will notice significant lags in responsiveness because the UI thread has no room to breathe.
According to RAIL budgets, users will notice any work that holds the UI thread for more than 100ms! Yielding to the browser within this time, or not interrupting the UI thread at all, is what we ultimately want to achieve.
Switching our sample code to a web worker was quite trivial, and this might not be your mileage — especially for large, preexisting codebases — but even that can be done progressively. Let’s still approach things from the level of oversimplification with our sample app.
Once you have the non-UI JavaScript code in a separate file, you can spin up a web worker with it by passing the non-UI JavaScript file path to the web worker constructor. Any additional scripts needed by the worker file can be loaded with the built-in importScripts
function, and it works for both your local JavaScript files as well as external files like those loaded from unpkg.com.
One downside to loading additional code with importScripts
is that you somewhat lose the benefits of loading just what is needed from the file, like with ES modules. You can use module syntax to achieve better code loading into web workers, but you will have to first instantiate your web worker with the type
set to module
in an options object, the second parameter needed when constructing the worker.
Browser support for this is still limited, though:
const worker = new Worker("./webworker.js", { type: "module" });
The dedicated web worker (controllable only by the page or script that created it) is then able to communicate with the parent page, and vice versa, by sending data with the postMessage
function and receiving data by listening to a message
event. Both invocations will receive an event object, and your data will be accessible from its data property.
// In the worker: self.postMessage(someObject); // In the main thread: worker.addEventListener('message', msg => console.log(msg.data));
Our app does exactly what we have described above, and the code snippets below shows how:
// webworker.js import { Analyzer } from "../analyzer.js"; self.addEventListener("message", ({ data }) => { const { stats } = Analyzer.analyzeText(data); self.postMessage({ stats }); });
// index.html const takeOff = () => { const worker = new Worker("./webworker.js", { type: "module" }); worker.addEventListener("message", ({ data }) => { const { stats } = data; requestAnimationFrame(() => { // update UI }); }); const inputElement = document.querySelector("textarea"); inputElement.addEventListener("input", ({ target: field }) => { const text = field.value || ""; if (text.trim() === "") return; worker.postMessage(text); }); }; document.addEventListener("DOMContentLoaded", takeOff);
To be fair, using web workers does not necessarily mean your code is running faster; in fact, you could be offloading so much work (e.g., parsing large CSV or JSON data) that there’ll be no telling how long before the tasks are completed.
What it guarantees is that your UI thread is free and remains responsive. You also don’t want to dump a huge request for render on the UI thread from the worker. With the right design in place, you can render updates from the worker to the UI quickly and then bring in even more updates in batches.
This really is not the full story, as there’s often more to using web workers in real life. Though web workers are simple and have great browser support, they can be cumbersome to manage, especially having to figure out how to execute the right code within a worker just from the data you are sending to it with postMessage
.
These tend to be handled with unwieldy if
or switch
statements, and they need to be kept in sync with almost identical structures in the code using the web worker.
// worker.js self.addEventListener("message", ({ data }) => { let result = {}; if(data.command === "ACTION_1") { result = actionOne(data); } else if(data.command === "ACTION_2") { result = actionTwo(data); } else if(data.command === "...") { ... } else if(data.command === "ACTION_50") { result = actionFifty(data); } self.postMessage(result); });
Thanks to bundlers, code splitting, and on-demand resource loading, there’s a chance you won’t load everything your app needs to do up front into a single JavaScript file that then gets run as a worker.
Heck, you might not even have the entire code at the time the worker is being constructed, so there has to be a way to ingest new code and processing logic into an existing worker or spin up new ones and then manage them all as shared workers.
Many believe these issues are inhibiting the use and adoption of web workers, and Comlink is here to make things several steps better, with some magic!
To communicate with another thread, web workers offer the postMessage
API. You can send JavaScript objects as messages using myWorker.postMessage(someObject)
, triggering a message
event inside the worker.
Comlink turns this message-based API into something more developer-friendly by providing an RPC implementation: values from one thread can be used within the other thread (and vice versa) just like local values.
All you need to do is expose the aspects of the worker you want to directly call from the main thread with Comlink. To complete the two-way communication setup, you’ll then also use Comlink to wrap the worker in the main thread.
This enables you to call functions or methods declared in the worker from the main thread as though they were local, and Comlink will automatically handle the call routing and data transfer. No more meddling with postMessage
or reaching into an event
object to route code or pull out data!
Lets see how this approach sits with our sample application:
// analyzer.js // Analyzer "API" export const Analyzer = { analyzeText(text) { return analyze({ text }); }, async analyzeGrammar(text, callback) { const status = await checkGrammar(text); callback({ status }); } }; export default Analyzer;
// webworker.js import { expose } from "https://unpkg.com/[email protected]/dist/esm/comlink.mjs"; import { Analyzer } from "../analyzer.js"; // expose the Analyzer "API" with Comlink expose(Analyzer);
// main thread javascript import * as Comlink from "https://unpkg.com/[email protected]/dist/esm/comlink.mjs"; const takeOff = () => { ... const Analyzer = Comlink.wrap( new Worker("./webworker.js", {type: 'module'}) ); ... const inputElement = document.querySelector("textarea"); inputElement.addEventListener("input", async ({ target: field }) => { const text = field.value || ""; if (text.trim() === "") return; const { stats } = await Analyzer.analyzeText(text); requestAnimationFrame(() => { // update UI with stats }); }); }; document.addEventListener("DOMContentLoaded", takeOff);
Since we have our Analyzer
in another file, we import it into our web worker and use Comlink to expose
the Analyzer API. In the main thread script, we equally use Comlink to wrap
the web worker and store a reference to the returned wrapped object as Analyzer
.
We deliberately made the returned wrapped object and the exposed API share the same name so that client code (main thread code using the web worker) can easily look like Comlink does not exist in the mix. You don’t have to do this!
After all this setup, we can directly call the analyzeText()
function declared in the Analyzer API and exposed by Comlink.
... const { stats } = await Analyzer.analyzeText(text); ...
In the above code snippet, Analyzer
is a proxy to our actual Analyzer API, and this proxy is created and handled by Comlink.
From our code above, when we call Analyzer.analyzeText(text)
, Comlink is able to transfer the text
data to the worker because it is a JavaScript value or object, and can be copied over with the structured cloning algorithm.
This works for values and objects, but not functions. Recall that functions in JavaScript are first-class citizens that can be used as parameters or returned from calls, which is why they are used as callbacks to other functions. This means if the text
parameter in our code above was a function serving as a callback, it will not get copied to the worker since the structured cloning algorithm won’t be able to handle it.
Here, Comlink comes through for us again! All we need to do is wrap such callback functions with Comlink.proxy()
and supply its return value (the proxy) as the callback instead. This proxy value can be transferred like other JavaScript values and objects.
The Analyzer API in our sample app has an analyzeGrammar
function that does not return immediately since it does asynchronous work, checking the text for grammar and spelling errors. It expects a callback that it can call with the results of its async analysis, when ready. We wrapped this callback with Comlink.proxy()
.
// Analyzer API exposed by Comlink ... async analyzeGrammar(text, callback) { const status = await checkGrammar(text); callback({ status }); }
// main thread code ... const grammarChecked = ({ status }) => {}; inputElement.addEventListener("input", async ({ target: field }) => { const text = field.value || ""; if (text.trim() === "") return; ... await Analyzer.analyzeGrammar(text, Comlink.proxy(grammarChecked)); });
Effectively, our grammarChecked
function in the main thread will be called when the analyzeGrammar
function in the worker calls callback({ status })
, and Comlink handles all the plumbing for us. Magic!
There’s even more ways Comlink steps in to make our work more intuitive and performant, including letting you send large data by transferring it instead of copying it, which is the normal behavior since the structured cloning algorithm is used by default. Transferring data instead of copying it, however, is outside the scope of this article.
For the UI to stay responsive throughout its use, the UI thread should not be made to do non-UI work, and state management — including all of your app’s business logic buried within it — should be off the main thread. This really isn’t all that different from how our Analyzer is currently set up with Comlink.
Below are some steps you can follow to achieve offshore state management:
// actions.js const Actions = { ANALYZETEXT: "ANALYZETEXT", ANALYZEGRAMMAR: "ANALYZEGRAMMAR" }; export default Actions; // store.webworker.js import { expose, proxy } from "https://unpkg.com/.../comlink.mjs"; import { createStore } from "https://unpkg.com/.../redux.mjs"; import { Analyzer } from "../../analyzer.js"; import Actions from "./actions.js"; const initialState = { stats: { text: "", chars: 0, words: 0, lines: 0, mostUsed: [] } }; const handleAnalyzeAction = (state, text) => { const { stats } = Analyzer.analyzeText(text); return { ...state, ...{ stats } }; }; const reducer = (state = initialState, { type, text }) => { switch (type) { case Actions.ANALYZETEXT: return handleAnalyzeAction(state, text); default: return state; } }; const subscribers = new Map(); const store = createStore(reducer); const broadcastChanges = async () => { await store.getState(); subscribers.forEach(fn => fn()); }; store.subscribe(proxy(broadcastChanges)); // state management interface to expose // the main thread will call functions in // this object and state management will happen // in this worker const StateMngr = { getState() { return store.getState(); }, dispatch(action) { store.dispatch(action); }, subscribe(fn) { subscribers.set(subscribers.size, fn); } }; expose(StateMngr);
And now the main thread client code:
import * as Comlink from "https://unpkg.com/..../comlink.mjs"; import Actions from "./actions.js"; const initApp = async () => { ... const StateMngr = Comlink.wrap( new Worker("./store.webworker.js", { type: "module" }) ); // callback function called // when there are state changes const stateChanged = async () => { const { stats } = await StateMngr.getState(); // In a typical reactive app, this will be // handled by the render() mechanism automatically requestAnimationFrame(() => { // update the UI }); }; // wire up the callback setup a subscription for it StateMngr.subscribe(Comlink.proxy(stateChanged)); const inputElement = document.querySelector("textarea"); inputElement.addEventListener("input", async ({ target: field }) => { const text = field.value || ""; if (text.trim() === "") return; // dispatch an action await StateMngr.dispatch({ type: Actions.ANALYZETEXT, text }); }); }; document.addEventListener("DOMContentLoaded", initApp);
In this post, there is a similar remoteStore
example with a slightly different approach to the one above. However, you might also be wondering how you handle dynamic actions and reducers with all of this setup. That is out of the scope of this article, but I’ll be updating our sample app codebase to include an example just for that.
How about service workers, you might ask? With businesses winning on PWAs and service workers poised to drive great experiences like those powered by background sync and offline capabilities, there’s a high chance you want your service worker-to-main-thread relationship to benefit from the intuition Comlink brings. You are in good hands.
The major things we might do differently from your regular service worker usage are:
// sw.js importScripts("https://unpkg.com/[email protected]/dist/umd/comlink.js"); importScripts("./sw.analyzer.js"); addEventListener("install", () => self.skipWaiting()); addEventListener("activate", () => self.clients.claim()); addEventListener("message", ({ data }) => { // expose the Analyzer "API" when // we hear from the ui-thread that // it is ready to interact with this // ServiceWorker if (data.isHandshake === true) { Comlink.expose(Analyzer, data.port); } });
// main-thread script import * as Comlink from "https://unpkg.com/[email protected]/dist/esm/comlink.mjs"; ... let Analyzer; const grammarChecked = ({ status }) => {}; const inputElement = document.querySelector("textarea"); inputElement.addEventListener("input", async ({ target: field }) => { const text = field.value || ""; if (text.trim() === "" || !Analyzer) return; const { stats } = await Analyzer.analyzeText(text); requestAnimationFrame(() => { // update UI }); await Analyzer.analyzeGrammar(text, Comlink.proxy(grammarChecked)); }); const initComlink = async () => { const { port1, port2 } = new MessageChannel(); const initMsg = { isHandshake: true, port: port1 }; // tell the ServiceWorker that we are ready to roll navigator.serviceWorker.controller.postMessage(initMsg, [port1]); Analyzer = Comlink.wrap(port2); }; const initApp = async () => { ... if ("serviceWorker" in navigator) { if (navigator.serviceWorker.controller) { initComlink(); } else { navigator.serviceWorker.oncontrollerchange = function() { this.controller.onstatechange = function() { if (this.state === "activated") { initComlink(); } }; }; navigator.serviceWorker.register("./sw.js", { scope: location.pathname }); } } }; document.addEventListener("DOMContentLoaded", initApp);
After the service worker setup and handshake are complete, we are able to call await Analyzer.analyzeText(text)
as the user types into the textarea
, even though the Analyzer.analyzeText()
function could be living entirely in the service worker.
Notice how the grammarChecked()
function is also set up to be invoked as a callback using Comlink.proxy(grammarChecked)
in the call to Analyzer.analyzeGrammar(...)
. As seen in a previous section, this can be handy when you want to use Comlink to empower your service worker to call main-thread functions as callbacks in response to async work happening in the service worker.
Web workers are powerful and can significantly improve the experience of app users if we leverage them for the kind of JavaScript code they were designed to handle on the web, which boils down to most of non-UI code.
Web workers are well supported in browsers, but their adoption and use have been very poor, probably because of how cumbersome it can be to overlay any non-trival architecture over postMessage
, the primary means of communicating with workers.
Comlink allows you to expose objects and functions from workers such that you can call them directly from the main thread, shielding you from postMessage
. You can even have main-thread functions called as callbacks when async tasks in the workers are done.
Though we have focused mostly on web workers and service workers in this article, Comlink has support for WebRTC and WebSockets, too.
A lot of web users are on slow networks and low-end devices. Comlink is here to help you leverage web technology that can deliver great experiences to more of your web app users.
importScripts
or test your apps on Chrome Canary!Hey there, want to help make our blog better?
Join LogRocket’s Content Advisory Board. You’ll help inform the type of content we create and get access to exclusive meetups, social accreditation, and swag.
Sign up nowCreate a multi-lingual web application using Nuxt 3 and the Nuxt i18n and Nuxt i18n Micro modules.
Use CSS to style and manage disclosure widgets, which are the HTML `details` and `summary` elements.
React Native’s New Architecture offers significant performance advantages. In this article, you’ll explore synchronous and asynchronous rendering in React Native through practical use cases.
Build scalable admin dashboards with Filament and Laravel using Form Builder, Notifications, and Actions for clean, interactive panels.