One of the biggest challenges in writing frontend code or Node.js code is dealing with asynchronicity. There was an original generator revolution when packages like co allowed us to write synchronous looking async code with normal constructs like try
and catch
:
co.wrap(function*() { try { yield fetch('http://some.domain'); } catch(err) { // handle } });
Around this time, C# and .net started shipping the original async...await
construct that flattened async code into a more familiar shape:
public static async Task Main() { Task<int> downloading = DownloadDocsMainPageAsync(); int bytesLoaded = await downloading; Console.WriteLine($"{nameof(Main)}: Downloaded {bytesLoaded} bytes."); }
Some very clever people decided that JavaScript should adopt async...await
keywords into the JavaScript language. Babel and regenerator transpiled the keyword constructs into code that used generators to achieve the async workflow. Nodejs went one step further and made async...await
a first-class language citizen.
What makes async...await
code so appealing is that it looks synchronous. The code appears to stop and wait until a response returns or an error occurs. Code can be wrapped in a familiar try..catch
block.
async...await
gained a lot of traction, and the generator revolution was overlooked for the more limited async...await
.
What makes JavaScript generator functions so different is that they do not initially execute, and instead they return an iterator object with a next
function. Execution in the function can suspend and resume at exactly the point that it was suspended in between next
calls.
I have been using the npm package thefrontside/effection for some time now.
Effection utilizes the magic of generators to allow us to write code like this:
run(function* () { let socket = new WebSocket('ws://localhost:8080'); yield throwOnErrorEvent(socket); yield once(socket, "open"); let messages = yield once(socket, "message"); while(true) { let message = yield messages.next(); console.log('Got message:', message); } });
There are some beautiful abstractions in the code above that ease the path to writing less code and simpler code.
For example:
yield once(socket, "open");
The above code states that execution cannot proceed until the websocket open
event has occurred.
If we were doing this in normal JavaScript, it would look something like this:
const remove = socket.addEventListener('open', (event) => { // proceed });
Let us take a quick recap on what makes generators so powerful.
A generator function is an iterator that returns an object that we can call next on. A generator appears to be a function, but it behaves like an iterator.
What makes generators so powerful is their ability to suspend and resume execution.
The everySingleEvenNumber
generator function below illustrates this capability:
function* everySingleEvenNumber() { let i = 0; while (true) { yield i += 2; } } var gen = everySingleEvenNumber(); console.log(gen.next().value); // 2 console.log(gen.next().value); // 4 console.log(gen.next().value); // 6 console.log(gen.next().value); // 8
The while (true)
construct looks like an infinite loop, but execution is suspended after each yield
and only resumed when the iterator next
function gets called in the console.log
code.
The current value of the local i
variable does not reset between each call and is maintained.
Generators differ from async/await, where execution vanishes and only returns when a promise resolves or rejects.
The ability to suspend and resume functions opens up many more doors than async/await has shut closed in its rapid adoption.
effection
allows you to spawn separate processes as generator functions and take care of the teardown of all child processes started with effection. This technique is known as structured concurrency.
Effection exposes a task
object that can spawn
new detached
processes:
main(function* (task: Task) { console.log('in main'); task.spawn(function* () { while (true) { yield sleep(100); console.log('awake'); } }); yield; })
Below is a flakyConnection
function that will not connect until the fifth attempt:
let attempt = 1; function flakyConnection(): Promise<{ connected: boolean }> { return new Promise<{ connected: boolean }>((resolve) => { setTimeout(() => { attempt++; resolve({ connected: attempt === 5 }); }, 100); }); }
To get a connection, a client will have to attempt five times before being successful. Good client code will also include a timeout and throw an exception if the operation takes too long.
Writing polling code that times out is annoying code to write, but effection and the suspend and resume qualities of generators make this a very nice experience:
main(function* (parent: Task) { parent.spawn(function* (child) { child.spawn(function* () { console.log('primed to throw an Error'); yield sleep(8000); throw new Error('you are out of time! Better luck next time.'); }); while (true) { console.log(`connection attempt ${attempt}...`); const { connected } = yield flakyConnection(); if (connected) { console.log('we are connected!'); return true; } console.log('no cigar, we try again'); yield sleep(2000); } }); yield; });
A new process is attached to the parent
task object made available through main
.
The code below elegantly takes care of setting a timeout that will throw an exception if the client cannot connect after 8000 milliseconds:
child.spawn(function* () { console.log('primed to throw an Error'); yield sleep(8000); throw new Error('you are out of time! Better luck next time.'); });
The effection sleep
function will suspend execution for 8000 milliseconds. If the parent process still exists after 8000 milliseconds, then it will throw an exception.
The code below will attempt to connect in 200 millisecond intervals until it is successful:
while (true) { console.log(`connection attempt ${attempt}...`); const { connected } = yield flakyConnection(); if (connected) { console.log('we are connected!'); return true; } console.log('no cigar, we try again'); yield sleep(300); }
This code above can keep executing until a connection occurs or the timeout exception throws at which stage effection will close down all child processes.
Running the above code results in this output:
primed to throw an Error connection attempt 1... no cigar, we try again connection attempt 2... no cigar, we try again connection attempt 3... no cigar, we try again connection attempt 4... we are connected!
Here is a repo with the above code.
You can check if the timeout works by changing the timeout code to something like this:
child.spawn(function* () { console.log('primed to throw an Error'); yield sleep(4000); throw new Error('you are out of time! Better luck next time.'); });
The timeout occurring results in this output:
primed to throw an Error connection attempt 1... no cigar, we try again connection attempt 2... no cigar, we try again Error: you are out of time! Better luck next time.
I still use async/await for simple one-shot async tasks with no workflow, but it is a limited paradigm.
Generator functions can solve a whole breed of problems that nothing else can. Starting and resuming threads of execution is incredibly powerful, and generators have this functionality built-in and out of the box.
Jump in! The water is warm.
Debugging code is always a tedious task. But the more you understand your errors, the easier it is to fix them.
LogRocket allows you to understand these errors in new and unique ways. Our frontend monitoring solution tracks user engagement with your JavaScript frontends to give you the ability to see exactly what the user did that led to an error.
LogRocket records console logs, page load times, stack traces, slow network requests/responses with headers + bodies, browser metadata, and custom logs. Understanding the impact of your JavaScript code will never be easier!
Would you be interested in joining LogRocket's developer community?
Join LogRocket’s Content Advisory Board. You’ll help inform the type of content we create and get access to exclusive meetups, social accreditation, and swag.
Sign up nowLearn how to manage memory leaks in Rust, avoid unsafe behavior, and use tools like weak references to ensure efficient programs.
Bypass anti-bot measures in Node.js with curl-impersonate. Learn how it mimics browsers to overcome bot detection for web scraping.
Handle frontend data discrepancies with eventual consistency using WebSockets, Docker Compose, and practical code examples.
Efficient initializing is crucial to smooth-running websites. One way to optimize that process is through lazy initialization in Rust 1.80.
6 Replies to "JavaScript generators: The superior async/await"
None of these show any reason to use them over the promise variants.
I think you are misunderstanding the differences between promises and yield. Just about all of your examples are simpler with async versions and the error handling is far easier than with yield. Especially when nested. You also seem to be confusing asynchronously I/O with yielding during iteration… They are vastly different use cases and the last thing you want to do for asynchronous I/O is yield back to the caller.
I’m not sure if author knows what he is writing about. About sample about stopping websocket is not good idea cause what will happen when we stop execution of generator – > websocket got an issue – > we resumes generator – > error occurs, i think writing this code in async await manner will be much more simple in case of error handling
1. I think you are misunderstanding the differences between promises and yield. Just about all of your examples are simpler with async versions and the error handling is far easier than with yield.
2. Especially when nested.
3. You also seem to be confusing asynchronously I/O with yielding during iteration…
4. They are vastly different use cases and the last thing you want to do for asynchronous I/O is yield back to the caller.
This is 1 opinion followed by 3 assertions. Do you have evidence for any of these claims?
That last example doesn’t really show any benefit over the async await experience. Also I’m wary of anything in javascript that uses “process” semantics. Under the hood JS engines handle a single function call at a time so suggesting otherwise seems disingenuous. Unless the idea is that multiprocess patterns are just inherently more readable? That’s even more dubious.
Here’s an async/await equivalent to your flakeyconnection example. It’s written fairly quickly but it should work. Whether or not one is more confusing than the other is a matter of debate, but a point in favor of the async await example is that it introduces no new libraries or concepts other than the standard:
const slep = (ms: number) => {
return new Promise((resolve) => setTimeout(resolve, ms));
}
const someApiCall = async () => {
throw new Error(“oops”);
}
const flakey= async () => {
let tries = 5;
let result = undefined;
while (tries > 0) {
try {
const r = await someApiCall();
} catch (e) {
}
tries–;
await slep(1000);
}
if(!result) {
// do something, probably throw error
}
return result;
}
Another benefit of the async/await example is that it’s exactly the same pattern as if you had a flakey function that was not async.
Ah actually my previous comment was in error. Your example was for a timeout, not for a number of re-tries.
In that case the promise version would be slightly altered:
const timeout = (secs:number) => {
return new Promise((resolve, reject) => {
setTimeout(() => reject(“timeout exceeded”), secs);
});
}
const sleep = (secs:number) => {
return new Promise((resolve, reject) => {
setTimeout(resolve, secs);
});
}
const someApiCall = async () => {
throw new Error(“oops”);
}
const flakey = async () => {
while (true) {
try {
return await someApiCall();
} catch (e) {
}
await sleep(1000);
}
}
const main = () => {
try {
const result = Promise.race([flakey(), timeout(500)]);
} catch(e) {
// time out exceeded.
}
}
Still more readable than the generator version. Especially the Promise.race(…) makes it immediately obvious that this code is dealing with “timeouts”. Whereas in the generator version, we have to read through the whole setup to understand fully what’s going on. Which, honestly is why I missed the point of your example in the first place. Granted, that is just me being the lowest common denominator 😀