Node.js has been a game-changing technology since its initial release back in 2009. In a nutshell, it lets developers use JavaScript to run scripts on the server side producing dynamic web content before the page is sent to the user’s web browser. Consequently, Node.js represents a “JavaScript everywhere” paradigm, unifying web application development around a single programming language, rather than needing different languages for server-side and client-side scripts.
If you’re a fan of JavaScript and Node.js, like I am, you’ll be excited to know it’s about to get a whole lot better.
Why is JavaScript about to get a lot better? Node.js 12 just dropped a few months ago.
On April 23rd, 2019, Node.js 12 officially launched, and JavaScript enthusiasts everywhere rejoiced. And let’s be clear, this isn’t just a regular old version update, this is a big overhaul with some major upgrades, let’s go down the list of highlights.
In addition to the expected performance tweaks and improvements that come with every new version of the JavaScript V8 engine, there are some really noteworthy upgrades this time around. These include:
error.stack
property with asynchronous call frames without adding extra runtime to the V8 engineTLS, which stands for transport layer security, is how Node handles encrypted stream communication.
With the release of Node.js 12, TLS gets an upgrade to version 1.3, which sounds insignificant, but is actually a major update, with numerous performance and security enhancements. Although it sounds counterintuitive at first, TLS 1.3 is actually a simpler protocol to implement than TLS 1.2, making it more secure, easier to configure, and quicker to negotiate sessions between applications.
By using TLS 1.3, Node apps will have increased end-user privacy while also improving the performance of requests by reducing the time required for the HTTPS handshake.
Bottom line: better security for everyone using it and less latency between communicating services. That’s a major win to me.
Now, let’s talk about some lower level improvements. Up to this point, the JavaScript heap size defaulted to the max heap sizes set by V8 for use with browsers, unless manually configured otherwise. With the release of Node.js 12, the JS heap size will be configured based on available memory, which ensures Node doesn’t try to use more memory than is available and terminate processes when its memory is exhausted.
Say goodbye to out of memory errors – at least some of the time – when processing large amounts of data. The old --max-old-space-size
flag will still be available to set a different limit if needed, but hopefully, this feature will reduce the need for setting the flag.
Unbeknownst to many (myself included), the current http_parser
library used in Node has been extremely difficult to maintain and improve upon, which is why llhttp was born. The project is a port of http_parser to TypeScript, which is then run through llparse to generate the C or bitcode output.
Turns out, llhttp is faster than http_parser by 156%, it’s written in fewer lines of code, and all performance optimizations are generated automatically, as opposed to http_parser’s hand-optimized code.
In Node.js 12, they’ve decided to switch the default parser to llhttp for the first time, and more thoroughly, put it to the test. Let’s hope it continues to perform well when lots of different applications with lots of different needs are trying it out.
Switching the conversation to debugging, there’s a new experimental feature in Node.js 12 allowing users to generate a report on demand or when certain trigger events occur.
This kind of real-time reporting can help diagnose problems in production including crashes, slow performance, memory leaks, high CPU usage, unexpected errors, etc. – the kind of stuff that usually takes hours if not days to debug, diagnose and fix.
Another feature in this release around heaps, sure to speed up the debugging process, is integrated heap dumps, which ships with Node.js 12, already built in.
Now there’s no need to install new modules to investigate memory issues – just tell Node what kind of JSON-formatted diagnostic summary you want via the command line or an API call and parse through all of the info you can handle.
Stepping back from the low-level improvements, there’s some cool stuff also coming for developers and module makers within the Node ecosystem.
Making and building native modules for Node continues to improve, with changes that include better support for native modules in combination with worker threads, as well as the version 4 release of the N-API, which makes it easier to configure your own threads for native asynchronous functions.
Summed up, this means that creators and maintainers of Node-specific modules have almost as easy a time maintaining these modules as pure JavaScript module creators. The increased complexity that resulted from maintainers needing to rebuild the distributed binaries for each Node.js version they wanted their modules to support is now largely abstracted away courtesy of the N-API.
Worker threads, while they’ve been around since Node 10, no longer require a flag to be enabled – they’re well on their way to moving out of the experimental phase. Prior to Node.js 11.7.0, you could not access the worker thread module unless you started node
with the --experimental-worker
flag in the command line.
$ node -e "require('worker_threads'); console.log('success');" internal/modules/cjs/loader.js:605 throw err; ^ Error: Cannot find module 'worker_threads' at Function.Module._resolveFilename (internal/modules/cjs/loader.js:603:15) at Function.Module._load (internal/modules/cjs/loader.js:529:25) at Module.require (internal/modules/cjs/loader.js:657:17) at require (internal/modules/cjs/helpers.js:22:18) at [eval]:1:1 at Script.runInThisContext (vm.js:123:20) at Object.runInThisContext (vm.js:312:38) at Object. ([eval]-wrapper:6:22) at Module._compile (internal/modules/cjs/loader.js:721:30) at evalScript (internal/bootstrap/node.js:720:27) $ $ node --experimental-worker -e "require('worker_threads'); console.log('success');" success $
Workers really shine when performing CPU-intensive JavaScript operations, they won’t help much with I/O-intensive work. Node’s built-in asynchronous I/O operations are more efficient than Workers can be.
Node.js 11 reduced startup time of worker threads almost 60% by using built-in code cache support.
Node 12 has built upon this idea to generate the code cache for built-in libraries in advance at build time, allowing the main thread to use the code cache to start up the initial load of any built-in library written in JavaScript.
The end result is another 30% speedup in startup time for the main thread, and your apps will load for users faster than ever before.
I saved the best for last. One of the most exciting features to me is ES6 module support – the thing so many of us have been waiting for. This feature is still experimental, and the Node team is looking for feedback from people trying it out, but just imagine being able to transition seamlessly from front-end to back-end JavaScript with nary a care in the world.
Here’s the best of what the latest version of -–experimental-modules
contains:
./examples.js
, absolute URLs file:///opt.app/examples.js
, package names example-package
or paths within packages example-package/lib/examples.js
are all supported.// relative urls ‘./examples.js’ // absolute URLs ‘file:///opt.app/examples.js’ // package names ‘example-package’ // paths within packages example-package/lib/examples.js
.js
files works. Finally, devs can specify default exports import test from
'./examples'
, named exports import {example1, example2} from './examples'
and namespace exports import * as samples from './examples'
just as we’ve been doing in traditional JavaScript since ES6 came about.// default imports / exports import test from ‘./examples’ // named imports / exports import {example1, example2} from ‘./examples’ // namespace exports import * as samples from ‘./examples’
"type": "module"
to the package.json
for a project, and Node.js will treat all .js
files in the project as ES modules. This approach allows Node to use the package.json
for package-level metadata and configuration, similar to how it’s already used by Babel and other bundling and configuration tools..mjs
ending, and files to be treated as CommonJS with the .cjs
. These are files which still use require
and module.exports
-type syntax.Hallelujah! I’m really stoked for when this comes out from behind the flag for full adoption.
And last but not least, there are new requirements for running Node itself.
With newer features coming to Node.js via internal improvements and upgrades to the C++ of the V8 engine, comes new minimum requirements for Node.js 12. The codebase now needs a minimum of GCC 6 and glibc 2.17 on platforms other than macOS and Windows. Binaries released use this new toolchain minimum and include new compile-time performance and security enhancements.
If you’re using Mac or Windows machines, you should be fine: Windows minimums are the same for running Node.js 11, Mac users will need at least Xcode 8 and a minimum macOS of 10.10 “Yosemite”. Linux compatible binaries from nodejs.org will support Enterprise Linux 7, Debian 8 and Ubuntu 14.04, but custom toolchains on systems not natively supporting GCC 6 may be necessary. I’m sure you’ll figure out what’s needed quickly enough.
Yes, Node.js is only 10 years old, yes, it’s single threaded, and yes, it is not as widely adopted and leveraged as some other programming languages, but Node boasts something no other programming language can claim: it is built with JavaScript, and can run both on the client and server side.
And the teams and companies working to support and improve Node are some of the best and brightest in the business. Node has continued to learn from core JavaScript and other languages, cherry-picking the right pieces to incorporate into itself, becoming a better and better platform for developers and applications, alike.
Node.js 12 brings about some extremely exciting improvements like ES6 module support, better application security, and quicker startup times. Although it will not go into LTS (long term support) mode until October 2019 I’m pumped to dig into these new features and see what else the team can dream up to continue making this platform a great server-side solution.
Deploying a Node-based web app or website is the easy part. Making sure your Node instance continues to serve resources to your app is where things get tougher. If you’re interested in ensuring requests to the backend or third-party services are successful, try LogRocket.
LogRocket is like a DVR for web and mobile apps, recording literally everything that happens while a user interacts with your app. Instead of guessing why problems happen, you can aggregate and report on problematic network requests to quickly understand the root cause.
LogRocket instruments your app to record baseline performance timings such as page load time, time to first byte, slow network requests, and also logs Redux, NgRx, and Vuex actions/state. Start monitoring for free.
Would you be interested in joining LogRocket's developer community?
Join LogRocket’s Content Advisory Board. You’ll help inform the type of content we create and get access to exclusive meetups, social accreditation, and swag.
Sign up nowBuild scalable admin dashboards with Filament and Laravel using Form Builder, Notifications, and Actions for clean, interactive panels.
Break down the parts of a URL and explore APIs for working with them in JavaScript, parsing them, building query strings, checking their validity, etc.
In this guide, explore lazy loading and error loading as two techniques for fetching data in React apps.
Deno is a popular JavaScript runtime, and it recently launched version 2.0 with several new features, bug fixes, and improvements […]
7 Replies to "Node.js 12: The future of server-side JavaScript"
Clarification towards the end… Node is *NOT* single-threaded. The main JS runs in an event loop on a single thread. Async I/O (and often other compiled modules) run within a thread pool. Node doesn’t run server and browser, but the code can run on both.
Clarification, not all async events are using thread pool, many of them use low level underlying OS functionality, but not separate thread polling. Http module is the best example.
Also, node doesn’t have to “produce dynamic web content”. It does any type of server-side (or even command line) work. It can power a websocket server, PDF export service, host an event/message system or do any other work not related to rendering web pages.
Thanks for sharing this.
Nice article, helps a lot. Thank you for this awesome content.
Thanks for this node js update details
Thanks for sharing.