Editor’s note: This article was last updated on 28 September 2023 to add information about best practices with JSON files in Node.js, such as the importance of backups when writing to files; and a section about reading and writing large JSON files using streams.
JavaScript Object Notation (JSON) is one of the most popular formats for data storage and data interchange over the internet. The simplicity of the JSON syntax makes it very easy for humans and machines alike to read and write.
Despite its name, the use of the JSON data format is not limited to JavaScript. Most programming languages implement data structures that you can easily convert to JSON string and vice versa. JavaScript, and therefore the Node.js runtime environment, is no exception. More often than not, this JSON data needs to be read from or written to a file for persistence. The Node runtime environment has the built-in fs
module specifically for working with files.
This article is a comprehensive guide on how to use the built-in fs
module to read and write data in JSON format. We’ll also look at some third-party npm packages that simplify working with data in the JSON format.
Jump ahead:
fs
module
Serialization is the process of modifying an object or data structure to a format that is easy to store or transfer over the internet. You can recover the serialized data by applying the reverse process.
Deserialization refers to transforming the serialized data structure to its original format.
You will almost always need to serialize JSON or JavaScript objects to a JSON string in Node. You can do so with the JSON.stringify
method before writing it to a storage device or transmitting it over the internet:
const config = { ip: '1234.22.11', port: 3000}; console.log(JSON.stringify(config));
On the other hand, after reading the JSON file, you will need to deserialize the JSON string to a plain JavaScript object using the JSON.parse
method before accessing or manipulating the data:
const config = JSON.stringify({ ip: '1234.22.11', port: 3000}); console.log(JSON.parse(config));
JSON.stringify
and JSON.parse
are globally available methods in Node. You don’t need to install or require them before using.
fs
moduleThe fs
module is built in, and it provides functions that you can use to read and write data in the JSON format and much more.
Each function exposed by the fs
module has the synchronous, callback, and promise-based forms. The synchronous and callback variants of a method are accessible from the synchronous and callback API. The promise-based variant of a function is accessible from the promise-based API.
The synchronous methods of the built-in fs
module block the event loop and further execution of the remaining code until the operation has succeeded or failed. More often than not, blocking the event loop is not something you want to do.
The names of all synchronous functions end with the "Sync"
characters. For example, writeFileSync
and readFileSync
are both synchronous functions.
You can access the synchronous API by requiring fs
:
const fs = require('fs'); // Blocks the event loop fs.readFileSync(path, options);
Unlike the synchronous methods that block the execution of the remaining code until the operation has succeeded or failed, the corresponding methods of the callback API are asynchronous. You’ll pass a callback function to the method as the last argument.
The callback function is invoked with an Error
object as the first argument if an error occurs. The remainder of the arguments to the callback function depend on the fs
method.
You can also access the methods of the callback API by requiring fs
like the synchronous API:
const fs = require('fs'); fs.readFile(path, options, callback);
The promise-based API is asynchronous, like the callback API. It returns a promise, which you can manage via promise chaining or async/await.
You can access the promise-based API by requiring fs/promises
:
const fs = require('fs/promises'); fs.readFile(path) .then((data) => { // Do something with the data }) .catch((error) => { // Do something if error });
We used the CommonJS syntax for accessing the modules in the code snippets above. We’ll use the CommonJS syntax throughout this article. You can also use ES6 modules if you want.
According to the Node documentation, the callback API of the built-in fs
module is more performant than the promise-based API. Therefore, most examples in this article will use the callback API.
In the Node runtime environment, you can use the built-in require
function and fs
modules for loading or reading JSON files. Because the require
function is available for each module, you don’t need to require it.
However, you will need to require the fs
module before using it. I will discuss how to read JSON files using the built-in fs
module and require
function in the following sections.
require
functionYou can use the require
function to synchronously load JSON files in Node. After loading a file using require
, it is cached. Therefore, loading the file again using require
will load the cached version. In a server environment, the file will be loaded again in the next server restart.
It is therefore advisable to use require
for loading static JSON files such as configuration files that do not change often. Do not use require
if the JSON file you load keeps changing, because it will cache the loaded file and use the cached version if you require the same file again. Your latest changes will not be reflected.
Assuming you have a config.json
file with the following contents:
{ "port": "3000", "ip": "127.00.12.3" }
You can load the config.json
file in a JavaScript file using the code below. require
will always load the JSON data as a JavaScript object:
const config = require('./config.json'); console.log(config);
fs.readFile
methodYou can use the readFile
method to read JSON files. It asynchronously reads the contents of the entire file in memory, and is therefore not the most optimal method for reading large JSON files.
The readFile
method takes three arguments. The code snippet below shows its function signature:
fs.readFile(path, options, callback);
The first argument, path
, is the file name or the file descriptor. The second is an optional object argument, and the third is a callback
function. You can also pass a string as the second argument instead of an object. If you pass a string, then it has to be encoded.
The callback
function takes two arguments. The first argument is the error
object if an error occurs, and the second is the serialized JSON data.
The code snippet below will read the JSON data in the config.json
file and log it on the terminal:
const fs = require("fs"); fs.readFile("./config.json", "utf8", (error, data) => { if (error) { console.log(error); return; } console.log(JSON.parse(data)); });
Make sure to deserialize the JSON string passed to the callback
function before you start working with the resulting JavaScript object.
fs.readFileSync
methodreadFileSync
is another built-in method for reading files in Node similar to readFile
. The difference between the two is that readFile
reads the file asynchronously while readFileSync
reads the file synchronously. Therefore, readFileSync
blocks the event loop and execution of the remaining code until all the data has been read.
Check out this article for more information about the difference between synchronous and asynchronous code.
Below is the function signature of fs.readFileSync
:
fs.readFileSync(path, options);
path
refers to the location of the JSON file you wish to read. Optionally, you can provide an object as the second argument.
In the code snippet below, we are reading JSON data from the config.json
file using readFileSync
:
const { readFileSync } = require('fs'); const data = readFileSync('./config.json'); console.log(JSON.parse(data));
Just like reading JSON files, the fs
module provides built-in methods for writing to JSON files. You can use the writeFile
and writeFileSync
methods of the fs
module. The difference between the two is that writeFile
is asynchronous while writeFileSync
is synchronous.
Before writing a JSON file, make sure to serialize the JavaScript object to a JSON string using the JSON.stringify
method. JSON.stringify
will format your JSON data in a single line if you do not pass the optional formatting argument to the JSON.stringify
method specifying how to format your JSON data.
fs.writeFile
methodThe writeFile
method takes four arguments. The code snippet below shows its function signature:
fs.writeFile(file, data, options, callback);
If the path you pass to the writeFile
method is for an existing JSON file, the method will overwrite the data in the specified file. It will create a new file if the file does not exist:
const { writeFile } = require('fs'); const path = './config.json'; const config = { ip: '192.0.2.1', port: 3000 }; writeFile(path, JSON.stringify(config, null, 2), (error) => { if (error) { console.log('An error has occurred ', error); return; } console.log('Data written successfully to disk'); });
fs.writeFileSync
methodUnlike writeFile
, writeFileSync
writes to a file synchronously. If you use writeFileSync
, you will block the execution of the event loop and the rest of the code until the operation is successful or an error occurs. It will create a new file if the path you pass doesn’t exist and overwrite it if it does.
In the code snippet below, we are writing to the config.json
file. We wrap the code in try-catch
so that we can catch any errors:
const { writeFileSync } = require('fs'); const path = './config.json'; const config = { ip: '192.0.2.1', port: 3000 }; try { writeFileSync(path, JSON.stringify(config, null, 2), 'utf8'); console.log('Data successfully saved to disk'); } catch (error) { console.log('An error has occurred ', error); }
Node doesn’t have a built-in function for appending or updating the fields of an existing JSON file out of the box. However, you can read the JSON file using the readFile
method of the fs
module, update it, and overwrite the JSON file with the updated JSON.
Below is a code snippet illustrating how to do this:
const { writeFile, readFile } = require('fs'); const path = './config.json'; readFile(path, (error, data) => { if (error) { console.log(error); return; } const parsedData = JSON.parse(data); parsedData.createdAt = new Date().toISOString(); writeFile(path, JSON.stringify(parsedData, null, 2), (err) => { if (err) { console.log('Failed to write updated data to file'); return; } console.log('Updated file successfully'); }); });
In this section, we’ll look at the most popular third-party Node packages for reading and writing data in JSON format.
jsonfile
npm packagejsonfile
is a popular npm package for reading and writing JSON files in Node. You can install it using the following command:
npm install jsonfile
It is similar to the readFile
and writeFile
methods of the built-in fs
module, though jsonfile
has some advantages over the built-in methods:
You can see the jsonfile
package in action in the code snippet below:
const jsonfile = require('jsonfile'); const path = './config.json'; jsonfile.readFile(path, (err, data) => { if (err) { console.log(err); return; } console.log(data); });
You can also use promise chaining instead of passing a callback function like in the example above:
const jsonfile = require('jsonfile'); const path = './config.json'; jsonfile .readFile(path) .then((data) => { console.log(data); }) .catch((err) => { console.log(err); });
fs-extra
npm packagefs-extra
is another popular Node package you can use to work with files. Though you can use this package for managing JSON files, it has methods whose functions extend beyond just reading and writing JSON files.
As its name suggests, fs-extra
has all the functionalities provided by the fs
module and more. According to the documentation, you can use the fs-extra
package instead of the fs
module.
Before using it, you need to first install fs-extra
from npm:
npm install fs-extra
The code below shows how you can read JSON files using the readJson
method of the fs-extra
package. You can use a callback function, promise chaining, or async/await:
const fsExtra = require('fs-extra'); const path = './config.json'; // Using callback fsExtra.readJson(path, (error, config) => { if (error) { console.log('An error has occurred'); return; } console.log(config); }); // Using promise chaining fsExtra .readJson(path) .then((config) => { console.log(config); }) .catch((error) => { console.log(error); }); // Using async/await async function readJsonData() { try { const config = await fsExtra.readJson(path); console.log(config); } catch (error) { console.log(error); } } readJsonData();
The code below illustrates how you can write JSON data using the writeJson
method:
const { writeJson } = require('fs-extra'); const path = './config.json'; const config = { ip: '192.0.2.1', port: 3000 }; // Using callback writeJson(path, config, (error) => { if (error) { console.log('An error has occurred'); return; } console.log('Data written to file successfully '); }); // Using promise chaining writeJson(path, config) .then(() => { console.log('Data written to file successfully '); }) .catch((error) => { console.log(error); }); // Using async/await async function writeJsonData() { try { await writeJson(path, config); console.log('Data written to file successfully '); } catch (error) { console.log(error); } } writeJsonData();
Just like the fs
module, fs-extra
has both asynchronous and synchronous methods. You don’t need to stringify your JavaScript object before writing to a JSON file.
Similarly, you don’t need to parse to a JavaScript object after reading a JSON file. The module does it for you out of the box.
bfj
npm packagebfj
is another npm package you can use to handle data in JSON format. According to the documentation, it was created for managing large JSON datasets:
bfj
implements asynchronous functions and uses pre-allocated fixed-length arrays to try and alleviate issues associated with parsing and stringifying large JSON or JavaScript datasets. – bfj documentation
To install bfj
from the npm package registry, run the following code:
npm install bfj
You can read JSON data using the read
method, which is asynchronous and returns a promise.
Assuming you have a config.json
file, you can use the following code to read it:
const bfj = require('bfj'); const path = './config.json'; bfj .read(path) .then((config) => { console.log(config); }) .catch((error) => { console.log(error); });
Similarly, you can use the write
method to write data to a JSON file:
const bfj = require('bfj'); const path = './config.json'; const config = { ip: '192.0.2.1', port: 3000 }; bfj .write(path, config) .then(() => { console.log('Data has been successfully written to disk'); }) .catch((error) => { console.log(error); });
bfj
was created purposely for handling large JSON data. It is also slow, so you should only use it if you are handling relatively large JSON datasets.
As explained above, the built-in functions of the synchronous and asynchronous APIs read the entire file into memory. This is inefficient in terms of both time and memory.
You need to wait until the entire file is read into memory before processing. If you are dealing with a large JSON file, you may wait for a long time. Similarly, you may run out of memory while reading large JSON files.
To remedy these issues, you may want to use streams to read and process JSON data. The stream-json package comes in handy when streaming large JSON data. You need to first install it from npm like so:
npm install stream-json
Depending on the shape of your JSON data, you can use one of the built-in functions, like in the example below. This reduces your application’s memory footprint and enables you to process chunks of data immediately after they become available:
const StreamArray = require("stream-json/streamers/StreamArray"); const fs = require("fs"); const pipeline = fs .createReadStream("large-file.json") .pipe(StreamArray.withParser()); pipeline.on("data", (data) => console.log(data));
When dealing with file operations, it’s essential to first create a backup of datasets to avoid losing or corrupting the data. The require
method loads the entire JSON file into memory and caches it. For frequently changing JSON files, it’s advisable to avoid require
and instead use functions from the fs
module.
Error handling is vital, especially with synchronous and promise-based APIs in conjunction with async/await because it prevents application failures. The space
parameter in JSON.stringify
improves JSON string readability, but it’s best avoided for network transmissions to reduce bundle size. Lastly, remember that Node.js’s promise-based API isn’t thread-safe according to its documentation. Concurrent operations on the same file might lead to issues, so use this API with caution.
It is not uncommon to encounter the TypeError: Converting circular structure to JSON
error when serializing a JavaScript object using the JSON.stringify
function. This error occurs when you attempt to stringify a JavaScript object that references itself, as in the example below:
const object = { a: 1 }; object.itself = object; try { JSON.stringify(object); } catch (e) { // TypeError: Converting circular structure to JSON console.log(e); }
There is no straightforward fix to this error. However, you can manually find and replace the circular references with serializable values or use a third-party library like cycle.js, which was created by Douglas Crockford, the brain behind the JSON format.
A fork of the library is maintained at the npm package registry as cycle
. You can install it like so:
npm install cycle
Then, you can use it in your application, as shown below:
const cycle = require("cycle"); const originalObj = { a: 1 }; originalObj.itself = originalObj; const stringifiedObj = JSON.stringify(cycle.decycle(originalObj)); const originalObjCopy = cycle.retrocycle(JSON.parse(stringifiedObj)); console.log(originalObjCopy);
The decycle
function of the circular.js
package highlighted above will create a copy of the object, look for duplicate references, which might be circular references, and replace them with objects of the form { "$ref": PATH }
.
You can then stringify and parse the resulting object without encountering the TypeError
mentioned above. After that, you can store the resulting object on disk or transfer it over the network.
You can use the retrocycle
function of the circular.js
package to get a copy of the original object.
As explained in the above sections, JSON is one of the most popular formats for data exchange over the internet. The Node runtime environment has the built-in fs
module you can use to work with files in general. The fs
module has methods that you can use to read and write to JSON files using the callback API, promise-based API, or synchronous API.
Because methods of the callback API are more performant than those of the promise-based API, you are better off using the callback API.
In addition to the built-in fs
module, several popular third-party packages such as jsonfile
, fs-extra
, and bfj
exist. They have additional utility functions that make working with JSON files a breeze. On the flip side, you should evaluate the limitations of adding third-party packages to your application.
Deploying a Node-based web app or website is the easy part. Making sure your Node instance continues to serve resources to your app is where things get tougher. If you’re interested in ensuring requests to the backend or third-party services are successful, try LogRocket.
LogRocket is like a DVR for web and mobile apps, recording literally everything that happens while a user interacts with your app. Instead of guessing why problems happen, you can aggregate and report on problematic network requests to quickly understand the root cause.
LogRocket instruments your app to record baseline performance timings such as page load time, time to first byte, slow network requests, and also logs Redux, NgRx, and Vuex actions/state. Start monitoring for free.
Hey there, want to help make our blog better?
Join LogRocket’s Content Advisory Board. You’ll help inform the type of content we create and get access to exclusive meetups, social accreditation, and swag.
Sign up nowToast notifications are messages that appear on the screen to provide feedback to users. When users interact with the user […]
Deno’s features and built-in TypeScript support make it appealing for developers seeking a secure and streamlined development experience.
It can be difficult to choose between types and interfaces in TypeScript, but in this post, you’ll learn which to use in specific use cases.
This tutorial demonstrates how to build, integrate, and customize a bottom navigation bar in a Flutter app.
3 Replies to "Reading and writing JSON files in Node.js: A complete tutorial"
Thank you, I couldn’t have done it without you.
You mention node in the beginning but then no more. So I don’t quite understand, do I need to install node or are all the commands referred to included in javascript.
(Sorry for the noob question but a quick research in the net didn’t get me on)
Hi Fabian,
Joseph here. Yes! You need to install Node for your system before using any of the commands in the article.
https://nodejs.org/en/download