Editor’s note: This article was last updated by Oyinkansola Awosan on 18 October 2024 to cover handling large JSON files using fs.createReadStream()
and to add information about the third-party library jsonfile.
Node.js provides built-in modules and third-party libraries for reading and writing JSON files. It offers flexible methods to suit various needs, such as handling small JSON objects or large datasets. These modules and libraries make it easy to work with structured data in JSON format. More often than not, this JSON data needs to be read from or written to a file for persistence, and the Node runtime environment has the built-in fs
module specifically for working with files.
In the Node runtime environment, you can use the built-in require
function and fs
modules to load or read JSON files. Because the require
function is available for each module, you don’t need to require it. However, you must require the fs
module before using it.
In the following sections, we will dive into how to read JSON files using the built-in fs
module and the require
function.
First, let’s create a JSON file. To do that, we can input the data below in a file called books.json
{ "title": "Ali goes to school", "genre": "Fiction", "type": "Children book" }
You can use the readFile
method to read JSON files. It asynchronously reads the contents of the entire file in memory and is therefore not the most optimal method for reading large JSON files.
The readFile
method takes three arguments. The code snippet below shows its function signature:
fs.readFile(path, options, callback);
path
, is the file name or the file descriptorcallback
function. You can also pass a string as the second argument instead of an object. If you pass a string, then it has to be encodedThe callback
function takes two arguments. The first argument is the error
object if an error occurs, and the second is the serialized JSON data:
const fs = require('fs'); fs.readFile('books.json', function(err, data) { if (err) throw err; const books = JSON.parse(data); console.log(books); });
In the example above, we import the fs
module, then use fs.readFile
to read our previously created JSON file, books.json
. The code retrieves the file’s content, checks for errors, and, if successful, parses the JSON data into a JavaScript object, which is then printed.
Make sure to deserialize the JSON string passed to the callback
function before you start working with the resulting JavaScript object.
require
functionYou can use the require
function to synchronously load JSON files in Node. After loading a file using require
, it is cached. Therefore, loading the file again using require
will load the cached version. In a server environment, the file will be loaded again in the next server restart.
Therefore, it is advisable to use require
for loading static JSON files such as configuration files that don’t change often. Don’t use require
if the JSON file you load keeps changing, because it will cache the loaded file, and use the cached version if you require the same file again. Your latest changes will not be reflected.
Assuming you have a config.json
file with the following contents:
{ "port": "3000", "ip": "127.00.12.3" }
You can load the config.json
file in a JavaScript file using the code below. require
will always load the JSON data as a JavaScript object:
const config = require('./config.json'); console.log(config);
fs.readFileSync
methodreadFileSync
is another built-in method for reading files in Node similar to readFile
. The difference between the two is that readFile
reads the file asynchronously while readFileSync
reads the file synchronously. Therefore, readFileSync
blocks the event loop and execution of the remaining code until all the data has been read.
Check out this article for more information about the difference between synchronous and asynchronous code.
Below is the function signature of fs.readFileSync
:
fs.readFileSync(path, options);
path
refers to the location of the JSON file you wish to read. Optionally, you can provide an object as the second argument.
In the code snippet below, we are reading JSON data from the config.json
file using readFileSync
:
const { readFileSync } = require('fs'); const data = readFileSync('./config.json'); console.log(JSON.parse(data));
Just like reading JSON files, the fs
module provides built-in methods for writing to JSON files. You can use the writeFile
, as discussed above, and the writeFileSync
methods, as discussed below. The difference between the two is that:
writeFile
is asynchronouswriteFileSync
is synchronousBefore writing a JSON file, make sure to serialize the JavaScript object to a JSON string using the JSON.stringify
method. JSON.stringify
will format your JSON data in a single line if you do not pass the optional formatting argument to the JSON.stringify
method specifying how to format your JSON data.
fs.writeFile
methodTo write a JSON file, the fs
module is also required. Here, we will use the fs.writeFile
method. The writeFile
method takes four arguments. The code snippet below shows its function signature:
fs.writeFile(file, data, options, callback);
When the writeFile
method is given the path of an existing JSON file, it will overwrite that file’s data. It will create a new file if the file does not exist:
const fs = require("fs"); const books = { title: "Alli goes to school", genre: "Fiction", type: "Children", pages: 56 }; const jsonData = JSON.stringify(books, null, 2); fs.writeFile("books1.json", jsonData, 'utf8', (err) => { if (err) { console.error('Error writing to file', err); } else { console.log('Data written to file'); } });
Here, we use the fs.writeFile
method to write the JSON string to a file named books1.json
and ask for a success or error message depending on which one occurs.
fs.writeFileSync
methodUnlike writeFile
, writeFileSync
writes to a file synchronously. If you use writeFileSync
, you will block the execution of the event loop and the rest of the code until the operation is successful or an error occurs. It will create a new file if the path you pass doesn’t exist and overwrite it if it does.
In the code snippet below, we are writing to the config.json
file. We wrap the code in try-catch
so that we can catch any errors:
const { writeFileSync } = require('fs'); const path = './config.json'; const config = { ip: '192.0.2.1', port: 3000 }; try { writeFileSync(path, JSON.stringify(config, null, 2), 'utf8'); console.log('Data successfully saved to disk'); } catch (error) { console.log('An error has occurred ', error); }
Node doesn’t have a built-in function for appending or updating the fields of an existing JSON file out of the box. However, you can read the JSON file using the readFile
method of the fs
module, update it, and overwrite the JSON file with the updated JSON.
Below is a code snippet illustrating how to do this:
const { writeFile, readFile } = require('fs'); const path = './config.json'; readFile(path, (error, data) => { if (error) { console.log(error); return; } const parsedData = JSON.parse(data); parsedData.createdAt = new Date().toISOString(); writeFile(path, JSON.stringify(parsedData, null, 2), (err) => { if (err) { console.log('Failed to write updated data to file'); return; } console.log('Updated file successfully'); }); });
There are other methods for reading and writing JSON files, which we’ll cover in the following sections.
Serialization is the process of modifying an object or data structure to a format that is easy to store or transfer over the internet. You can recover the serialized data by applying the reverse process.
Deserialization refers to transforming the serialized data structure to its original format. You will almost always need to serialize JSON or JavaScript objects to a JSON string in Node. You can do so with the JSON.stringify
method before writing it to a storage device or transmitting it over the internet:
const config = { ip: '1234.22.11', port: 3000}; console.log(JSON.stringify(config));
On the other hand, after reading the JSON file, you will need to deserialize the JSON string to a plain JavaScript object using the JSON.parse
method before accessing or manipulating the data:
const config = JSON.stringify({ ip: '1234.22.11', port: 3000}); console.log(JSON.parse(config));
JSON.stringify
and JSON.parse
are globally available methods in Node. You don’t need to install or require them before using them.
fs
moduleThe fs
module is built-in, and it provides functions that you can use to read and write data in the JSON format and much more.
Each function exposed by the fs
module has synchronous, callback, and promise-based forms. The synchronous and callback variants of a method are accessible from the synchronous and callback API. The promise-based variant of a function is accessible from the promise-based API.
The synchronous methods of the built-in fs
module block the event loop and further execution of the remaining code until the operation has succeeded or failed. More often than not, blocking the event loop is not something you want to do.
The names of all synchronous functions end with the "Sync"
characters. For example, writeFileSync
and readFileSync
are both synchronous functions.
You can access the synchronous API by requiring fs
:
const fs = require('fs'); // Blocks the event loop fs.readFileSync(path, options);
Unlike the synchronous methods that block the execution of the remaining code until the operation has succeeded or failed, the corresponding methods of the callback API are asynchronous. You’ll pass a callback function to the method as the last argument.
The callback function is invoked with an Error
object as the first argument if an error occurs. The remainder of the arguments to the callback function depend on the fs
method.
You can also access the methods of the callback API by requiring fs
like the synchronous API:
const fs = require('fs'); fs.readFile(path, options, callback);
The promise-based API is asynchronous, like the callback API. It returns a promise, which you can manage via promise chaining or async/await.
You can access the promise-based API by requiring fs/promises
:
const fs = require('fs/promises'); fs.readFile(path) .then((data) => { // Do something with the data }) .catch((error) => { // Do something if error });
We used the CommonJS syntax to access the modules in the code snippets above. We’ll continue using the CommonJS syntax throughout this article. You can also use ES6 modules if you want.
According to the Node documentation, the callback API of the built-in fs
module is more performant than the promise-based API. Therefore, most examples in this article will use the callback API.
In this section, we’ll look at the most popular third-party Node packages for reading and writing data in JSON format.
jsonfile is a popular npm package for reading and writing JSON files in Node. You can install it using the following command:
npm install jsonfile
It is similar to the readFile
and writeFile
methods of the built-in fs
module, though jsonfile has some advantages over the built-in methods:
You can see the jsonfile package in action in the code snippet below:
const jsonfile = require('jsonfile'); const path = './config.json'; jsonfile.readFile(path, (err, data) => { if (err) { console.log(err); return; } console.log(data); });
You can also use promise chaining instead of passing a callback function like the example above:
const jsonfile = require('jsonfile'); const path = './config.json'; jsonfile .readFile(path) .then((data) => { console.log(data); }) .catch((err) => { console.log(err); });
fs-extra is another popular Node package you can use to work with files. Though you can use this package for managing JSON files, it has methods whose functions extend beyond just reading and writing JSON files.
As its name suggests, fs-extra has all the functionalities provided by the fs
module and more. According to the documentation, you can use the fs-extra package instead of the fs
module.
Before using it, you need to first install fs-extra from npm:
npm install fs-extra
The code below shows how you can read JSON files using the readJson
method of the fs-extra package. You can use a callback function, promise chaining, or async/await:
const fsExtra = require('fs-extra'); const path = './config.json'; // Using callback fsExtra.readJson(path, (error, config) => { if (error) { console.log('An error has occurred'); return; } console.log(config); }); // Using promise chaining fsExtra .readJson(path) .then((config) => { console.log(config); }) .catch((error) => { console.log(error); }); // Using async/await async function readJsonData() { try { const config = await fsExtra.readJson(path); console.log(config); } catch (error) { console.log(error); } } readJsonData();
The code below illustrates how you can write JSON data using the writeJson
method:
const { writeJson } = require('fs-extra'); const path = './config.json'; const config = { ip: '192.0.2.1', port: 3000 }; // Using callback writeJson(path, config, (error) => { if (error) { console.log('An error has occurred'); return; } console.log('Data written to file successfully '); }); // Using promise chaining writeJson(path, config) .then(() => { console.log('Data written to file successfully '); }) .catch((error) => { console.log(error); }); // Using async/await async function writeJsonData() { try { await writeJson(path, config); console.log('Data written to file successfully '); } catch (error) { console.log(error); } } writeJsonData();
Just like the fs
module, fs-extra has both asynchronous and synchronous methods. You don’t need to stringify your JavaScript object before writing to a JSON file.
Similarly, you don’t need to parse to a JavaScript object after reading a JSON file. The module does it for you out of the box.
bfj is another npm package you can use to handle data in JSON format. According to the documentation, it was created for managing large JSON datasets:
“bfj implements asynchronous functions and uses pre-allocated fixed-length arrays to try and alleviate issues associated with parsing and stringifying large JSON or JavaScript datasets.” — bfj documentation
To install bfj from the npm package registry, run the following code:
npm install bfj
You can read JSON data using the read
method, which is asynchronous and returns a promise.
Assuming you have a config.json
file, you can use the following code to read it:
const bfj = require('bfj'); const path = './config.json'; bfj .read(path) .then((config) => { console.log(config); }) .catch((error) => { console.log(error); });
Similarly, you can use the write
method to write data to a JSON file:
const bfj = require('bfj'); const path = './config.json'; const config = { ip: '192.0.2.1', port: 3000 }; bfj .write(path, config) .then(() => { console.log('Data has been successfully written to disk'); }) .catch((error) => { console.log(error); });
bfj was created purposely for handling large JSON data. It is also slow, so you should only use it if you are handling relatively large JSON datasets.
As explained above, the built-in functions of the synchronous and asynchronous APIs read the entire file into memory. This is inefficient in terms of both time and memory as you need to wait until the entire file is read into memory before processing. If you are dealing with a large JSON file, you may wait for a long time. Similarly, you may run out of memory while reading large JSON files.
To remedy these issues, you may want to use streams to read and process JSON data. Streams in Node.js enable you to handle data in chunks rather than loading the entire file into memory. This is useful for managing memory usage and improving performance when reading and writing large files. The stream-json package comes in handy when streaming large JSON data.
The fs.createReadStream()
method in Node.js’s fs (file system) module is used to read large files in chunks or streams, rather than loading the entire file into memory at once. This approach is particularly useful for efficiently handling large files, such as logs and JSON data.
Here, we will look at how to use fs.createReadStream()
, but first, we need to install the stream-json
package as seen below:
npm install stream-json
In the example below, we used fs.createReadStream()
with the stream-json
package to read and parse large-file.json
. This reduces your application’s memory footprint and enables you to process chunks of data immediately after they become available:
const StreamArray = require("stream-json/streamers/StreamArray"); const fs = require("fs"); const pipeline = fs .createReadStream("large-file.json") .pipe(StreamArray.withParser()); pipeline.on("data", (data) => console.log(data));
require
method loads the entire JSON file into memory and caches it. For JSON files that change frequently, it’s best to avoid require
and use fs
module functions insteadspace
parameter in JSON.stringify
improves JSON string readability, but it’s best avoided for network transmissions to reduce bundle sizeIt is not uncommon to encounter the TypeError: Converting circular structure to JSON
error when serializing a JavaScript object using the JSON.stringify
function. This error occurs when you attempt to stringify a JavaScript object that references itself, as in the example below:
const object = { a: 1 }; object.itself = object; try { JSON.stringify(object); } catch (e) { // TypeError: Converting circular structure to JSON console.log(e); }
There is no straightforward fix to this error. However, you can manually find and replace the circular references with serializable values or use a third-party library like cycle.js, which was created by Douglas Crockford, the brain behind the JSON format.
A fork of the library is maintained at the npm package registry as cycle
. You can install it like so:
npm install cycle
Then, you can use it in your application, as shown below:
const cycle = require("cycle"); const originalObj = { a: 1 }; originalObj.itself = originalObj; const stringifiedObj = JSON.stringify(cycle.decycle(originalObj)); const originalObjCopy = cycle.retrocycle(JSON.parse(stringifiedObj)); console.log(originalObjCopy);
The decycle
function of the circular.js
package highlighted above will create a copy of the object, look for duplicate references, which might be circular references, and replace them with objects of the form { "$ref": PATH }
.
You can then stringify and parse the resulting object without encountering the TypeError
mentioned above. After that, you can store the resulting object on disk or transfer it over the network.
You can use the retrocycle
function of the circular.js
package to get a copy of the original object.
As explained in the above sections, JSON is one of the most popular formats for data exchange over the internet. The Node runtime environment has the built-in fs
module you can use to work with files in general. The fs
module has methods that you can use to read and write to JSON files using the callback API, promise-based API, or synchronous API.
Because methods of the callback API are more performant than those of the promise-based API, you are better off using the callback API.
In addition to the built-in fs
module, several popular third-party packages such as jsonfile, fs-extra, and bfj exist. They have additional utility functions that make working with JSON files a breeze. On the flip side, you should evaluate the limitations of adding third-party packages to your application.
Deploying a Node-based web app or website is the easy part. Making sure your Node instance continues to serve resources to your app is where things get tougher. If you’re interested in ensuring requests to the backend or third-party services are successful, try LogRocket.
LogRocket is like a DVR for web and mobile apps, recording literally everything that happens while a user interacts with your app. Instead of guessing why problems happen, you can aggregate and report on problematic network requests to quickly understand the root cause.
LogRocket instruments your app to record baseline performance timings such as page load time, time to first byte, slow network requests, and also logs Redux, NgRx, and Vuex actions/state. Start monitoring for free.
Would you be interested in joining LogRocket's developer community?
Join LogRocket’s Content Advisory Board. You’ll help inform the type of content we create and get access to exclusive meetups, social accreditation, and swag.
Sign up nowCompare Prisma and Drizzle ORMs to learn their differences, strengths, and weaknesses for data access and migrations.
It’s easy for devs to default to JavaScript to fix every problem. Let’s use the RoLP to find simpler alternatives with HTML and CSS.
Learn how to manage memory leaks in Rust, avoid unsafe behavior, and use tools like weak references to ensure efficient programs.
Bypass anti-bot measures in Node.js with curl-impersonate. Learn how it mimics browsers to overcome bot detection for web scraping.
3 Replies to "Reading and writing JSON files in Node.js: A complete tutorial"
Thank you, I couldn’t have done it without you.
You mention node in the beginning but then no more. So I don’t quite understand, do I need to install node or are all the commands referred to included in javascript.
(Sorry for the noob question but a quick research in the net didn’t get me on)
Hi Fabian,
Joseph here. Yes! You need to install Node for your system before using any of the commands in the article.
https://nodejs.org/en/download