773

Background

I am doing some experimentation with Node.js and would like to read a JSON object, either from a text file or a .js file (which is better??) into memory so that I can access that object quickly from code. I realize that there are things like Mongo, Alfred, etc out there, but that is not what I need right now.

Question

How do I read a JSON object out of a text or js file and into server memory using JavaScript/Node?

0

12 Answers 12

1434

Sync:

var fs = require('fs');
var obj = JSON.parse(fs.readFileSync('file', 'utf8'));

Async:

var fs = require('fs');
var obj;
fs.readFile('file', 'utf8', function (err, data) {
  if (err) throw err;
  obj = JSON.parse(data);
});
11
  • 20
    I think JSON.parse is synchronous, its directly from v8, which means even with the Async way, people have to be careful with large JSON files. since it would tie up node.
    – Sean_A91
    Aug 3, 2015 at 4:29
  • 28
    For the sake of completeness. Their exists a npm called jsonfile.
    – Stefan
    Feb 23, 2016 at 15:37
  • 5
    I cant believe it was so difficult to find this simple thing. Every answer I got from google was doing an HTTPRequest or using JQuery or doing it in the browser Jun 14, 2017 at 17:26
  • 7
    two points: (1) The synchronous answer should just be let imported = require("file.json"). (2) JSON.parse must be asynchronous, because I used this code to load a 70mb JSON file into memory as an object. It takes milliseconds this way, but if I use require(), it chugs.
    – Kyle Baker
    Apr 3, 2018 at 4:09
  • 35
    For people finding this answer in 2019 and on, Node.js has had native json support for many, many versions through require, with this answer is no longer being applicable if you just want to load a json file. Just use let data = require('./yourjsonfile.json') and off you go (with the booknote that if the performance of require is impacting your code, you have problems well beyond "wanting to load a .json file") Feb 11, 2019 at 3:12
455

The easiest way I have found to do this is to just use require and the path to your JSON file.

For example, suppose you have the following JSON file.

test.json

{
  "firstName": "Joe",
  "lastName": "Smith"
}

You can then easily load this in your node.js application using require

var config = require('./test.json');
console.log(config.firstName + ' ' + config.lastName);
10
  • 41
    Just so folks know, and if I remember correctly, require in node runs synchronously. Dive in deep here
    – prasanthv
    Nov 5, 2014 at 3:04
  • 25
    Another issue/benefit with such method is the fact that required data is cached unless you specifically delete the cached instance Jan 1, 2016 at 11:46
  • 20
    "require" is meant to be used to load modules or config file you are using through out the lifespan of your application. does not seem right to use this to load files.
    – Yaki Klein
    May 18, 2016 at 9:55
  • 4
    I'd say this is potentially a security threat. If the json file you're loading contains JS code, would requireing it run that code? If so then you really need to control where your json files are coming from or an attacker could run malicious code on your machine.
    – sokkyoku
    Apr 4, 2019 at 17:46
  • 6
    This is a sound solution for small DevOps scripts or batch operations. You have to balance human time with performance. As far as something you can commit to memory and use quickly for these appropriate cases, this is is tops. Not every task involves Big Data™ and hostile execution environments. Mar 20, 2020 at 15:25
60

Asynchronous is there for a reason! Throws stone at @mihai

Otherwise, here is the code he used with the asynchronous version:

// Declare variables
var fs = require('fs'),
    obj

// Read the file and send to the callback
fs.readFile('path/to/file', handleFile)

// Write the callback function
function handleFile(err, data) {
    if (err) throw err
    obj = JSON.parse(data)
    // You can now play with your datas
}
3
  • 10
    agreed :), added async as well
    – mihai
    Apr 4, 2012 at 12:34
  • 1
    Great :-) I don't like inline callbacks though, it can lead to callback nightmares that I'd rather avoid. Apr 4, 2012 at 12:37
  • 12
    It's there for a reason.. unless you want it synchronously. May 31, 2016 at 9:15
51

At least in Node v8.9.1, you can just do

var json_data = require('/path/to/local/file.json');

and access all the elements of the JSON object.

4
  • 7
    This approach loads file only once. If you will change the file.json after new require (without restarting program) data will be from first load. I do not have source to back this, but I had this in app I am building Mar 12, 2018 at 18:22
  • Your answer is woefully incomplete. What that gets you is an object, and it doesn't even bother to implement tostring(). Apr 9, 2018 at 3:05
  • 4
    @DavidA.Gray The question wants to be able to access the objects as objects, not as strings. Asides from the singleton issue Lukas mentioned this answer is fine. Jan 30, 2019 at 11:20
  • 3
    Using require will also execute arbitrary code in the file. This method is insecure and I would recommend against it.
    – spoulson
    Apr 16, 2019 at 12:46
28

Answer for 2021, using ES6 module syntax and async/await

In modern JavaScript, this can be done as a one-liner, without the need to install additional packages:

import { readFile } from 'fs/promises';

let data = JSON.parse(await readFile("filename.json", "utf8"));

Add a try/catch block to handle exceptions as needed.

2
  • Where would you put the try catch? Aug 22, 2021 at 22:08
  • 1
    I was looking for this, thank you! Works great when I know that the file's content is JSON data, but the extension is customized. The usual require('./jsonfile.xyz') cannot be used in this situation.
    – szegheo
    Dec 11, 2021 at 13:48
19

In Node 8 you can use the built-in util.promisify() to asynchronously read a file like this

const {promisify} = require('util')
const fs = require('fs')
const readFileAsync = promisify(fs.readFile)

readFileAsync(`${__dirname}/my.json`, {encoding: 'utf8'})
  .then(contents => {
    const obj = JSON.parse(contents)
    console.log(obj)
  })
  .catch(error => {
    throw error
  })
4
  • 3
    .readFile is already async, if you're looking for the sync version, its name is .readFileSync.
    – Aternus
    May 26, 2018 at 17:45
  • If you want to use promises, there's also fs/promises as of Node 10. Note: the API is experimental: nodejs.org/api/fs.html#fs_fs_promises_api
    – aboutaaron
    Sep 25, 2018 at 18:25
  • @Aternus .readFile is asynchronous, but not async. Meaning, the function is not defined with async keyword, nor does it return a Promise, so you can't do await fs.readFile('whatever.json');
    – Kip
    Oct 1, 2019 at 2:37
  • @Kip how about a CodeSandBox?
    – Aternus
    Oct 8, 2019 at 14:39
7

Using fs-extra package is quite simple:

Sync:

const fs = require('fs-extra')

const packageObj = fs.readJsonSync('./package.json')
console.log(packageObj.version) 

Async:

const fs = require('fs-extra')

const packageObj = await fs.readJson('./package.json')
console.log(packageObj.version) 
6

using node-fs-extra (async await)

const readJsonFile = async () => {
    const myJsonObject = await fs.readJson('./my_json_file.json');
    console.log(myJsonObject);
}

readJsonFile() // prints your json object
2

https://nodejs.org/dist/latest-v6.x/docs/api/fs.html#fs_fs_readfile_file_options_callback

var fs = require('fs');  

fs.readFile('/etc/passwd', (err, data) => {
  if (err) throw err;
  console.log(data);
});  

// options
fs.readFile('/etc/passwd', 'utf8', callback);

https://nodejs.org/dist/latest-v6.x/docs/api/fs.html#fs_fs_readfilesync_file_options

You can find all usage of Node.js at the File System docs!
hope this help for you!

2
function parseIt(){
    return new Promise(function(res){
        try{
            var fs = require('fs');
            const dirPath = 'K:\\merge-xml-junit\\xml-results\\master.json';
            fs.readFile(dirPath,'utf8',function(err,data){
                if(err) throw err;
                res(data);
        })}
        catch(err){
            res(err);
        }
    });
}

async function test(){
    jsonData = await parseIt();
    var parsedJSON = JSON.parse(jsonData);
    var testSuite = parsedJSON['testsuites']['testsuite'];
    console.log(testSuite);
}

test();
0
0

So many answers, and no one ever made a benchmark to compare sync vs async vs require. I described the difference in use cases of reading json in memory via require, readFileSync and readFile here.

-1

If you are looking for a complete solution for Async loading a JSON file from Relative Path with Error Handling

  // Global variables
  // Request path module for relative path
    const path = require('path')
  // Request File System Module
   var fs = require('fs');


// GET request for the /list_user page.
router.get('/listUsers', function (req, res) {
   console.log("Got a GET request for list of users");

     // Create a relative path URL
    let reqPath = path.join(__dirname, '../mock/users.json');

    //Read JSON from relative path of this file
    fs.readFile(reqPath , 'utf8', function (err, data) {
        //Handle Error
       if(!err) {
         //Handle Success
          console.log("Success"+data);
         // Parse Data to JSON OR
          var jsonObj = JSON.parse(data)
         //Send back as Response
          res.end( data );
        }else {
           //Handle Error
           res.end("Error: "+err )
        }
   });
})

Directory Structure:

enter image description here

Not the answer you're looking for? Browse other questions tagged or ask your own question.