Tania Rascia Software developer, writer, maker of things. (http://tania.dev )

Setting up a RESTful API with Node.js and PostgreSQL

11 min read 3223

An extremely important aspect of being a modern web developer is knowing how to work with APIs to facilitate communication between different software systems.

In this tutorial, you’ll learn how to create your own RESTful API in a Node.js environment running on an Express server and utilizing a PostgreSQL database.

Prerequisites

In order to get the most out of this tutorial, there are a few prerequisites:

  • You should be familiar with JavaScript syntax and fundamentals
  • You should have basic knowledge of working with the command line
  • You should have Node.js and npm installed

Goals

By the time you complete this article, you should have a fully functional API server running on an Express framework in Node.js. The API should be able to handle the HTTP request methods that correspond to the PostgreSQL database that the API gets its data from. You will learn how to install PostgreSQL and work with it through the command line interface.

What is a RESTful API?

REST stands for Representational State Transfer and defines a set of standards for web services. An API is an interface that different software programs use to communicate with each other. Therefore, a RESTful API is an API that conforms to the REST architectural style and constraints. REST systems are stateless, scalable, cacheable, and have a uniform interface.

RESTful APIs most commonly utilize HTTP requests. Four of the most common HTTP methods are GET, POST, PUT, and DELETE, which are the methods by which a developer can create a CRUD system – create, read, update, delete.

PostgreSQL database

PostgreSQL, commonly referred to as Postgres, is a free and open source relational database management system. You might be familiar with a few other similar database systems, such as MySQL, Microsoft SQL Server, or MariaDB, which compete with PostgreSQL.

PostgreSQL is a robust but stable relational database that has been around since 1997 and is available on all major operating systems — Linux, Windows, and macOS. Since PostgreSQL is known for stability, extensibility, and standards compliance, it’s a popular choice for developers and companies to use for their database needs.

We’ll begin this tutorial by installing PostgreSQL, creating a new user, creating a database, and initializing a table with schema and some data.

Installation

If you’re using Windows, download a Windows installer of PostgreSQL.

If you’re using a Mac, this tutorial assumes you have Homebrew installed on your computer as a package manager for installing new programs. If you don’t, simply click on the link and follow the instructions to install Homebrew.

Open up the Terminal and install postgresql with brew.

brew install postgresql

You may see instructions on the web that will say brew install postgres instead of postgresql. Both of these options will install PostgreSQL on your computer.

After the installation is complete, we’ll want to get the postgresql up and running, which we can do with services start.

brew services start postgresql
==> Successfully started `postgresql` (label: homebrew.mxcl.postgresql)

If at any point you want to stop the postgresql service, you can run brew services stop postgresql.

PostgreSQL is installed now, so the next step is to connect to the postgres command line, where we can run SQL commands.

PostgreSQL command prompt

psql is the PostgreSQL interactive terminal. Running psql will connect you to a PostgreSQL host. Running psql --help will give you more information about the available options for connecting with psql.

  • --h--host=HOSTNAME | database server host or socket directory (default: “local socket”)
  • --p--port=PORT | database server port (default: “5432”)
  • --U--username=USERNAME | database username (default: “your_username”)
  • --w--no-password | never prompt for password
  • --W--password | force password prompt (should happen automatically)

We’ll just connect to the default postgres database with the default login information – no option flags.

psql postgres

You’ll see that we’ve entered into a new connection. We’re now inside psql in the postgres database. The prompt ends with a # to denote that we’re logged in as the superuser, or root.

postgres=#

Commands within psql start with a backslash (\). To test our first command, we can ensure what database, user, and port we’ve connected to by using the \conninfo command.

postgres=# \conninfo
You are connected to database "postgres" as user "your_username" via socket in "/tmp" at port "5432".

Here is a reference table of a few common commands which we’ll be using in this tutorial.

  • \q | Exit psql connection
  • \c | Connect to a new database
  • \dt | List all tables
  • \du | List all roles
  • \list | List databases

Let’s create a new database and user so we’re not using the default accounts, which have superuser privileges.

Create a user

First, we’ll create a role called me and give it a password of password. A role can function as a user or a group, so in this case, we’ll be using it as a user.

postgres=# CREATE ROLE me WITH LOGIN PASSWORD 'password';

We want me to be able to create a database.

postgres=# ALTER ROLE me CREATEDB;

You can run \du to list all roles/users.

me          | Create DB                           | {}
postgres    | Superuser, Create role, Create DB   | {}

Now we want to create a database from the me user. Exit from the default session with \q for quit.

postgres=# \q

We’re back in our computer’s default Terminal connection. Now we’ll connect postgres with me.

psql -d postgres -U me

Instead of postgres=#, our prompt shows postgres=> now, meaning we’re no longer logged in as a superuser.

Create a database

We can create a database with the SQL command.

postgres=> CREATE DATABASE api;

Use the \list command to see the available databases.

Name    |    Owner    | Encoding |   Collate   |    Ctype    |
api     | me          | UTF8     | en_US.UTF-8 | en_US.UTF-8 |

Let’s connect to the new api database with me using the \c (connect) command.

postgres=> \c api
You are now connected to database "api" as user "me".
api=>

Our prompt now displays that we’re connected to api.

Create a table

The last thing we’ll do in the psql command prompt is create a table called users with three fields – two VARCHAR types and an auto-incrementing PRIMARY KEY id.

api=>
CREATE TABLE users (
  ID SERIAL PRIMARY KEY,
  name VARCHAR(30),
  email VARCHAR(30)
);

Make sure not to use the backtick (`) character when creating and working with tables in PostgreSQL. While backticks are allowed in MySQL, they’re not valid in PostgreSQL. Also, ensure you do not have a trailing comma in the CREATE TABLE command.

We’ll add two entries to users to have some data to work with.

INSERT INTO users (name, email)
  VALUES ('Jerry', 'jerry@example.com'), ('George', 'george@example.com');

Let’s make sure that got added correctly by getting all entries in users.

api=> SELECT * FROM users;
id |  name  |       email        
----+--------+--------------------
  1 | Jerry  | jerry@example.com
  2 | George | george@example.com

Now we have a user, database, table, and some data. We can begin building our Node.js RESTful API to connect to this data stored in a PostgreSQL database.

Setting up an Express server

At this point, we’re finished with all of our PostgreSQL tasks, and we can begin setting up our Node.js app and Express server.

Create a directory for the project to live.

mkdir node-api-postgres
cd node-api-postgres

You can either run npm init -y to create a package.json, or copy the code below into a package.json file.

{
  "name": "node-api-postgres",
  "version": "1.0.0",
  "description": "RESTful API with Node.js, Express, and PostgreSQL",
  "main": "index.js",
  "license": "MIT"
}

We’ll want to install Express for the server and node-postgres (pg) to be able to connect to PostgreSQL.

npm i express pg

Now we have our dependencies loaded into node_modules and package.json.

Create an index.js file, which we’ll use as the entry point for our server. At the top, we’ll require the express module, built in bodyParser middleware, and set our app and port variables.

const express = require('express')
const bodyParser = require('body-parser')
const app = express()
const port = 3000

app.use(bodyParser.json())
app.use(
  bodyParser.urlencoded({
    extended: true,
  })
)

We’ll tell a route to look for a GET request on the root (/) URL, and return some JSON.

app.get('/', (request, response) => {
  response.json({ info: 'Node.js, Express, and Postgres API' })
})

Now set the app to listen on the port you set.

app.listen(port, () => {
  console.log(`App running on port ${port}.`)
})

From the command line, we can start the server by hitting index.js.

node index.js
App running on port 3000.

Go to http://localhost:3000 in the URL bar of your browser, and you’ll see the JSON we set earlier.

{
  info: "Node.js, Express, and Postgres API"
}

The Express server is running now, but it’s only sending some static JSON data that we created. The next step is to connect to PostgreSQL from Node.js to be able to make dynamic queries.

Connecting to the database from Node.js

We’ll be using the node-postgres module to create a pool of connections. This way we don’t have to open a client and close it every time we make a query.

A popular option for production pooling would be to use pgBouncer, a lightweight connection pooler for PostgreSQL.

Create a file called queries.js and set up the configuration of your PostgreSQL connection.

const Pool = require('pg').Pool
const pool = new Pool({
  user: 'me',
  host: 'localhost',
  database: 'api',
  password: 'password',
  port: 5432,
})

In a production environment, you would want to put your configuration details in a separate file with restrictive permissions that is not accessible from version control, but for the simplicity of this tutorial , we’re keeping it in the same file as the queries.

The aim of this tutorial is to allow GET, POST, PUT, and DELETE operations on the API which will run the corresponding database commands. To do this we’ll set up a route for each endpoint, and a function to correspond to each query.

Creating routes

We’re going to create six functions for six routes, seen in the chart below. First, we’ll go through and create all of the functions for each route, then we’ll export the functions so they’re accessible:

  • GET/ | displayHome()
  • GET/users | getUsers()
  • GET/users/:id | getUserById()
  • POSTusers | createUser()
  • PUT/users/:id | updateUser()
  • DELETE/users/:id | deleteUser()

In index.js, we made an app.get() for the root endpoint with a function in it. Now in queries.js, we’ll create endpoints that will display all users, display a single user, create a new user, update an existing user, and delete a user.

GET all users

Our first endpoint will be a GET request. Inside the pool.query() we can put the raw SQL that will touch the api database. We’ll SELECT all users and order by id.

const getUsers = (request, response) => {
  pool.query('SELECT * FROM users ORDER BY id ASC', (error, results) => {
    if (error) {
      throw error
    }
    response.status(200).json(results.rows)
  })
}

GET a single user by id

For our /users/:id request, we’ll be getting the custom id parameter by the URL and using a WHERE clause to display the result.

In the SQL query, we’re looking for id=$1. In this instance, $1 is a numbered placeholder, which PostgreSQL uses natively instead of the ? placeholder you may be familiar with from other flavors of SQL.

const getUserById = (request, response) => {
  const id = parseInt(request.params.id)

  pool.query('SELECT * FROM users WHERE id = $1', [id], (error, results) => {
    if (error) {
      throw error
    }
    response.status(200).json(results.rows)
  })
}

POST a new user

<p”>The API will take a GET and POST request to the /users endpoint. In the POST request, we’ll be adding a new user. In this function, we’re extracting the name and email properties from the request body, and INSERTing the values.

const createUser = (request, response) => {
  const { name, email } = request.body

  pool.query('INSERT INTO users (name, email) VALUES ($1, $2)', [name, email], (error, results) => {
    if (error) {
      throw error
    }
    response.status(201).send(`User added with ID: ${result.insertId}`)
  })
}

PUT updated data in an existing user

The /users/:id endpoint will also take two HTTP requests — the GET we created for getUserById, and also a PUT, to modify an existing user. For this query, we’ll combine what we learned in GET and POST to use the UPDATE clause.

It is worth noting that PUT is idempotent, meaning the exact same call can be made over and over and will produce the same result. This is different than POST, in which the exact same call repeated will continuously make new users with the same data.

const updateUser = (request, response) => {
  const id = parseInt(request.params.id)
  const { name, email } = request.body

  pool.query(
    'UPDATE users SET name = $1, email = $2 WHERE id = $3',
    [name, email, id],
    (error, results) => {
      if (error) {
        throw error
      }
      response.status(200).send(`User modified with ID: ${id}`)
    }
  )
}

DELETE a user

Finally, we’ll use the DELETE clause on /users/:id to delete a specific user by id. This call is very similar to our getUserById() function.

const deleteUser = (request, response) => {
  const id = parseInt(request.params.id)

  pool.query('DELETE FROM users WHERE id = $1', [id], (error, results) => {
    if (error) {
      throw error
    }
    response.status(200).send(`User deleted with ID: ${id}`)
  })
}

Export

In order to access these functions from index.js, we’ll need to export them. We can do this with module.exports, creating an object of functions. Since we’re using ES6 syntax, we can write getUsers instead of getUsers:getUsers, and so on.

module.exports = {
  getUsers,
  getUserById,
  createUser,
  updateUser,
  deleteUser,
}

Here is our complete queries.js file.

const Pool = require('pg').Pool
const pool = new Pool({
  user: 'me',
  host: 'localhost',
  database: 'api',
  password: 'password',
  port: 5432,
})
const getUsers = (request, response) => {
  pool.query('SELECT * FROM users ORDER BY id ASC', (error, results) => {
    if (error) {
      throw error
    }
    response.status(200).json(results.rows)
  })
}

const getUserById = (request, response) => {
  const id = parseInt(request.params.id)

  pool.query('SELECT * FROM users WHERE id = $1', [id], (error, results) => {
    if (error) {
      throw error
    }
    response.status(200).json(results.rows)
  })
}

const createUser = (request, response) => {
  const { name, email } = request.body

  pool.query('INSERT INTO users (name, email) VALUES ($1, $2)', [name, email], (error, results) => {
    if (error) {
      throw error
    }
    response.status(201).send(`User added with ID: ${result.insertId}`)
  })
}

const updateUser = (request, response) => {
  const id = parseInt(request.params.id)
  const { name, email } = request.body

  pool.query(
    'UPDATE users SET name = $1, email = $2 WHERE id = $3',
    [name, email, id],
    (error, results) => {
      if (error) {
        throw error
      }
      response.status(200).send(`User modified with ID: ${id}`)
    }
  )
}

const deleteUser = (request, response) => {
  const id = parseInt(request.params.id)

  pool.query('DELETE FROM users WHERE id = $1', [id], (error, results) => {
    if (error) {
      throw error
    }
    response.status(200).send(`User deleted with ID: ${id}`)
  })
}

module.exports = {
  getUsers,
  getUserById,
  createUser,
  updateUser,
  deleteUser,
}

Completing the setup

Now that we have all of our queries, the last thing we need to do is pull them into the index.js file and make endpoint routes for all the query functions we created.

To get all the exported functions from queries.js, we’ll require the file and assign it to a variable.

const db = require('./queries')

Now for each endpoint, we’ll set the HTTP request method, the endpoint URL path, and the relevant function.

app.get('/users', db.getUsers)
app.get('/users/:id', db.getUserById)
app.post('/users', db.createUser)
app.put('/users/:id', db.updateUser)
app.delete('/users/:id', db.deleteUser)

Here is our complete index.js, the entry point of the API server.

const express = require('express')
const bodyParser = require('body-parser')
const app = express()
const db = require('./queries')
const port = 3000

app.use(bodyParser.json())
app.use(
  bodyParser.urlencoded({
    extended: true,
  })
)

app.get('/', (request, response) => {
  response.json({ info: 'Node.js, Express, and Postgres API' })
})

app.get('/users', db.getUsers)
app.get('/users/:id', db.getUserById)
app.post('/users', db.createUser)
app.put('/users/:id', db.updateUser)
app.delete('/users/:id', db.deleteUser)

app.listen(port, () => {
  console.log(`App running on port ${port}.`)
})

Now with just these two files, we have a server, database, and API all set up. You can start up the server by hitting index.js again.

node index.js
App running on port 3000.

Now if you go to http://localhost:3000/users or http://localhost:3000/users/1, you’ll see the JSON response of the two GET requests. But how can we test our POST, PUT, and DELETE requests?

This can be done with curl, a command line tool that’s already available on your terminal. Below are examples you can run on the command line to test all of the protocols.

Make sure that the server is actively running in one Terminal window while you run these commands in a separate window.

200’s only Monitor failed and slow REST requests in production

Deploying a new web app or website leveraging REST/Axios/multithreading is the easy part. Making sure everything continues to serve resources to your app is where things get tougher. If you’re interested in ensuring requests to the backend or 3rd party services are successful, try LogRocket. https://logrocket.com/signup/

LogRocket is like a DVR for web apps, recording literally everything that happens on your site. Instead of guessing why problems happen, you can aggregate and report on problematic network requests to quickly understand the root cause.

LogRocket instruments your app to record baseline performance timings such as page load time, time to first byte, and slow network requests as well as logs Redux, NgRx. and Vuex actions/state. .

POST

Add a new user with the name Elaine and email elaine@example.com.

curl --data "name=Elaine&email=elaine@example.com" 
http://localhost:3000/users

PUT

Update the user with id 1 to have the name Kramer and email kramer@example.com.

curl -X PUT -d "name=Kramer" -d "email=kramer@example.com" 
http://localhost:3000/users/1

DELETE

Delete the user with id 1.

curl -X "DELETE" http://localhost:3000/users/1

Conclusion

Congratulations, you should now have a functioning API server running on Node.js and hooked up to an active PostgreSQL database. In this tutorial, we learned how to install and set up PostgreSQL in the command line, how to create users, databases, and tables, and how to run SQL commands. We also learned how to create an Express server that can handle multiple HTTP methods, and how to use the pg module to connect to PostgreSQL from Node.

With this knowledge, you should be able to build on this API and utilize it for your own personal or professional development projects.

Tania Rascia Software developer, writer, maker of things. (http://tania.dev )

29 Replies to “Setting up a RESTful API with Node.js and PostgreSQL”

  1. Can you make a tutorial of user registration using react native with the PostgreSQL and inserting the data with the button onPress function , what i mean is a restful api with node.js using react native with the PostgreSQL and just add new user into database….
    Thanks

  2. Thanks for this tutorial. I was getting an error like “Cannot GET /users” and the resolution was to stop and start node index.js – in case anyone else gets stuck on that

  3. I don’t really know why this brilliant tutorial doesn’t have many likes or comments. It’s perfect for a Node newbie looking to intercept the postgres world to build apis. Thanks very much!!

  4. This was incredible. Thank you so much. I’ve been having so many issues connecting to postgres that I thought it was going to take me forever to connect locally as well.
    Note to anybody who might have permission issues connecting with your created user, you might have to grant read access instead of just creating a database: https://dba.stackexchange.com/questions/36870/permission-denied-in-postgres

    just change myuser and paste the rest into postgres
    for read access
    GRANT ALL PRIVILEGES ON ALL TABLES IN SCHEMA public TO myuser;
    for inserting data
    GRANT ALL PRIVILEGES ON ALL SEQUENCES IN SCHEMA public TO myuser;

  5. absolutely nice starting guide for me, theres a bit mistype in your create user response, just change to “results” and you good to go, anyway thanks… your starting guide helps me understand node works

  6. Thanks for the tutorial, this was really helpful getting me started on making a backend for my apps 🙂

    I had only one issue, the setup with the bodyParser middleware (`app.use()` lines) was preventing my browser from connecting to Express. The server would start via CLI fine, but I was getting “waiting for localhost” when trying to hit the server from Firefox & Chrome. Starting Express in debug mode didn’t show anything indicative.. But after removing those lines, the rest of the tutorial worked fine! I’m going to investigate what the bodyParser component is meant for because it’s likely I’m skipping something important 😅

  7. Awesome tutorial. Real help for beginners. With your step by step approach you explained everything so clearly. Thanks!

  8. Hello there !
    I got through the whole tutorial (which is really clear and great, thanks for that !), and I guess this is going to be a pretty stupid question (fairly new to Node/Express/APIs), but I’m on Windows, and the curls commands to POST and UPDATE don’t work, only the DELETE one those.
    When I try, it returns an error :
    “Cannot destructure property `name` of undefined or null;”

    In any case thanks again for this tutorial 🙂 !

  9. Where is this field (${results.INSERTID}) coming from? I am not seeing this field in the results object.

  10. @ryan this means that you are trying to access name from a variable which is undefined

    request.body is undefined and you are trying to get the name data which will throw this error

  11. Thanks for this article. Helped a novice like me to start on Nodejs and PostGres. As token of thanks…was getting undefined for results.insertId for insert POST api. I replaced that by response.status(201).send(`User added with ID: ${results.rows[0].id}`) to get the last inserted id.

  12. I solved this issue by adding the bodyParser.json() middelware in the app.post as follows:

    app.post(‘/users’, bodyParser.json(), db.createUser);

  13. I like this tutorials. It was straight to the point, integrating basic concepts of node and postgresql. I wish more tutorials and/or videos where this straight forward, without trivial distractions.

  14. It is best tutorial. thank you.
    but, can you tell me how to access the four methods(GET, POST, PUT, and DELETE) in angular JS?

  15. What a life saver! I wish many will copy your style of tutorials. I went through a number of frustrating ones until I found yours.Thanks.

  16. where do you enter the mkdir node-api-postgres? I tried on my windows command prompt but when i got to npm i express pg, it says “‘npm’ is not recognized as an internal or external command, operable program or batch file”. Could you be more specific please cos I know it did not work on psql terminal so I’m not sure where to run this successfully

Leave a Reply