Working on projects that have a clear spearation of backend and frontend typically have a good API setup to communicate between both. This makes development on frontend applications relatively easy without the need of talking to the backend in a local development environment if the API is well documented. If this case is true, then utilizing a solution like json-server makes development that much easier.
Out of the box
Assuming you already have a node project going, after installation globally (a requirement by the package), json-server setup is a breeze if no extra customization is needed.
1 | npm install -g json-server |
1 | { |
2 | users: [ { id: 1, name: 'Haidee' } ], |
3 | books: [ { id: 1, title: 'It' } ], |
4 | ... |
5 | } |
1 | json-server --watch db.json |
Your db.json
file ends up producing routes based on the parent nodes of the object which your frontend can end up calling for data.
1 | GET http://localhost:3000/users |
2 | |
3 | [ |
4 | { id: 1, name: 'Haidee' } |
5 | ] |
6 | |
7 | GET http://localhost:3000/books/1 |
8 | |
9 | { id: 1, name: 'It' } |
This is great for a quick up and running server, but in projects that need a bit more customization, json-server does provide ample options.
Customization
Instead of creating a db.json
file, in order to customize, we need to create a JavaScript file like server.js
. This file will include all the options we need to adhere to the backend agreements set forth.
1 | const jsonServer = require('json-server'); |
2 | const path = require('path'); |
3 | |
4 | const server = jsonServer.create(); |
5 | const router = jsonServer.router(path.join(__dirname, 'db.json')); |
6 | const middlewares = jsonServer.defaults(); |
7 | |
8 | server.use(middlewares); |
9 | server.listen(4000, () => { |
10 | console.log('JSON Server is running: http://localhost:4000'); |
11 | }); |
This code above exposes what the cli command does behind the curtains, but sets up the environment to allow for more customization later.
Utilizing the JSON:API spec
In recent projects I have utilized following the JSON:API to make handshakes between backend and frontend simpler. The spec follows a pretty robust guideline on how to make and send payloads back and forth. Using this in conjunction with json-server requires a little bit of extra setup than what is outlined in the docs for json-server.
Using the initial server.js
file that was setup, we are going to adjust it so that we can adhere to some of the JSON:API spec requirements. The one requirement in specific is the use of the data
key on every payload. By default, json-server
serves up a payload at the root; where JSON:API requires the root use a data
key.
1 | { data: { /* payload object /* } } |
To adjust for this, we can modify the server.js
file accordingly.
1 | /* removed for brevity */ |
2 | router.render = (req, res) => { |
3 | res.jsonp({ |
4 | data: res.locals.data, |
5 | }); |
6 | }; |
7 | |
8 | server.use(jsonServer.bodyParser); |
9 | server.use((req, res, next) => { |
10 | if (['PATCH', 'PUT', 'POST'].includes(req.method)) { |
11 | req.body = req.body.data; |
12 | } |
13 | next(); |
14 | }); |
15 | /* removed for brevity */ |
In the above snippet, when any response is rendered by the server, we make sure to re-wrap the data in a data
key.
The other part of this snippet changes how json-server
reacts to requests for non-get methods. Since it does not work directly with the JSON:API spec, it expects a flat payload. We must conform that payload to what it expects by putting the payload at the root.
Adjusting routes
Most of the time routes to an API aren’t always the root such as http://my.api.com/
. It’s good to namespace or even version the routes.
1 | /* removed for brevity */ |
2 | server.use('/api', router); |
This will now make the api url conform to: http://localhost:4000/api/:resource
.
Dynamic data
The last thing that makes all this worthwhile is to serve up dynamic data. This can easily be acheived by using a JS file that utilizes a library like Faker.js to generate random stringed output if desired.
1 | const faker = require('faker'); |
2 | |
3 | // Schema |
4 | const data = { |
5 | users: [], |
6 | books: [], |
7 | }; |
8 | |
9 | data.users.push({id: 1, name: 'Haidee'}); |
10 | data.users.push({id: 2, name: faker.name.firstName()}); |
11 | data.books.push({id: 1, name: 'It'}); |
12 | data.books.push({id: 2, name: faker.lorem.words(2)}); |
13 | |
14 | console.log(JSON.stringify(data)); |
The final line is key in that the output is logged to JSON string format that will be consumed to a .json
file in an npm script.
Putting it all together
After all the setup is done, you can finally add a simple script to your package.json
to make it all work together.
1 | node ./path/to/data-builder.js > ./path/to/db.json && node ./path/to/server.js |
Running this will take the output of the data-builder.js which would be JSON and save it to the db.json file and subsequently start the node server using the json-server previously defined. Each time the script is run, new data will be generated to the db.json file.
All of this is great to develop directly against, but a step further would be to get the tests to interact directly with this data instead of mocking out the data sets per test. That is my next item to figure out and tackle probably using cypress.io.