Creating a scalable API using Node, GraphQL, MySQL, and Knex
Creating a scalable API using Node, GraphQL, MySQL, and Knex
Imagine a new bookstore has taken up shop in your neighborhood. The manager decides she really wants to build an API from scratch to handle all the library’s business, and you’ll be the one to do it. You come in on day one and your first task is to build a couple API endpoints to handle the CRUD operations for books. You also have to go to the dry cleaners later and take your dog on a walk, so time is precious. With that said, let’s jump right to it.
Run the following in your terminal:
mkdir node_api_tutorial && cd node_api_tutorial
Let’s scaffold an express application using the express-cli library. Install the cli globally:
npm i -g express
Now run:
express && npm i && npm start
Look at that! We have a basic express server running. By the way, unless you want to be restarting your server every time you make a change, I recommend using nodemon. Install it globally, kill the current port connection (run control-C), and run:
nodemon start
Let’s get back to it. That scaffolding script came with extra baggage we don’t need in our API-only app, so let’s go ahead and do some trimming.
- Remove the views/ folder
- Remove routes/users.js
- Change app.js at the root of your project to look like the following:
var express = require('express');
var path = require('path');
var favicon = require('serve-favicon');
var logger = require('morgan');
var cookieParser = require('cookie-parser');
var bodyParser = require('body-parser');
var index = require('./routes/index');
var app = express();
// view engine setup
app.set('view engine', 'jade');
// uncomment after placing your favicon in /public
//app.use(favicon(path.join(__dirname, 'public', 'favicon.ico')));
app.use(logger('dev'));
app.use(bodyParser.json());
app.use(bodyParser.urlencoded({ extended: false }));
app.use(cookieParser());
app.use(express.static(path.join(__dirname, 'public')));
app.use('/api/', index);
// catch 404 and forward to error handler
app.use(function(req, res, next) {
var err = new Error('Not Found');
err.status = 404;
next(err);
});
// error handler
app.use(function(err, req, res, next) {
// set locals, only providing error in development
res.locals.message = err.message;
res.locals.error = req.app.get('env') === 'development' ? err : {};
// render the error page
res.status(err.status || 500);
res.render('error');
});
module.exports = app;
We got rid of the views path and the users endpoint. If you look closely, we also changed the path for the index route to:
app.use(‘/api/’, index);
This means that the paths for your endpoints will begin with /api/. You’ll thank me when it’s time to build the frontend and you need that /books path.
Go to localhost:3000/api/ for a quick sanity check. Hopefully your page isn’t giving you any errors. We’re almost past the setup process! Let’s just quickly configure our MySQL database to our project.
Run the following from the root of the project:
touch config.js && npm i --save mysql
Copy-paste this code inside your newly-created config.js file:
var mysql = require('mysql');
var connection = mysql.createConnection({
host : 'localhost',
user : 'admin',
password : 'password',
database : 'Bookstore'
});
try {
connection.connect();
} catch(e) {
console.log('Database Connection failed:' + e);
}
module.exports = connection;
Replace user with your MySQL user and password with your MySQL password. If you don’t have MySQL setup on your computer yet, there are a lot of resources online to do so. Consider figuring it out a rite of passage to the heavenly world of relational databases.
Log into the MySQL terminal. I’m logging in with my admin account.
mysql -u admin -p
Create the database. I ran the following:
CREATE DATABASE Bookstore;
Setting up a database migration system
I lied. There’s still one more step before we can begin endpoint madness. We should really set up a database migration system. Maybe you use tools like git for version controlling your code, but how do you version control your database schema? You guessed it! (I hope). Migrations. Using migrations, we can create, delete, update tables, and if we need to go back or forward through our different database models, we can. I have found Knex to be really easy to use, so let’s give it a shot.
We need to install it both locally and globally. We use the global version for its CLI capabilities and the local version to manage the project’s migrations. From the root of the project, run the following:
npm i --save knex && npm i -g knex && mkdir db && cd db
In the terminal, you should have been magically transported to the db/ folder.
If so, run the following:
knex init
A file ‘knexfile.js’ should popup under the db/ folder. If it did, give yourself a pat on the back. When you’re done with that, open up that new file, and let’s update the configuration for ‘development’.
Change it to something like:
development: {
client: 'mysql',
connection: {
user : 'admin',
password : 'password',
database : 'bookstore'
}
},
The user and password values should be the same as the ones from config.js. By the way, I wouldn’t recommend pushing your bookstore app to GitHub because your password will be exposed. If you really wish to make your source code publicly available, consider setting up environment variables.
Let’s create our first migration. Still inside the db/ folder, run the following:
knex migrate:make create-books
A ‘migrations’ folder should have popped up and a file prepended with a bunch of numbers should magically appear inside your project. PROTIP: those numbers at the beginning are when the file was created. This approach became standardized at one point or another to avoid file name collisions. If those numbers weren’t there and you ran the same migration as before, you would suddenly have two identically-named files and the migration manager would become very, very confused.
Creating our books table migrations should now be a relatively trivial task. Make your newly created file look like this:
exports.up = async db => {
await db.schema.createTable('books', t => {
t.increments('id').unsigned().primary();
t.string('name').notNull();
t.string('isbn').notNull();
})
};
exports.down = async db => {
await db.schema.dropTableIfExists('books');
};
The up method is called when you migrate up; in practical terms: after generating the migration file and running knex migrate:latest. The down method is called when you migrate down — when you rollback the migration.
Let’s migrate up. Still within the db/ folder, run:
knex migrate:latest
OK, let’s build start building some endpoints! The dry cleaners won’t be open all night after all. Let’s setup our books controller and while we’re there a 404-response controller as well.
In the terminal, let’s go to the root of the project and run the following:
mkdir controllers && touch controllers/books-controller.js controllers/not-found-controller.js
Get your CRUD on
Alright, the time has come! If you haven’t yet done so, roll up your sleeves because we’re going to make some endpoints. Let’s get the skeleton going.
const connection = require(‘../config.js');
module.exports = {
all: function(req, res) {
},
create: function(req, res, next) {
},
get: function (req,res) {
},
update: function (req,res) {
},
destroy: function (req,res) {
}
};
Let’s smash these out of the park.
Change your all method to:
all: function(req, res) {
connection.query('SELECT * FROM books', (err, rows) => {
if (!err) {
res.setHeader('Content-Type', 'application/json');
res.status(200).send(JSON.stringify(
{
'result' : 'success',
'data': rows
})
);
} else {
res.status(400).send(err);
}
});
},
Change your create method to:
create: function(req, res, next) {
let response;
const name = req.body.name;
const isbn = req.body.isbn;
if (
typeof name !== 'undefined'
&& typeof isbn !== 'undefined'
) {
connection.query('INSERT INTO books (name, isbn) VALUES (?, ?)',
[name, isbn],
function(err, result) {
handleSuccessOrErrorMessage(err, result, res);
});
} else {
response = {
'result' : 'error',
'msg' : 'Please fill required details'
};
res.setHeader('Content-Type', 'application/json');
res.status(200).send(JSON.stringify(response));
}
},
Change your get method to:
get: function (req,res) {
connection.query('SELECT * FROM books WHERE id = ? LIMIT 1', [req.params.id], (err, rows) => {
res.setHeader('Content-Type', 'application/json');
res.status(200).send(JSON.stringify(
{
'result' : 'success',
'data': rows[0]
})
);
})
},
Change your update method to:
update: function (req,res) {
let response;
const name = req.body.name;
const isbn = req.body.isbn;
const id = req.params.id;
if (
typeof name !== 'undefined'
&& typeof isbn !== 'undefined'
) {
connection.query('UPDATE books SET name = ?, isbn = ? WHERE id = ?',
[name, isbn, id],
function(err, result) {
handleSuccessOrErrorMessage(err, result, res);
});
} else {
response = {'result' : name, 'msg' : 'Please fill required information'};
res.setHeader('Content-Type', 'application/json');
res.send(200, JSON.stringify(response));
}
},
Change your destroy method to:
destroy: function (req,res) {
connection.query('DELETE FROM books WHERE id = ?', [req.params.id], function(err, result) {
handleSuccessOrErrorMessage(err, result, res);
});
}
Let’s add the handleSuccessOrErrorMessage function to the end of the file for now.
function handleSuccessOrErrorMessage(err, result, res) {
if (!err){
if (result.affectedRows != 0) {
response = {'result' : 'success'};
} else {
response = {'msg' : 'No Result Found'};
}
res.setHeader('Content-Type', 'application/json');
res.status(200).send(JSON.stringify(response));
} else {
res.status(400).send(err);
}
}
There you go, the full thing:
const connection = require('../config.js');
module.exports = {
all: (req, res) => {
connection.query('SELECT * from books', (err, rows) => {
if (!err) {
res.setHeader('Content-Type', 'application/json');
res.status(200).send(JSON.stringify(
{
'result' : 'success',
'data': rows
})
);
} else {
res.status(400).send(err);
}
});
},
create: (req, res, next) => {
let response;
const name = req.body.name;
const isbn = req.body.isbn;
if (
typeof name !== 'undefined'
&& typeof isbn !== 'undefined'
) {
connection.query('INSERT INTO books (name, isbn) VALUES (?, ?)',
[name, isbn],
(err, result) => {
handleSuccessOrErrorMessage(err, result, res);
});
} else {
response = {
'result' : 'error',
'msg' : 'Please fill required details'
};
res.setHeader('Content-Type', 'application/json');
res.status(200).send(JSON.stringify(response));
}
},
get: (req, res) => {
connection.query('SELECT * from books where id = ?', [req.params.id], (err, rows) => {
res.setHeader('Content-Type', 'application/json');
res.status(200).send(JSON.stringify(
{
'result' : 'success',
'data': rows[0]
})
);
})
},
update: (req, res) => {
let response;
const name = req.body.name;
const isbn = req.body.isbn;
const id = req.params.id;
console.log(name, isbn, 'yooo');
if (
typeof name !== 'undefined'
&& typeof isbn !== 'undefined'
) {
connection.query('UPDATE books SET name = ?, isbn = ? WHERE id = ?',
[name, isbn, id],
function(err, result) {
handleSuccessOrErrorMessage(err, result, res);
});
} else {
response = {'result' : name, 'msg' : 'Please fill required information'};
res.setHeader('Content-Type', 'application/json');
res.send(200, JSON.stringify(response));
}
},
destroy: (req, res) => {
connection.query('DELETE FROM books WHERE id = ?', [req.params.id], (err, result) => {
handleSuccessOrErrorMessage(err, result, res);
});
}
};
function handleSuccessOrErrorMessage(err, result, res) {
if (!err){
let response;
if (result.affectedRows != 0) {
response = {'result' : 'success'};
} else {
response = {'msg' : 'No Result Found'};
}
res.setHeader('Content-Type', 'application/json');
res.status(200).send(JSON.stringify(response));
} else {
res.status(400).send(err);
}
}
To test the endpoints, let’s give each one a route. Make your index.js file inside of routes look something like this:
const express = require('express');
const router = express.Router();
const booksController = require('../controllers/books-controller');
const notFoundController = require('../controllers/not-found-controller');
router.get('/books', booksController.all);
router.post('/books', booksController.create);
router.get('/book/:id', booksController.get);
router.put('/book/:id', booksController.update);
router.delete('/book/:id', booksController.destroy);
router.get('*', notFoundController.show);
module.exports = router;
Oh, almost forgot: the NotFound controller. Add this to /controllers/not-found-controller.js
const connection = require('../config.js');
module.exports.show = (req, res, next) => {
res.setHeader('Content-Type', 'application/json');
res.status(404).send(JSON.stringify(
{
'result' : 'Not Found',
'data' : null
}
));
};
Let’s check each endpoint. Navigate to localhost:3000/api/books. Hopefully you see data. By the way, I use the Google chrome extension JSON Viewer to make the JSON look all nice and readable.
Let’s submit a POST request to /api/books to create a book. If you have Postman or similar software, feel free to use it. That’s what I would probably use. Otherwise, sending a curl script in the terminal is simple enough as well. Run this in your terminal:
curl -X POST -H "Content-Type: application/json" -d '{"name":"Lord of the Fries","isbn":"som3isbnvalu3"}' http://localhost:3000/api/books
Hopefully you get a response in the terminal saying {“result”:”success”}.
Go back to http://localhost:3000/api/books and you should now see some data. Pretty neat right?
Let’s check our Get function. Go to http://localhost:3000/api/book/1. Hopefully you see data here as well.
Now, you may have noticed we misspelled William Golding’s canonic work. We also didn’t put the correct ISBN number, so let’s fix that using our Update controller.
curl -X PUT -H "Content-Type: application/json" -d '{"name":"Lord of the Flies","isbn":"978-0-39-950148-7"}' http://localhost:3000/api/book/1
I hope you got a success message. If so, going back to http://localhost:3000/api/books should now show you our corrected JSON object.
Don’t grow too attached to the Lord of the Flies record. We still have to try out our destroy function. Cover your eyes, and run the following:
curl -X DELETE -H "Content-Type: application/json" http://localhost:3000/api/book/1
Yup, it’s gone. Hopefully. If the terminal returned another success message, congrats! You now have the basic CRUD features working.
What’s next? Well, you can either call it quits, or tag along for one last wild ride. We’re going to be taking a quick look at GraphQL, and incorporate it into our API development workflow.
So what is GraphQL?
Well, it’s a tool that can do a small or large number of operations on your API data and then spit out the result. This is by no means a revolutionary concept, but what is revolutionary is the simplicity of the interface to do so. I’m going to show you perhaps the simplest application of it to demonstrate how to use it and start a conversation about how it might be useful.
Step 1: We need some data. Send a couple CREATE requests to our api. I’m going to go with:
curl -X POST -H "Content-Type: application/json" -d '{"name":"The Real Fake Book","isbn":"1234567891011"}' http://localhost:3000/api/books
and
curl -X POST -H "Content-Type: application/json" -d '{"name":"Paradise Lost (and Found)","isbn":"2345678910111"}' http://localhost:3000/api/books
Now think for a moment. What parts of the data do you actually want to render? You definitely want the name, you most likely want the ISBN value, but do you really need that ID value? Though the database needs it for indexing, someone looking for a book at the bookstore can’t do much with that number. So why bother sending that data to the client side? Let’s trim the fat. GraphQL to the rescue!
I’ll be honest. It took me a bit of time to understand how GraphQL worked. I went through a lot of tutorials that had code that didn’t quite work, and I didn’t have the know-how to fix the errors. It suddenly clicked, however, with this simple understanding:
To use GraphQL, you need to do three things:
- Give GraphQL a schema of the data.
- Tell GraphQL what parts of the data you want.
- Execute the request
Sound simple enough? Let’s roll.
From the root of the project, run the following:
mkdir schemas_and_queries && touch schemas_and_queries/book.js schemas_and_queries/books.js
Copy this code in schemas_and_queries/books.js and then let’s take a look at it:
module.exports = (buildSchema) => {
return {
booksSchema: buildSchema(`
type Book {
id: ID
name: String
isbn: String
}
type Query {
books(id: ID): [Book]
}`
),
booksQuery: `
{
books {
id
name
isbn
}
}`
}
};
FYI, we’re using that funky code at the top
module.exports = (buildSchema) => {
to take advantage of Javascript’s closure capabilities in order to only have to import the buildSchema function from GraphQL once. You’ll see what I mean soon enough.
Looking at the schema, we define the Book and Query types. name and isbn are both string types while id is the special ID type.
Looking at the Query type definition, you might notice its type is [Book]. This notation indicates that the return type is an array of the Book type. In typescript this would probably look something like Book[].
Looking at booksQuery, what we are doing here is telling GraphQl what parts of the data we want returned. We are saying we want to return the id, name, and isbn of each book.
Go to schemas_and_queries/book.js and copy-paste this code:
module.exports = (buildSchema) => {
return {
bookSchema: buildSchema(`
type Book {
id: ID
name: String
isbn: String
}
type Query {
book(id: ID): Book
}`
),
bookQuery: `
{
book {
id
name
isbn
}
}`
}
};
Take a quick look at it. It’s fairly similar to the books.js schema. Let’s install graphql. Run the following:
npm i —save graphql
Go the controllers/books-controller.js file. Let’s add some imports. Add this to the top of the file:
const { graphql, buildSchema } = require('graphql');
const { bookSchema, bookQuery } = require('../schemas_and_queries/book')(buildSchema);
const {booksSchema, booksQuery} = require('../schemas_and_queries/books')(buildSchema);
There are two things going on here. First, we’re using destructuring assignments. I hope you’re familiar with them. Once you start using them, you won’t want to go back. The second thing is passing buildSchema to our schemas_and_queries files. Technically we could require ‘buildSchema’ in each of the files, but I think this is a bit cleaner.
Let’s modify our .all and .get functions. The .all function should now be:
all: (req, res) => {
connection.query('SELECT * from books', async (err, rows) => {
if (!err) {
const response = await graphql(booksSchema, booksQuery, {books: rows});
res.setHeader('Content-Type', 'application/json');
res.status(200).send(JSON.stringify(
{
'result' : 'success',
'data': response.data
})
);
} else {
res.status(400).send(err);
}
});
},
The .get method should become:
get: (req, res) => {
connection.query('SELECT * from books where id = ?', [req.params.id], async (err, rows) => {
const response = await graphql(bookSchema, bookQuery, {book: rows[0]});
res.setHeader('Content-Type', 'application/json');
res.status(200).send(JSON.stringify(
{
'result' : 'success',
'data': response.data
})
);
})
},
OK, check out http://localhost:3000/api/books and http://localhost:3000/api/book/1
Hopefully they’re still rendering. If they are, you may be wondering why we went through all that effort just to get the same data. Well, check this out.
Go to schemas_and_queries/book.js, and change bookQuery to:
bookQuery: `
{
book {
name
isbn
}
}`
and then in schemas_and_queries/books.js, change booksQuery to:
booksQuery: `
{
books {
name
isbn
}
}`
Check out those api endpoints again. Notice anything missing? Perhaps the id values? Well there you have it, we just built the start of a scalable, easy to manage Node API that takes advantage of a migration system. Not too bad, right?
Catch you later, and look out for a part 2 in which we add authors or something.
Hi! I am a robot. I just upvoted you! I found similar content that readers might be interested in:
https://medium.com/@alexanderleon/creating-a-scalable-api-using-node-graphql-mysql-and-knex-710a1a475ff4
Great article!