Ricardo Borges

Personal blog

Learning GraphQL by building a chat application - Part 1

When I decided to learn GraphQL I knew that the best way to do it was implementing its concepts, so I figure out that develop a chat application were a way to achieve my goal because would allow me to put into practice all GraphQL features, that’s what this post is about: Learning some GraphQL concepts by building a chat application.

Our application will be split into two parts, back-end, and front-end, as well as these posts, in this first post we will develop the server-side, to do so we’ll use NodeJS, Apollo Server, and of course GraphQL, we also will need a database and a query builder module, I used Knex and MySQL.

Before we continue, all the code is in this repository.

Initial setup

Ok, first things first, let's start by creating the project and installing its dependencies.

Inside project folder:

npm init

And:

npm i apollo-server bcrypt dotenv graphql jsonwebtoken knex lodash mysql

npm i --save-dev @babel/cli @babel/core @babel/node @babel/plugin-transform-runtime @babel/preset-env babel-jest jest nodemon standard

In the scripts section of package.json put the following commands:

1   "start": "nodemon --exec babel-node ./src/index.js",
2    "test": "jest",
3    "test:watch": "jest --watch",
4    "migrate": "knex migrate:latest",
5    "unmigrate": "knex migrate:rollback",
6    "seed": "knex seed:run",
7    "lint": "standard",
8    "lint:fix": "standard --fix"

In the root folder create a .babelrc file:

1{
2  "presets": [
3    "@babel/preset-env"
4  ],
5  "env": {
6    "test": {
7      "plugins": [
8        "@babel/plugin-transform-runtime"
9      ]
10    }
11  }
12}

Also in the root folder create a .env file, this file contains the project’s environment variables:

1NODE_ENV=development
2
3DB_HOST=localhost
4DB_USER=root
5DB_PASSWORD=toor
6DB_NAME=chat
7
8SECRET=secret

The first variable is the environment, let's leave it as development for now, the next four variables is database host, user, password, and name, for these you can set the values accordingly to your database configuration. The last one is the secret value that we’ll use later in the authentication.

Feel free to configure any relational database, I used MySQL, if want to use another, like PostgreSQL, you'll just have to do a different setup in the knexfile.js.

Database and models

In this section we’ll configure our database and implement our models, in the root folder create a knexfile.js file, it contains database configuration for development, test, and production environments:

1require('dotenv').config()
2
3module.exports = {
4
5  development: {
6    client: 'mysql',
7    connection: {
8      host: process.env.DB_HOST,
9      user: process.env.DB_USER,
10      password: process.env.DB_PASSWORD,
11      database: process.env.DB_NAME
12    },
13    migrations: {
14      directory: './src/data/migrations'
15    },
16    seeds: { directory: './src/data/seeds' }
17  },
18
19  test: {
20    client: 'mysql',
21    connection: {
22      host: process.env.DB_HOST,
23      user: process.env.DB_USER,
24      password: process.env.DB_PASSWORD,
25      database: process.env.DB_NAME
26    },
27    migrations: {
28      directory: './src/data/migrations'
29    },
30    seeds: { directory: './src/data/seeds' }
31  },
32
33  production: {
34    client: 'mysql',
35    connection: {
36      host: process.env.DB_HOST,
37      user: process.env.DB_USER,
38      password: process.env.DB_PASSWORD,
39      database: process.env.DB_NAME
40    },
41    migrations: {
42      directory: './src/data/migrations'
43    },
44    seeds: { directory: './src/data/seeds' }
45  }
46}

In src/data/ we can store our database migrations, seeds, and a file that export a database object with the configurations from the knexfile.js:

1# src/data/db.js
2
3import knex from 'knex'
4import knexfile from '../../knexfile'
5
6const env = process.env.NODE_ENV || 'development'
7const configs = knexfile[env]
8const database = knex(configs)
9
10export default database

Now let’s create our migrations, run:

knex migrate:make user knex migrate:make message

The generated files are in the directory configured in knexfile.js, they must have the following contents:

1// src/data/migrations/20200107121031_user.js
2
3exports.up = (knex) =>
4  knex.schema.createTable('user', table => {
5    table.bigIncrements('id').unsigned()
6    table.string('name').notNullable()
7    table.string('email').notNullable()
8    table.string('password').notNullable()
9  })
10
11exports.down = (knex) => knex.schema.dropSchemaIfExists('user')
1// src/data/migrations/20200107121034_message.js
2
3exports.up = (knex) =>
4  knex.schema.createTable('message', table => {
5    table.bigIncrements('id').unsigned()
6    table.string('message').notNullable()
7    table.bigInteger('senderId').unsigned().references('id').inTable('user')
8    table.bigInteger('receiverId').unsigned().references('id').inTable('user')
9  })
10
11exports.down = function (knex) {
12  knex.schema.dropSchemaIfExists('message')
13}

can run our migrations, the following commands will create user and message tables in the database and populate it.

npm run migrate

Next, we create our models, let’s start by creating the Model class, it contains common methods used by other models that will extend it:

1// src/model/Model.js
2
3export default class Model {
4  constructor (database, table) {
5    this.database = database
6    this.table = table
7  }
8
9  all () {
10    return this.database(this.table).select()
11  }
12
13  find (conditions) {
14    return this.database(this.table).where(conditions).select()
15  }
16
17  findOne (conditions) {
18    return this.database(this.table).where(conditions).first()
19  }
20
21  findById (id) {
22    return this.database(this.table).where({ id }).select().first()
23  }
24
25  insert (values) {
26    return this.database(this.table).insert(values)
27  }
28}

Then we create User and Message models, notice that in the User model there is a method to generate a token using the environment variable SECRET that we defined before, also there are methods to find a user by a token and to retrieve a user's messages.

1// src/model/User.js
2
3import Model from './Model'
4import bcrypt from 'bcrypt'
5import jwt from 'jsonwebtoken'
6
7export class User extends Model {
8  constructor (database) {
9    super(database, 'user')
10  }
11
12  async hash (password) {
13    return bcrypt.hash(password, 10)
14  }
15
16  async compare (hash, password) {
17    return bcrypt.compare(password, hash)
18  }
19
20  generateToken (user) {
21    /* knex return a RowDataPacket object and jwt.sign function
22      expects a plain object, stringify and parse it back does the trick */
23    return jwt.sign(
24      JSON.parse(JSON.stringify(user)),
25      process.env.SECRET,
26      {
27        expiresIn: 86400
28      }
29    )
30  }
31
32  async getUserByToken (token) {
33    try {
34      const decoded = jwt.verify(token, process.env.SECRET)
35      return decoded
36    } catch (error) {
37      console.log(error)
38      return null
39    }
40  }
41
42  async getMessages(senderId, lastId) {
43    return this.database('message')
44      .where('id', '>', lastId)
45      .andWhere(q => q.where({ senderId: senderId })
46        .orWhere({ receiverId: senderId }))
47      .limit(10)
48  }
1// src/model/Message.js
2
3import Model from './Model'
4
5export class Message extends Model {
6  constructor (database) {
7    super(database, 'message')
8  }
9
10  async getConversation (senderId, receiverId, lastId) {
11    return this.database('message')
12      .where('id', '>', lastId)
13      .andWhere({ senderId })
14      .andWhere({ receiverId })
15      .limit(10)
16  }
17
18}

Now we have to export all those models, for the sake of organization, I’ve created a index.js file in src/model that export an object models containing all our models.

1// src/model/index.js
2
3import database from '../data/db'
4import { User } from '../model/User'
5import { Message } from '../model/Message'
6
7const user = new User(database)
8const message = new Message(database)
9
10const models = {
11  user,
12  message
13}
14
15export default models

Schema

Finally, we’ll deal with GraphQL, let’s start with the schema, but what is the schema? The schema uses GraphQL schema language to define a set of types that our application will provide, a type can be, among others, a query, a mutation, a subscription, an object type, or a scalar type.

The query type defines the possible queries that our application will provide, for example, fetch all messages.

Mutation type is like queries but allows to modify data, for example, send a message.

Subscription allows the server to send data to a client when an event happens, usually is implemented with WebSockets, for example, in our chat application when a client sends a message, the receiver client must receive that message without request it to the server.

Object type defines an object that our application allows to be fetched, like user or message.

And scalar types, well, Object type has fields and these fields must have a value of some type, like string or int, these types are scalar types, the possible scalar types are Int, String, Float, Boolean, and ID. In some GraphQL implementations is possible to specify custom scalar types. When we use ! means that the field is non-nullable and our service promises to return a non-nullable value. If we want to specify that our service will return an array we use [], for example, [String]!.

Our GraphQL schema could be defined entirely in a single file, but as our application grows, that file would become a mess, so I decide to separate the schema into files that represent entities, so we’ll have a file to define user schema and another to define message schema, also there will be a file to bring all schema together, let’s start with this file:

1// src/schema/index.js
2
3import { merge } from 'lodash'
4import { gql, makeExecutableSchema } from 'apollo-server'
5import {
6  typeDef as User,
7  resolvers as userResolvers
8} from './user'
9
10import {
11  typeDef as Message,
12  resolvers as messageResolvers
13} from './message'
14
15const Query = gql`
16  type Query {
17    _empty: String
18  }
19  type Mutation {
20    _empty: String
21  }
22  type Subscription {
23    _empty: String
24  }
25`
26export const schema = makeExecutableSchema({
27  typeDefs: [Query, User, Message],
28  resolvers: merge(userResolvers, messageResolvers)
29})

Next, we create the user and message schemas, you will notice that in each file there is an object called resolvers we will talk about it in a bit. Also notice that when we define the schema in the const typeDef we are extending the types Query, Mutation, and Subscription, we have to do this way because a GraphQL schema must have only one of each of these types.

1// src/schema/message.js
2
3import { gql } from 'apollo-server'
4
5export const subscriptionEnum = Object.freeze({
6  MESSAGE_SENT: 'MESSAGE_SENT'
7})
8
9export const typeDef = gql`
10  extend type Query {
11    messages(cursor: String!): [Message!]!
12    conversation(cursor: String!, receiverId: ID!): [Message!]!
13  }
14  extend type Subscription {
15    messageSent: Message
16  }
17  extend type Mutation {
18    sendMessage(sendMessageInput: SendMessageInput!): Message!
19  }
20  type Message {
21    id: ID!
22    message: String!
23    sender: User!
24    receiver: User!
25  }
26  input SendMessageInput {
27    message: String!
28    receiverId: ID!
29  }
30`
31
32export const resolvers = {
33  Query: {
34    messages: async (parent, args, { models, user }, info) => {
35      if (!user) { throw new Error('You must be logged in') }
36
37      const { cursor } = args
38      const users = await models.user.all()
39      const messages = await models.user.getMessages(user.id, cursor)
40
41      const filteredMessages = messages.map(message => {
42        const sender = users.find(user => user.id === message.senderId)
43        const receiver = users.find(user => user.id === message.receiverId)
44        return { ...message, sender, receiver }
45      })
46
47      return filteredMessages
48    },
49
50    conversation: async (parent, args, { models, user }, info) => {
51      if (!user) { throw new Error('You must be logged in') }
52
53      const { cursor, receiverId } = args
54      const users = await models.user.all()
55      const messages = await models.message.getConversation(user.id, receiverId, cursor)
56
57      const filteredMessages = messages.map(message => {
58        const sender = users.find(user => user.id === message.senderId)
59        const receiver = users.find(user => user.id === message.receiverId)
60        return { ...message, sender, receiver }
61      })
62
63      return filteredMessages
64    }
65  },
66
67  Subscription: {
68    messageSent: {
69      subscribe: (parent, args, { pubsub, user }, info) => {
70        if (!user) { throw new Error('You must be logged in') }
71
72        return pubsub.asyncIterator([subscriptionEnum.MESSAGE_SENT])
73      }
74    }
75  },
76
77  Mutation: {
78    sendMessage: async (parent, args, { models, user, pubsub }, info) => {
79      if (!user) { throw new Error('You must be logged in') }
80
81      const { message, receiverId } = args.sendMessageInput
82
83      const receiver = await models.user.findById(receiverId)
84
85      if (!receiver) { throw new Error('receiver not found') }
86
87      const result = await models.message.insert([{
88        message,
89        senderId: user.id,
90        receiverId
91      }])
92
93      const newMessage = {
94        id: result[0],
95        message,
96        receiver,
97        sender: user
98      }
99
100      pubsub.publish(subscriptionEnum.MESSAGE_SENT, { messageSent: newMessage })
101
102      return newMessage
103    }
104  }
105}
1// src/schema/user.js
2
3import { gql } from 'apollo-server'
4
5export const typeDef = gql`
6  extend type Query {
7    users: [User!]!
8  }
9  extend type Mutation {
10    createUser(createUserInput: CreateUserInput!): User!
11    login(email: String!, password: String!): String!
12  }
13  type User {
14    id: ID!
15    name: String!
16    email: String!
17    password: String!
18  }
19  input CreateUserInput {
20    name: String!
21    email: String!
22    password: String!
23  }
24`
25
26export const resolvers = {
27  Query: {
28    users: async (parent, args, { models, user }, info) => {
29      if (!user) { throw new Error('You must be logged in') }
30
31      const users = await models.user.all()
32      return users
33    }
34  },
35
36  Mutation: {
37    createUser: async (parent, args, { models }, info) => {
38      const { name, email, password } = args.createUserInput
39      const user = await models.user.findOne({ email })
40
41      if (user) { throw new Error('Email already taken') }
42
43      const hash = await models.user.hash(password)
44
45      const result = await models.user.insert([{
46        name,
47        email,
48        password: hash
49      }])
50
51      return {
52        id: result[0],
53        password: hash,
54        name,
55        email
56      }
57    },
58
59    login: async (parent, args, { models }, info) => {
60      const { email, password } = args
61
62      const user = await models.user.findOne({ email })
63
64      if (!user) { throw new Error('Invalid credentials') }
65
66      if (!await models.user.compare(user.password, password)) { throw new Error('Invalid credentials') }
67
68      return models.user.generateToken(user)
69    }
70  }
71}

Each file has the schema defined in the const typeDef and the resolvers for this schema are in the resolver object.

So what are that resolvers objects? Resolvers contain the logic that will be executed when a query, mutation, or subscription defined in our application schema is called. They are functions that accept the following arguments:

parent The object that contains the result returned from the resolver on the parent field

args The arguments passed to the query, for example, the login mutation receives email and password arguments

context Is an object shared by all resolvers, in our application it contains the model object that we defined before and the logged in user.

info Contains information about the execution state of the query

So if you want to define the resolvers for the type Query put them in the Query, if want to define for Mutation type, put inside the Mutation object, and so on.

About pagination, I chose to use cursor-based pagination, you can see in the messages query in message schema, that query accepts a cursor as an argument, yes, we can pass arguments to GraphQL queries, the cursor value is the ID of the last message returned.

Now we have one last thing to do, and that is to define the application entry point (src/index.js):

1//src/index.js
2
3import { ApolloServer, PubSub } from 'apollo-server'
4
5import { schema } from './schema'
6import models from './model/index'
7
8const pubsub = new PubSub()
9
10const getUser = async (req, connection) => {
11  let user = null
12
13  if (req && req.headers.authorization) {
14    const token = req.headers.authorization.replace('Bearer ', '')
15    user = await models.user.getUserByToken(token)
16  } else if (connection && connection.context.Authorization) {
17    const token = connection.context.Authorization.replace('Bearer ', '')
18    user = await models.user.getUserByToken(token)
19  }
20
21  return user
22}
23
24const server = new ApolloServer({
25  schema,
26  context: async ({ req, res, connection }) => {
27    return {
28      models,
29      pubsub,
30      user: await getUser(req, connection)
31    }
32  }
33})
34
35server.listen().then(({ url }) => {
36  console.log(`🚀  Server ready at ${url}`)
37})

Here we create an instance of ApolloServer with the schema we defined before, in the option context we set which resources will be available for the resolvers in the context argument before we return these resources we check if there is a logged in user using the token that we will receive from the request if you use express you can put the logic of fetching a user by a token in a middleware like in this example

The server will run in the default URL http://localhost:4000/, where you can test the application making some queries in GraphQL playground, you can learn more about here.

In the part two we will develop the front-end using Apollo Client and ReactJS.