Advanced GraphQL — Subscriptions

Originally published on Medium - 29th October 2019

If you read my previous posts on GraphQL, you’ll know that I’m a fan. I left it at the basics however. So starting with this post, we’ll dive into some of the more advanced parts of GraphQL. And when I say advanced, I don’t mean tricky, it’s more than most introductory tutorials to GraphQL don’t cover these things.

One limitation of REST is that it is request based. You will only get data when you request it. A lot of people think this is the same for GraphQL, but as part of the specification, it does support subscriptions and push responses. While there is no transport actually specified in the spec, generally most just use HTTP(S) for their transport. Unfortunately, HTTP is a request based protocol and not suitable for subscriptions. GraphQL gets around this by using websockets, which allow you to keep a connection open and just push data down that pipe as needed.

So how does this work with GraphQL, given that in our requests, we specify the fields we want? It works in much the same way. When you make your subscription request, you specify the fields you want then.

In the following, I’m using both Apollo Server and Apollo Client. The last tutorial I used Go, and that’s still valid and similar to Apollo (if you use a library such asgqlgen). I'll also be using React on the frontend with the latest hooks features of Apollo. These have been added since my last GraphQL post. If you haven't read that, you won't miss much, but if you have, this will give you a bit of an idea on how to move from the render-props method of Apollo to hooks.

We’ll start with a very simple chat app. Clients connect and get a history of all the existing chats (using basic GraphQL) then get updates via a subscription. Initially we’ll keep the history in memory and then iterate on this to hook in to Redis’s Pub/Sub mechanism. Here we are only using Redis as an example, but there are plenty of different libraries to hook into Kafka, Postgresql etc with the same interface.

Let’s dig into the server. Open a command prompt and create a new project

mkdir graphql-chat-server
yarn init -y
yarn add apollo-server graphql

Next, fire up your editor of choice and create an index.js

const {
  ApolloServer,
  gql,
  PubSub,
} = require('apollo-server');
 
// The GraphQL schema
const typeDefs = gql `
  type Subscription {
    postAdded: Post
  }
 
  type Query {
    posts: [Post]
  }
 
  type Mutation {
    addPost(author: String, comment: String): Post
  }
 
  type Post {
    author: String
    comment: String
  }
`;
 
// Publish/Subscribe listener
const pubsub = new PubSub();
 
// Our in memory store for posts
const posts = [];
const addPost = item => {
  posts.push(item);
  return item;
};
 
const POST_ADDED = 'POST_ADDED';
 
const resolvers = {
  Subscription: {
    postAdded: {
      // Additional event labels can be passed to asyncIterator creation
      subscribe: () => pubsub.asyncIterator([POST_ADDED]),
    },
  },
  Query: {
    posts(root, args, context) {
      return posts;
    },
  },
  Mutation: {
    addPost(root, args, context) {
      pubsub.publish(POST_ADDED, {
        postAdded: args
      });
      return addPost(args);
    },
  },
};
 
// The actual server
const server = new ApolloServer({
  typeDefs,
  resolvers,
});
 
server.listen().then(({
  url,
  subscriptionsUrl
}) => {
  console.log(`🚀 Server ready at ${url}`);
  console.log(`🚀 Subscriptions ready at ${subscriptionsUrl}`);
});

The fact we have a Subscription section in our GraphQL schema causes the ApolloServer class to create a websocket server alongside our regular http server.

You’ll notice the pubsub.publish call within the Mutation resolver, this is what actually notifies all watching clients that something has changed.

This server is pretty simple to run, simply type:-

node index.js

On the client side, we’ll create a simple React App create a new project

create-react-app graphql-chat-client
cd graphql-chat-client
yarn add @apollo/react-hooks apollo-cache-inmemory apollo-client apollo-link-http apollo-link-ws graphql graphql-tag subscriptions-transport-ws

Once again, fire up your editor. Let’s create a couple of components for displaying our chat. Firstly AddChat.js

import React from "react";
import gql from "graphql-tag";
import { useMutation } from "@apollo/react-hooks";
 
const ADD_POST = gql`
  mutation AddPost($author: String!, $comment: String!) {
    addPost(author: $author, comment: $comment) {
      author
      comment
    }
  }
`;
 
export default (props) => {
  let name, comment;
  const [addPost] = useMutation(ADD_POST);
  return (
    <div className="AddChat">
      <form
        onSubmit={e => {
          e.preventDefault();
          addPost({
            variables: { author: name.value, comment: comment.value }
          });
          comment.value = "";
        }}
      >
        <label htmlFor="name">Name:</label>
        <input
          type="text"
          name="name"
          id="name"
          ref={node => {
            name = node;
          }}
        />
        <br />
        <label htmlFor="comment">Comment:</label>
        <input
          type="text"
          name="comment"
          id="comment"
          ref={node => {
            comment = node;
          }}
        />
        <br />
        <input type="submit" value="Send" />
      </form>
    </div>
  );
};

This is a simple component with a textbox for name and comment and a ‘send’ button. Generally I would break this up one level further having components for holding a label and text field next to each other, but I’ve kept it as one for ease of the blog post.

Beyond that, we have an ADD_POST GraphQL query. A basic mutation which sends the author and comment and receives the same.

We have our functional component with a basic hook inside for calling the mutation. Our form submit calls this passing the appropriate variables — as a side note, whenever doing this sort of work, doing so on the form submit is preferable to doing it in the button as then you get the benefit of submitting upon the user hitting ‘enter’ without having to add key listeners and the like.

Next component is our Chat window itself. ChatWindow.js

import React, { useEffect } from "react";
import gql from "graphql-tag";
import { useQuery } from "@apollo/react-hooks";
 
const INITIAL_CHAT = gql`
  query {
    posts {
      author
      comment
    }
  }
`;
 
const SUBSCRIPTION = gql`
  subscription {
    postAdded {
      author
      comment
    }
  }
`;
 
export default () => {
  const { subscribeToMore, loading, error, data } = useQuery(INITIAL_CHAT);
 
  useEffect(
    () =>
      subscribeToMore({
        document: SUBSCRIPTION,
        updateQuery: (prev, { subscriptionData }) => {
          if (!subscriptionData.data) return prev;
          const newFeedItem = subscriptionData.data.postAdded;
          console.log(newFeedItem);
          return {
            posts: [...prev.posts, newFeedItem]
          };
        }
      }),
    [subscribeToMore]
  );
 
  if (loading) return <p>Loading...</p>;
  if (error) return <p>Error</p>;
 
  return (
    <div>
      {data.posts.map((val, idx) => (
        <div key={idx}>{`${val.author}:${val.comment}`}</div>
      ))}
    </div>
  );
};

Once again, a fairly simple component. The JSX itself just iterates over all posts and creates a div for each with the string inside it. In a real world app, you’d probably make this a bit nicer by splitting the author from the comment.

We have two GraphQL queries here one for the initial request — INITIAL_CHAT — which gets all the saved chat history, then we have our subscription, which is what we call to set up our websocket connection.

Our initial GraphQL query hook can return a function ‘subscribeToMore’ which sets up our listener for when we receive more data along with the actual call to the server to subscribe.

We use the React hook ‘useEffect’ to create our connection. This will only be called when the component mounts (well, technically it’ll also be called if subscribeToMore changes, but this should never happen).

There are two important parameters here: the document, which is our subscription GraphQL query and the updateQuery. The updateQuery parameter is for updating the local Apollo cache.

Finally, let’s modify the App.js file. Add the following to the imports

import ChatWindow from './ChatWindow';
import AddChat from './AddChat';
import { WebSocketLink } from 'apollo-link-ws';
import { split } from 'apollo-link';
import { HttpLink } from 'apollo-link-http';
import { getMainDefinition } from 'apollo-utilities';
import { ApolloClient } from 'apollo-client';
import { ApolloProvider } from '@apollo/react-hooks';
import { InMemoryCache } from 'apollo-cache-inmemory';

Next, add the following above the component

const httpLink = new HttpLink({
  uri: 'http://localhost:4000/graphql'
});
 
const wsLink = new WebSocketLink({
  uri: `ws://localhost:4000/graphql`,
  options: {
    reconnect: true
  }
});
 
const link = split(
  // split based on operation type
  ({ query }) => {
    const definition = getMainDefinition(query);
    return (
      definition.kind === 'OperationDefinition' &&
      definition.operation === 'subscription'
    );
  },
  wsLink,
  httpLink,
);
 
const client = new ApolloClient({
  link,
  cache: new InMemoryCache()
});

This creates our client. Unfortunately, Apollo Boost can’t be used for subscriptions because we need to create our dual link (handling both the http connection and the websocket). You’ll notice that both connections connect to the same port on the server, with only the protocol being different.

Replace the existing App component with the following

function App() {
  return (
    <div className="App">
      <ApolloProvider client={client}>
        <ChatWindow /><br />
        <AddChat />
      </ApolloProvider>
    </div>
  );
}

Now modify the App.css and add the following

.ChatWindow {
  height: 800px;
  width: 100%;
}

And as simple as that, we’re done. Fire up your client from the command line with the following.

yarn start

For some real fun open a second browser and point it to the same URL http://localhost:5000and enjoy chatting to yourself.

So that’s all well and good, but how about we persist the data on the server. Even better, we’ll use the built-in Pub/Sub mechanism of Redis, meaning we could update this data elsewhere (possibly another app) and have it update all the connected clients.

We’ll also put this into a docker container so we can stand up Redis and the Chat Server at the same time.

Start by updating the server dependencies

yarn add async-redis graphql-redis-subscriptions nodemon redis

Add/Update the scripts section to your package.json

"scripts": {
  "start": "./node_modules/.bin/nodemon index.js"
}

Next, modify your index.js requires with

const {
  ApolloServer,
  gql,
} = require('apollo-server');
const redis = require("redis");
const {
  RedisPubSub
} = require("graphql-redis-subscriptions");
const {
  promisify
} = require('util');

We have removed the PubSub from the apollo-server imports and brought in the Redis version instead. We’ve also brought in the Redis client and the promisfy library from node so we can use async calls rather than callbacks.

Next, we set up our Redis client, promisfy the functions we’ll be using, and create our Pub/Sub connection.

console.log(`Connecting to Redis: ${process.env.redis}`);
const store = redis.createClient(6379, process.env.redis);
const lRangeRedis = promisify(store.lrange).bind(store);
const rPushRedis = promisify(store.rpush).bind(store);
const options = {
  host: process.env.redis,
  port: 6379
}
const pubsub = new RedisPubSub({
  publisher: redis.createClient(6379, process.env.redis),
  subscriber: redis.createClient(6379, process.env.redis)
});

We now need to update our addPost function to use Redis

const addPost = async item => {
  await rPushRedis("messages", JSON.stringify(item));
  return item;
};

The final change we need to make is in our Query resolver. This is just to grab the data from Redis rather than our memory store

Query: {
  posts(root, args, context) {
    const allMsg = new Promise((resolve, reject) => {
      lRangeRedis("messages", "0", "-1").then((vals) => {
        resolve(vals.map(val => JSON.parse(val)));
      });
    });
    return allMsg;
  }
}

Let’s tie all this together using Docker by creating a docker-compose.yml

version: "2"
 
services:
  chatServer:
    image: node:10-alpine
    container_name: chat_server
    ports:
      - "3000:3000"
      - "4000:4000"
    volumes:
      - ./:/home/node/app
    user: "node"
    working_dir: /home/node/app
    command: "npm start"
    depends_on:
      - "chat_db"
    environment:
      redis: "chat_db"
  chat_db:
    image: redis:latest
    container_name: chat_db

Earlier we set up nodemon in the package.json file, here, we have mapped our source into the container. Now when you update your source code, it’ll automatically restart your node server. Because we’re persisting the data into Redis, you’ll no longer lose chat history when restarting.

Bring up your server with

docker-compose up -d

From a client perspective, this will look exactly the same as before (go ahead and try it), but now we’re saving the data, and any time we publish to “message” within Redis, the clients will now update.

For the simple use case that we have presented here, the addition of Redis is be overkill. However if you have an existing service that could benefit from this (for example, an info screen that currently polls a backend), this could be a real win for the amount of bandwidth used, and frequency of updates.