Quantcast
Channel: Scalability – The Stream Blog
Viewing all articles
Browse latest Browse all 14

The Stream Firehose with SQS, Lambda, and Redis

$
0
0

Introduction

As of late August, 2016, Stream has officially opened a Firehose to its real-time platform, allowing users to listen to all feed changes via Amazon SQS or webhooks. The Firehose is an addition to the existing websocket framework and was put in place to allow for speed improvements in applications where real-time aspects were important to their core functionality (games, chat, etc.).

We have a real-time infrastructure that can support round trip times of sub 500 millisecond between posting an activity and an activity showing up.

This blog post will get you up to speed on how to integrate the Firehose into your existing or new Stream application by walking you through the best practices for the Firehose setup process. At the end of this post, you will understand how to generate emails and/or push notifications using a serverless infrastructure for your application.

Prerequisites

This demo uses Node.js and, while it may work on earlier releases, we recommend the 4.3.2 runtime, as this is the version supported by AWS Lambda.

Additionally, you’ll need the following:

Getting Started

To get started, let’s briefly touch on the core technologies that we’ll be using in this tutorial: AWS Lambda, Simple Queue Service (SQS) and Redis.

AWS Lambda allows you to implement scalable and fault tolerant applications without the need of a single virtual machine.

Amazon SQS is a managed message queuing service. SQS offers fault tolerant and scalable distributed message queues: simple to use, but very powerful.

Redis is an open source, in-memory data structure store, used as database, cache and message broker.

The combination of these three services allows you to receive and process data from Stream Firehose: SQS handles incoming messages, Lambda processes the queued messages, and ElastiCache / Redis will serve as datastore for multiple messages.

image00

1. Create an IAM Role on AWS

The fastest way to get up and running with an IAM role on AWS is to follow the steps found at http://docs.aws.amazon.com/lambda/latest/dg/with-s3-example-create-iam-role.html.

Save your private key and secret from this setup, you’ll need them in the next step.

2. Creating an App on Stream

Let’s kick things off by creating an application on Stream. Sign in or register a new account at http://getstream.io. Once you’re logged in, click “Create App” in the top right corner of the screen.

This will open a modal where you define your app’s credentials. For now, let’s keep things simple and apply an app name (you can play around with the rest of the settings later). You can call your app whatever you’d like; for this demo, we’re going to name the application “firehose-sqs-lambda”.

image07

Note: App Names are globally unique, so you’ll need to be creative when assigning an application name.

Next up, we’ll need to create a new Feed Group. Let’s use the Flat Timeline Group, and call it “timeline” for simplicity. Ensure “Realtime Notifications” are enabled, as this is the primary feature we’re going to be looking at throughout the rest of this tutorial.

image03

Now that we have a working application, let’s make a few adjustments to allow for real-time support via SQS. Click on the blue link to your application (in this case, “timeline”). Next, click on “Realtime Configuration”. Disable Websockets and make SQS active by clicking on the toggle.

image02

As seen in the above image, you’ll see a field for AWS Key and AWS Secret.

Since we have not yet created our SQS Queue, we don’t have a URL. Let’s keep this screen open and move on to the next step of the post. We’ll return to this page in a moment.

3. Create a New SQS Queue

In a new browser tab, log into AWS and head over to the SQS section and click “Create New Queue”. Similar to Stream, you will be prompted with a modal asking for some information. Use the screenshot below to guide you through which settings to enable:

image06

Once you create your queue, you’ll see your SQS Queue “Details” at the bottom of the page. Copy the URL and go back to the GetStream.io settings window left open from the previous step, and populate the SQS URL field in Stream.

image01

4. Create a Lambda Function

Back at AWS, go to the Lambda section and click on “Create a Lambda Function” (currently a blue button in the top left hand corner). This button will open a blueprint screen, where you can narrow down your Lambda function to a preset “blueprint” – in our case: “sqs-poller”. Click on the title of the SQS Poller blueprint and it will bring you to a configuration screen.

You’ll need to configure your Lambda function and enable your trigger. Your settings should match the following before you click the “Next” button:

image04

We’ve prepared some examples to get you up and running quickly, but you’re welcome to write your own code as well. Clone the repo github.com/getstream/firehose-sqs-lambda into a directory of your choice. For this tutorial will we focus on three main files from the repo:

  1. config.js: holds on to our configuration variables used throughout the codebase.
  2. index.js: contains the handler function for Lambda.
  3. scheduler.js: is a simple set of functions that periodically pull from Redis and notify users via push notification on their mobile devices.

Each file above is heavily commented, so you shouldn’t run into any issues when attempting to modify the file.

We’ll need to make some changes to the config.js file, which contains credentials for your Redis instance and AWS. In that file, you will see the following variables that you need to update:


'use strict'

module.exports = {
    name: 'YOUR_APP_NAME',
    redis: {
        port: 6379,
        host: 'YOUR_REDIS_HOST'
    }
}

Next, let’s make sure to run npm install.

5. Uploading Your Code & Configuring Your Settings

Back at AWS where you were setting up your triggers, we need to upload a ZIP file of the code.

Note: One common mistake that people make is uploading their code. It sounds simple, but many people don’t realize that you need to zip the contents of the directory, not the directory itself. We recommend using a build tool similar to node-lambda on npm. It is a CLI for locally running and remotely deploying your node.js applications to AWS Lambda.

Assuming you’re on a Mac, let’s select the contents of the directory, right click, and select “Compress X Items” (“X” will be a number of files).

At AWS, choose “Upload a .ZIP file” from the “Code entry type” dropdown.

image08

Click on the “Upload” button next to “Function package” and choose the archive that you recently zipped. Continue to follow the prompts, keeping the defaults in place. Once you’re finished, click the “Next” button and your code will be uploaded to AWS!

6. Running the Scheduler

The script scheduler.js is a simple set of functions meant to show you how to pull a specific number of items off the Redis queue. The example code pulls the last 10 activity events, and sends them to the notify function where they are consumed and sent out via push notification.

Note: The iOS and Android push notification code is only for demo purposes. In theory, the code will work if uncommented and installed properly.

While scheduler.js does not fully represent how you would replicate a push notification to your user, it serves as a great blueprint as to how to do so.

Conclusion

You should now have a fully functioning Firehose that will allow you to offer your users near real-time updates on your Stream application!

You have now seen how to use SQS to handle incoming messages, how to use Lambda to process them, and how Redis can be used to store messages to allow batch sending to your users.

Let us know what you’ve learned, how easy you found this tutorial, and if you have any questions about the process; we always love to hear from you.

The post The Stream Firehose with SQS, Lambda, and Redis appeared first on The Stream Blog.


Viewing all articles
Browse latest Browse all 14

Trending Articles