4

Building Serverless with Docker

 3 years ago
source link: https://espressocoder.com/2020/12/13/building-serverless-with-docker/
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.
neoserver,ios ssh client

Building Serverless with DockerSkip to content

If you have experience with Azure Functions or AWS Lambdas, then the title may sound a bit like an oxymoron. One of the key benefits of “Function as a Service” (FaaS) or “serverless” offerings is developers do not have to worry about infrastructural concerns such as Virtual Machines, Containers, and the like. You create a function in your favorite language and ship it while effectively all administration is handled by your cloud provider. So, where does Docker come into play? In this article, we will discuss how you can use Docker and the serverless framework to supplement your local development experience.

Moving to the Cloud

Serverless

Several frameworks are available to create, build, and deploy serverless functions. I tend to gravitate towards the serverless framework. Not being native to Azure or AWS, the serverless framework has the following benefits.

It is also super intuitive and easy to use! While the serverless framework supports Azure, AWS, and GCP, we will be focusing on AWS as we advance. Getting started is a breeze. First, you will need to install the serverless CLI, which can be accomplished with the following command.

npm install -g serverless
npm install -g serverless

Next, type serverless, and a wizard will guide you through creating your new project.

Building a Serverless project

The serverless framework is a topic in itself so, we will not go into details in this article. That being said, there is a wealth of information on their website. If you are new to the serverless framework, please take a look at their getting started guide and AWS provider documentation.

Serverless Offline

One of my favorite aspects of the serverless framework is the extensibility through its plugin architecture. Coupling this with its strong community support provides developers with plugins for the majority of their development needs. The most popular of these is the serverless-offline plugin, which allows developers to run serverless functions behind a local HTTP server. By simulating AWS Lambda functions and API Gateway, developers can quickly run and debug their serverless applications entirely on their own machine!

As is with most plugins, installation is very straight forward. First, we need to install the plugin with npm.

npm install serverless-offline --save-dev
npm install serverless-offline --save-dev

Next, the plugin needs to be declared in the serverless.yml configuration file.

plugins:
- serverless-offline
plugins:
  - serverless-offline

An additional section can be added to the serverless.yml file to configure the plugin. Configuration changes are common to update the default behavior, such as the hosting port.

custom:
serverless-offline:
httpPort: 9999
custom:
  serverless-offline:
    httpPort: 9999

For further reading, view the GitHub readme here.

Visual Studio Code Remote Containers

Now that we have our serverless solution running locally, let’s start incorporating Docker into the mix. I use Visual Studio Code for the majority of my development work. Aside from being an excellent IDE, it comes with a handy feature called Remote Containers. Using a remote container allows us to do our development completely within a Docker container. Sorry guys, no more “it works on my machine” excuses.

Visual Studio Code walks us through creating the required files for running in a container. All we have to do is run the `Remote-Containers: Add Development Container Configuration Files’ command as shown below.

Adding Remote Containers to a Serverless project

Once complete, we have all the required files for running a development container. Executing the ‘Remote-Containers: Reopen in Container’ command will reopen Visual Studio Code within our new container. If we open the terminal, we will notice it is a bash shell! As before, we can run our application with the following command.

npx serverless offline
npx serverless offline

Pretty cool!

DynamoDB

As exciting as building serverless functions is, these functions seldom live in a vacuum. For instance, it is typical for a Lambda function in AWS to integrate with DynamoDB. In this case, how can I run my functions locally? Sure, I can point my Lambda to a Dynamo table in AWS, but this is not always desirable, especially for large teams. Enter DynamoDB local. DynamoDB local is a downloadable version of DynamoDB designed for local development. Lucky for us, Amazon also provides an easy to use Docker image, which we can run along with our serverless application. To configure serverless to use our local container, we must install another plugin, serverless-dynamodb-local. As with our other plugin, we need to install it with npm…

npm install serverless-dynamodb-local --save-dev
npm install serverless-dynamodb-local --save-dev

…and add it to our configuration file.

plugins:
- serverless-dynamodb-local
plugins:
  - serverless-dynamodb-local

Lastly, we will need to tell our lambda to connect to our local DynamoDB instance. This will require some slight modifications to how the instantiate our DynamoDB client, as shown below.

new AWS.DynamoDB.DocumentClient({
region: 'localhost',
endpoint: 'http://localhost:8000'
new AWS.DynamoDB.DocumentClient({
    region: 'localhost',
    endpoint: 'http://localhost:8000'
})

This plugin also has some pretty cool features, such as schema migrations and seeding your tables. For more information, please take a look at their github page.

Bringing it all together

Now that we have serverless AND DynamoDB running in a container, how can we bring the two together? This is where Visual Studio Code comes to the rescue again! We can configure our environment to run multiple containers via docker-compose. These enhancements can be done by making a few small updates to the devcontainer.json file.

"name": "Node.js",
"dockerComposeFile": "docker-compose.yml",
"service": "serverless.app",
"workspaceFolder": "/workspace",
// Set *default* container specific settings.json values on container create.
"settings": {
"terminal.integrated.shell.linux": "/bin/bash"
// Add the IDs of extensions you want installed when the container is created.
"extensions": [
"dbaeumer.vscode-eslint"
// Use 'forwardPorts' to make a list of ports inside the container available locally.
// "forwardPorts": [],
// Use 'postCreateCommand' to run commands after the container is created.
// "postCreateCommand": "yarn install",
// Comment out connect as root instead. More info: https://aka.ms/vscode-remote/containers/non-root.
"remoteUser": "node"
{
    "name": "Node.js",
    "dockerComposeFile": "docker-compose.yml",
    "service": "serverless.app",
    "workspaceFolder": "/workspace",
    // Set *default* container specific settings.json values on container create.
    "settings": { 
        "terminal.integrated.shell.linux": "/bin/bash"
    },
    // Add the IDs of extensions you want installed when the container is created.
    "extensions": [
        "dbaeumer.vscode-eslint"
    ],
    // Use 'forwardPorts' to make a list of ports inside the container available locally.
    // "forwardPorts": [],
    // Use 'postCreateCommand' to run commands after the container is created.
    // "postCreateCommand": "yarn install",
    // Comment out connect as root instead. More info: https://aka.ms/vscode-remote/containers/non-root.
    "remoteUser": "node"
}

Now we need to create a docker-compose.yml file. Of course, we can run any container our heart desires; however, three containers will do in our case. We will run one container for our serverless app, one for DynamoDB, and one for dynamodb-admin. Dynamodb-admin is a lightweight web application that is useful for managing your local DynamoDB instance.

Below is a copy of the docker-compose file I use.

version: '3'
services:
serverless.app:
build:
context: .
dockerfile: Dockerfile
args:
VARIANT: 12
volumes:
- ..:/workspace:cached
command: sleep infinity
dynamodb.local:
image: amazon/dynamodb-local
ports:
- "8000:8000"
volumes:
- ./db:/home/dynamodblocal/db
command: ["-jar", "DynamoDBLocal.jar", "-sharedDb", "-dbPath", "/home/dynamodblocal/db"]
dynamodb.admin:
image: aaronshaf/dynamodb-admin
ports:
- 8001:8001
environment:
- DYNAMO_ENDPOINT=http://dynamodb.local:8000 dynamodb-admin
version: '3'
services:
  serverless.app:
    build: 
      context: .
      dockerfile: Dockerfile
      args:
        VARIANT: 12
    volumes:
      - ..:/workspace:cached
    command: sleep infinity
  dynamodb.local:
    image: amazon/dynamodb-local
    ports:
      - "8000:8000"
    volumes:
      - ./db:/home/dynamodblocal/db
    command: ["-jar", "DynamoDBLocal.jar", "-sharedDb", "-dbPath", "/home/dynamodblocal/db"]
  dynamodb.admin:
    image: aaronshaf/dynamodb-admin
    ports:
      - 8001:8001
    environment: 
      - DYNAMO_ENDPOINT=http://dynamodb.local:8000 dynamodb-admin

Now when we open our project in the development container, we can navigate to dynamo-admin by browsing to http://localhost:8001. All this provides us with the ability to run our serverless functions and DynamoDB locally with the stability of a local Docker environment!

serverless with the dynamo-db admin app

I hope you enjoyed this article! Please feel free to share on social media via the links below. Happy Coding!

Like this:

Loading...

About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK