Testing DynamoDB + NodeJS App

main

Do you have an Express application with the DynamoDB database and want to set up continuous integration and deployment? This article for you.

Setting up the Testing Environment

According to the twelve-factor app, env vars are granular controls, each orthogonal to other env vars. We are going to look at the app where environment variables are the source of truth. We have three different places with env variables. First is the development environment(.env file), second is Bitbucket env variables, and the third one is AWS Elastic(staging).

PORT=5000
DYNAMO_TESTING_PORT=4567
SECRET=secret for jws token encryption
AWS_REGION=eu-west-2
AWS_ACCESS_KEY_ID=<AWS_ACCESS_KEY_ID>
AWS_SECRET_ACCESS_KEY=<AWS_SECRET_ACCESS_KEY>
JWT_LIFESPAN_IN_SECONDS=<seconds>
view raw .env hosted with ❤ by GitHub

Since AWS SDK has a global config object, we will run the setup code before we do any interactions with the database. It will specify credentials and point out that in a testing environment, we want to use local DynamoDB rather than a remote one.

const aws = require('aws-sdk')
module.exports = {
setupAWS: () => {
aws.config.update(
process.env.NODE_ENV === 'test'
? {
endpoint: `http://localhost:${process.env.DYNAMO_TESTING_PORT}`,
region: 'mock',
credentials: {
accessKeyId: 'accessKeyId',
secretAccessKey: 'secretAccessKey'
}
}
: {
credentials: {
accessKeyId: process.env.AWS_ACCESS_KEY_ID,
secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY
},
region: process.env.AWS_REGION
}
)
}
}
view raw setupAWS.js hosted with ❤ by GitHub

We launch local DynamoDB and create tables for tests before running tests. Once we executed tests, we delete the tables created before. To run these functions, we set paths to them in package.json. According to Jest documentation, both of them should return a promise.

{
"scripts": {
"test": "jest --forceExit",
},
"jest": {
"globalSetup": "./test/globalSetup",
"globalTeardown": "./test/globalTeardown"
},
}
view raw jest-package.json hosted with ❤ by GitHub

The logic of the setup function is pretty straightforward. If you are curious how represented tables params, check this post.

require('dotenv').config()
const aws = require('aws-sdk')
const localDynamo = require('local-dynamo')
const { setupAWS } = require('../src/utils/aws')
const { TABLES_PARAMS } = require('../../constants/db')
module.exports = async () => {
setupAWS()
localDynamo.launch(null, process.env.DYNAMO_TESTING_PORT)
const dynamodb = new aws.DynamoDB()
try {
const data = await Promise.all(
TABLES_PARAMS.map(params => dynamodb.createTable(params).promise())
)
console.log('Database created: ', JSON.stringify(data, null, 2))
} catch (err) {
console.error('Unable to create table: ', JSON.stringify(err, null, 2))
}
}
view raw globalSetup.js hosted with ❤ by GitHub

We are using the same approach in the teardown function.

const aws = require('aws-sdk')
const { TABLES_PARAMS } = require('../../constants/db')
module.exports = async () => {
const dynamodb = new aws.DynamoDB()
try {
const data = await Promise.all(
TABLES_PARAMS.map(({ TableName }) => dynamodb.deleteTable({ TableName }).promise()))
)
console.log('Deleted table: ', JSON.stringify(data, null, 2))
} catch (err) {
console.error('Delete fail: ', JSON.stringify(err, null, 2))
}
}
view raw globalTeardown.js hosted with ❤ by GitHub

Tests Termination

Sometimes we may abort tests before they finish running. In such cases, Jest won't execute globalTeardown. Better to delete tables first and after that create. Below you could see improved setup function.

require('dotenv').config()
const localDynamo = require('local-dynamo')
const { setupAWS } = require('../src/utils/aws')
const recreateTables = async (silent = true) => {
const database = new AWS.DynamoDB()
const existingTables = await database
.listTables()
.promise()
.then(data => data.TableNames)
await Promise.all(
existingTables.map(TableName =>
database
.deleteTable({ TableName })
.promise()
.catch(
err =>
!silent &&
console.error('Delete fail: ', JSON.stringify(err, null, 2))
)
.then(
data =>
!silent &&
data &&
console.log('Deleted table: ', JSON.stringify(data, null, 2))
)
)
)
const waiter = new AWS.ResourceWaiter(database, 'tableNotExists')
await Promise.all(
existingTables.map(TableName =>
waiter.wait({ TableName, $waiter: { delay: 1 } }).promise()
)
)
await Promise.all(
TABLES_PARAMS.map(params =>
database
.createTable(params)
.promise()
.catch(
err =>
!silent &&
console.error(
'Unable to create table: ',
JSON.stringify(err, null, 2)
)
)
.then(
data =>
!silent &&
data &&
console.log('Database created: ', JSON.stringify(data, null, 2))
)
)
)
const createdWaiter = new AWS.ResourceWaiter(database, 'tableExists')
await Promise.all(
TABLES_PARAMS.map(({ TableName }) =>
createdWaiter.wait({ TableName, $waiter: { delay: 1 } }).promise()
)
)
}
module.exports = () => {
localDynamo.launch(null, process.env.DYNAMO_TESTING_PORT)
return recreateTables()
}
view raw globalSetup2.js hosted with ❤ by GitHub

CI/CD

This time for continuous deployment, we will use Bitbucket pipelines. In pipelines, we need to download java and some packages required for AWS deployment.

image: node:latest
pipelines:
branches:
dev:
- step:
caches:
- node
script:
- apt-get update
- apt-get install -y default-jre
- npm install
- npm test
- apt-get install -y python-dev
- curl -O https://bootstrap.pypa.io/get-pip.py
- python get-pip.py
- pip install awsebcli --upgrade
- eb init <app> -r <region> -p node.js
- eb deploy <app>

To optimize pipeline execution time, we could make a Docker image with the required packages.

FROM node:latest
RUN apt-get update \
&& apt-get install -y default-jre \
&& apt-get install -y python-dev \
&& curl -O https://bootstrap.pypa.io/get-pip.py \
&& python get-pip.py \
&& pip install awsebcli --upgrade \
view raw Dockerfile hosted with ❤ by GitHub

I was lazy to download and configure docker on a local machine. Hopefully, there is a service on the Google cloud platform — Container Registry.

image:
name: <image>
username: _json_key
password: '$GCR_JSON_KEY'
pipelines:
branches:
dev:
- step:
caches:
- node
script:
- npm install
- npm test
- eb init <app> -r <region> -p node.js
- eb deploy <app>