top of page

Using the AWS CDK to build scheduled Lambda Functions

25 nov 2019

8 min. leestijd

1

7

Complete sample for creating Lambda Functions including a CI/CD pipeline using infrastructure as code



At Hatch, we typically use Terraform, Serverless or AWS SAM to script our cloud infrastructure. You’re probably familiar with at least one of these tools, but since one year there’s a new kid on the block, the AWS Cloud Development Kit.

The AWS CDK developer preview got released at AWS re:Invent 2018. We experimented a bit with it in the early days and back then it was clearly not a mature product. However, in July 2019 it has become generally available for both TypeScript and Python. Since then it’s gaining traction.

UPDATE: On November 25th 2019, the CDK became generally available for Java and .NET as well.

As I was hearing more and more positive things about the CDK, lately, I thought the time was right to give it another try and share my findings with you. As a use case, I’ll recreate the functionality described in previous blog posts.

In the next steps I will explain how to build an automated way to shut down an RDS database, together with a complete CI/CD Pipeline. This time we won’t use any yaml files or manual AWS actions, only pure TypeScript code.




Prerequisites




Setup


Database


To test our code, we will use the AWS CLI to set up a MySQL database on a db.t2.micro instance.

aws rds create-db-instance --db-instance-identifier testdb --db-instance-class db.t2.micro --engine mysql --allocated-storage 20 --master-username admin --master-user-password adminPwd



Project


Create a new directory

mkdir automatic-aws-db-shutdown-cdk

Change your working directory to the newly created one

cd automatic-aws-db-shutdown-cdk

Initialize a new CDK project using TypeScript

cdk init --language typescript

Install all the CDK modules we will need and save them as dependencies

npm install @aws-cdk/aws-codepipeline @aws-cdk/aws-codepipeline-actions @aws-cdk/aws-events @aws-cdk/aws-events-targets @aws-cdk/aws-ssm --save

For the sake of simplicity, you can remove the ‘test’ folder. Your project should now look like this.



We will be mainly editing two files:

  • bin/automatic-aws-db-shutdown-cdk.ts => This file is the entry point of our application. It will reference the stacks we are going to build. The CDK will start from this file to synthesize CloudFormation templates.

  • lib/automatic-aws-db-shutdown-cdk-stack.ts => This file describes one of the infrastructure stacks.



Stacks


Lambda Stack


To recreate all functionality, we will build two stacks, one for the lambda functions and one for the CI/CD pipeline. We’ll start with the stack for our lambda functions, so let’s rename the ‘automatic-aws-db-shutdown-cdk-stack.ts’ file to ‘lambda-stack.ts.’



We won’t change the actual lambda function code, so we can reuse the files we created earlier.

Create a ‘lambda’ folder and copy the files from my previous blog post.



For the sake of completeness, I’ve added them below.


app.js

const stopInstance = require('./stop');

exports.lambdaHandler = async (event, context) => {
    const instanceIdentifier = process.env.INSTANCE_IDENTIFIER;
    const result = await stopInstance(instanceIdentifier);
    return {
        statusCode: 200,
        body: result,
    }
};

stop.js

const AWS = require('aws-sdk');

module.exports = (instanceIdentifier) => {
    return new Promise((resolve, reject) => {
        const rds = new AWS.RDS();
        const params = {
            DBInstanceIdentifier: instanceIdentifier,
        };
        rds.stopDBInstance(params, (err, data)=> {
            if (err) {
                reject(err);
            } else {
                resolve(data);
            }
        });
    });
};

app.js

const startInstance = require('./start');

exports.lambdaHandler = async (event, context) => {
    const instanceIdentifier = process.env.INSTANCE_IDENTIFIER;
    const result = await startInstance(instanceIdentifier);
    return {
        statusCode: 200,
        body: result,
    }
};

start.js

const AWS = require('aws-sdk');

module.exports = (instanceIdentifier) => {
    return new Promise((resolve, reject) => {
        const rds = new AWS.RDS();
        const params = {
            DBInstanceIdentifier: instanceIdentifier,
        };
        rds.startDBInstance(params, (err, data)=> {
            if (err) {
                reject(err);
            } else {
                resolve(data);
            }
        });
    });
};

It’s time to build our actual lambda stack. We create some specific StackProps to pass in the variables to reference the target RDS instance. We also create some helper methods to easily build our two lambda functions, the events that will trigger them and the necessary access policies.

lambda-stack.ts

import {Construct, Duration, Stack, StackProps} from "@aws-cdk/core";
import {CfnParametersCode, Code, Function, Runtime} from "@aws-cdk/aws-lambda";
import {LambdaFunction} from "@aws-cdk/aws-events-targets";
import {PolicyStatement, Effect} from "@aws-cdk/aws-iam";
import {Rule, Schedule} from "@aws-cdk/aws-events";

export interface LambdaStackProps extends StackProps {
    readonly instanceId: string;
    readonly instanceARN: string;
}

export class LambdaStack extends Stack {

    public readonly startUpLambdaCode: CfnParametersCode;
    public readonly shutDownLambdaCode: CfnParametersCode;

    constructor(scope: Construct, id: string, props: LambdaStackProps) {
        super(scope, id, props);

        this.shutDownLambdaCode = Code.fromCfnParameters();
        this.buildEventTriggeredLambdaFunction("DBShutDown", props.instanceId, props.instanceARN, "rds:StopDBInstance", "0 17 ? * MON-FRI *", this.shutDownLambdaCode);

        this.startUpLambdaCode = Code.fromCfnParameters();
        this.buildEventTriggeredLambdaFunction("DBStartUp", props.instanceId, props.instanceARN, "rds:StartDBInstance", "0 5 ? * MON-FRI *", this.startUpLambdaCode);
    }

    private buildEventTriggeredLambdaFunction(name: string, instanceId: string, instanceARN: string, instanceAction: string, scheduleExpression: string, lambdaCode: CfnParametersCode): Function {
        const lambdaFn = this.buildLambdaFunction(`${name}Function`, "app", lambdaCode, instanceId);

        const instanceActionPolicy = this.buildPolicy(instanceAction, instanceARN);
        lambdaFn.addToRolePolicy(instanceActionPolicy);

        const eventRule = this.buildEventRule(`${name}Rule`, scheduleExpression);
        eventRule.addTarget(new LambdaFunction(lambdaFn));

        return lambdaFn;
    }

    private buildLambdaFunction(id: string, filename: string, code: CfnParametersCode, instanceId: string): Function {
        return new Function(this, id, {
            code: code,
            handler: filename + '.lambdaHandler',
            memorySize: 128,
            timeout: Duration.seconds(300),
            runtime: Runtime.NODEJS_10_X,
            environment: {
                INSTANCE_IDENTIFIER: instanceId
            }
        });
    }

    private buildPolicy(actionToAllow: string, instanceARN: string): PolicyStatement {
        return new PolicyStatement({
            effect: Effect.ALLOW,
            actions: [actionToAllow],
            resources: [instanceARN]
        });
    }

    private buildEventRule(id: string, scheduleExpression: string): Rule {
        return new Rule(this, id, {
            schedule: Schedule.expression('cron(' + scheduleExpression + ')')
        });
    }
}

If you paid close attention, you probably noticed that there is something missing. The SNS dead letter queue we scripted in our previous AWS SAM setup, is not included in the current stack. This is because the AWS CDK currently only supports SQS dead letter queues.

I reached out to the CDK team about this (see github issue). They were very reactive and created a pull request to add the functionality. However until today, it hasn’t been merged and released. I guess they are too busy preparing for this year’s edition of re:Invent.



Pipeline Stack


Now let’s create the stack for the CI/CD pipeline that will build and deploy our lambda functions automatically on every code update.

Create a new file ‘pipeline-stack.ts’ in the lib folder.



Once this is done, we can write the pipeline stack code.


pipeline-stack.ts

import {Construct, SecretValue, Stack, StackProps} from "@aws-cdk/core";
import {Artifact, Pipeline} from "@aws-cdk/aws-codepipeline";
import {
    CloudFormationCreateUpdateStackAction,
    CodeBuildAction,
    GitHubSourceAction
} from "@aws-cdk/aws-codepipeline-actions";
import { CfnParametersCode } from "@aws-cdk/aws-lambda";
import {StringParameter} from "@aws-cdk/aws-ssm";
import { BuildSpec, PipelineProject, LinuxBuildImage } from "@aws-cdk/aws-codebuild";

export interface PipelineStackProps extends StackProps {
    readonly startUpLambdaCode: CfnParametersCode;
    readonly shutDownLambdaCode: CfnParametersCode;
}

export class PipelineStack extends Stack {
    constructor(scope: Construct, id: string, props: PipelineStackProps) {
        super(scope, id, props);

        // Source action
        const oauthToken = SecretValue.secretsManager('/automatic-aws-db-shutdown-cdk/github/token', {jsonField: 'github-token'});
        const githubRepo = StringParameter.valueFromLookup(this, "/automatic-aws-db-shutdown-cdk/github/repo");
        const githubOwner = StringParameter.valueFromLookup(this, "/automatic-aws-db-shutdown-cdk/github/owner");

        const sourceOutput = new Artifact("SourceOutput");
        const sourceAction = new GitHubSourceAction({
            actionName: 'Source',
            owner: githubOwner,
            repo: githubRepo,
            branch: 'master',
            oauthToken: oauthToken,
            output: sourceOutput
        });


        // Build actions
        const lambdaTemplateFileName = 'LambdaStack.template.json';
        const cdkBuild = this.createCDKBuildProject('CdkBuild', lambdaTemplateFileName);
        const cdkBuildOutput = new Artifact('CdkBuildOutput');
        const cdkBuildAction = new CodeBuildAction({
            actionName: 'CDK_Build',
            project: cdkBuild,
            input: sourceOutput,
            outputs: [cdkBuildOutput],
        });

        const shutDownLambdaBuild = this.createLambdaBuildProject('ShutDownLambdaBuild', 'lambda/shut-down');
        const shutDownLambdaBuildOutput = new Artifact('ShutDownLambdaBuildOutput');
        const shutDownLambdaBuildAction = new CodeBuildAction({
            actionName: 'Shut_Down_Lambda_Build',
            project: shutDownLambdaBuild,
            input: sourceOutput,
            outputs: [shutDownLambdaBuildOutput],
        });

        const startUpLambdaBuild = this.createLambdaBuildProject('StartUpLambdaBuild', 'lambda/start-up');
        const startUpLambdaBuildOutput = new Artifact('StartUpLambdaBuildOutput');
        const startUpLambdaBuildAction = new CodeBuildAction({
            actionName: 'Start_Up_Lambda_Build',
            project: startUpLambdaBuild,
            input: sourceOutput,
            outputs: [startUpLambdaBuildOutput],
        });

        // Deployment action
        const deployAction = new CloudFormationCreateUpdateStackAction({
            actionName: 'Lambda_Deploy',
            templatePath: cdkBuildOutput.atPath(lambdaTemplateFileName),
            stackName: 'LambdaDeploymentStack',
            adminPermissions: true,
            parameterOverrides: {
                ...props.startUpLambdaCode.assign(startUpLambdaBuildOutput.s3Location),
                ...props.shutDownLambdaCode.assign(shutDownLambdaBuildOutput.s3Location),
            },
            extraInputs: [startUpLambdaBuildOutput, shutDownLambdaBuildOutput]
        });


        // Construct the pipeline
        const pipelineName = "automatic-aws-db-shutdown-cdk-pipeline";
        const pipeline = new Pipeline(this, pipelineName, {
            pipelineName: pipelineName,
            stages: [
                {
                    stageName: 'Source',
                    actions: [sourceAction],
                },
                {
                    stageName: 'Build',
                    actions: [startUpLambdaBuildAction, shutDownLambdaBuildAction, cdkBuildAction],
                },
                {
                    stageName: 'Deploy',
                    actions: [deployAction],
                }
            ]
        });

        // Make sure the deployment role can get the artifacts from the S3 bucket
        pipeline.artifactBucket.grantRead(deployAction.deploymentRole);
    }

    private createCDKBuildProject(id: string, templateFilename: string) {
        return new PipelineProject(this, id, {
            buildSpec: BuildSpec.fromObject({
                version: '0.2',
                phases: {
                    install: {
                        commands: [
                            "npm install",
                            "npm install -g cdk",
                        ],
                    },
                    build: {
                        commands: [
                            'npm run build',
                            'npm run cdk synth -- -o dist'
                        ],
                    },
                },
                artifacts: {
                    'base-directory': 'dist',
                    files: [
                        templateFilename,
                    ],
                },
            }),
            environment: {
                buildImage: LinuxBuildImage.UBUNTU_14_04_NODEJS_10_1_0,
            },
        });
    }

    private createLambdaBuildProject(id: string, sourceCodeBaseDirectory: string) {
        return new PipelineProject(this, id, {
            buildSpec: BuildSpec.fromObject({
                version: '0.2',
                artifacts: {
                    'base-directory': sourceCodeBaseDirectory,
                    files: [
                        '*.js'
                    ],
                },
            }),
            environment: {
                buildImage: LinuxBuildImage.UBUNTU_14_04_NODEJS_10_1_0,
            },
        })
    }
}

In this piece of code you’ll see we create one pipeline with 3 stages:



Source stage


In this stage we create a source action that will need to go and fetch the code from a Github repository. In order to know where to get the code from, we will store the repository owner and name in the AWS Systems Manager Parameter Store.

aws ssm put-parameter --name /automatic-aws-db-shutdown-cdk/github/repo --description &quotGithub Repository name for Pipeline Stack&quot --type String --value automatic-aws-db-shutdown-cdk  
 
aws ssm put-parameter --name /automatic-aws-db-shutdown-cdk/github/owner --description &quotGithub Owner for Pipeline Stack&quot --type String --value HatchSoftware

Our source action also needs the correct permissions to download the source code from the repository. We will give these permissions by specifying a newly created personal access token and storing this one in the AWS SecretsManager.

aws secretsmanager create-secret --name /automatic-aws-db-shutdown-cdk/github/token --secret-string '{"github-token":"<YOUR GITHUB TOKEN>"}'


Build stage


The build stage will have 3 build actions.

  • One action will install and use the AWS CDK to synthesize the CloudFormation template of our lambda stack.

  • The other two actions will take the code of our lambda functions, create output artifacts from them and store them in an S3 bucket. This bucket needs to be referenced by the lambda stack.



Deploy stage


This stage contains a CloudFormation deploy action that uses the synthesized template of our lambda stack to build the infrastructure. It will also use parameter overrides to specify the S3 location where our lambda function code was published to during the build stage.



Deploy


We’re almost there, but before we can deploy we still need to modify the ‘bin/automatic-aws-db-shutdown-cdk.ts’ file.


automatic-aws-db-shutdown-cdk.ts

import 'source-map-support/register';
import cdk = require('@aws-cdk/core');
import {LambdaStack} from '../lib/lambda-stack';
import {PipelineStack} from "../lib/pipeline-stack";

const accountId = '[YOUR AWS ACCOUNT ID]';
const region = '[YOUR AWS REGION]';
const instanceId = '[YOUR RDS DB INSTANCE ID]';
const instanceARN = '[YOUR RDS DB INSTANCE ARN';

const app = new cdk.App();
const lambdaStack = new LambdaStack(app, 'LambdaStack', {
    env: {
        account: accountId,
        region: region
    },
    instanceId: instanceId,
    instanceARN: instanceARN
});

new PipelineStack(app, 'PipelineStack', {
    env: {
        account: accountId,
        region: region
    },
    startUpLambdaCode: lambdaStack.startUpLambdaCode,
    shutDownLambdaCode: lambdaStack.shutDownLambdaCode,
});

app.synth();

In this file we create our two stacks and pass in the correct stack properties. Be aware that you will need to change the variables on top of the file with the values that match your AWS account and database instance.

We can now start the actual deployment of our pipeline stack.

npm run build

This command will compile our TypeScript code to javascript.

cdk synth

This will synthesize CloudFormation templates from our code. You can have a look at the templates in the ‘cdk.out’ folder that has been created. You’ll see that our 200 lines of TypeScript have been translated into around 1800 lines of JSON.

cdk deploy PipelineStack

This command lists all resources that are about to be created.



You should type ‘y’ and hit enter. In the next minutes the infrastructure specified in our pipeline stack will be created. You can follow the progress in the terminal window.



If you log in to the AWS Management Console and go to CloudFormation, you can also see that our Pipeline stack is being created.



After a few minutes, our pipeline infrastructure creation will be finished.



The created CodePipeline will be triggered automatically upon a code update. The deploy step of our pipeline will create the lambda stack.



When you check CloudFormation again, you will see the newly created stack called ‘LambdaDeploymentStack’.



When you check the lambda functions, you’ll see that they have successfully been created.





Clean up


Never forget to clean up resources you aren’t using anymore if you don’t want any surprises on your monthly AWS bill.

In order to delete the CloudFormation stack that was created automatically from the deployment pipeline, just run the command below.

aws cloudformation delete-stack --stack-name LambdaDeploymentStack

Afterwards, destroy the pipeline stack we created using the CDK.

cdk destroy PipelineStack

Also don’t forget to delete the test database.

aws rds delete-db-instance --db-instance-identifier testdb --skip-final-snapshot


Final code


The final code of this project can be found on Github. If you want to use it directly or as a baseline to start from, don’t forget to specify the correct AWS account, region and RDS database instance ID and ARN.



Conclusion


However the AWS CDK is not completely where it should be, it already came a long way since the developer preview.

The advantages of being able to use patterns and practices we are all familiar with (conditions, loops, methods, classes, …), together with your favorite programming language and IDE (providing you code-completion and refactoring capabilities) make me think the CDK has a bright future ahead.

Thanks for reading!

25 nov 2019

8 min. leestijd

1

7

bottom of page