top of page

Setting up CI/CD for Lambda Functions using AWS CodePipeline

28 okt 2019

6 min. leestijd

1

6

A step-by-step guide on building a CI/CD pipeline for template-based Lambda Functions using AWS Developer Tools



In my previous post I showed you how to set up a fully automated way to shut down RDS instances using Lambda functions that were built with AWS SAM.

Back then we packaged and deployed these functions from our local machine. To make our lives easier, we will now set up a CI/CD pipeline that automatically executes these steps every time we make a code change.

We’ll take the source code of our previous project as a base to start from, but you can apply the same logic for any Lambda Functions that have been created using an AWS SAM or an AWS CloudFormation template.



What are we going to change?


We will configure AWS CodePipeline to execute the package and deploy steps automatically on every update of our code repository.

A typical code pipeline has 3 stages:

Source In this step, the latest version of our source code will be fetched from our repository and uploaded to an S3 bucket.

Build During the build step we will use this uploaded source code and automate our manual packaging step using a CodeBuild project.

Deploy In the deployment step we will use CloudFormation in order to create and execute the changeset that will eventually build the entire infrastructure.



Preparing the build step


Buildspec file


We add a buildspec.yaml file to the root folder of the project. This file is a collection of commands and related settings that CodeBuild uses to build a project (you can find more info here).



In our case, the buildspec file is rather simple as we only need to define an installation phase. For this phase, we need to specify the wanted runtime version and the command to be executed.

aws cloudformation package --template-file template.yaml --s3-bucket auto-aws-db-shutdown-deployment-artifacts --output-template-file outputTemplate.yaml This command will upload the artifacts to the S3 bucket we specified and output a new template file called outputTemplate.yaml.

As you can see, we are using the command ‘aws cloudformation package’ instead of the ‘sam package’ command we used when packaging from our local machine. We do this because AWS SAM is not available by default in the CodeBuild runtime and the SAM command is actually just an alias for the CloudFormation command.

In the artifacts section, we specify the artifacts that should be available as the output of our build step.



Setting up the pipeline


Now that we prepared our build step, we are ready to create our actual pipeline. We’ll do this using the AWS Management Console.


Create pipeline



Go to the CodePipeline service and click the ‘Create pipeline’ button.



Fill out the pipeline name and click ‘Next’.



Source stage


In this stage we need to specify the source provider. In our case this is GitHub, but you can also get your sources from other providers like S3, CodeCommit, … .



Select the repository and branch and click ‘Next’.



Build stage


In the build stage, we set up AWS CodeBuild as our build provider.



When you click ‘Create project’, a new window will open.In here we’ll use the wizard to create a new CodeBuild project.



Fill out the project name and specify the environment settings.



  • Operating system : Ubuntu

  • Runtime : Standard

  • Runtime version : aws/ codebuild / standard: 2.0

  • Image version: Always us the latest image for this runtime version

Select ‘Use a buildspec file’ and click ‘Next’.



In the CodePipeline window, you will see that the CodeBuild project has successfully been created.

Now let’s step back for a second. In our buildspec file we defined a command that would package our template and push the result to the S3 bucket we defined. As our ‘auto-aws-db-shutdown-deployment-artifacts’ bucket is not public, the CodeBuild service will never be able to push the output template unless we assign its role the proper permissions to do so.

Open the IAM service and go to ‘Roles’.



Select the role that was created during the setup of the CodeBuild project.



Click ‘Attach policies’ and in the next screen ‘Create policy’.

Switch to the JSON tab and create a policy that only gives access to the S3 bucket where the deployment artifacts of our Lambda functions are stored.



Specify a name for the policy and click ‘Create policy’.



Back in the screen of the the CodeBuild role, search for our newly created policy and tick the checkbox in front of it.



Click ‘Attach policy’ to assign the policy to the CodeBuild role.



Now that the CodeBuild role has all proper permissions, we can go to the deployment step.



Deploy stage


For this stage we will use CloudFormation to build the entire infrastructure defined in our AWS SAM template. In order to do that, we need to create a role for it that has the proper permissions.

Go to the ‘Roles’ page of the IAM service and click ‘Create role’.



Select the CloudFormation service and click ‘Next: Permissions’.



Click ‘Create policy’ and switch to the JSON tab.



We need to make sure that our CloudFormation Role has all access rights needed to fetch our build artifacts and create the entire infrastructure.The allowed actions are therefor specific to the infrastructure we want to create(CloudWatch event triggers, Lambda functions, SNS dead letter queue, …).


cfn-policy.json

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Sid": "VisualEditor0",
            "Effect": "Allow",
            "Action": [
                "events:EnableRule",
                "events:PutRule",
                "iam:CreateRole",
                "iam:AttachRolePolicy",
                "iam:PutRolePolicy",
                "cloudformation:CreateChangeSet",
                "iam:PassRole",
                "iam:DetachRolePolicy",
                "events:ListRuleNamesByTarget",
                "iam:DeleteRolePolicy",
                "events:ListRules",
                "events:RemoveTargets",
                "events:ListTargetsByRule",
                "events:DisableRule",
                "sns:*",
                "events:PutEvents",
                "iam:GetRole",
                "events:DescribeRule",
                "iam:DeleteRole",
                "s3:GetBucketVersioning",
                "events:TestEventPattern",
                "events:PutPermission",
                "events:DescribeEventBus",
                "events:TagResource",
                "events:PutTargets",
                "events:DeleteRule",
                "s3:GetObject",
                "lambda:*",
                "events:ListTagsForResource",
                "events:RemovePermission",
                "iam:GetRolePolicy",
                "s3:GetObjectVersion",
                "events:UntagResource"
            ],
            "Resource": "*"
        }
    ]
}

Click ‘Review Policy’ to get a summary of all allowed actions.



Click ‘Create Policy’.

Back in the screen of the CloudFormation role, search for our newly created policy and tick the checkbox in front of it.



Click ‘Next: Tags’ and on the next screen ‘Next: Review’.



Specify a name for the role and click ‘Create Role’.

Now let’s switch back to our CodePipeline to configure the last stage of our pipeline.



  • Deploy provider : AWS CloudFormation

  • Action mode : Create or replace a change set

  • Stack name : auto-db-shutdown-stack

  • Change set name : auto-db-shutdown-changeset

  • Template : BuildArtifact => outputTemplate.yaml

  • Capabilities : CAPABILITY_IAM

  • Role name : The newly created role for CloudFormation

Click ‘Next’, review all pipeline settings and click ‘Create Pipeline’.

You probably noticed that the default wizard only lets us create one deployment action. We have configured this one to create or replace a changeset. Because this changeset needs to be executed as well, we will need to add another deployment action for this.

But to make things clear, let’s rename the first deployment action.

Click on ‘Edit’ at the top of the pipeline.



Click the ‘Edit stage’ button in the Deploy stage and then click on the pencil icon of the Deploy action.



Change the action name to ‘Create - Changeset’ and click ‘Done’.

Now we will add the extra deployment action to execute our changeset.

Click ‘+ Add action group’ below the ‘Create-Changeset’ action.



  • Action name : Execute-Changeset

  • Action provider : AWS CloudFormation

  • Input artifacts : BuildArtifact

  • Action mode : Execute a change set

  • Stack name : auto-db-shutdown-stack

  • Change set name : auto-db-shutdown-changeset

Click ‘Done’.



Our pipeline is now completely finished, click ‘Save’ and wait for the first pipeline run to complete. If the pipeline already executed automatically and failed because the configuration wasn't completely done yet, just hit ‘Release change’ to rerun it.



See if it works


Let’s have a look at the current state of our Lambda functions.



We can see our shut-down function is scheduled at 5 pm UTC during weekdays.



And our start-up function is scheduled at 5 am UTC during weekdays.



Now modify the event schedule in the AWS SAM template to stop and start the database an hour later. Afterwards, commit and push the code. This will automatically trigger CodePipeline.



Wait for the pipeline to execute successfully.



When we check back on our Lambda functions, the shut-down function now runs at 6 pm UTC.



And the start-up function now runs at 6 am UTC.



Conclusion


We have set up an automated CI/CD pipeline that modifies our Lambda infrastructure at every code change. You now no longer need to manually package and deploy whenever you make a code change.

Moreover, if your colleagues want to make a change (e.g., modify the database downtime schedule), they now don’t even need to set up their local machine (install and configure AWS CLI, AWS SAM, …) any more in order to do so.

The updated code of this project can be found on GitHub.

Thanks for reading!

28 okt 2019

6 min. leestijd

1

6

bottom of page