This post is a follow-up to my authenticated API Gateway post.
I've written a Lambda function and presented it using API Gateway. The API uses Amazon's AIM keys for authentication. The next step is to write the code to source control (AWS CodeBuild Git) and have it automatically deploy.
Note: This official guide on AWS is really good, and I copied big parts it. I use Python instead of JavaScript in this post, and include my own experience while following the guide.
Open the S3 Console and create a bucket to store your build artifacts.
I'm using a dedicated repository on AWS CodeCommit to store the code, and Python3 for the language. Here's a very simple Lambda function I use for testing:
def lambda_handler(event, context):
""" run a hello function """
body = 'NO HTTP METHOD'
if 'httpMethod' in event and event['httpMethod'] == 'POST':
request_body = event['body']
request_user = event['requestContext']['identity']['userArn']
body = f'{request_user} sent a POST with body: {request_body}'
elif 'httpMethod' in event and event['httpMethod'] == 'GET':
request_user = event['requestContext']['identity']['userArn']
body = f'{request_user} sent a GET'
return {
'statusCode': 200,
'body': json.dumps(body)
}
This was my first time using AWS SAM, so consider these my observations and not necessarily fact.
SAM stands for Serverless Application Model.
The fields I've changed from Amazon's example for this post:
AWS::Serverless::Function
# template.yaml
AWSTemplateFormatVersion: '2010-09-09'
Transform: AWS::Serverless-2016-10-31
Description: Sample Continuously deploy function
Resources:
TestCicdFunction:
Type: AWS::Serverless::Function
Properties:
Handler: cicd-test.lambda_handler
Runtime: python3.8
CodeUri: ./cicd-test.py
Events:
TestCicdAPI:
Type: Api
Properties:
Path: /Test
Method: GET
Install the AWS SAM CLI application.
pip install --user aws-sam-cli
Build the package and upload it to your S3 bucket. From your project root, set your bucket name then run the following.
# Set the bucket name first
BUCKET=''
aws cloudformation package \
--template-file template.yml \
--s3-bucket $BUCKET \
--output-template-file outputtemplate.yml
The package command will upload your python file to S3 and output a new
template file with the S3 path replacing the CodeUri
.
Uploading to <ID> 381 / 381.0 (100.00%)
Successfully packaged artifacts and wrote output template to file outputtemplate.yml.
Execute the following command to deploy the packaged template
aws cloudformation deploy --template-file /home/vagrant/arcus-lambda/outputtemplate.yml --stack-name <YOUR STACK NAME>
Now deploy it. Since there's no CloudFormation stack created yet, I make a new
one named test-cicd-lambda
. Note that in the above command output, the
example they give for deploy is wrong. You need to include the
--capabilities CAPABILITY_IAM
argument for this template too.
aws cloudformation deploy \
--capabilities CAPABILITY_IAM \
--template-file /home/vagrant/arcus-lambda/outputtemplate.yml \
--stack-name test-cicd-lambda
The application is now deployed.
I recently did a whole post on this here, so I'm going to be light on the details. Head over to that post to fill in any blanks. In that post I was building Docker images and pushing them to ECR. This is similar, except we push python files to Lambda instead.
Put this in the project root, it defines what CodeBuild will do. We'll use an
environment variable for the BUCKET
definition.
If you're not sure what to pick for a runtime, check this list.
# buildspec.yml
version: 0.2
phases:
install:
runtime-versions:
python: 3.8
build:
commands:
- aws cloudformation package --template-file template.yml --s3-bucket $BUCKET --output-template-file outputtemplate.yml
artifacts:
type: zip
files:
- template.yml
- outputtemplate.yml
Commit the file to source control.
I use the role I defined for CodeBuild in my previous post. It's a bit overly permissive but it works. Other than the policies which get created automatically, mine has these:
AWSCodeCommitFullAccess
AmazonEC2ContainerRegistryFullAccess
AmazonS3FullAccess
CloudWatchLogsFullAccess
AWSCodeBuildAdminAccess
Test the build before moving on and make sure it works.
Open the IAM console and create a new policy. Define the policy using the following JSON. These wildcards will cover everything you need, though it might be a bit permissive.
{
"Statement": [
{
"Action": [
"apigateway:*",
"codedeploy:*",
"lambda:*",
"cloudformation:CreateChangeSet",
"iam:GetRole",
"iam:CreateRole",
"iam:DeleteRole",
"iam:PutRolePolicy",
"iam:AttachRolePolicy",
"iam:DeleteRolePolicy",
"iam:DetachRolePolicy",
"iam:PassRole",
"s3:GetObject",
"s3:GetObjectVersion",
"s3:GetBucketVersioning"
],
"Resource": "*",
"Effect": "Allow"
}
],
"Version": "2012-10-17"
}
In the role wizard of the IAM page:
CloudFormation
AWSLambdaExecute
When code is committed, you want CodeBuild to run and re-deploy your serverless application.
BuildArtifact
outputtemplate.yml
CAPABILITY_IAM
That should do it. Now whenever you push code to your master branch, the serverless application will re-deploy.