CI/CD with GitHub Actions in Serverless

As discussed in my previous post, setting up a Serverless project is very easy and fast. In this post I will discuss how we can achieve Continuous Integration and Continuous Deployment (aka CI/CD) for such projects using GitHub Actions.

This is something that I use in my Serverless projects, is easy to setup (& as code) and works flawlessly.

You can check how to get started with GitHub Action here.

Overview

Stack

  • python 3.9
  • serverless V3
  • pylint 2.16.2
  • coverage 7.2.2
  • coverage-badge 1.1.0
  • anybadge 1.14.0
  • pytest 7.2.1

Code Structure

├── .github
│   ├── CODEOWNERS
│   └── workflows
│       └── deploy.yaml
├── images
│   └── coverage.svg
│   └── pylint.svg
├── src
│   └── main.py
├── tests
│   └── unit
│       └── test_main.py
│   └── e2e
│       └── test_e2e.py
├── requirements.txt
├── serverless.yml
├── serverless.doc.yml
└── README.md

The .github folder contains the GitHub workflows which will be primarily used for setting up CI/CD.

Before proceeding further, here’s the GitHub Action workflow file.

name: deploy

on:
  push:
    branches:
      - main
    paths:
      - "src/**"
      - "tests/**"
      - "requirements.txt"
      - "serverless.*"
  pull_request:
    branches:
      - main
    paths:
      - "src/**"
      - "tests/**"
      - "requirements.txt"
      - "serverless.*"

permissions:
  id-token: write
  contents: write
  pull-requests: write

jobs:
  deploy:
    runs-on: ubuntu-latest

    strategy:
      matrix:
        python-version: [3.9]
        node-version: [19.7.0]

    steps:
      - name: Checkout Source
        uses: actions/checkout@v3
        with:
          token: ${{ github.token }}

      - name: Setup Python ${{ matrix.python-version }}
        uses: actions/setup-python@v1
        with:
          python-version: ${{ matrix.python-version }}

      - name: Install Python Dependencies
        run: |
          if [ -f requirements.txt ]; then pip install -r requirements.txt; fi
      - name: Use Node.js ${{ matrix.node-version }}
        uses: actions/setup-node@v1
        with:
          node-version: ${{ matrix.node-version }}

      - name: Install Serverless Framework
        run: sudo npm install -g serverless

      - name: Lint Code in Stage
        if: github.ref != 'refs/heads/main'
        run: |
          score=$(pylint src/* tests/* | sed -n 's/^Your code has been rated at \([-0-9.]*\)\/.*/\1/p')
          echo "PyLint score is: ${score}"
          if [[ ${score} < 8 ]]; then echo "[ERROR] PyLint score is less than 8! Failing build..."; exit 1; fi
          anybadge -o -l pylint -v $score --file 'images/pylint.svg' 8=red 8.5=orange 9=yellow 10=green

      # AWS AuthN (GitHub OIDC)

      - name: Configure AWS Credentials in Stage
        if: github.ref != 'refs/heads/main'
        uses: aws-actions/configure-aws-credentials@master
        with:
          aws-region: eu-west-1
          role-to-assume: arn:aws:iam::${{ secrets.AWS_ACCOUNT_ID }}:role/${{ secrets.SLS_DEPLOY_ROLE_NAME }}Stage
          role-session-name: ${{ secrets.SLS_DEPLOY_ROLE_NAME }}Stage

      - name: Configure AWS Credentials in Prod
        if: github.ref == 'refs/heads/main'
        uses: aws-actions/configure-aws-credentials@master
        with:
          aws-region: eu-west-1
          role-to-assume: arn:aws:iam::${{ secrets.AWS_ACCOUNT_ID }}:role/${{ secrets.SLS_DEPLOY_ROLE_NAME }}Prod
          role-session-name: ${{ secrets.SLS_DEPLOY_ROLE_NAME }}Prod

      - name: Serverless AWS AuthN
        run: sls config credentials --provider aws --key ${{ env.AWS_ACCESS_KEY_ID }} --secret ${{ env.AWS_SECRET_ACCESS_KEY }}

      # Stage deployment

      - name: Run Unit Tests in Stage
        if: github.ref != 'refs/heads/main'
        run: |
          coverage run -m pytest tests/unit/* --color=yes --verbose
          coverage xml -i --skip-empty
          coverage-badge -f -o images/coverage.svg

      - name: Post Code Coverage in PR
        if: github.ref != 'refs/heads/main'
        uses: orgoro/coverage@v3
        with:
          coverageFile: coverage.xml
          token: ${{github.token}}
          thresholdAll: 0.80
          thresholdNew: 0.90

      - name: Install Serverless Plugins in Stage
        if: github.ref != 'refs/heads/main'
        run: |
          sls plugin install -n serverless-python-requirements --stage stage
          sls plugin install -n serverless-openapi-documenter --stage stage

      - name: Deploy in Stage
        if: github.ref != 'refs/heads/main'
        run: sls deploy --stage stage

      - name: Run E2E Tests in Stage
        if: github.ref != 'refs/heads/main'
        run: env=stage pytest tests/e2e/* --color=yes --verbose

      # OpenApi Spec

      - name: Detect Changes in API Spec in Stage
        if: github.ref != 'refs/heads/main'
        uses: dorny/paths-filter@v2
        id: filter
        with:
          filters: |
            apidoc:
              - 'serverless.doc.yml'

      - name: Generate OpenAPI Spec in Stage
        if: github.ref != 'refs/heads/main' && steps.filter.outputs.apidoc == 'true'
        run: serverless openapi generate -f yaml --stage stage

      # Update PR in Stage

      - name: Update PR in Stage
        uses: stefanzweifel/git-auto-commit-action@v4
        if: github.ref != 'refs/heads/main'
        with:
          file_pattern: 'images/* openapi.yml'
          commit_message: '[DEPLOYER] auto: Update API spec & lint/coverage scores'
          commit_user_name: GitHub Actions <[email protected]>

      # Prod deployment

      - name: Run Unit Tests in Prod
        if: github.ref == 'refs/heads/main'
        run: python -m pytest tests/unit/* --color=yes --verbose

      - name: Install Serverless Plugins in Prod
        if: github.ref == 'refs/heads/main'
        run: |
          sls plugin install -n serverless-python-requirements --stage prod
          sls plugin install -n serverless-openapi-documenter --stage prod

      - name: Deploy in Prod
        if: github.ref == 'refs/heads/main'
        run: |
          sls create_domain --stage prod
          sls deploy --stage prod

      - name: Run E2E Tests in Prod
        if: github.ref == 'refs/heads/main'
        run: env=prod pytest tests/e2e/* --color=yes --verbose

The steps are self-explanatory, however I would like to highlight a few things:

  1. Post Code Coverage
    • This step uses the orgoro GitHub Action, to calculate the coverage of a Python project, and posts the report as a PR comment.
    • This is very useful and the committer/reviewer gets to know if the PR has code-changes which alters the coverage percent. Additionally thresholds can be configured to fail the build in-case of not meeting required coverage percent, ensuring quality.
  2. Generate OpenAPI Spec
    • For a restful Serverless project, API Spec is of great importance, and the same can be handled as code in Serverless using the serverless-openapi-documenter plugin.
    • In this case, the API spec file is re-generated if there is an update done in the serverless.doc.yaml. (change is detected using the dorny-paths-filter action)
    • If you see the step “Update PR in Stage“, the newly generated API Spec is auto added to the PR (using git-auto-commit action) along with Pylint and Coverage badges (used in the README file to indicate the projects overall health)
  3. Authentication
    • For AWS Auth here the OIDC connect is used (which is recommended), however you can simply set AWS user tokens as GitHub secrets and use that.

The result is a fully automated CI/CD pipeline with auto API Spec updates and Code Coverage/Code Lint badges auto pushed in the PR.

If you see the GitHub Action, there are steps dedicated for Stage and Prod to ensure required steps/tests are run in concerned environments only.

This setup works flawlessly and with GitHub branch protection enabled, having Build action status check configured, code-merges only with passing build is guaranteed, preventing buggy codes to main branch and ensuring quality.

Terraform is used to provision static infrastructure (secrets/IAM etc), however that is not discussed in this post.

I will discuss in detail about setting up a Serverless python REST service in the next post including more details, covering all aspects 🙂

Serverless with AWS APIGW and Lambda

Recently I was working on automating a web-hook workflow and ended up using the Serverless framework to implement the solution with AWS API Gateway and Lambda.

It was a breeze to setup, deploy and manage the infra with serverless and the benefits serverless bring are worth mentioning – scalable, fully managed and no overhead to manage the infra!

The below was the use-case:

Set some environment variables in Terraform Cloud Workspace automatically (using notification web-hooks). So an API backend to authenticate the web-hook request, process it, fetch variables from AWS Secret Manager and invoke Terraform rest API to update these secrets was required.

Below is the architecture that was implemented:

Similar use-cases involving web-hooks is very common and a serverless setup makes things really simple to achieve something like above.

One challenge I faced was with Authorization of the web-hook request. In AWS API Gateway, there is a Lambda Authorizer that can be used, however it allows us to validate AuthN tokens from the request header only. In case of Terraform or even Git web-hooks, this is achieved by sending a keyed-hash message authentication code (HMAC) signature of the request body. This is basically hashing the request body using a key. This key is passed as a request header by the web-hook. The same key can be used to hash the request body at consumer’s end and on comparing these 2 signatures, we can determine the authenticity of the request. However the request body is required in this case with is not available to Lambda Authorizers. The only option being left is to have the HMAC signature comparison included as a part of the main Lambda function.

Other trivial settings like API GW request validation, API throttling can be set in the serverless yaml itself. To implement the above, the serverless.yaml is as below:

service: set-terraform-secrets

frameworkVersion: '2'

package:
  patterns:
    - 'node_modules/**'

plugins:
  - serverless-python-requirements
  - serverless-api-gateway-throttling
  
custom:
  pythonRequirements:
    dockerizePip: non-linux
    slim: true
    
  apiGatewayThrottling:
    maxRequestsPerSecond: 10
    maxConcurrentRequests: 5
  
provider:
  name: aws
  runtime: python3.8
  stage: ${sls:stage}
  region: eu-west-1
  lambdaHashingVersion: '20201221'
  iamRoleStatements:
    - Effect: Allow
      Action:
        - secretsmanager:Get*
        - secretsmanager:List*
      Resource: "arn:aws:secretsmanager:**:secret:prod/secrets/*"

functions:
  lambda:
    handler: main.set_tfe_secrets # Set based on your Lambda function handler name
    description: Lambda to set secret variables for Terraform workspaces
    events:
      - http:
          path: /secrets
          method: post
          throttling:
            maxRequestsPerSecond: 100
            maxConcurrentRequests: 50
          request:
            parameters:
              headers:
                X-Notification-Signature: true

Note the X-Notification-Signature header validation that is being done, to accept request only containing this header. This is specific to Terraform Cloud Web-hook.

The plugins are basically to install Python dependencies from requirement.txt file provided in the same source-code location and the throttling api is to set the API rate limiting settings.

The IAM Role is required by the Lambda function to fetch secrets and is specific to my use case. The idea being, all IAM roles can be specified here itself.

Note – Here, the secrets are created manually and not with serverless as they are static resources.

Once this is in place, all that is required to deploy the solution is to run:

sls deploy --stage prod

Design a site like this with WordPress.com
Get started