JSON Logging in CloudWatch with AWS Lambda Powertools

Photo by Hasan Almasi on Unsplash

Introduction

In this post we will look at how to implement JSON logging in CloudWatch using AWS Lambda Powertools in Python. Logging in JSON will allow you to search through the log using patterns like the ones described in the AWS documentation.

Using Python’s module aws-lambda-powertools your logs will now appear like this:

{
    "level": "INFO",
    "location": "lambda_handler:21",
    "message": {
        "function_name": "test_api_backend",
        "aws_request_id": "0123456789-1234-1234-1234-0123456789",
        "event": {
            "resource": "/test-api-method",
            "path": "/test-api-method",
            "httpMethod": "POST",
            "headers": null,
            "multiValueHeaders": null,
            "queryStringParameters": null,
            "multiValueQueryStringParameters": null,
            "pathParameters": null,
            "stageVariables": null,
            "requestContext": {
                "resourceId": "0123456789resourceId",
                "resourcePath": "/test-api-method",
                "httpMethod": "POST",
                "extendedRequestId": "0123456789extendedRequestId",
                "requestTime": "19/Nov/2020:14:18:29 +0000",
                "path": "/test-api-method",
                "accountId": "0123456789accountId",
                "protocol": "HTTP/1.1",
                "stage": "test-invoke-stage",
                "domainPrefix": "testPrefix",
                "requestTimeEpoch": 1605795509026,
                "requestId": "0123456789-1234-1234-1234-0123456789",
                "identity": {
                    "cognitoIdentityPoolId": null,
                    "cognitoIdentityId": null,
                    "apiKey": "test-invoke-api-key",
                    "principalOrgId": null,
                    "cognitoAuthenticationType": null,
                    "userArn": "userARN",
                    "apiKeyId": "test-invoke-api-key-id",
                    "userAgent": "aws-internal/3 aws-sdk-java/1.11.864 Linux/4.9.217-0.3.ac.206.84.332.metal1.x86_64 OpenJDK_64-Bit_Server_VM/25.262-b10 java/1.8.0_262 vendor/Oracle_Corporation",
                    "accountId": "0123456789accountId",
                    "caller": "mycaller",
                    "sourceIp": "test-invoke-source-ip",
                    "accessKey": "0123456789accessKey",
                    "cognitoAuthenticationProvider": null,
                    "user": "myuser"
                },
                "domainName": "testPrefix.testDomainName",
                "apiId": "0123456789apiId"
            },
            "body": "[\n  {\n    \"key\": \"0123456789\",\n    \"key2\": \"0123456789\"\n  },\n  {\n    \"key\": \"1234\",\n    \"key2\": \"1234\"\n  }\n]",
            "isBase64Encoded": false
        }
    },
    "timestamp": "2020-11-19 14:18:29,084",
    "service": "my-service",
    "sampling_rate": 0,
    "xray_trace_id": "0123456789xray_trace_id"
}

One way of doing it…

Using SAM, you could build and package your lambda functions with the modules you want easily. Use the following directory structure:

- handlers
 |_ my_function
   |_ __init__.py
   |_ app.py
   |_ requirements.txt

Your lambda logic will be in app.py, __init__.py can be empty and the requirements.txt will have your modules that you want to use with your lambda. If you only want to import powertools, the contents would simply be:

aws_lambda_powertools

Then, using sam-cli you could build, package and deploy your Lambdas and all of your modules listed in requirements.txt will be available for your function. For more information around sam-cli refer to the AWS SAM cli docs.

Another way of doing it…

The above method works just fine. In fact it’s what you’ll be using most of the time. Just bear in mind that if the modules you are importing are large, your deployments will start taking slightly longer and if you have lots of Lambdas this could easily add up and become an issue. To avoid this it is important to keep in mind that your Lambdas should be doing one thing and do it well, don’t try to pack an entire application in one Lambda even if you are within the time limits Lambda allows.

Powertools is a huge module though and even if imported alone it could potentially slow down your deploys. What you can do instead in this particular case is use AWS’s Serverless Application Repository (SAR) where they have converted it into an application available as a Layer to your Lambdas. This is explained in short in the AWS labs docs.

Including this resource in your template:

AwsLambdaPowertoolsPythonLayer:
  Type: AWS::Serverless::Application
  Properties:
    Location:
      ApplicationId: arn:aws:serverlessrepo:eu-west-1:057560766410:applications/aws-lambda-powertools-python-layer
      SemanticVersion: 1.7.0

will create a nested AWS stack for you. In each of your Lambdas then simply reference the Layer using the output from the nested stack.

Layers:
  - !GetAtt AwsLambdaPowertoolsPythonLayer.Outputs.LayerVersionArn

Something that can be confusing in the docs is that the ApplicaitonId attribute in the SAR resource above will be the same even if you have a service or an application that needs to be deployed in multiple AWS regions. Don’t try to change the region using something like:

...
  ApplicationId: !Sub arn:aws:serverlessrepo:${AWS::Region}:057560766410:applications/aws-lambda-powertools-python-layer
...

You will get weird permissions issues. That’s because this SAR is only available in Ireland.

Once the nested stack is created your application should benefit from faster deployments as it won’t need to package powertools for your Lambdas.

Happy serverless deployments :)

Avatar
Vasileios Vlachos
Cloud Engineer

I am a value driven engineer, helping my clients maximise their ROI from their cloud deployments.

Previous

Related