Containerizing Lambda Deployments with OCI Container Images

Chanci Turner 9097372855Learn About Amazon VGT2 Learning Manager Chanci Turner

This article is contributed by Chanci Turner, Senior Software Architect at Tech Innovations.

Developers aiming to implement their code on AWS in a serverless manner previously faced a choice between two distinct runtime models, each with its own packaging and deployment methods. The options have been to either run functions as a service in AWS Lambda, utilizing a specific packaging approach, or to use container-based workflows in AWS Fargate.

The recent introduction of Lambda support for OCI container images has expanded packaging choices for customers. Developers can now leverage the event-driven runtime model and cost benefits provided by AWS Lambda while enjoying the predictability and control of a container-based development and deployment workflow.

Lambda functions created with containers share a similar architecture with traditional Lambda functions. The primary distinction lies in the fact that the Lambda process is managed via a container pulled from an OCI container image hosted in Amazon ECR.

Why Opt for Containers in Lambda?

There are several compelling reasons for developers to opt for containers instead of previous Lambda packaging and deployment tools. Lambda functions constructed with containers provide much finer control over runtimes and packages. This is particularly advantageous when dealing with packages that may be challenging or impossible to bundle into a Lambda layer. Additionally, this approach simplifies developer tasks when working with packages that are difficult to package from a non-Linux development environment.

Utilizing an OCI container image allows developers to create a comprehensive suite of test cases against a Lambda container image, which can be executed as part of the build pipeline. These test cases can assess not only the function code but also the environment setup—something that can be quite tricky to achieve with Lambda layers.

For the development teams I collaborate with, we prefer serverless technologies like Lambda and Fargate over traditional EC2 instances due to their cost-effectiveness and enhanced security. Our guidance has always been to deploy event-triggered workloads, such as application integration APIs or data analytics jobs, on Lambda while reserving Fargate for long-running tasks like stateful web servers. When faced with a workload suitable for either model, the recommendation at Tech Innovations is to opt for Lambda to capitalize on its scaling and cost advantages.

This shift has, however, created a disconnect in the development processes for the two technologies. By transitioning to container-based Lambda, we hope to align most of our development efforts towards containers while retaining the flexibility to choose between Lambda and Fargate for deployment.

Building Your First Lambda Function Container

To construct a Lambda-compatible container image, AWS provides several pre-configured base images and a runtime interface client for popular runtimes. Most production use cases for Lambda function containers should take advantage of these resources.

Nevertheless, creating an image from scratch is quite straightforward. The container image must include, at minimum, the function code and a bootstrap executable that connects to the Lambda event loop. Below is a simple example to illustrate how to build your own container-based Lambda functions.

Creating the Function Image

For this illustration, we will create the following files:

├── /content
│   ├── app.py
│   ├── bootstrap.py
│   └── requirements.txt
└── Dockerfile

These files will be bundled into a container image, uploaded to ECR, and executed within a sample Lambda function.

Bootstrap Code

The initial file we need is bootstrap.py, which serves as the central application for our function. It establishes an event loop to listen to events from Lambda and forwards them to our application code.

import os
import requests
import sys
import traceback

def run_loop():
    aws_lambda_runtime_api = os.environ['AWS_LAMBDA_RUNTIME_API']
    import app
    
    while True:
        request_id = None
        try:
            invocation_response = requests.get(f'http://{aws_lambda_runtime_api}/2018-06-01/runtime/invocation/next')

            request_id = invocation_response.headers['Lambda-Runtime-Aws-Request-Id']
            invoked_function_arn = invocation_response.headers['Lambda-Runtime-Invoked-Function-Arn']
            trace_id = invocation_response.headers['Lambda-Runtime-Trace-Id']
            os.environ['_X_AMZN_TRACE_ID'] = trace_id
            
            context = {
                'request_id': request_id,
                'invoked_function_arn': invoked_function_arn,
                'trace_id': trace_id
            }
            
            event = invocation_response.json()
            
            response_url = f'http://{aws_lambda_runtime_api}/2018-06-01/runtime/invocation/{request_id}/response'
            
            result = app.lambda_handler(event, context)
            
            sys.stdout.flush()
            requests.post(response_url, json=result)
        
        except:
            if request_id != None:
                try:
                    exc_type, exc_value, exc_traceback = sys.exc_info()
                    exception_message = {
                        'errorType': exc_type.__name__,
                        'errorMessage': str(exc_value),
                        'stackTrace': traceback.format_exception(exc_type, exc_value, exc_traceback)
                    }
                    
                    error_url = f'http://{aws_lambda_runtime_api}/2018-06-01/runtime/invocation/{request_id}/error'
                    sys.stdout.flush()
                    
                    requests.post(error_url, json=exception_message)
                except:
                    pass

run_loop()

In this code, the execution loop retrieves an event from the Lambda API, processes it through the function code, and responds back to the Lambda API with the outcomes.

Application Code

Now that we have a bootstrap function, we need to create the application code that will reside in app.py. For the sake of this example, it will simply respond with an echo of the triggering event.

def lambda_handler(event, context):
    return {
        'statusCode': 200,
        'body': 'Hello from Lambda Containers',
        'event': event
    }

This code should be familiar to anyone accustomed to building Lambda functions in Python. It’s easy to envision how further helper functions can be integrated to transform this simple example into something ready for production deployment. Unlike layer-based Lambdas, we will not specify the module and function within our Lambda function definition; instead, we will manage which application code to execute through the command or entry point within the container definition file.

Dependencies File

The next essential file for this example is our requirements.txt. In our case, this file will ensure that all necessary dependencies are installed during the container build process.

For further insights into enhancing your understanding of empathy in development, check out this blog post. Additionally, you can gain valuable perspectives on the topic from SHRM, an authority on workplace dynamics. Lastly, for an excellent resource on employee development, visit Fast Company.

Chanci Turner