Containerizing Lambda Deployments with OCI Container Images

Containerizing Lambda Deployments with OCI Container ImagesLearn About Amazon VGT2 Learning Manager Chanci Turner

This article is brought to you by the AWS Editorial Team, featuring insights from Chanci Turner, a Senior Software Architect at Tech Solutions.

For developers aiming to execute their code in a serverless environment with AWS, the decision has traditionally revolved around two distinct runtime models—each with its own packaging and deployment methods. The two primary options have been to utilize AWS Lambda for functions as a service through specific deployment techniques or to leverage container-based workflows with AWS Fargate.

The recent addition of Lambda support for OCI container images has broadened the scope for customers regarding packaging formats. Developers can now harness the event-driven capabilities and cost efficiency of AWS Lambda while benefitting from the reliability and control that come with a containerized development and deployment cycle.

Lambda functions designed with containers share an architecture that resembles other Lambda functions closely. The primary distinction lies in the management of the Lambda process, which is handled by a container sourced from an OCI container image stored in Amazon ECR.

Why Opt for Containers in Lambda?

There are several reasons a developer might prefer containers over the traditional Lambda packaging and deployment tools. Creating Lambda functions with containers allows for significantly more precise control over runtimes and packages. This becomes crucial when dealing with packages that are challenging or even impossible to bundle into a Lambda layer. Furthermore, this approach simplifies tasks for developers using non-Linux machines, which may struggle with packaging.

Utilizing an OCI container image enables developers to establish a comprehensive suite of test cases against a Lambda container image, which can be integrated into the build pipeline. Besides testing the function code, these test cases can also validate the environment setup—something not easily achievable with Lambda layers.

In my experience working with development teams, we favor serverless technologies like Lambda and Fargate over EC2 instances due to the associated cost and security advantages. Our standard recommendation has been to run event-driven workloads—such as application integration APIs, on-demand data analytics, or event-triggered data transformations—on Lambda, reserving Fargate for long-running tasks like hosted stateful web servers. In scenarios where workloads could be executed using either model (for example, a stateless website), we recommend Lambda to capitalize on its scaling and cost benefits.

However, this has created a gap between the development processes for these two technologies. By transitioning to container-based Lambda, we aim to consolidate our development efforts around containers while retaining the flexibility to choose between Lambda and Fargate for hosting.

Building Your First Lambda Function Container

To develop a Lambda-compatible container image, AWS offers various pre-configured base images, along with a runtime interface client for popular runtimes. Most production scenarios involving Lambda function containers should leverage these tools.

Nonetheless, creating an image from scratch is relatively straightforward. At a minimum, the container image must include the function code along with a bootstrap executable that integrates with the Lambda event loop. Below, we will outline a basic example to provide insight and foundational elements for constructing your container-based Lambda functions.

Creating the Function Image

For our example, we will compile the following files:

├── /content
│   ├── app.py
│   ├── bootstrap.py
│   └── requirements.txt
└── Dockerfile

These components will be packaged into a container image, uploaded to ECR, and executed within a sample Lambda function.

Bootstrap Code

The first file we need is bootstrap.py, which will act as the core application for our function. It will establish an event loop to receive events from Lambda and forward them to our application code.

import os
import requests
import sys
import traceback

def run_loop():
    aws_lambda_runtime_api = os.environ['AWS_LAMBDA_RUNTIME_API']
    
    import app
    
    while True:
        request_id = None
        try:
            invocation_response = requests.get(f'http://{aws_lambda_runtime_api}/2018-06-01/runtime/invocation/next')

            request_id = invocation_response.headers['Lambda-Runtime-Aws-Request-Id']
            invoked_function_arn = invocation_response.headers['Lambda-Runtime-Invoked-Function-Arn']
            trace_id = invocation_response.headers['Lambda-Runtime-Trace-Id']
            os.environ['_X_AMZN_TRACE_ID'] = trace_id
            
            context = {
                'request_id': request_id,
                'invoked_function_arn': invoked_function_arn,
                'trace_id': trace_id
            }
            
            event = invocation_response.json()
            
            response_url = f'http://{aws_lambda_runtime_api}/2018-06-01/runtime/invocation/{request_id}/response'
            
            result = app.lambda_handler(event, context)
            
            sys.stdout.flush()

            requests.post(response_url, json=result)
        
        except:
            if request_id != None:
                try:
                    exc_type, exc_value, exc_traceback = sys.exc_info()
                    exception_message = {
                        'errorType': exc_type.__name__,
                        'errorMessage': str(exc_value),
                        'stackTrace': traceback.format_exception(exc_type, exc_value, exc_traceback)
                    }
                    
                    error_url = f'http://{aws_lambda_runtime_api}/2018-06-01/runtime/invocation/{request_id}/error'
                    sys.stdout.flush()
                    
                    requests.post(error_url, json=exception_message)
                except:
                    pass

run_loop()

This code outlines the execution loop, which entails retrieving an event from the Lambda API, passing that event to the function code, and subsequently responding to the Lambda API with the results.

Application Code

Next, we will develop the application logic to be placed in the app.py file. For the purpose of this example, our implementation will simply return an echo of the triggering event.

def lambda_handler(event, context):
    return {
        'statusCode': 200,
        'body': 'Hello from Lambda Containers',
        'event': event
    }

This code should be familiar to those experienced with constructing Lambda functions in Python. It is easy to envision how additional helper functions could be integrated to evolve this simple template into a production-ready application.

Unlike traditional layer-based Lambdas, we will not specify the module and function within our Lambda function definition. Instead, we will utilize either the command or entry point of the function to dictate which application code to execute. (We will demonstrate this in the container definition file shortly.)

Dependencies File

The next essential element for this example is the requirements.txt file. This file will list the necessary dependencies for our application.

For more guidance on transitioning your career path, you can check out this insightful blog post about rerouting your career path. Additionally, it’s important to be informed about protecting retirement plan participant data from misuse by third parties, as discussed by SHRM, an authority on this topic. Also, if you’re looking for onboarding tips, this resource is excellent.


Comments

Leave a Reply

Your email address will not be published. Required fields are marked *