A2LRS - P1 - Thumbnail
profile picture Sandeep Kumar P
4 min read Nov 21, 2023

What happens inside an AWS Lambda

AWS Lambda is a serverless event driven service that allows you to run the code without the need to provision or manage servers or any other related infrastructure. AWS Lambda released a major functionality called response streaming for node js and custom runtimes. I had a use case where I had to run a python runtime with streaming enabled for my AI Enabled Application. Since then I have been trying to weave my head around the flow of events of lambda. By the time I got it working I had a pretty good idea of what happens when an AWS Lambda is invoked. I’m sharing the knowledge of my learnings in this blog. Note that the explanation will be from a custom runtime perspective as it gives an additional layer of control which will help you to understand the intricacies of AWS Lambda.

A custom runtime allows you to create a docker file and bring in your code dependencies and configuration for a lambda runtime. This means that you can initialise a lambda with any language of your choice and also use specific versions of your programming language which is not supported by AWS Lambda. Now, let’s understand how a custom runtime works.

Here’s a prerequisite before we get into the details. It is important to know this concept as the flow of events vary between a cold start and a warm start.

Warm Start, Cold Start

A quick intro to the cold start and warm start of an AWS Lambda. When a Lambda is invoked for the first time, there is a delay between the invocation event and the start of execution of the handler function. This is called a cold start. AWS Lambda needs this time to initialise the environment and other dependencies required to execute the handler. Once the initialization is done, the upcoming requests can invoke the handler function in a seamless manner. This is called a warm start. Note that AWS Lambda billing includes this delay and that is one of the reasons for this concept to be hyped.

Now let’s get to the crux of the blog.

AWS Lambda Cold Start

We will now break down the events of what happens in a cold start when a custom runtime lambda is invoked.

A2LRS - P1 - Cold Start - Request

We invoke the Lambda API endpoint, either directly or through integration with API Gateway, Event Bridge, etc. This in turn invokes the Lambda Service. The Lambda Service has a mechanism to identify a cold start vs a warm start. The cold start initialises the docker based on the docker file we have provided. At this point, the lambda service will also send the invocation event to its internal API. When a lambda is configured to run a custom runtime, the entry point post docker initialization is the bootstrap file. The bootstrap file acts as an interface between the lambda service and the runtime APIs When the bootstrap file is initialised, it will run a loop to continuously poll for the next lambda event using this API - http://${AWS_LAMBDA_RUNTIME_API}/2018-06-01/runtime/invocation/next. This API will fetch the event that is posted in step 2. When an event is received, the bootstrap file will invoke the lambda handler function and wait for its execution to complete.

This concludes the flow of events for until the initialization of the lambda handler, now we will see how the response is propagated.

A2LRS - P1 - Cold Start - Response

The handler function returns the response to the bootstrap file after its execution. The bootstrap file sends the response of the invocation to http://${AWS_LAMBDA_RUNTIME_API}/2018-06-01/runtime/invocation/AWSLambdaInvocationId/response.The AWSLambdaInvocationId is replaced with the invocation ID of the event. The runtime API then sends the response to the lambda service. The lambda service takes care of multiple other things like logging, etc. The response is finally sent out to the client making the request.

AWS Lambda Warm Start

As you would have expected by now, the steps remain more or less the same. The difference is that since the docker is already initialised and the bootstrap file is continuously polling for the next invocation event, the initialization of docker and other prerequisites are skipped in the request section of the cold start. The response sections remain the same as the cold start.

Coming Soon

Building a custom python runtime that leverages AWS Lambda’s response streaming functionality.

Application Modernization Icon

Innovate faster, and go farther with serverless-native application development. Explore limitless possibilities with AntStack's serverless solutions. Empowering your business to achieve your most audacious goals.

Talk to us

Author(s)

Tags

Your Digital Journey deserves a great story.

Build one with us.

Recommended Blogs

Cookies Icon

These cookies are used to collect information about how you interact with this website and allow us to remember you. We use this information in order to improve and customize your browsing experience and for analytics and metrics about our visitors on this website.

If you decline, your information won’t be tracked when you visit this website. A single cookie will be used in your browser to remember your preference not to be tracked.

Talk to us