AWS Lambda
- A serverless compute service.
- Lambda executes your code only when needed and scales automatically.
- Lambda functions are stateless – no affinity to the underlying infrastructure.
- You choose the amount of memory you want to allocate to your functions and AWS Lambda allocates proportional CPU power, network bandwidth, and disk I/O.
- AWS Lambda is SOC, HIPAA, PCI, ISO compliant.
- Supports the following languages:
- Node.js
- Java
- C#
- Go
- Python
Components of a Lambda Application
- Function – a script or program that runs in Lambda. Lambda passes invocation events to your function. The function processes an event and returns a response.
- Runtimes – Lambda runtimes allow functions in different languages to run in the same base execution environment. The runtime sits in-between the Lambda service and your function code, relaying invocation events, context information, and responses between the two.
- Layers – Lambda layers are a distribution mechanism for libraries, custom runtimes, and other function dependencies. Layers let you manage your in-development function code independently from the unchanging code and resources that it uses.
- Event source – an AWS service or a custom service that triggers your function and executes its logic.
- Downstream resources – an AWS service that your Lambda function calls once it is triggered.
- Log streams – While Lambda automatically monitors your function invocations and reports metrics to CloudWatch, you can annotate your function code with custom logging statements that allow you to analyze the execution flow and performance of your Lambda function.
- AWS Serverless Application Model
Lambda Functions
- You upload your application code in the form of one or more Lambda functions. Lambda stores code in Amazon S3 and encrypts it at rest.
- To create a Lambda function, you first package your code and dependencies in a deployment package. Then, you upload the deployment package to create your Lambda function.
- After your Lambda function is in production, Lambda automatically monitors functions on your behalf, reporting metrics through Amazon CloudWatch.
- Configure basic function settings include the description, role, runtime, and role that you specify when you create a function.
- Environment variables are always encrypted at rest, and can be encrypted in transit as well.
- Versions and aliases are secondary resources that you can create to manage function deployment and invocation.
- A layer is a ZIP archive that contains libraries, a custom runtime, or other dependencies. Use layers to manage your function’s dependencies independently and keep your deployment package small.
Invoking Functions
- Lambda supports synchronous and asynchronous invocation of a Lambda function. You can control the invocation type only when you invoke a Lambda function (referred to as on-demand invocation).
- An event source is the entity that publishes events, and a Lambda function is the custom code that processes the events.
- Event source mapping maps an event source to a Lambda function. It enables automatic invocation of your Lambda function when events occur.
Lambda@Edge
- Lets you run Lambda functions to customize content that CloudFront delivers, executing the functions in AWS locations closer to the viewer. The functions run in response to CloudFront events, without provisioning or managing servers.
- You can use Lambda functions to change CloudFront requests and responses at the following points:
- After CloudFront receives a request from a viewer (viewer request)
- Before CloudFront forwards the request to the origin (origin request)
- After CloudFront receives the response from the origin (origin response)
- Before CloudFront forwards the response to the viewer (viewer response)
- You can automate your serverless application’s release process using AWS CodePipeline and AWS CodeDeploy.
- Lambda will automatically track the behavior of your Lambda function invocations and provide feedback that you can monitor. In addition, it provides metrics that allows you to analyze the full function invocation spectrum, including event source integration and whether downstream resources perform as expected.
Pricing
- You are charged based on the total number of requests for your functions and the duration, the time it takes for your code to execute.
Limits
Resource
|
Default Limit
|
Concurrent executions | 1000 |
Function and layer storage | 75 GB |
Function memory allocation | 128 MB to 3008 MB, in 64 MB increments. |
Function timeout | 900 seconds (15 minutes) |
Function environment variables | 4 KB |
Function layers | 5 layers |
Deployment package size | 50 MB (zipped) 250 MB (unzipped, including layers) 3 MB (console editor) |
Comments
Post a Comment