This post is written by Beau Gosse, Senior Software Engineer and Paras Jain, Senior Technical Account Manager.
AWS Lambda now supports .NET 8 as both a managed runtime and container base image. With this release, Lambda developers can benefit from .NET 8 features including API enhancements, improved Native Ahead of Time (Native AOT) support, and improved performance. .NET 8 supports C# 12, F# 8, and PowerShell 7.4. You can develop Lambda functions in .NET 8 using the AWS Toolkit for Visual Studio, the AWS Extensions for .NET CLI, AWS Serverless Application Model (AWS SAM), AWS CDK, and other infrastructure as code tools.
Creating .NET 8 function in the console
What’s new Upgraded operating systemThe .NET 8 runtime is built on the Amazon Linux 2023 (AL2023) minimal container image. This provides a smaller deployment footprint than earlier Amazon Linux 2 (AL2) based runtimes and updated versions of common libraries such as glibc 2.34 and OpenSSL 3.
The new image also uses microdnf
as a package manager, symlinked as dnf
. This replaces the yum
package manager used in earlier AL2-based images. If you deploy your Lambda functions as container images, you must update your Dockerfiles to use dnf
instead of yum
when upgrading to the .NET 8 base image. For more information, see Introducing the Amazon Linux 2023 runtime for AWS Lambda.
There are a number of language performance improvements available as part of .NET 8. Initialization time can impact performance, as Lambda creates new execution environments to scale your function automatically. There are a number of ways to optimize performance for Lambda-based .NET workloads, including using source generators in System.Text.Json
or using Native AOT.
Lambda has increased the default memory size from 256 MB to 512 MB in the blueprints and templates for improved performance with .NET 8. Perform your own functional and performance tests on your .NET 8 applications. You can use AWS Compute Optimizer or AWS Lambda Power Tuning for performance profiling.
At launch, new Lambda runtimes receive less usage than existing established runtimes. This can result in longer cold start times due to reduced cache residency within internal Lambda subsystems. Cold start times typically improve in the weeks following launch as usage increases. As a result, AWS recommends not drawing performance comparison conclusions with other Lambda runtimes until the performance has stabilized.
Native AOTLambda introduced .NET Native AOT support in November 2022. Benchmarks show up to 86% improvement in cold start times by eliminating the JIT compilation. Deploying .NET 8 Native AOT functions using the managed dotnet8
runtime rather than the OS-only provided.al2023
runtime gives your function access to .NET system libraries. For example, libicu
, which is used for globalization, is not included by default in the provided.al2023
runtime but is in the dotnet8
runtime.
While Native AOT is not suitable for all .NET functions, .NET 8 has improved trimming support. This allows you to more easily run ASP.NET APIs. Improved trimming support helps eliminate build time trimming warnings, which highlight possible runtime errors. This can give you confidence that your Native AOT function behaves like a JIT-compiled function. Trimming support has been added to the Lambda runtime libraries, AWS .NET SDK, .NET Lambda Annotations, and .NET 8 itself.
Using.NET 8 with LambdaTo use .NET 8 with Lambda, you must update your tools.
Amazon.Lambda.Tools
), install the CLI extension and templates. You can upgrade existing tools with dotnet tool update -g Amazon.Lambda.Tools
and existing templates with dotnet new install Amazon.Lambda.Templates
.You can also use .NET 8 with Powertools for AWS Lambda (.NET), a developer toolkit to implement serverless best practices such as observability, batch processing, retrieving parameters, idempotency, and feature flags.
Building new .NET 8 functions Using AWS SAMsam init
.dotnet8
as the runtime. The dotnet8
Hello World Example also includes a Native AOT template option.AWS SAM .NET 8 init options
You can amend the generated function code and use sam deploy --guided
to deploy the function.
dotnet new list --tag Lambda
to get a list of available Lambda templates.dotnet new <template name>
. To build a function using Native AOT, use dotnet new lambda.NativeAOT
or dotnet new serverless.NativeAOT
when using the .NET Lambda Annotations Framework.src
which contains the .csproj file. You can amend the generated function code.dotnet lambda deploy-function
and follow the prompts.dotnet lambda invoke-function
or by using the test functionality in the Lambda console.You can build and deploy .NET Lambda functions using container images. Follow the instructions in the documentation.
Migrating from .NET 6 to .NET 8 without Native AOT Using AWS SAMtemplate.yaml
file.dotnet8
.sam build
.sam deploy
to deploy the changes.TargetFramework
to net8.0
. Update NuGet packages for your Lambda functions to the latest version to pull in .NET 8 updates.aws-lambda-tools-defaults.json
file if it exists.
net8.0
. If unspecified, the value is inferred from the project file.dotnet8
.serverless.template
file if it exists. For any AWS::Lambda::Function
or AWS::Servereless::Function
resources, set the Runtime property to dotnet8
.The following example migrates a .NET 6 class library function to a .NET 8 Native AOT executable function. This uses the optional Lambda Annotations framework which provides idiomatic .NET coding patterns.
Update your project fileTargetFramework
to net8.0
.OutputType
to exe
.PublishReadyToRun
if it exists.PublishAot
and set to true
.Amazon.Lambda.Annotations
and Amazon.Lambda.RuntimeSupport
. You can update using the NuGet UI in your IDE, manually, or by running dotnet add package Amazon.Lambda.RuntimeSupport and dotnet add package Amazon.Lambda.Annotations
from your project directory.Your project file should look similar to the following:
<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<OutputType>exe</OutputType>
<TargetFramework>net8.0</TargetFramework>
<ImplicitUsings>enable</ImplicitUsings>
<Nullable>enable</Nullable>
<AWSProjectType>Lambda</AWSProjectType>
<CopyLocalLockFileAssemblies>true</CopyLocalLockFileAssemblies>
<!-- Generate native aot images during publishing to improve cold start time. -->
<PublishAot>true</PublishAot>
<!-- StripSymbols tells the compiler to strip debugging symbols from the final executable if we're on Linux and put them into their own file.
This will greatly reduce the final executable's size.-->
<StripSymbols>true</StripSymbols>
</PropertyGroup>
<ItemGroup>
<PackageReference Include="Amazon.Lambda.Core" Version="2.2.0" />
<PackageReference Include="Amazon.Lambda.RuntimeSupport" Version="1.10.0" />
<PackageReference Include="Amazon.Lambda.Serialization.SystemTextJson" Version="2.4.0" />
</ItemGroup>
</Project>
Updating your function code
using Amazon.Lambda.Annotations;
[assembly:LambdaGlobalProperties(GenerateMain = true)]
to allow the annotations framework to create the main method. This is required as the project is now an executable instead of a library.JsonSerializable
attribute for any types that you need to serialize, including for your function input and output This partial class is used at build time to generate reflection free code dedicated to serializing the listed types. The following is an example:/// <summary>
/// This class is used to register the input event and return type for the FunctionHandler method with the System.Text.Json source generator.
/// There must be a JsonSerializable attribute for each type used as the input and return type or a runtime error will occur
/// from the JSON serializer unable to find the serialization information for unknown types.
/// </summary>
[JsonSerializable(typeof(APIGatewayHttpApiV2ProxyRequest))]
[JsonSerializable(typeof(APIGatewayHttpApiV2ProxyResponse))]
public partial class MyCustomJsonSerializerContext : JsonSerializerContext
{
// By using this partial class derived from JsonSerializerContext, we can generate reflection free JSON Serializer code at compile time
// which can deserialize our class and properties. However, we must attribute this class to tell it what types to generate serialization code for
// See https://docs.microsoft.com/en-us/dotnet/standard/serialization/system-text-json-source-generation
}
[assembly: LambdaSerializer(typeof(SourceGeneratorLambdaJsonSerializer<LambdaFunctionJsonSerializerContext>))]
Swap LambdaFunctionJsonSerializerContext
for your context if you are not using the partial class from the previous step.
If you are using aws-lambda-tools-defaults.json.
function-runtime
to dotnet8
.function-architecture
to match your build machine – either x86_64
or arm64
.environment-variables
to include ANNOTATIONS_HANDLER=<YourFunctionHandler>
. Replace <YourFunctionHandler>
with the method name of your function handler, so the annotations framework knows which method to call from the generated main method.function-handler
to the name of the executable assembly in your bin directory. By default, this is your project name, which tells the .NET Lambda bootstrap script to run your native binary instead of starting the .NET runtime. If your project file has AssemblyName then use that value for the function handler.{
"function-architecture": "x86_64",
"function-runtime": "dotnet8",
"function-handler": "<your-assembly-name>",
"environment-variables",
"ANNOTATIONS_HANDLER=<your-function-handler>",
}
Deploy and test
Amazon.Lambda.Tools
, run dotnet lambda deploy-function
. Check for trim warnings during build and refactor to eliminate them.Lambda is introducing the new .NET 8 managed runtime. This post highlights new features in .NET 8. You can create new Lambda functions or migrate existing functions to .NET 8 or .NET 8 Native AOT.
For more information, see the AWS Lambda for .NET repository, documentation, and .NET on Serverless Land.
For more serverless learning resources, visit Serverless Land.
RetroSearch is an open source project built by @garambo | Open a GitHub Issue
Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo
HTML:
3.2
| Encoding:
UTF-8
| Version:
0.7.4