A RetroSearch Logo

Home - News ( United States | United Kingdom | Italy | Germany ) - Football scores

Search Query:

Showing content from https://docs.aws.amazon.com/cdk/api/v2/docs/aws-cdk-lib.aws_lambda_event_sources-readme.html below:

aws-cdk-lib.aws_lambda_event_sources module · AWS CDK

AWS Lambda Event Sources

An event source mapping is an AWS Lambda resource that reads from an event source and invokes a Lambda function. You can use event source mappings to process items from a stream or queue in services that don't invoke Lambda functions directly. Lambda provides event source mappings for the following services. Read more about lambda event sources here.

This module includes classes that allow using various AWS services as event sources for AWS Lambda via the high-level lambda.addEventSource(source) API.

NOTE: In most cases, it is also possible to use the resource APIs to invoke an AWS Lambda function. This library provides a uniform API for all Lambda event sources regardless of the underlying mechanism they use.

The following code sets up a lambda function with an SQS queue event source -

import { SqsEventSource } from 'aws-cdk-lib/aws-lambda-event-sources';

declare const fn: lambda.Function;
const queue = new sqs.Queue(this, 'MyQueue');
const eventSource = new SqsEventSource(queue);
fn.addEventSource(eventSource);

const eventSourceId = eventSource.eventSourceMappingId;
const eventSourceMappingArn = eventSource.eventSourceMappingArn;

Example not in your language?

The eventSourceId property contains the event source id. This will be a token that will resolve to the final value at the time of deployment.

The eventSourceMappingArn property contains the event source mapping ARN. This will be a token that will resolve to the final value at the time of deployment.

SQS

Amazon Simple Queue Service (Amazon SQS) allows you to build asynchronous workflows. For more information about Amazon SQS, see Amazon Simple Queue Service. You can configure AWS Lambda to poll for these messages as they arrive and then pass the event to a Lambda function invocation. To view a sample event, see Amazon SQS Event.

To set up Amazon Simple Queue Service as an event source for AWS Lambda, you first create or update an Amazon SQS queue and select custom values for the queue parameters. The following parameters will impact Amazon SQS's polling behavior:

import { SqsEventSource } from 'aws-cdk-lib/aws-lambda-event-sources';

const queue = new sqs.Queue(this, 'MyQueue', {
  visibilityTimeout: Duration.seconds(30),      
});
declare const fn: lambda.Function;

fn.addEventSource(new SqsEventSource(queue, {
  batchSize: 10, 
  maxBatchingWindow: Duration.minutes(5),
  reportBatchItemFailures: true, 
}));

Example not in your language?

S3

You can write Lambda functions to process S3 bucket events, such as the object-created or object-deleted events. For example, when a user uploads a photo to a bucket, you might want Amazon S3 to invoke your Lambda function so that it reads the image and creates a thumbnail for the photo.

You can use the bucket notification configuration feature in Amazon S3 to configure the event source mapping, identifying the bucket events that you want Amazon S3 to publish and which Lambda function to invoke.

import * as s3 from 'aws-cdk-lib/aws-s3';
import { S3EventSource } from 'aws-cdk-lib/aws-lambda-event-sources';

const bucket = new s3.Bucket(this, 'mybucket');
declare const fn: lambda.Function;

fn.addEventSource(new S3EventSource(bucket, {
  events: [ s3.EventType.OBJECT_CREATED, s3.EventType.OBJECT_REMOVED ],
  filters: [ { prefix: 'subdir/' } ], 
}));

Example not in your language?

In the example above, S3EventSource is accepting Bucket type as parameter. However, Functions like from_bucket_name and from_bucket_arn will return IBucket and is not compliant with S3EventSource. If this is the case, please consider using S3EventSourceV2 instead, this class accepts IBucket.

import * as s3 from 'aws-cdk-lib/aws-s3';
import { S3EventSourceV2 } from 'aws-cdk-lib/aws-lambda-event-sources';

const bucket = s3.Bucket.fromBucketName(this, 'Bucket', 'amzn-s3-demo-bucket');
declare const fn: lambda.Function;

fn.addEventSource(new S3EventSourceV2(bucket, {
  events: [ s3.EventType.OBJECT_CREATED, s3.EventType.OBJECT_REMOVED ],
  filters: [ { prefix: 'subdir/' } ], 
}));

Example not in your language?

SNS

You can write Lambda functions to process Amazon Simple Notification Service notifications. When a message is published to an Amazon SNS topic, the service can invoke your Lambda function by passing the message payload as a parameter. Your Lambda function code can then process the event, for example publish the message to other Amazon SNS topics, or send the message to other AWS services.

This also enables you to trigger a Lambda function in response to Amazon CloudWatch alarms and other AWS services that use Amazon SNS.

For an example event, see Appendix: Message and JSON Formats and Amazon SNS Sample Event. For an example use case, see Using AWS Lambda with Amazon SNS from Different Accounts.

import * as sns from 'aws-cdk-lib/aws-sns';
import { SnsEventSource } from 'aws-cdk-lib/aws-lambda-event-sources';

declare const topic: sns.Topic;
const deadLetterQueue = new sqs.Queue(this, 'deadLetterQueue');

declare const fn: lambda.Function;
fn.addEventSource(new SnsEventSource(topic, {
  filterPolicy: { },
  deadLetterQueue: deadLetterQueue,
}));

Example not in your language?

When a user calls the SNS Publish API on a topic that your Lambda function is subscribed to, Amazon SNS will call Lambda to invoke your function asynchronously. Lambda will then return a delivery status. If there was an error calling Lambda, Amazon SNS will retry invoking the Lambda function up to three times. After three tries, if Amazon SNS still could not successfully invoke the Lambda function, then Amazon SNS will send a delivery status failure message to CloudWatch.

DynamoDB Streams

You can write Lambda functions to process change events from a DynamoDB Table. An event is emitted to a DynamoDB stream (if configured) whenever a write (Put, Delete, Update) operation is performed against the table. See Using AWS Lambda with Amazon DynamoDB for more information about configuring Lambda function event sources with DynamoDB.

To process events with a Lambda function, first create or update a DynamoDB table and enable a stream specification. Then, create a DynamoEventSource and add it to your Lambda function. The following parameters will impact Amazon DynamoDB's polling behavior:

import * as dynamodb from 'aws-cdk-lib/aws-dynamodb';
import { DynamoEventSource, SqsDlq } from 'aws-cdk-lib/aws-lambda-event-sources';

declare const table: dynamodb.Table;

const deadLetterQueue = new sqs.Queue(this, 'deadLetterQueue');

declare const fn: lambda.Function;
fn.addEventSource(new DynamoEventSource(table, {
  startingPosition: lambda.StartingPosition.TRIM_HORIZON,
  batchSize: 5,
  bisectBatchOnError: true,
  onFailure: new SqsDlq(deadLetterQueue),
  retryAttempts: 10,
}));

Example not in your language?

The following code sets up a Lambda function with a DynamoDB event source. A filter is applied to only send DynamoDB events to the Lambda function when the id column is a boolean that equals true.

import * as dynamodb from 'aws-cdk-lib/aws-dynamodb';
import { DynamoEventSource } from 'aws-cdk-lib/aws-lambda-event-sources';

declare const table: dynamodb.Table;

declare const fn: lambda.Function;
fn.addEventSource(new DynamoEventSource(table, {
  startingPosition: lambda.StartingPosition.LATEST,
  filters: [
    lambda.FilterCriteria.filter({
      eventName: lambda.FilterRule.isEqual('INSERT'),
      dynamodb: {
        NewImage: {
          id: { BOOL: lambda.FilterRule.isEqual(true) },
        },
      },
    }),
  ],
}));

Example not in your language?

Kinesis

You can write Lambda functions to process streaming data in Amazon Kinesis Streams. For more information about Amazon Kinesis, see Amazon Kinesis Service. To learn more about configuring Lambda function event sources with kinesis and view a sample event, see Amazon Kinesis Event.

To set up Amazon Kinesis as an event source for AWS Lambda, you first create or update an Amazon Kinesis stream and select custom values for the event source parameters. The following parameters will impact Amazon Kinesis's polling behavior:

import * as kinesis from 'aws-cdk-lib/aws-kinesis';
import { KinesisEventSource } from 'aws-cdk-lib/aws-lambda-event-sources';

const stream = new kinesis.Stream(this, 'MyStream');

declare const myFunction: lambda.Function;
myFunction.addEventSource(new KinesisEventSource(stream, {
  batchSize: 100, 
  startingPosition: lambda.StartingPosition.TRIM_HORIZON,
}));

Example not in your language?

To use a dedicated-throughput consumer with enhanced fan-out

import * as kinesis from 'aws-cdk-lib/aws-kinesis';
import { KinesisConsumerEventSource } from 'aws-cdk-lib/aws-lambda-event-sources';

const stream = new kinesis.Stream(this, 'MyStream');
const streamConsumer = new kinesis.StreamConsumer(this, 'MyStreamConsumer', {
  stream,
  streamConsumerName: 'MyStreamConsumer',
});

declare const myFunction: lambda.Function;
myFunction.addEventSource(new KinesisConsumerEventSource(streamConsumer, {
  batchSize: 100, 
  startingPosition: lambda.StartingPosition.TRIM_HORIZON,
}));

Example not in your language?

Kafka

You can write Lambda functions to process data either from Amazon MSK or a self-managed Kafka cluster. The following parameters will impact to the polling behavior:

The following code sets up Amazon MSK as an event source for a lambda function. Credentials will need to be configured to access the MSK cluster, as described in Username/Password authentication.

import { Secret } from 'aws-cdk-lib/aws-secretsmanager';
import { ManagedKafkaEventSource } from 'aws-cdk-lib/aws-lambda-event-sources';


const clusterArn = 'arn:aws:kafka:us-east-1:0123456789019:cluster/SalesCluster/abcd1234-abcd-cafe-abab-9876543210ab-4';


const topic = 'some-cool-topic';



const secret = new Secret(this, 'Secret', { secretName: 'AmazonMSK_KafkaSecret' });

declare const myFunction: lambda.Function;
myFunction.addEventSource(new ManagedKafkaEventSource({
  clusterArn,
  topic: topic,
  secret: secret,
  batchSize: 100, 
  startingPosition: lambda.StartingPosition.TRIM_HORIZON,
}));

Example not in your language?

The following code sets up a self managed Kafka cluster as an event source. Username and password based authentication will need to be set up as described in Managing access and permissions.

import { Secret } from 'aws-cdk-lib/aws-secretsmanager';
import { SelfManagedKafkaEventSource } from 'aws-cdk-lib/aws-lambda-event-sources';


const bootstrapServers = ['kafka-broker:9092'];


const topic = 'some-cool-topic';


declare const secret: Secret;


const consumerGroupId = "my-consumer-group-id";

declare const myFunction: lambda.Function;
myFunction.addEventSource(new SelfManagedKafkaEventSource({
  bootstrapServers: bootstrapServers,
  topic: topic,
  consumerGroupId: consumerGroupId,
  secret: secret,
  batchSize: 100, 
  startingPosition: lambda.StartingPosition.TRIM_HORIZON,
}));

Example not in your language?

If your self managed Kafka cluster is only reachable via VPC also configure vpc vpcSubnets and securityGroup.

You can specify event filtering for managed and self managed Kafka clusters using the filters property:

import { ManagedKafkaEventSource } from 'aws-cdk-lib/aws-lambda-event-sources';


const clusterArn = 'arn:aws:kafka:us-east-1:0123456789019:cluster/SalesCluster/abcd1234-abcd-cafe-abab-9876543210ab-4';


const topic = 'some-cool-topic';

declare const myFunction: lambda.Function;
myFunction.addEventSource(new ManagedKafkaEventSource({
  clusterArn,
  topic,
  startingPosition: lambda.StartingPosition.TRIM_HORIZON,
  filters: [
    lambda.FilterCriteria.filter({
      stringEquals: lambda.FilterRule.isEqual('test'),
    }),
  ],
}));

Example not in your language?

By default, Lambda will encrypt Filter Criteria using AWS managed keys. But if you want to use a self managed KMS key to encrypt the filters, You can specify the self managed key using the filterEncryption property.

import { ManagedKafkaEventSource } from 'aws-cdk-lib/aws-lambda-event-sources';
import { Key } from 'aws-cdk-lib/aws-kms';


const clusterArn = 'arn:aws:kafka:us-east-1:0123456789019:cluster/SalesCluster/abcd1234-abcd-cafe-abab-9876543210ab-4';


const topic = 'some-cool-topic';


const myKey = Key.fromKeyArn(
  this,
  'SourceBucketEncryptionKey',
  'arn:aws:kms:us-east-1:123456789012:key/<key-id>',
);

declare const myFunction: lambda.Function;
myFunction.addEventSource(new ManagedKafkaEventSource({
  clusterArn,
  topic,
  startingPosition: lambda.StartingPosition.TRIM_HORIZON,
  filters: [
    lambda.FilterCriteria.filter({
      stringEquals: lambda.FilterRule.isEqual('test'),
    }),
  ],
  filterEncryption: myKey,
}));

Example not in your language?

You can also specify an S3 bucket as an "on failure" destination:

import { ManagedKafkaEventSource, S3OnFailureDestination } from 'aws-cdk-lib/aws-lambda-event-sources';
import { IBucket } from 'aws-cdk-lib/aws-s3';


const clusterArn = 'arn:aws:kafka:us-east-1:0123456789019:cluster/SalesCluster/abcd1234-abcd-cafe-abab-9876543210ab-4';


const topic = 'some-cool-topic';

declare const bucket: IBucket;
declare const myFunction: lambda.Function;

const s3OnFailureDestination = new S3OnFailureDestination(bucket);

myFunction.addEventSource(new ManagedKafkaEventSource({
  clusterArn,
  topic,
  startingPosition: lambda.StartingPosition.TRIM_HORIZON,
  onFailure: s3OnFailureDestination,
}));

Example not in your language?

Set configuration for provisioned pollers that read from the event source.

import { ManagedKafkaEventSource } from 'aws-cdk-lib/aws-lambda-event-sources';


declare const clusterArn: string


const topic = 'some-cool-topic';

declare const myFunction: lambda.Function;
myFunction.addEventSource(new ManagedKafkaEventSource({
  clusterArn,
  topic,
  startingPosition: lambda.StartingPosition.TRIM_HORIZON,
  provisionedPollerConfig: {
    minimumPollers: 1,
    maximumPollers: 3,
  },
}));

Example not in your language?

Set a confluent or self-managed schema registry to de-serialize events from the event source. Note, this will similarly work for SelfManagedKafkaEventSource but the example only shows setup for ManagedKafkaEventSource.

import { ManagedKafkaEventSource, ConfluentSchemaRegistry } from 'aws-cdk-lib/aws-lambda-event-sources';
import { Secret } from 'aws-cdk-lib/aws-secretsmanager';


declare const clusterArn: string;


const topic = 'some-cool-topic';

const secret = new Secret(this, 'Secret', { secretName: 'AmazonMSK_KafkaSecret' });

declare const myFunction: lambda.Function;
myFunction.addEventSource(new ManagedKafkaEventSource({
  clusterArn,
  topic,
  startingPosition: lambda.StartingPosition.TRIM_HORIZON,
  provisionedPollerConfig: {
    minimumPollers: 1,
    maximumPollers: 3,
  },
  schemaRegistryConfig: new ConfluentSchemaRegistry({
    schemaRegistryUri: 'https://example.com',
    eventRecordFormat: lambda.EventRecordFormat.JSON,
    authenticationType: lambda.KafkaSchemaRegistryAccessConfigType.BASIC_AUTH,
    secret: secret,
    schemaValidationConfigs: [{ attribute: lambda.KafkaSchemaValidationAttribute.KEY }],
  }),
}));

Example not in your language?

Set Glue schema registry to de-serialize events from the event source. Note, this will similarly work for SelfManagedKafkaEventSource but the example only shows setup for ManagedKafkaEventSource.

import { CfnRegistry } from 'aws-cdk-lib/aws-glue';
import { ManagedKafkaEventSource, GlueSchemaRegistry } from 'aws-cdk-lib/aws-lambda-event-sources';


declare const clusterArn: string;


const topic = 'some-cool-topic';


const glueRegistry = new CfnRegistry(this, 'Registry', {
  name: 'schema-registry',
  description: 'Schema registry for event source',
});

declare const myFunction: lambda.Function;
myFunction.addEventSource(new ManagedKafkaEventSource({
  clusterArn,
  topic,
  startingPosition: lambda.StartingPosition.TRIM_HORIZON,
  provisionedPollerConfig: {
    minimumPollers: 1,
    maximumPollers: 3,
  },
  schemaRegistryConfig: new GlueSchemaRegistry({
    schemaRegistry: glueRegistry,
    eventRecordFormat: lambda.EventRecordFormat.JSON,
    schemaValidationConfigs: [{ attribute: lambda.KafkaSchemaValidationAttribute.KEY }],
  }),
}));

Example not in your language?

Roadmap

Eventually, this module will support all the event sources described under Supported Event Sources in the AWS Lambda Developer Guide.


RetroSearch is an open source project built by @garambo | Open a GitHub Issue

Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo

HTML: 3.2 | Encoding: UTF-8 | Version: 0.7.4