Package-level declarations

Types

Link copied to clipboard
sealed class AssignPublicIp
Link copied to clipboard

This structure specifies the VPC subnets and security groups for the task, and whether a public IP address is to be used. This structure is relevant only for ECS tasks that use the awsvpc network mode.

Link copied to clipboard

The array properties for the submitted job, such as the size of the array. The array size can be between 2 and 10,000. If you specify array properties for a job, it becomes an array job. This parameter is used only if the target is an Batch job.

Link copied to clipboard

The overrides that are sent to a container.

Link copied to clipboard

The environment variables to send to the container. You can add new environment variables, which are added to the container at launch, or you can override the existing environment variables from the Docker image or the task definition.

Link copied to clipboard

An object that represents an Batch job dependency.

Link copied to clipboard
Link copied to clipboard

The type and amount of a resource to assign to a container. The supported resources include GPU, MEMORY, and VCPU.

Link copied to clipboard
Link copied to clipboard

The retry strategy that's associated with a job. For more information, see Automated job retries in the Batch User Guide.

Link copied to clipboard

The details of a capacity provider strategy. To learn more, see CapacityProviderStrategyItem in the Amazon ECS API Reference.

Link copied to clipboard

The Amazon CloudWatch Logs logging configuration settings for the pipe.

Link copied to clipboard

The Amazon CloudWatch Logs logging configuration settings for the pipe.

Link copied to clipboard

An action you attempted resulted in an exception.

Link copied to clipboard
Link copied to clipboard
Link copied to clipboard

A DeadLetterConfig object that contains information about a dead-letter queue configuration.

Link copied to clipboard
Link copied to clipboard
Link copied to clipboard
Link copied to clipboard
Link copied to clipboard

Maps source data to a dimension in the target Timestream for LiveAnalytics table.

Link copied to clipboard
sealed class DimensionValueType
Link copied to clipboard
Link copied to clipboard

The overrides that are sent to a container. An empty container override can be passed in. An example of an empty container override is {"containerOverrides": [ ] }. If a non-empty container override is specified, the name parameter must be included.

Link copied to clipboard

A list of files containing the environment variables to pass to a container. You can specify up to ten environment files. The file must have a .env file extension. Each line in an environment file should contain an environment variable in VARIABLE=VALUE format. Lines beginning with # are treated as comments and are ignored. For more information about the environment variable file syntax, see Declare default environment variables in file.

Link copied to clipboard
Link copied to clipboard

The environment variables to send to the container. You can add new environment variables, which are added to the container at launch, or you can override the existing environment variables from the Docker image or the task definition. You must also specify a container name.

Link copied to clipboard

The amount of ephemeral storage to allocate for the task. This parameter is used to expand the total amount of ephemeral storage available, beyond the default amount, for tasks hosted on Fargate. For more information, see Fargate task storage in the Amazon ECS User Guide for Fargate.

Link copied to clipboard

Details on an Elastic Inference accelerator task override. This parameter is used to override the Elastic Inference accelerator specified in the task definition. For more information, see Working with Amazon Elastic Inference on Amazon ECS in the Amazon Elastic Container Service Developer Guide.

Link copied to clipboard

The type and amount of a resource to assign to a container. The supported resource types are GPUs and Elastic Inference accelerators. For more information, see Working with GPUs on Amazon ECS or Working with Amazon Elastic Inference on Amazon ECS in the Amazon Elastic Container Service Developer Guide

Link copied to clipboard
Link copied to clipboard

The overrides that are associated with a task.

Link copied to clipboard
sealed class EpochTimeUnit
Link copied to clipboard
class Filter

Filter events using an event pattern. For more information, see Events and Event Patterns in the Amazon EventBridge User Guide.

Link copied to clipboard

The collection of event patterns used to filter events.

Link copied to clipboard

The Amazon Data Firehose logging configuration settings for the pipe.

Link copied to clipboard

The Amazon Data Firehose logging configuration settings for the pipe.

Link copied to clipboard
Link copied to clipboard

This exception occurs due to unexpected causes.

Link copied to clipboard
Link copied to clipboard
sealed class LaunchType
Link copied to clipboard
Link copied to clipboard
Link copied to clipboard
sealed class LogLevel
Link copied to clipboard
sealed class MeasureValueType
Link copied to clipboard

The Secrets Manager secret that stores your broker credentials.

Link copied to clipboard

The Secrets Manager secret that stores your stream credentials.

Link copied to clipboard
sealed class MskStartPosition
Link copied to clipboard

A mapping of a source event data field to a measure in a Timestream for LiveAnalytics record.

Link copied to clipboard

Maps multiple measures from the source event to the same Timestream for LiveAnalytics record.

Link copied to clipboard

This structure specifies the network configuration for an Amazon ECS task.

Link copied to clipboard

An entity that you specified does not exist.

Link copied to clipboard
class Pipe

An object that represents a pipe. Amazon EventBridgePipes connect event sources to targets and reduces the need for specialized knowledge and integration code.

Link copied to clipboard

These are custom parameter to be used when the target is an API Gateway REST APIs or EventBridge ApiDestinations. In the latter case, these are merged with any InvocationParameters specified on the Connection, with any values from the Connection taking precedence.

Link copied to clipboard

The parameters required to set up enrichment on your pipe.

Link copied to clipboard

The logging configuration settings for the pipe.

Link copied to clipboard

Specifies the logging configuration settings for the pipe.

Link copied to clipboard

Base class for all service related exceptions thrown by the Pipes client

Link copied to clipboard

The parameters for using an Active MQ broker as a source.

Link copied to clipboard

The parameters for using a DynamoDB stream as a source.

Link copied to clipboard

The parameters for using a Kinesis stream as a source.

The parameters for using an MSK stream as a source.

Link copied to clipboard

The parameters required to set up a source for your pipe.

Link copied to clipboard

The parameters for using a Rabbit MQ broker as a source.

Link copied to clipboard

The parameters for using a self-managed Apache Kafka stream as a source.

Link copied to clipboard

The parameters for using a Amazon SQS stream as a source.

Link copied to clipboard
sealed class PipeState
Link copied to clipboard

The parameters for using an Batch job as a target.

Link copied to clipboard

The parameters for using an CloudWatch Logs log stream as a target.

Link copied to clipboard

The parameters for using an Amazon ECS task as a target.

Link copied to clipboard

The parameters for using an EventBridge event bus as a target.

Link copied to clipboard

These are custom parameter to be used when the target is an API Gateway REST APIs or EventBridge ApiDestinations.

Link copied to clipboard
Link copied to clipboard

The parameters for using a Kinesis stream as a target.

Link copied to clipboard

The parameters for using a Lambda function as a target.

Link copied to clipboard

The parameters required to set up a target for your pipe.

Link copied to clipboard

These are custom parameters to be used when the target is a Amazon Redshift cluster to invoke the Amazon Redshift Data API BatchExecuteStatement.

Link copied to clipboard

The parameters for using a SageMaker pipeline as a target.

Link copied to clipboard

The parameters for using a Amazon SQS stream as a target.

Link copied to clipboard

The parameters for using a Step Functions state machine as a target.

Link copied to clipboard

The parameters for using a Timestream for LiveAnalytics table as a target.

Link copied to clipboard

An object representing a constraint on task placement. To learn more, see Task Placement Constraints in the Amazon Elastic Container Service Developer Guide.

Link copied to clipboard
Link copied to clipboard

The task placement strategy for a task or service. To learn more, see Task Placement Strategies in the Amazon Elastic Container Service Service Developer Guide.

Link copied to clipboard
Link copied to clipboard
sealed class PropagateTags
Link copied to clipboard
sealed class RequestedPipeState
Link copied to clipboard

The Amazon S3 logging configuration settings for the pipe.

Link copied to clipboard

The Amazon S3 logging configuration settings for the pipe.

Link copied to clipboard
sealed class S3OutputFormat
Link copied to clipboard

Name/Value pair of a parameter to start execution of a SageMaker Model Building Pipeline.

The Secrets Manager secret that stores your stream credentials.

Link copied to clipboard

This structure specifies the VPC subnets and security groups for the stream, and whether a public IP address is to be used.

Link copied to clipboard
Link copied to clipboard

A quota has been exceeded.

Link copied to clipboard

Maps a single source data field to a single record in the specified Timestream for LiveAnalytics table.

Link copied to clipboard
Link copied to clipboard
Link copied to clipboard
Link copied to clipboard
Link copied to clipboard
class Tag

A key-value pair associated with an Amazon Web Services resource. In EventBridge, rules and event buses support tagging.

Link copied to clipboard
Link copied to clipboard
Link copied to clipboard

An action was throttled.

Link copied to clipboard
sealed class TimeFieldType
Link copied to clipboard
Link copied to clipboard
Link copied to clipboard
Link copied to clipboard
Link copied to clipboard

The parameters for using an Active MQ broker as a source.

Link copied to clipboard

The parameters for using a DynamoDB stream as a source.

Link copied to clipboard

The parameters for using a Kinesis stream as a source.

The parameters for using an MSK stream as a source.

Link copied to clipboard

The parameters required to set up a source for your pipe.

Link copied to clipboard

The parameters for using a Rabbit MQ broker as a source.

The parameters for using a self-managed Apache Kafka stream as a source.

Link copied to clipboard

The parameters for using a Amazon SQS stream as a source.

Link copied to clipboard

Indicates that an error has occurred while performing a validate operation.

Link copied to clipboard

Indicates that an error has occurred while performing a validate operation.