InferenceComponentContainerSpecification
Defines a container that provides the runtime environment for a model that you deploy with an inference component.
Types
Properties
Link copied to clipboard
The Amazon S3 path where the model artifacts, which result from model training, are stored. This path must point to a single gzip compressed tar archive (.tar.gz suffix).
Link copied to clipboard
The environment variables to set in the Docker container. Each key and value in the Environment string-to-string map can have length of up to 1024. We support up to 16 entries in the map.
Functions
Link copied to clipboard
inline fun copy(block: InferenceComponentContainerSpecification.Builder.() -> Unit = {}): InferenceComponentContainerSpecification