1. Packages
  2. AWS Native
  3. API Docs
  4. sagemaker
  5. getInferenceExperiment

We recommend new projects start with resources from the AWS provider.

AWS Native v0.126.0 published on Monday, Sep 30, 2024 by Pulumi

aws-native.sagemaker.getInferenceExperiment

Explore with Pulumi AI

aws-native logo

We recommend new projects start with resources from the AWS provider.

AWS Native v0.126.0 published on Monday, Sep 30, 2024 by Pulumi

    Resource Type definition for AWS::SageMaker::InferenceExperiment

    Using getInferenceExperiment

    Two invocation forms are available. The direct form accepts plain arguments and either blocks until the result value is available, or returns a Promise-wrapped result. The output form accepts Input-wrapped arguments and returns an Output-wrapped result.

    function getInferenceExperiment(args: GetInferenceExperimentArgs, opts?: InvokeOptions): Promise<GetInferenceExperimentResult>
    function getInferenceExperimentOutput(args: GetInferenceExperimentOutputArgs, opts?: InvokeOptions): Output<GetInferenceExperimentResult>
    def get_inference_experiment(name: Optional[str] = None,
                                 opts: Optional[InvokeOptions] = None) -> GetInferenceExperimentResult
    def get_inference_experiment_output(name: Optional[pulumi.Input[str]] = None,
                                 opts: Optional[InvokeOptions] = None) -> Output[GetInferenceExperimentResult]
    func LookupInferenceExperiment(ctx *Context, args *LookupInferenceExperimentArgs, opts ...InvokeOption) (*LookupInferenceExperimentResult, error)
    func LookupInferenceExperimentOutput(ctx *Context, args *LookupInferenceExperimentOutputArgs, opts ...InvokeOption) LookupInferenceExperimentResultOutput

    > Note: This function is named LookupInferenceExperiment in the Go SDK.

    public static class GetInferenceExperiment 
    {
        public static Task<GetInferenceExperimentResult> InvokeAsync(GetInferenceExperimentArgs args, InvokeOptions? opts = null)
        public static Output<GetInferenceExperimentResult> Invoke(GetInferenceExperimentInvokeArgs args, InvokeOptions? opts = null)
    }
    public static CompletableFuture<GetInferenceExperimentResult> getInferenceExperiment(GetInferenceExperimentArgs args, InvokeOptions options)
    // Output-based functions aren't available in Java yet
    
    fn::invoke:
      function: aws-native:sagemaker:getInferenceExperiment
      arguments:
        # arguments dictionary

    The following arguments are supported:

    Name string
    The name for the inference experiment.
    Name string
    The name for the inference experiment.
    name String
    The name for the inference experiment.
    name string
    The name for the inference experiment.
    name str
    The name for the inference experiment.
    name String
    The name for the inference experiment.

    getInferenceExperiment Result

    The following output properties are available:

    Arn string
    The Amazon Resource Name (ARN) of the inference experiment.
    CreationTime string
    The timestamp at which you created the inference experiment.
    DataStorageConfig Pulumi.AwsNative.SageMaker.Outputs.InferenceExperimentDataStorageConfig
    The Amazon S3 location and configuration for storing inference request and response data.
    Description string
    The description of the inference experiment.
    DesiredState Pulumi.AwsNative.SageMaker.InferenceExperimentDesiredState
    The desired state of the experiment after starting or stopping operation.
    EndpointMetadata Pulumi.AwsNative.SageMaker.Outputs.InferenceExperimentEndpointMetadata
    LastModifiedTime string
    The timestamp at which you last modified the inference experiment.
    ModelVariants List<Pulumi.AwsNative.SageMaker.Outputs.InferenceExperimentModelVariantConfig>
    An array of ModelVariantConfig objects. Each ModelVariantConfig object in the array describes the infrastructure configuration for the corresponding variant.
    Schedule Pulumi.AwsNative.SageMaker.Outputs.InferenceExperimentSchedule

    The duration for which the inference experiment ran or will run.

    The maximum duration that you can set for an inference experiment is 30 days.

    ShadowModeConfig Pulumi.AwsNative.SageMaker.Outputs.InferenceExperimentShadowModeConfig
    The configuration of ShadowMode inference experiment type, which shows the production variant that takes all the inference requests, and the shadow variant to which Amazon SageMaker replicates a percentage of the inference requests. For the shadow variant it also shows the percentage of requests that Amazon SageMaker replicates.
    Status Pulumi.AwsNative.SageMaker.InferenceExperimentStatus
    The status of the inference experiment.
    StatusReason string
    The error message or client-specified reason from the StopInferenceExperiment API, that explains the status of the inference experiment.
    Tags List<Pulumi.AwsNative.Outputs.Tag>
    An array of key-value pairs to apply to this resource.
    Arn string
    The Amazon Resource Name (ARN) of the inference experiment.
    CreationTime string
    The timestamp at which you created the inference experiment.
    DataStorageConfig InferenceExperimentDataStorageConfig
    The Amazon S3 location and configuration for storing inference request and response data.
    Description string
    The description of the inference experiment.
    DesiredState InferenceExperimentDesiredState
    The desired state of the experiment after starting or stopping operation.
    EndpointMetadata InferenceExperimentEndpointMetadata
    LastModifiedTime string
    The timestamp at which you last modified the inference experiment.
    ModelVariants []InferenceExperimentModelVariantConfig
    An array of ModelVariantConfig objects. Each ModelVariantConfig object in the array describes the infrastructure configuration for the corresponding variant.
    Schedule InferenceExperimentSchedule

    The duration for which the inference experiment ran or will run.

    The maximum duration that you can set for an inference experiment is 30 days.

    ShadowModeConfig InferenceExperimentShadowModeConfig
    The configuration of ShadowMode inference experiment type, which shows the production variant that takes all the inference requests, and the shadow variant to which Amazon SageMaker replicates a percentage of the inference requests. For the shadow variant it also shows the percentage of requests that Amazon SageMaker replicates.
    Status InferenceExperimentStatus
    The status of the inference experiment.
    StatusReason string
    The error message or client-specified reason from the StopInferenceExperiment API, that explains the status of the inference experiment.
    Tags Tag
    An array of key-value pairs to apply to this resource.
    arn String
    The Amazon Resource Name (ARN) of the inference experiment.
    creationTime String
    The timestamp at which you created the inference experiment.
    dataStorageConfig InferenceExperimentDataStorageConfig
    The Amazon S3 location and configuration for storing inference request and response data.
    description String
    The description of the inference experiment.
    desiredState InferenceExperimentDesiredState
    The desired state of the experiment after starting or stopping operation.
    endpointMetadata InferenceExperimentEndpointMetadata
    lastModifiedTime String
    The timestamp at which you last modified the inference experiment.
    modelVariants List<InferenceExperimentModelVariantConfig>
    An array of ModelVariantConfig objects. Each ModelVariantConfig object in the array describes the infrastructure configuration for the corresponding variant.
    schedule InferenceExperimentSchedule

    The duration for which the inference experiment ran or will run.

    The maximum duration that you can set for an inference experiment is 30 days.

    shadowModeConfig InferenceExperimentShadowModeConfig
    The configuration of ShadowMode inference experiment type, which shows the production variant that takes all the inference requests, and the shadow variant to which Amazon SageMaker replicates a percentage of the inference requests. For the shadow variant it also shows the percentage of requests that Amazon SageMaker replicates.
    status InferenceExperimentStatus
    The status of the inference experiment.
    statusReason String
    The error message or client-specified reason from the StopInferenceExperiment API, that explains the status of the inference experiment.
    tags List<Tag>
    An array of key-value pairs to apply to this resource.
    arn string
    The Amazon Resource Name (ARN) of the inference experiment.
    creationTime string
    The timestamp at which you created the inference experiment.
    dataStorageConfig InferenceExperimentDataStorageConfig
    The Amazon S3 location and configuration for storing inference request and response data.
    description string
    The description of the inference experiment.
    desiredState InferenceExperimentDesiredState
    The desired state of the experiment after starting or stopping operation.
    endpointMetadata InferenceExperimentEndpointMetadata
    lastModifiedTime string
    The timestamp at which you last modified the inference experiment.
    modelVariants InferenceExperimentModelVariantConfig[]
    An array of ModelVariantConfig objects. Each ModelVariantConfig object in the array describes the infrastructure configuration for the corresponding variant.
    schedule InferenceExperimentSchedule

    The duration for which the inference experiment ran or will run.

    The maximum duration that you can set for an inference experiment is 30 days.

    shadowModeConfig InferenceExperimentShadowModeConfig
    The configuration of ShadowMode inference experiment type, which shows the production variant that takes all the inference requests, and the shadow variant to which Amazon SageMaker replicates a percentage of the inference requests. For the shadow variant it also shows the percentage of requests that Amazon SageMaker replicates.
    status InferenceExperimentStatus
    The status of the inference experiment.
    statusReason string
    The error message or client-specified reason from the StopInferenceExperiment API, that explains the status of the inference experiment.
    tags Tag[]
    An array of key-value pairs to apply to this resource.
    arn str
    The Amazon Resource Name (ARN) of the inference experiment.
    creation_time str
    The timestamp at which you created the inference experiment.
    data_storage_config InferenceExperimentDataStorageConfig
    The Amazon S3 location and configuration for storing inference request and response data.
    description str
    The description of the inference experiment.
    desired_state InferenceExperimentDesiredState
    The desired state of the experiment after starting or stopping operation.
    endpoint_metadata InferenceExperimentEndpointMetadata
    last_modified_time str
    The timestamp at which you last modified the inference experiment.
    model_variants Sequence[InferenceExperimentModelVariantConfig]
    An array of ModelVariantConfig objects. Each ModelVariantConfig object in the array describes the infrastructure configuration for the corresponding variant.
    schedule InferenceExperimentSchedule

    The duration for which the inference experiment ran or will run.

    The maximum duration that you can set for an inference experiment is 30 days.

    shadow_mode_config InferenceExperimentShadowModeConfig
    The configuration of ShadowMode inference experiment type, which shows the production variant that takes all the inference requests, and the shadow variant to which Amazon SageMaker replicates a percentage of the inference requests. For the shadow variant it also shows the percentage of requests that Amazon SageMaker replicates.
    status InferenceExperimentStatus
    The status of the inference experiment.
    status_reason str
    The error message or client-specified reason from the StopInferenceExperiment API, that explains the status of the inference experiment.
    tags Sequence[root_Tag]
    An array of key-value pairs to apply to this resource.
    arn String
    The Amazon Resource Name (ARN) of the inference experiment.
    creationTime String
    The timestamp at which you created the inference experiment.
    dataStorageConfig Property Map
    The Amazon S3 location and configuration for storing inference request and response data.
    description String
    The description of the inference experiment.
    desiredState "Running" | "Completed" | "Cancelled"
    The desired state of the experiment after starting or stopping operation.
    endpointMetadata Property Map
    lastModifiedTime String
    The timestamp at which you last modified the inference experiment.
    modelVariants List<Property Map>
    An array of ModelVariantConfig objects. Each ModelVariantConfig object in the array describes the infrastructure configuration for the corresponding variant.
    schedule Property Map

    The duration for which the inference experiment ran or will run.

    The maximum duration that you can set for an inference experiment is 30 days.

    shadowModeConfig Property Map
    The configuration of ShadowMode inference experiment type, which shows the production variant that takes all the inference requests, and the shadow variant to which Amazon SageMaker replicates a percentage of the inference requests. For the shadow variant it also shows the percentage of requests that Amazon SageMaker replicates.
    status "Creating" | "Created" | "Updating" | "Starting" | "Stopping" | "Running" | "Completed" | "Cancelled"
    The status of the inference experiment.
    statusReason String
    The error message or client-specified reason from the StopInferenceExperiment API, that explains the status of the inference experiment.
    tags List<Property Map>
    An array of key-value pairs to apply to this resource.

    Supporting Types

    InferenceExperimentCaptureContentTypeHeader

    CsvContentTypes List<string>
    The list of all content type headers that SageMaker will treat as CSV and capture accordingly.
    JsonContentTypes List<string>
    The list of all content type headers that SageMaker will treat as JSON and capture accordingly.
    CsvContentTypes []string
    The list of all content type headers that SageMaker will treat as CSV and capture accordingly.
    JsonContentTypes []string
    The list of all content type headers that SageMaker will treat as JSON and capture accordingly.
    csvContentTypes List<String>
    The list of all content type headers that SageMaker will treat as CSV and capture accordingly.
    jsonContentTypes List<String>
    The list of all content type headers that SageMaker will treat as JSON and capture accordingly.
    csvContentTypes string[]
    The list of all content type headers that SageMaker will treat as CSV and capture accordingly.
    jsonContentTypes string[]
    The list of all content type headers that SageMaker will treat as JSON and capture accordingly.
    csv_content_types Sequence[str]
    The list of all content type headers that SageMaker will treat as CSV and capture accordingly.
    json_content_types Sequence[str]
    The list of all content type headers that SageMaker will treat as JSON and capture accordingly.
    csvContentTypes List<String>
    The list of all content type headers that SageMaker will treat as CSV and capture accordingly.
    jsonContentTypes List<String>
    The list of all content type headers that SageMaker will treat as JSON and capture accordingly.

    InferenceExperimentDataStorageConfig

    Destination string
    The Amazon S3 bucket where the inference request and response data is stored.
    ContentType Pulumi.AwsNative.SageMaker.Inputs.InferenceExperimentCaptureContentTypeHeader
    Configuration specifying how to treat different headers. If no headers are specified SageMaker will by default base64 encode when capturing the data.
    KmsKey string
    The AWS Key Management Service key that Amazon SageMaker uses to encrypt captured data at rest using Amazon S3 server-side encryption.
    Destination string
    The Amazon S3 bucket where the inference request and response data is stored.
    ContentType InferenceExperimentCaptureContentTypeHeader
    Configuration specifying how to treat different headers. If no headers are specified SageMaker will by default base64 encode when capturing the data.
    KmsKey string
    The AWS Key Management Service key that Amazon SageMaker uses to encrypt captured data at rest using Amazon S3 server-side encryption.
    destination String
    The Amazon S3 bucket where the inference request and response data is stored.
    contentType InferenceExperimentCaptureContentTypeHeader
    Configuration specifying how to treat different headers. If no headers are specified SageMaker will by default base64 encode when capturing the data.
    kmsKey String
    The AWS Key Management Service key that Amazon SageMaker uses to encrypt captured data at rest using Amazon S3 server-side encryption.
    destination string
    The Amazon S3 bucket where the inference request and response data is stored.
    contentType InferenceExperimentCaptureContentTypeHeader
    Configuration specifying how to treat different headers. If no headers are specified SageMaker will by default base64 encode when capturing the data.
    kmsKey string
    The AWS Key Management Service key that Amazon SageMaker uses to encrypt captured data at rest using Amazon S3 server-side encryption.
    destination str
    The Amazon S3 bucket where the inference request and response data is stored.
    content_type InferenceExperimentCaptureContentTypeHeader
    Configuration specifying how to treat different headers. If no headers are specified SageMaker will by default base64 encode when capturing the data.
    kms_key str
    The AWS Key Management Service key that Amazon SageMaker uses to encrypt captured data at rest using Amazon S3 server-side encryption.
    destination String
    The Amazon S3 bucket where the inference request and response data is stored.
    contentType Property Map
    Configuration specifying how to treat different headers. If no headers are specified SageMaker will by default base64 encode when capturing the data.
    kmsKey String
    The AWS Key Management Service key that Amazon SageMaker uses to encrypt captured data at rest using Amazon S3 server-side encryption.

    InferenceExperimentDesiredState

    InferenceExperimentEndpointMetadata

    EndpointName string
    The name of the endpoint.
    EndpointConfigName string
    The name of the endpoint configuration.
    EndpointStatus Pulumi.AwsNative.SageMaker.InferenceExperimentEndpointMetadataEndpointStatus
    The status of the endpoint. For possible values of the status of an endpoint.
    EndpointName string
    The name of the endpoint.
    EndpointConfigName string
    The name of the endpoint configuration.
    EndpointStatus InferenceExperimentEndpointMetadataEndpointStatus
    The status of the endpoint. For possible values of the status of an endpoint.
    endpointName String
    The name of the endpoint.
    endpointConfigName String
    The name of the endpoint configuration.
    endpointStatus InferenceExperimentEndpointMetadataEndpointStatus
    The status of the endpoint. For possible values of the status of an endpoint.
    endpointName string
    The name of the endpoint.
    endpointConfigName string
    The name of the endpoint configuration.
    endpointStatus InferenceExperimentEndpointMetadataEndpointStatus
    The status of the endpoint. For possible values of the status of an endpoint.
    endpoint_name str
    The name of the endpoint.
    endpoint_config_name str
    The name of the endpoint configuration.
    endpoint_status InferenceExperimentEndpointMetadataEndpointStatus
    The status of the endpoint. For possible values of the status of an endpoint.
    endpointName String
    The name of the endpoint.
    endpointConfigName String
    The name of the endpoint configuration.
    endpointStatus "Creating" | "Updating" | "SystemUpdating" | "RollingBack" | "InService" | "OutOfService" | "Deleting" | "Failed"
    The status of the endpoint. For possible values of the status of an endpoint.

    InferenceExperimentEndpointMetadataEndpointStatus

    InferenceExperimentModelInfrastructureConfig

    InfrastructureType Pulumi.AwsNative.SageMaker.InferenceExperimentModelInfrastructureConfigInfrastructureType
    The type of the inference experiment that you want to run.
    RealTimeInferenceConfig Pulumi.AwsNative.SageMaker.Inputs.InferenceExperimentRealTimeInferenceConfig
    The infrastructure configuration for deploying the model to real-time inference.
    InfrastructureType InferenceExperimentModelInfrastructureConfigInfrastructureType
    The type of the inference experiment that you want to run.
    RealTimeInferenceConfig InferenceExperimentRealTimeInferenceConfig
    The infrastructure configuration for deploying the model to real-time inference.
    infrastructureType InferenceExperimentModelInfrastructureConfigInfrastructureType
    The type of the inference experiment that you want to run.
    realTimeInferenceConfig InferenceExperimentRealTimeInferenceConfig
    The infrastructure configuration for deploying the model to real-time inference.
    infrastructureType InferenceExperimentModelInfrastructureConfigInfrastructureType
    The type of the inference experiment that you want to run.
    realTimeInferenceConfig InferenceExperimentRealTimeInferenceConfig
    The infrastructure configuration for deploying the model to real-time inference.
    infrastructure_type InferenceExperimentModelInfrastructureConfigInfrastructureType
    The type of the inference experiment that you want to run.
    real_time_inference_config InferenceExperimentRealTimeInferenceConfig
    The infrastructure configuration for deploying the model to real-time inference.
    infrastructureType "RealTimeInference"
    The type of the inference experiment that you want to run.
    realTimeInferenceConfig Property Map
    The infrastructure configuration for deploying the model to real-time inference.

    InferenceExperimentModelInfrastructureConfigInfrastructureType

    InferenceExperimentModelVariantConfig

    InfrastructureConfig Pulumi.AwsNative.SageMaker.Inputs.InferenceExperimentModelInfrastructureConfig
    The configuration for the infrastructure that the model will be deployed to.
    ModelName string
    The name of the Amazon SageMaker Model entity.
    VariantName string
    The name of the variant.
    InfrastructureConfig InferenceExperimentModelInfrastructureConfig
    The configuration for the infrastructure that the model will be deployed to.
    ModelName string
    The name of the Amazon SageMaker Model entity.
    VariantName string
    The name of the variant.
    infrastructureConfig InferenceExperimentModelInfrastructureConfig
    The configuration for the infrastructure that the model will be deployed to.
    modelName String
    The name of the Amazon SageMaker Model entity.
    variantName String
    The name of the variant.
    infrastructureConfig InferenceExperimentModelInfrastructureConfig
    The configuration for the infrastructure that the model will be deployed to.
    modelName string
    The name of the Amazon SageMaker Model entity.
    variantName string
    The name of the variant.
    infrastructure_config InferenceExperimentModelInfrastructureConfig
    The configuration for the infrastructure that the model will be deployed to.
    model_name str
    The name of the Amazon SageMaker Model entity.
    variant_name str
    The name of the variant.
    infrastructureConfig Property Map
    The configuration for the infrastructure that the model will be deployed to.
    modelName String
    The name of the Amazon SageMaker Model entity.
    variantName String
    The name of the variant.

    InferenceExperimentRealTimeInferenceConfig

    InstanceCount int
    The number of instances of the type specified by InstanceType.
    InstanceType string
    The instance type the model is deployed to.
    InstanceCount int
    The number of instances of the type specified by InstanceType.
    InstanceType string
    The instance type the model is deployed to.
    instanceCount Integer
    The number of instances of the type specified by InstanceType.
    instanceType String
    The instance type the model is deployed to.
    instanceCount number
    The number of instances of the type specified by InstanceType.
    instanceType string
    The instance type the model is deployed to.
    instance_count int
    The number of instances of the type specified by InstanceType.
    instance_type str
    The instance type the model is deployed to.
    instanceCount Number
    The number of instances of the type specified by InstanceType.
    instanceType String
    The instance type the model is deployed to.

    InferenceExperimentSchedule

    EndTime string
    The timestamp at which the inference experiment ended or will end.
    StartTime string
    The timestamp at which the inference experiment started or will start.
    EndTime string
    The timestamp at which the inference experiment ended or will end.
    StartTime string
    The timestamp at which the inference experiment started or will start.
    endTime String
    The timestamp at which the inference experiment ended or will end.
    startTime String
    The timestamp at which the inference experiment started or will start.
    endTime string
    The timestamp at which the inference experiment ended or will end.
    startTime string
    The timestamp at which the inference experiment started or will start.
    end_time str
    The timestamp at which the inference experiment ended or will end.
    start_time str
    The timestamp at which the inference experiment started or will start.
    endTime String
    The timestamp at which the inference experiment ended or will end.
    startTime String
    The timestamp at which the inference experiment started or will start.

    InferenceExperimentShadowModeConfig

    ShadowModelVariants List<Pulumi.AwsNative.SageMaker.Inputs.InferenceExperimentShadowModelVariantConfig>
    List of shadow variant configurations.
    SourceModelVariantName string
    The name of the production variant, which takes all the inference requests.
    ShadowModelVariants []InferenceExperimentShadowModelVariantConfig
    List of shadow variant configurations.
    SourceModelVariantName string
    The name of the production variant, which takes all the inference requests.
    shadowModelVariants List<InferenceExperimentShadowModelVariantConfig>
    List of shadow variant configurations.
    sourceModelVariantName String
    The name of the production variant, which takes all the inference requests.
    shadowModelVariants InferenceExperimentShadowModelVariantConfig[]
    List of shadow variant configurations.
    sourceModelVariantName string
    The name of the production variant, which takes all the inference requests.
    shadow_model_variants Sequence[InferenceExperimentShadowModelVariantConfig]
    List of shadow variant configurations.
    source_model_variant_name str
    The name of the production variant, which takes all the inference requests.
    shadowModelVariants List<Property Map>
    List of shadow variant configurations.
    sourceModelVariantName String
    The name of the production variant, which takes all the inference requests.

    InferenceExperimentShadowModelVariantConfig

    SamplingPercentage int
    The percentage of inference requests that Amazon SageMaker replicates from the production variant to the shadow variant.
    ShadowModelVariantName string
    The name of the shadow variant.
    SamplingPercentage int
    The percentage of inference requests that Amazon SageMaker replicates from the production variant to the shadow variant.
    ShadowModelVariantName string
    The name of the shadow variant.
    samplingPercentage Integer
    The percentage of inference requests that Amazon SageMaker replicates from the production variant to the shadow variant.
    shadowModelVariantName String
    The name of the shadow variant.
    samplingPercentage number
    The percentage of inference requests that Amazon SageMaker replicates from the production variant to the shadow variant.
    shadowModelVariantName string
    The name of the shadow variant.
    sampling_percentage int
    The percentage of inference requests that Amazon SageMaker replicates from the production variant to the shadow variant.
    shadow_model_variant_name str
    The name of the shadow variant.
    samplingPercentage Number
    The percentage of inference requests that Amazon SageMaker replicates from the production variant to the shadow variant.
    shadowModelVariantName String
    The name of the shadow variant.

    InferenceExperimentStatus

    Tag

    Key string
    The key name of the tag
    Value string
    The value of the tag
    Key string
    The key name of the tag
    Value string
    The value of the tag
    key String
    The key name of the tag
    value String
    The value of the tag
    key string
    The key name of the tag
    value string
    The value of the tag
    key str
    The key name of the tag
    value str
    The value of the tag
    key String
    The key name of the tag
    value String
    The value of the tag

    Package Details

    Repository
    AWS Native pulumi/pulumi-aws-native
    License
    Apache-2.0
    aws-native logo

    We recommend new projects start with resources from the AWS provider.

    AWS Native v0.126.0 published on Monday, Sep 30, 2024 by Pulumi