We recommend new projects start with resources from the AWS provider.
aws-native.appflow.getFlow
Explore with Pulumi AI
We recommend new projects start with resources from the AWS provider.
Resource schema for AWS::AppFlow::Flow.
Using getFlow
Two invocation forms are available. The direct form accepts plain arguments and either blocks until the result value is available, or returns a Promise-wrapped result. The output form accepts Input-wrapped arguments and returns an Output-wrapped result.
function getFlow(args: GetFlowArgs, opts?: InvokeOptions): Promise<GetFlowResult>
function getFlowOutput(args: GetFlowOutputArgs, opts?: InvokeOptions): Output<GetFlowResult>
def get_flow(flow_name: Optional[str] = None,
opts: Optional[InvokeOptions] = None) -> GetFlowResult
def get_flow_output(flow_name: Optional[pulumi.Input[str]] = None,
opts: Optional[InvokeOptions] = None) -> Output[GetFlowResult]
func LookupFlow(ctx *Context, args *LookupFlowArgs, opts ...InvokeOption) (*LookupFlowResult, error)
func LookupFlowOutput(ctx *Context, args *LookupFlowOutputArgs, opts ...InvokeOption) LookupFlowResultOutput
> Note: This function is named LookupFlow
in the Go SDK.
public static class GetFlow
{
public static Task<GetFlowResult> InvokeAsync(GetFlowArgs args, InvokeOptions? opts = null)
public static Output<GetFlowResult> Invoke(GetFlowInvokeArgs args, InvokeOptions? opts = null)
}
public static CompletableFuture<GetFlowResult> getFlow(GetFlowArgs args, InvokeOptions options)
// Output-based functions aren't available in Java yet
fn::invoke:
function: aws-native:appflow:getFlow
arguments:
# arguments dictionary
The following arguments are supported:
- Flow
Name string - Name of the flow.
- Flow
Name string - Name of the flow.
- flow
Name String - Name of the flow.
- flow
Name string - Name of the flow.
- flow_
name str - Name of the flow.
- flow
Name String - Name of the flow.
getFlow Result
The following output properties are available:
- Description string
- Description of the flow.
- Destination
Flow List<Pulumi.Config List Aws Native. App Flow. Outputs. Flow Destination Flow Config> - List of Destination connectors of the flow.
- Flow
Arn string - ARN identifier of the flow.
- Flow
Status Pulumi.Aws Native. App Flow. Flow Status - Flow activation status for Scheduled- and Event-triggered flows
- Metadata
Catalog Pulumi.Config Aws Native. App Flow. Outputs. Flow Metadata Catalog Config - Configurations of metadata catalog of the flow.
- Source
Flow Pulumi.Config Aws Native. App Flow. Outputs. Flow Source Flow Config - Configurations of Source connector of the flow.
- List<Pulumi.
Aws Native. Outputs. Tag> - List of Tags.
- Tasks
List<Pulumi.
Aws Native. App Flow. Outputs. Flow Task> - List of tasks for the flow.
- Trigger
Config Pulumi.Aws Native. App Flow. Outputs. Flow Trigger Config - Trigger settings of the flow.
- Description string
- Description of the flow.
- Destination
Flow []FlowConfig List Destination Flow Config - List of Destination connectors of the flow.
- Flow
Arn string - ARN identifier of the flow.
- Flow
Status FlowStatus - Flow activation status for Scheduled- and Event-triggered flows
- Metadata
Catalog FlowConfig Metadata Catalog Config - Configurations of metadata catalog of the flow.
- Source
Flow FlowConfig Source Flow Config - Configurations of Source connector of the flow.
- Tag
- List of Tags.
- Tasks
[]Flow
Task - List of tasks for the flow.
- Trigger
Config FlowTrigger Config - Trigger settings of the flow.
- description String
- Description of the flow.
- destination
Flow List<FlowConfig List Destination Flow Config> - List of Destination connectors of the flow.
- flow
Arn String - ARN identifier of the flow.
- flow
Status FlowStatus - Flow activation status for Scheduled- and Event-triggered flows
- metadata
Catalog FlowConfig Metadata Catalog Config - Configurations of metadata catalog of the flow.
- source
Flow FlowConfig Source Flow Config - Configurations of Source connector of the flow.
- List<Tag>
- List of Tags.
- tasks
List<Flow
Task> - List of tasks for the flow.
- trigger
Config FlowTrigger Config - Trigger settings of the flow.
- description string
- Description of the flow.
- destination
Flow FlowConfig List Destination Flow Config[] - List of Destination connectors of the flow.
- flow
Arn string - ARN identifier of the flow.
- flow
Status FlowStatus - Flow activation status for Scheduled- and Event-triggered flows
- metadata
Catalog FlowConfig Metadata Catalog Config - Configurations of metadata catalog of the flow.
- source
Flow FlowConfig Source Flow Config - Configurations of Source connector of the flow.
- Tag[]
- List of Tags.
- tasks
Flow
Task[] - List of tasks for the flow.
- trigger
Config FlowTrigger Config - Trigger settings of the flow.
- description str
- Description of the flow.
- destination_
flow_ Sequence[Flowconfig_ list Destination Flow Config] - List of Destination connectors of the flow.
- flow_
arn str - ARN identifier of the flow.
- flow_
status FlowStatus - Flow activation status for Scheduled- and Event-triggered flows
- metadata_
catalog_ Flowconfig Metadata Catalog Config - Configurations of metadata catalog of the flow.
- source_
flow_ Flowconfig Source Flow Config - Configurations of Source connector of the flow.
- Sequence[root_Tag]
- List of Tags.
- tasks
Sequence[Flow
Task] - List of tasks for the flow.
- trigger_
config FlowTrigger Config - Trigger settings of the flow.
- description String
- Description of the flow.
- destination
Flow List<Property Map>Config List - List of Destination connectors of the flow.
- flow
Arn String - ARN identifier of the flow.
- flow
Status "Active" | "Suspended" | "Draft" - Flow activation status for Scheduled- and Event-triggered flows
- metadata
Catalog Property MapConfig - Configurations of metadata catalog of the flow.
- source
Flow Property MapConfig - Configurations of Source connector of the flow.
- List<Property Map>
- List of Tags.
- tasks List<Property Map>
- List of tasks for the flow.
- trigger
Config Property Map - Trigger settings of the flow.
Supporting Types
FlowAggregationConfig
- Aggregation
Type Pulumi.Aws Native. App Flow. Flow Aggregation Type - Specifies whether Amazon AppFlow aggregates the flow records into a single file, or leave them unaggregated.
- Target
File intSize - The desired file size, in MB, for each output file that Amazon AppFlow writes to the flow destination. For each file, Amazon AppFlow attempts to achieve the size that you specify. The actual file sizes might differ from this target based on the number and size of the records that each file contains.
- Aggregation
Type FlowAggregation Type - Specifies whether Amazon AppFlow aggregates the flow records into a single file, or leave them unaggregated.
- Target
File intSize - The desired file size, in MB, for each output file that Amazon AppFlow writes to the flow destination. For each file, Amazon AppFlow attempts to achieve the size that you specify. The actual file sizes might differ from this target based on the number and size of the records that each file contains.
- aggregation
Type FlowAggregation Type - Specifies whether Amazon AppFlow aggregates the flow records into a single file, or leave them unaggregated.
- target
File IntegerSize - The desired file size, in MB, for each output file that Amazon AppFlow writes to the flow destination. For each file, Amazon AppFlow attempts to achieve the size that you specify. The actual file sizes might differ from this target based on the number and size of the records that each file contains.
- aggregation
Type FlowAggregation Type - Specifies whether Amazon AppFlow aggregates the flow records into a single file, or leave them unaggregated.
- target
File numberSize - The desired file size, in MB, for each output file that Amazon AppFlow writes to the flow destination. For each file, Amazon AppFlow attempts to achieve the size that you specify. The actual file sizes might differ from this target based on the number and size of the records that each file contains.
- aggregation_
type FlowAggregation Type - Specifies whether Amazon AppFlow aggregates the flow records into a single file, or leave them unaggregated.
- target_
file_ intsize - The desired file size, in MB, for each output file that Amazon AppFlow writes to the flow destination. For each file, Amazon AppFlow attempts to achieve the size that you specify. The actual file sizes might differ from this target based on the number and size of the records that each file contains.
- aggregation
Type "None" | "SingleFile" - Specifies whether Amazon AppFlow aggregates the flow records into a single file, or leave them unaggregated.
- target
File NumberSize - The desired file size, in MB, for each output file that Amazon AppFlow writes to the flow destination. For each file, Amazon AppFlow attempts to achieve the size that you specify. The actual file sizes might differ from this target based on the number and size of the records that each file contains.
FlowAggregationType
FlowAmplitudeConnectorOperator
FlowAmplitudeSourceProperties
- Object string
- The object specified in the Amplitude flow source.
- Object string
- The object specified in the Amplitude flow source.
- object String
- The object specified in the Amplitude flow source.
- object string
- The object specified in the Amplitude flow source.
- object str
- The object specified in the Amplitude flow source.
- object String
- The object specified in the Amplitude flow source.
FlowConnectorOperator
- Amplitude
Pulumi.
Aws Native. App Flow. Flow Amplitude Connector Operator - The operation to be performed on the provided Amplitude source fields.
- Custom
Connector Pulumi.Aws Native. App Flow. Flow Operator - Operators supported by the custom connector.
- Datadog
Pulumi.
Aws Native. App Flow. Flow Datadog Connector Operator - The operation to be performed on the provided Datadog source fields.
- Dynatrace
Pulumi.
Aws Native. App Flow. Flow Dynatrace Connector Operator - The operation to be performed on the provided Dynatrace source fields.
- Google
Analytics Pulumi.Aws Native. App Flow. Flow Google Analytics Connector Operator - The operation to be performed on the provided Google Analytics source fields.
- Infor
Nexus Pulumi.Aws Native. App Flow. Flow Infor Nexus Connector Operator - The operation to be performed on the provided Infor Nexus source fields.
- Marketo
Pulumi.
Aws Native. App Flow. Flow Marketo Connector Operator - The operation to be performed on the provided Marketo source fields.
- Pardot
Pulumi.
Aws Native. App Flow. Flow Pardot Connector Operator - The operation to be performed on the provided Salesforce Pardot source fields.
- S3
Pulumi.
Aws Native. App Flow. Flow S3Connector Operator - The operation to be performed on the provided Amazon S3 source fields.
- Salesforce
Pulumi.
Aws Native. App Flow. Flow Salesforce Connector Operator - The operation to be performed on the provided Salesforce source fields.
- Sapo
Data Pulumi.Aws Native. App Flow. Flow Sapo Data Connector Operator - The operation to be performed on the provided SAPOData source fields.
- Service
Now Pulumi.Aws Native. App Flow. Flow Service Now Connector Operator - The operation to be performed on the provided ServiceNow source fields.
- Singular
Pulumi.
Aws Native. App Flow. Flow Singular Connector Operator - The operation to be performed on the provided Singular source fields.
- Slack
Pulumi.
Aws Native. App Flow. Flow Slack Connector Operator - The operation to be performed on the provided Slack source fields.
- Trendmicro
Pulumi.
Aws Native. App Flow. Flow Trendmicro Connector Operator - The operation to be performed on the provided Trend Micro source fields.
- Veeva
Pulumi.
Aws Native. App Flow. Flow Veeva Connector Operator - The operation to be performed on the provided Veeva source fields.
- Zendesk
Pulumi.
Aws Native. App Flow. Flow Zendesk Connector Operator - The operation to be performed on the provided Zendesk source fields.
- Amplitude
Flow
Amplitude Connector Operator - The operation to be performed on the provided Amplitude source fields.
- Custom
Connector FlowOperator - Operators supported by the custom connector.
- Datadog
Flow
Datadog Connector Operator - The operation to be performed on the provided Datadog source fields.
- Dynatrace
Flow
Dynatrace Connector Operator - The operation to be performed on the provided Dynatrace source fields.
- Google
Analytics FlowGoogle Analytics Connector Operator - The operation to be performed on the provided Google Analytics source fields.
- Infor
Nexus FlowInfor Nexus Connector Operator - The operation to be performed on the provided Infor Nexus source fields.
- Marketo
Flow
Marketo Connector Operator - The operation to be performed on the provided Marketo source fields.
- Pardot
Flow
Pardot Connector Operator - The operation to be performed on the provided Salesforce Pardot source fields.
- S3
Flow
S3Connector Operator - The operation to be performed on the provided Amazon S3 source fields.
- Salesforce
Flow
Salesforce Connector Operator - The operation to be performed on the provided Salesforce source fields.
- Sapo
Data FlowSapo Data Connector Operator - The operation to be performed on the provided SAPOData source fields.
- Service
Now FlowService Now Connector Operator - The operation to be performed on the provided ServiceNow source fields.
- Singular
Flow
Singular Connector Operator - The operation to be performed on the provided Singular source fields.
- Slack
Flow
Slack Connector Operator - The operation to be performed on the provided Slack source fields.
- Trendmicro
Flow
Trendmicro Connector Operator - The operation to be performed on the provided Trend Micro source fields.
- Veeva
Flow
Veeva Connector Operator - The operation to be performed on the provided Veeva source fields.
- Zendesk
Flow
Zendesk Connector Operator - The operation to be performed on the provided Zendesk source fields.
- amplitude
Flow
Amplitude Connector Operator - The operation to be performed on the provided Amplitude source fields.
- custom
Connector FlowOperator - Operators supported by the custom connector.
- datadog
Flow
Datadog Connector Operator - The operation to be performed on the provided Datadog source fields.
- dynatrace
Flow
Dynatrace Connector Operator - The operation to be performed on the provided Dynatrace source fields.
- google
Analytics FlowGoogle Analytics Connector Operator - The operation to be performed on the provided Google Analytics source fields.
- infor
Nexus FlowInfor Nexus Connector Operator - The operation to be performed on the provided Infor Nexus source fields.
- marketo
Flow
Marketo Connector Operator - The operation to be performed on the provided Marketo source fields.
- pardot
Flow
Pardot Connector Operator - The operation to be performed on the provided Salesforce Pardot source fields.
- s3
Flow
S3Connector Operator - The operation to be performed on the provided Amazon S3 source fields.
- salesforce
Flow
Salesforce Connector Operator - The operation to be performed on the provided Salesforce source fields.
- sapo
Data FlowSapo Data Connector Operator - The operation to be performed on the provided SAPOData source fields.
- service
Now FlowService Now Connector Operator - The operation to be performed on the provided ServiceNow source fields.
- singular
Flow
Singular Connector Operator - The operation to be performed on the provided Singular source fields.
- slack
Flow
Slack Connector Operator - The operation to be performed on the provided Slack source fields.
- trendmicro
Flow
Trendmicro Connector Operator - The operation to be performed on the provided Trend Micro source fields.
- veeva
Flow
Veeva Connector Operator - The operation to be performed on the provided Veeva source fields.
- zendesk
Flow
Zendesk Connector Operator - The operation to be performed on the provided Zendesk source fields.
- amplitude
Flow
Amplitude Connector Operator - The operation to be performed on the provided Amplitude source fields.
- custom
Connector FlowOperator - Operators supported by the custom connector.
- datadog
Flow
Datadog Connector Operator - The operation to be performed on the provided Datadog source fields.
- dynatrace
Flow
Dynatrace Connector Operator - The operation to be performed on the provided Dynatrace source fields.
- google
Analytics FlowGoogle Analytics Connector Operator - The operation to be performed on the provided Google Analytics source fields.
- infor
Nexus FlowInfor Nexus Connector Operator - The operation to be performed on the provided Infor Nexus source fields.
- marketo
Flow
Marketo Connector Operator - The operation to be performed on the provided Marketo source fields.
- pardot
Flow
Pardot Connector Operator - The operation to be performed on the provided Salesforce Pardot source fields.
- s3
Flow
S3Connector Operator - The operation to be performed on the provided Amazon S3 source fields.
- salesforce
Flow
Salesforce Connector Operator - The operation to be performed on the provided Salesforce source fields.
- sapo
Data FlowSapo Data Connector Operator - The operation to be performed on the provided SAPOData source fields.
- service
Now FlowService Now Connector Operator - The operation to be performed on the provided ServiceNow source fields.
- singular
Flow
Singular Connector Operator - The operation to be performed on the provided Singular source fields.
- slack
Flow
Slack Connector Operator - The operation to be performed on the provided Slack source fields.
- trendmicro
Flow
Trendmicro Connector Operator - The operation to be performed on the provided Trend Micro source fields.
- veeva
Flow
Veeva Connector Operator - The operation to be performed on the provided Veeva source fields.
- zendesk
Flow
Zendesk Connector Operator - The operation to be performed on the provided Zendesk source fields.
- amplitude
Flow
Amplitude Connector Operator - The operation to be performed on the provided Amplitude source fields.
- custom_
connector FlowOperator - Operators supported by the custom connector.
- datadog
Flow
Datadog Connector Operator - The operation to be performed on the provided Datadog source fields.
- dynatrace
Flow
Dynatrace Connector Operator - The operation to be performed on the provided Dynatrace source fields.
- google_
analytics FlowGoogle Analytics Connector Operator - The operation to be performed on the provided Google Analytics source fields.
- infor_
nexus FlowInfor Nexus Connector Operator - The operation to be performed on the provided Infor Nexus source fields.
- marketo
Flow
Marketo Connector Operator - The operation to be performed on the provided Marketo source fields.
- pardot
Flow
Pardot Connector Operator - The operation to be performed on the provided Salesforce Pardot source fields.
- s3
Flow
S3Connector Operator - The operation to be performed on the provided Amazon S3 source fields.
- salesforce
Flow
Salesforce Connector Operator - The operation to be performed on the provided Salesforce source fields.
- sapo_
data FlowSapo Data Connector Operator - The operation to be performed on the provided SAPOData source fields.
- service_
now FlowService Now Connector Operator - The operation to be performed on the provided ServiceNow source fields.
- singular
Flow
Singular Connector Operator - The operation to be performed on the provided Singular source fields.
- slack
Flow
Slack Connector Operator - The operation to be performed on the provided Slack source fields.
- trendmicro
Flow
Trendmicro Connector Operator - The operation to be performed on the provided Trend Micro source fields.
- veeva
Flow
Veeva Connector Operator - The operation to be performed on the provided Veeva source fields.
- zendesk
Flow
Zendesk Connector Operator - The operation to be performed on the provided Zendesk source fields.
- amplitude "BETWEEN"
- The operation to be performed on the provided Amplitude source fields.
- custom
Connector "PROJECTION" | "LESS_THAN" | "GREATER_THAN" | "CONTAINS" | "BETWEEN" | "LESS_THAN_OR_EQUAL_TO" | "GREATER_THAN_OR_EQUAL_TO" | "EQUAL_TO" | "NOT_EQUAL_TO" | "ADDITION" | "MULTIPLICATION" | "DIVISION" | "SUBTRACTION" | "MASK_ALL" | "MASK_FIRST_N" | "MASK_LAST_N" | "VALIDATE_NON_NULL" | "VALIDATE_NON_ZERO" | "VALIDATE_NON_NEGATIVE" | "VALIDATE_NUMERIC" | "NO_OP" - Operators supported by the custom connector.
- datadog "PROJECTION" | "BETWEEN" | "EQUAL_TO" | "ADDITION" | "MULTIPLICATION" | "DIVISION" | "SUBTRACTION" | "MASK_ALL" | "MASK_FIRST_N" | "MASK_LAST_N" | "VALIDATE_NON_NULL" | "VALIDATE_NON_ZERO" | "VALIDATE_NON_NEGATIVE" | "VALIDATE_NUMERIC" | "NO_OP"
- The operation to be performed on the provided Datadog source fields.
- dynatrace "PROJECTION" | "BETWEEN" | "EQUAL_TO" | "ADDITION" | "MULTIPLICATION" | "DIVISION" | "SUBTRACTION" | "MASK_ALL" | "MASK_FIRST_N" | "MASK_LAST_N" | "VALIDATE_NON_NULL" | "VALIDATE_NON_ZERO" | "VALIDATE_NON_NEGATIVE" | "VALIDATE_NUMERIC" | "NO_OP"
- The operation to be performed on the provided Dynatrace source fields.
- google
Analytics "PROJECTION" | "BETWEEN" - The operation to be performed on the provided Google Analytics source fields.
- infor
Nexus "PROJECTION" | "BETWEEN" | "EQUAL_TO" | "ADDITION" | "MULTIPLICATION" | "DIVISION" | "SUBTRACTION" | "MASK_ALL" | "MASK_FIRST_N" | "MASK_LAST_N" | "VALIDATE_NON_NULL" | "VALIDATE_NON_ZERO" | "VALIDATE_NON_NEGATIVE" | "VALIDATE_NUMERIC" | "NO_OP" - The operation to be performed on the provided Infor Nexus source fields.
- marketo "PROJECTION" | "LESS_THAN" | "GREATER_THAN" | "BETWEEN" | "ADDITION" | "MULTIPLICATION" | "DIVISION" | "SUBTRACTION" | "MASK_ALL" | "MASK_FIRST_N" | "MASK_LAST_N" | "VALIDATE_NON_NULL" | "VALIDATE_NON_ZERO" | "VALIDATE_NON_NEGATIVE" | "VALIDATE_NUMERIC" | "NO_OP"
- The operation to be performed on the provided Marketo source fields.
- pardot "PROJECTION" | "EQUAL_TO" | "NO_OP" | "ADDITION" | "MULTIPLICATION" | "DIVISION" | "SUBTRACTION" | "MASK_ALL" | "MASK_FIRST_N" | "MASK_LAST_N" | "VALIDATE_NON_NULL" | "VALIDATE_NON_ZERO" | "VALIDATE_NON_NEGATIVE" | "VALIDATE_NUMERIC"
- The operation to be performed on the provided Salesforce Pardot source fields.
- s3 "PROJECTION" | "LESS_THAN" | "GREATER_THAN" | "BETWEEN" | "LESS_THAN_OR_EQUAL_TO" | "GREATER_THAN_OR_EQUAL_TO" | "EQUAL_TO" | "NOT_EQUAL_TO" | "ADDITION" | "MULTIPLICATION" | "DIVISION" | "SUBTRACTION" | "MASK_ALL" | "MASK_FIRST_N" | "MASK_LAST_N" | "VALIDATE_NON_NULL" | "VALIDATE_NON_ZERO" | "VALIDATE_NON_NEGATIVE" | "VALIDATE_NUMERIC" | "NO_OP"
- The operation to be performed on the provided Amazon S3 source fields.
- salesforce "PROJECTION" | "LESS_THAN" | "CONTAINS" | "GREATER_THAN" | "BETWEEN" | "LESS_THAN_OR_EQUAL_TO" | "GREATER_THAN_OR_EQUAL_TO" | "EQUAL_TO" | "NOT_EQUAL_TO" | "ADDITION" | "MULTIPLICATION" | "DIVISION" | "SUBTRACTION" | "MASK_ALL" | "MASK_FIRST_N" | "MASK_LAST_N" | "VALIDATE_NON_NULL" | "VALIDATE_NON_ZERO" | "VALIDATE_NON_NEGATIVE" | "VALIDATE_NUMERIC" | "NO_OP"
- The operation to be performed on the provided Salesforce source fields.
- sapo
Data "PROJECTION" | "LESS_THAN" | "CONTAINS" | "GREATER_THAN" | "BETWEEN" | "LESS_THAN_OR_EQUAL_TO" | "GREATER_THAN_OR_EQUAL_TO" | "EQUAL_TO" | "NOT_EQUAL_TO" | "ADDITION" | "MULTIPLICATION" | "DIVISION" | "SUBTRACTION" | "MASK_ALL" | "MASK_FIRST_N" | "MASK_LAST_N" | "VALIDATE_NON_NULL" | "VALIDATE_NON_ZERO" | "VALIDATE_NON_NEGATIVE" | "VALIDATE_NUMERIC" | "NO_OP" - The operation to be performed on the provided SAPOData source fields.
- service
Now "PROJECTION" | "LESS_THAN" | "CONTAINS" | "GREATER_THAN" | "BETWEEN" | "LESS_THAN_OR_EQUAL_TO" | "GREATER_THAN_OR_EQUAL_TO" | "EQUAL_TO" | "NOT_EQUAL_TO" | "ADDITION" | "MULTIPLICATION" | "DIVISION" | "SUBTRACTION" | "MASK_ALL" | "MASK_FIRST_N" | "MASK_LAST_N" | "VALIDATE_NON_NULL" | "VALIDATE_NON_ZERO" | "VALIDATE_NON_NEGATIVE" | "VALIDATE_NUMERIC" | "NO_OP" - The operation to be performed on the provided ServiceNow source fields.
- singular "PROJECTION" | "EQUAL_TO" | "ADDITION" | "MULTIPLICATION" | "DIVISION" | "SUBTRACTION" | "MASK_ALL" | "MASK_FIRST_N" | "MASK_LAST_N" | "VALIDATE_NON_NULL" | "VALIDATE_NON_ZERO" | "VALIDATE_NON_NEGATIVE" | "VALIDATE_NUMERIC" | "NO_OP"
- The operation to be performed on the provided Singular source fields.
- slack "PROJECTION" | "BETWEEN" | "EQUAL_TO" | "ADDITION" | "MULTIPLICATION" | "DIVISION" | "SUBTRACTION" | "MASK_ALL" | "MASK_FIRST_N" | "MASK_LAST_N" | "VALIDATE_NON_NULL" | "VALIDATE_NON_ZERO" | "VALIDATE_NON_NEGATIVE" | "VALIDATE_NUMERIC" | "NO_OP"
- The operation to be performed on the provided Slack source fields.
- trendmicro "PROJECTION" | "EQUAL_TO" | "ADDITION" | "MULTIPLICATION" | "DIVISION" | "SUBTRACTION" | "MASK_ALL" | "MASK_FIRST_N" | "MASK_LAST_N" | "VALIDATE_NON_NULL" | "VALIDATE_NON_ZERO" | "VALIDATE_NON_NEGATIVE" | "VALIDATE_NUMERIC" | "NO_OP"
- The operation to be performed on the provided Trend Micro source fields.
- veeva "PROJECTION" | "LESS_THAN" | "GREATER_THAN" | "BETWEEN" | "LESS_THAN_OR_EQUAL_TO" | "GREATER_THAN_OR_EQUAL_TO" | "EQUAL_TO" | "NOT_EQUAL_TO" | "ADDITION" | "MULTIPLICATION" | "DIVISION" | "SUBTRACTION" | "MASK_ALL" | "MASK_FIRST_N" | "MASK_LAST_N" | "VALIDATE_NON_NULL" | "VALIDATE_NON_ZERO" | "VALIDATE_NON_NEGATIVE" | "VALIDATE_NUMERIC" | "NO_OP"
- The operation to be performed on the provided Veeva source fields.
- zendesk "PROJECTION" | "GREATER_THAN" | "ADDITION" | "MULTIPLICATION" | "DIVISION" | "SUBTRACTION" | "MASK_ALL" | "MASK_FIRST_N" | "MASK_LAST_N" | "VALIDATE_NON_NULL" | "VALIDATE_NON_ZERO" | "VALIDATE_NON_NEGATIVE" | "VALIDATE_NUMERIC" | "NO_OP"
- The operation to be performed on the provided Zendesk source fields.
FlowConnectorType
FlowCustomConnectorDestinationProperties
- Entity
Name string - The entity specified in the custom connector as a destination in the flow.
- Custom
Properties Dictionary<string, string> - The custom properties that are specific to the connector when it's used as a destination in the flow.
- Error
Handling Pulumi.Config Aws Native. App Flow. Inputs. Flow Error Handling Config - The settings that determine how Amazon AppFlow handles an error when placing data in the custom connector as destination.
- Id
Field List<string>Names - List of fields used as ID when performing a write operation.
- Write
Operation Pulumi.Type Aws Native. App Flow. Flow Write Operation Type - Specifies the type of write operation to be performed in the custom connector when it's used as destination.
- Entity
Name string - The entity specified in the custom connector as a destination in the flow.
- Custom
Properties map[string]string - The custom properties that are specific to the connector when it's used as a destination in the flow.
- Error
Handling FlowConfig Error Handling Config - The settings that determine how Amazon AppFlow handles an error when placing data in the custom connector as destination.
- Id
Field []stringNames - List of fields used as ID when performing a write operation.
- Write
Operation FlowType Write Operation Type - Specifies the type of write operation to be performed in the custom connector when it's used as destination.
- entity
Name String - The entity specified in the custom connector as a destination in the flow.
- custom
Properties Map<String,String> - The custom properties that are specific to the connector when it's used as a destination in the flow.
- error
Handling FlowConfig Error Handling Config - The settings that determine how Amazon AppFlow handles an error when placing data in the custom connector as destination.
- id
Field List<String>Names - List of fields used as ID when performing a write operation.
- write
Operation FlowType Write Operation Type - Specifies the type of write operation to be performed in the custom connector when it's used as destination.
- entity
Name string - The entity specified in the custom connector as a destination in the flow.
- custom
Properties {[key: string]: string} - The custom properties that are specific to the connector when it's used as a destination in the flow.
- error
Handling FlowConfig Error Handling Config - The settings that determine how Amazon AppFlow handles an error when placing data in the custom connector as destination.
- id
Field string[]Names - List of fields used as ID when performing a write operation.
- write
Operation FlowType Write Operation Type - Specifies the type of write operation to be performed in the custom connector when it's used as destination.
- entity_
name str - The entity specified in the custom connector as a destination in the flow.
- custom_
properties Mapping[str, str] - The custom properties that are specific to the connector when it's used as a destination in the flow.
- error_
handling_ Flowconfig Error Handling Config - The settings that determine how Amazon AppFlow handles an error when placing data in the custom connector as destination.
- id_
field_ Sequence[str]names - List of fields used as ID when performing a write operation.
- write_
operation_ Flowtype Write Operation Type - Specifies the type of write operation to be performed in the custom connector when it's used as destination.
- entity
Name String - The entity specified in the custom connector as a destination in the flow.
- custom
Properties Map<String> - The custom properties that are specific to the connector when it's used as a destination in the flow.
- error
Handling Property MapConfig - The settings that determine how Amazon AppFlow handles an error when placing data in the custom connector as destination.
- id
Field List<String>Names - List of fields used as ID when performing a write operation.
- write
Operation "INSERT" | "UPSERT" | "UPDATE" | "DELETE"Type - Specifies the type of write operation to be performed in the custom connector when it's used as destination.
FlowCustomConnectorSourceProperties
- Entity
Name string - The entity specified in the custom connector as a source in the flow.
- Custom
Properties Dictionary<string, string> - Custom properties that are required to use the custom connector as a source.
- Data
Transfer Pulumi.Api Aws Native. App Flow. Inputs. Flow Custom Connector Source Properties Data Transfer Api Properties - The API of the connector application that Amazon AppFlow uses to transfer your data.
- Entity
Name string - The entity specified in the custom connector as a source in the flow.
- Custom
Properties map[string]string - Custom properties that are required to use the custom connector as a source.
- Data
Transfer FlowApi Custom Connector Source Properties Data Transfer Api Properties - The API of the connector application that Amazon AppFlow uses to transfer your data.
- entity
Name String - The entity specified in the custom connector as a source in the flow.
- custom
Properties Map<String,String> - Custom properties that are required to use the custom connector as a source.
- data
Transfer FlowApi Custom Connector Source Properties Data Transfer Api Properties - The API of the connector application that Amazon AppFlow uses to transfer your data.
- entity
Name string - The entity specified in the custom connector as a source in the flow.
- custom
Properties {[key: string]: string} - Custom properties that are required to use the custom connector as a source.
- data
Transfer FlowApi Custom Connector Source Properties Data Transfer Api Properties - The API of the connector application that Amazon AppFlow uses to transfer your data.
- entity_
name str - The entity specified in the custom connector as a source in the flow.
- custom_
properties Mapping[str, str] - Custom properties that are required to use the custom connector as a source.
- data_
transfer_ Flowapi Custom Connector Source Properties Data Transfer Api Properties - The API of the connector application that Amazon AppFlow uses to transfer your data.
- entity
Name String - The entity specified in the custom connector as a source in the flow.
- custom
Properties Map<String> - Custom properties that are required to use the custom connector as a source.
- data
Transfer Property MapApi - The API of the connector application that Amazon AppFlow uses to transfer your data.
FlowCustomConnectorSourcePropertiesDataTransferApiProperties
FlowCustomConnectorSourcePropertiesDataTransferApiPropertiesType
FlowDataTransferApi
FlowDatadogConnectorOperator
FlowDatadogSourceProperties
- Object string
- The object specified in the Datadog flow source.
- Object string
- The object specified in the Datadog flow source.
- object String
- The object specified in the Datadog flow source.
- object string
- The object specified in the Datadog flow source.
- object str
- The object specified in the Datadog flow source.
- object String
- The object specified in the Datadog flow source.
FlowDestinationConnectorProperties
- Custom
Connector Pulumi.Aws Native. App Flow. Inputs. Flow Custom Connector Destination Properties - The properties that are required to query the custom Connector.
- Event
Bridge Pulumi.Aws Native. App Flow. Inputs. Flow Event Bridge Destination Properties - The properties required to query Amazon EventBridge.
- Lookout
Metrics Pulumi.Aws Native. App Flow. Inputs. Flow Lookout Metrics Destination Properties - The properties required to query Amazon Lookout for Metrics.
- Marketo
Pulumi.
Aws Native. App Flow. Inputs. Flow Marketo Destination Properties - The properties required to query Marketo.
- Redshift
Pulumi.
Aws Native. App Flow. Inputs. Flow Redshift Destination Properties - The properties required to query Amazon Redshift.
- S3
Pulumi.
Aws Native. App Flow. Inputs. Flow S3Destination Properties - The properties required to query Amazon S3.
- Salesforce
Pulumi.
Aws Native. App Flow. Inputs. Flow Salesforce Destination Properties - The properties required to query Salesforce.
- Sapo
Data Pulumi.Aws Native. App Flow. Inputs. Flow Sapo Data Destination Properties - The properties required to query SAPOData.
- Snowflake
Pulumi.
Aws Native. App Flow. Inputs. Flow Snowflake Destination Properties - The properties required to query Snowflake.
- Upsolver
Pulumi.
Aws Native. App Flow. Inputs. Flow Upsolver Destination Properties - The properties required to query Upsolver.
- Zendesk
Pulumi.
Aws Native. App Flow. Inputs. Flow Zendesk Destination Properties - The properties required to query Zendesk.
- Custom
Connector FlowCustom Connector Destination Properties - The properties that are required to query the custom Connector.
- Event
Bridge FlowEvent Bridge Destination Properties - The properties required to query Amazon EventBridge.
- Lookout
Metrics FlowLookout Metrics Destination Properties - The properties required to query Amazon Lookout for Metrics.
- Marketo
Flow
Marketo Destination Properties - The properties required to query Marketo.
- Redshift
Flow
Redshift Destination Properties - The properties required to query Amazon Redshift.
- S3
Flow
S3Destination Properties - The properties required to query Amazon S3.
- Salesforce
Flow
Salesforce Destination Properties - The properties required to query Salesforce.
- Sapo
Data FlowSapo Data Destination Properties - The properties required to query SAPOData.
- Snowflake
Flow
Snowflake Destination Properties - The properties required to query Snowflake.
- Upsolver
Flow
Upsolver Destination Properties - The properties required to query Upsolver.
- Zendesk
Flow
Zendesk Destination Properties - The properties required to query Zendesk.
- custom
Connector FlowCustom Connector Destination Properties - The properties that are required to query the custom Connector.
- event
Bridge FlowEvent Bridge Destination Properties - The properties required to query Amazon EventBridge.
- lookout
Metrics FlowLookout Metrics Destination Properties - The properties required to query Amazon Lookout for Metrics.
- marketo
Flow
Marketo Destination Properties - The properties required to query Marketo.
- redshift
Flow
Redshift Destination Properties - The properties required to query Amazon Redshift.
- s3
Flow
S3Destination Properties - The properties required to query Amazon S3.
- salesforce
Flow
Salesforce Destination Properties - The properties required to query Salesforce.
- sapo
Data FlowSapo Data Destination Properties - The properties required to query SAPOData.
- snowflake
Flow
Snowflake Destination Properties - The properties required to query Snowflake.
- upsolver
Flow
Upsolver Destination Properties - The properties required to query Upsolver.
- zendesk
Flow
Zendesk Destination Properties - The properties required to query Zendesk.
- custom
Connector FlowCustom Connector Destination Properties - The properties that are required to query the custom Connector.
- event
Bridge FlowEvent Bridge Destination Properties - The properties required to query Amazon EventBridge.
- lookout
Metrics FlowLookout Metrics Destination Properties - The properties required to query Amazon Lookout for Metrics.
- marketo
Flow
Marketo Destination Properties - The properties required to query Marketo.
- redshift
Flow
Redshift Destination Properties - The properties required to query Amazon Redshift.
- s3
Flow
S3Destination Properties - The properties required to query Amazon S3.
- salesforce
Flow
Salesforce Destination Properties - The properties required to query Salesforce.
- sapo
Data FlowSapo Data Destination Properties - The properties required to query SAPOData.
- snowflake
Flow
Snowflake Destination Properties - The properties required to query Snowflake.
- upsolver
Flow
Upsolver Destination Properties - The properties required to query Upsolver.
- zendesk
Flow
Zendesk Destination Properties - The properties required to query Zendesk.
- custom_
connector FlowCustom Connector Destination Properties - The properties that are required to query the custom Connector.
- event_
bridge FlowEvent Bridge Destination Properties - The properties required to query Amazon EventBridge.
- lookout_
metrics FlowLookout Metrics Destination Properties - The properties required to query Amazon Lookout for Metrics.
- marketo
Flow
Marketo Destination Properties - The properties required to query Marketo.
- redshift
Flow
Redshift Destination Properties - The properties required to query Amazon Redshift.
- s3
Flow
S3Destination Properties - The properties required to query Amazon S3.
- salesforce
Flow
Salesforce Destination Properties - The properties required to query Salesforce.
- sapo_
data FlowSapo Data Destination Properties - The properties required to query SAPOData.
- snowflake
Flow
Snowflake Destination Properties - The properties required to query Snowflake.
- upsolver
Flow
Upsolver Destination Properties - The properties required to query Upsolver.
- zendesk
Flow
Zendesk Destination Properties - The properties required to query Zendesk.
- custom
Connector Property Map - The properties that are required to query the custom Connector.
- event
Bridge Property Map - The properties required to query Amazon EventBridge.
- lookout
Metrics Property Map - The properties required to query Amazon Lookout for Metrics.
- marketo Property Map
- The properties required to query Marketo.
- redshift Property Map
- The properties required to query Amazon Redshift.
- s3 Property Map
- The properties required to query Amazon S3.
- salesforce Property Map
- The properties required to query Salesforce.
- sapo
Data Property Map - The properties required to query SAPOData.
- snowflake Property Map
- The properties required to query Snowflake.
- upsolver Property Map
- The properties required to query Upsolver.
- zendesk Property Map
- The properties required to query Zendesk.
FlowDestinationFlowConfig
- Connector
Type Pulumi.Aws Native. App Flow. Flow Connector Type - Destination connector type
- Destination
Connector Pulumi.Properties Aws Native. App Flow. Inputs. Flow Destination Connector Properties - Destination connector details
- Api
Version string - The API version that the destination connector uses.
- Connector
Profile stringName - Name of destination connector profile
- Connector
Type FlowConnector Type - Destination connector type
- Destination
Connector FlowProperties Destination Connector Properties - Destination connector details
- Api
Version string - The API version that the destination connector uses.
- Connector
Profile stringName - Name of destination connector profile
- connector
Type FlowConnector Type - Destination connector type
- destination
Connector FlowProperties Destination Connector Properties - Destination connector details
- api
Version String - The API version that the destination connector uses.
- connector
Profile StringName - Name of destination connector profile
- connector
Type FlowConnector Type - Destination connector type
- destination
Connector FlowProperties Destination Connector Properties - Destination connector details
- api
Version string - The API version that the destination connector uses.
- connector
Profile stringName - Name of destination connector profile
- connector_
type FlowConnector Type - Destination connector type
- destination_
connector_ Flowproperties Destination Connector Properties - Destination connector details
- api_
version str - The API version that the destination connector uses.
- connector_
profile_ strname - Name of destination connector profile
- connector
Type "SAPOData" | "Salesforce" | "Pardot" | "Singular" | "Slack" | "Redshift" | "S3" | "Marketo" | "Googleanalytics" | "Zendesk" | "Servicenow" | "Datadog" | "Trendmicro" | "Snowflake" | "Dynatrace" | "Infornexus" | "Amplitude" | "Veeva" | "CustomConnector" | "Event Bridge" | "Upsolver" | "Lookout Metrics" - Destination connector type
- destination
Connector Property MapProperties - Destination connector details
- api
Version String - The API version that the destination connector uses.
- connector
Profile StringName - Name of destination connector profile
FlowDynatraceConnectorOperator
FlowDynatraceSourceProperties
- Object string
- The object specified in the Dynatrace flow source.
- Object string
- The object specified in the Dynatrace flow source.
- object String
- The object specified in the Dynatrace flow source.
- object string
- The object specified in the Dynatrace flow source.
- object str
- The object specified in the Dynatrace flow source.
- object String
- The object specified in the Dynatrace flow source.
FlowErrorHandlingConfig
- Bucket
Name string - Specifies the name of the Amazon S3 bucket.
- Bucket
Prefix string - Specifies the Amazon S3 bucket prefix.
- Fail
On boolFirst Error - Specifies if the flow should fail after the first instance of a failure when attempting to place data in the destination.
- Bucket
Name string - Specifies the name of the Amazon S3 bucket.
- Bucket
Prefix string - Specifies the Amazon S3 bucket prefix.
- Fail
On boolFirst Error - Specifies if the flow should fail after the first instance of a failure when attempting to place data in the destination.
- bucket
Name String - Specifies the name of the Amazon S3 bucket.
- bucket
Prefix String - Specifies the Amazon S3 bucket prefix.
- fail
On BooleanFirst Error - Specifies if the flow should fail after the first instance of a failure when attempting to place data in the destination.
- bucket
Name string - Specifies the name of the Amazon S3 bucket.
- bucket
Prefix string - Specifies the Amazon S3 bucket prefix.
- fail
On booleanFirst Error - Specifies if the flow should fail after the first instance of a failure when attempting to place data in the destination.
- bucket_
name str - Specifies the name of the Amazon S3 bucket.
- bucket_
prefix str - Specifies the Amazon S3 bucket prefix.
- fail_
on_ boolfirst_ error - Specifies if the flow should fail after the first instance of a failure when attempting to place data in the destination.
- bucket
Name String - Specifies the name of the Amazon S3 bucket.
- bucket
Prefix String - Specifies the Amazon S3 bucket prefix.
- fail
On BooleanFirst Error - Specifies if the flow should fail after the first instance of a failure when attempting to place data in the destination.
FlowEventBridgeDestinationProperties
- Object string
- The object specified in the Amazon EventBridge flow destination.
- Error
Handling Pulumi.Config Aws Native. App Flow. Inputs. Flow Error Handling Config - The object specified in the Amplitude flow source.
- Object string
- The object specified in the Amazon EventBridge flow destination.
- Error
Handling FlowConfig Error Handling Config - The object specified in the Amplitude flow source.
- object String
- The object specified in the Amazon EventBridge flow destination.
- error
Handling FlowConfig Error Handling Config - The object specified in the Amplitude flow source.
- object string
- The object specified in the Amazon EventBridge flow destination.
- error
Handling FlowConfig Error Handling Config - The object specified in the Amplitude flow source.
- object str
- The object specified in the Amazon EventBridge flow destination.
- error_
handling_ Flowconfig Error Handling Config - The object specified in the Amplitude flow source.
- object String
- The object specified in the Amazon EventBridge flow destination.
- error
Handling Property MapConfig - The object specified in the Amplitude flow source.
FlowFileType
FlowGlueDataCatalog
- Database
Name string - A string containing the value for the tag
- Role
Arn string - A string containing the value for the tag
- Table
Prefix string - A string containing the value for the tag
- Database
Name string - A string containing the value for the tag
- Role
Arn string - A string containing the value for the tag
- Table
Prefix string - A string containing the value for the tag
- database
Name String - A string containing the value for the tag
- role
Arn String - A string containing the value for the tag
- table
Prefix String - A string containing the value for the tag
- database
Name string - A string containing the value for the tag
- role
Arn string - A string containing the value for the tag
- table
Prefix string - A string containing the value for the tag
- database_
name str - A string containing the value for the tag
- role_
arn str - A string containing the value for the tag
- table_
prefix str - A string containing the value for the tag
- database
Name String - A string containing the value for the tag
- role
Arn String - A string containing the value for the tag
- table
Prefix String - A string containing the value for the tag
FlowGoogleAnalyticsConnectorOperator
FlowGoogleAnalyticsSourceProperties
- Object string
- The object specified in the Google Analytics flow source.
- Object string
- The object specified in the Google Analytics flow source.
- object String
- The object specified in the Google Analytics flow source.
- object string
- The object specified in the Google Analytics flow source.
- object str
- The object specified in the Google Analytics flow source.
- object String
- The object specified in the Google Analytics flow source.
FlowIncrementalPullConfig
- Datetime
Type stringField Name - A field that specifies the date time or timestamp field as the criteria to use when importing incremental records from the source.
- Datetime
Type stringField Name - A field that specifies the date time or timestamp field as the criteria to use when importing incremental records from the source.
- datetime
Type StringField Name - A field that specifies the date time or timestamp field as the criteria to use when importing incremental records from the source.
- datetime
Type stringField Name - A field that specifies the date time or timestamp field as the criteria to use when importing incremental records from the source.
- datetime_
type_ strfield_ name - A field that specifies the date time or timestamp field as the criteria to use when importing incremental records from the source.
- datetime
Type StringField Name - A field that specifies the date time or timestamp field as the criteria to use when importing incremental records from the source.
FlowInforNexusConnectorOperator
FlowInforNexusSourceProperties
- Object string
- The object specified in the Infor Nexus flow source.
- Object string
- The object specified in the Infor Nexus flow source.
- object String
- The object specified in the Infor Nexus flow source.
- object string
- The object specified in the Infor Nexus flow source.
- object str
- The object specified in the Infor Nexus flow source.
- object String
- The object specified in the Infor Nexus flow source.
FlowLookoutMetricsDestinationProperties
- Object string
- The object specified in the Amazon Lookout for Metrics flow destination.
- Object string
- The object specified in the Amazon Lookout for Metrics flow destination.
- object String
- The object specified in the Amazon Lookout for Metrics flow destination.
- object string
- The object specified in the Amazon Lookout for Metrics flow destination.
- object str
- The object specified in the Amazon Lookout for Metrics flow destination.
- object String
- The object specified in the Amazon Lookout for Metrics flow destination.
FlowMarketoConnectorOperator
FlowMarketoDestinationProperties
- Object string
- The object specified in the Marketo flow destination.
- Error
Handling Pulumi.Config Aws Native. App Flow. Inputs. Flow Error Handling Config - The settings that determine how Amazon AppFlow handles an error when placing data in the destination. For example, this setting would determine if the flow should fail after one insertion error, or continue and attempt to insert every record regardless of the initial failure.
ErrorHandlingConfig
is a part of the destination connector details.
- Object string
- The object specified in the Marketo flow destination.
- Error
Handling FlowConfig Error Handling Config - The settings that determine how Amazon AppFlow handles an error when placing data in the destination. For example, this setting would determine if the flow should fail after one insertion error, or continue and attempt to insert every record regardless of the initial failure.
ErrorHandlingConfig
is a part of the destination connector details.
- object String
- The object specified in the Marketo flow destination.
- error
Handling FlowConfig Error Handling Config - The settings that determine how Amazon AppFlow handles an error when placing data in the destination. For example, this setting would determine if the flow should fail after one insertion error, or continue and attempt to insert every record regardless of the initial failure.
ErrorHandlingConfig
is a part of the destination connector details.
- object string
- The object specified in the Marketo flow destination.
- error
Handling FlowConfig Error Handling Config - The settings that determine how Amazon AppFlow handles an error when placing data in the destination. For example, this setting would determine if the flow should fail after one insertion error, or continue and attempt to insert every record regardless of the initial failure.
ErrorHandlingConfig
is a part of the destination connector details.
- object str
- The object specified in the Marketo flow destination.
- error_
handling_ Flowconfig Error Handling Config - The settings that determine how Amazon AppFlow handles an error when placing data in the destination. For example, this setting would determine if the flow should fail after one insertion error, or continue and attempt to insert every record regardless of the initial failure.
ErrorHandlingConfig
is a part of the destination connector details.
- object String
- The object specified in the Marketo flow destination.
- error
Handling Property MapConfig - The settings that determine how Amazon AppFlow handles an error when placing data in the destination. For example, this setting would determine if the flow should fail after one insertion error, or continue and attempt to insert every record regardless of the initial failure.
ErrorHandlingConfig
is a part of the destination connector details.
FlowMarketoSourceProperties
- Object string
- The object specified in the Marketo flow source.
- Object string
- The object specified in the Marketo flow source.
- object String
- The object specified in the Marketo flow source.
- object string
- The object specified in the Marketo flow source.
- object str
- The object specified in the Marketo flow source.
- object String
- The object specified in the Marketo flow source.
FlowMetadataCatalogConfig
- Glue
Data Pulumi.Catalog Aws Native. App Flow. Inputs. Flow Glue Data Catalog - Configurations of glue data catalog of the flow.
- Glue
Data FlowCatalog Glue Data Catalog - Configurations of glue data catalog of the flow.
- glue
Data FlowCatalog Glue Data Catalog - Configurations of glue data catalog of the flow.
- glue
Data FlowCatalog Glue Data Catalog - Configurations of glue data catalog of the flow.
- glue_
data_ Flowcatalog Glue Data Catalog - Configurations of glue data catalog of the flow.
- glue
Data Property MapCatalog - Configurations of glue data catalog of the flow.
FlowOperator
FlowOperatorPropertiesKeys
FlowPardotConnectorOperator
FlowPardotSourceProperties
- Object string
- The object specified in the Salesforce Pardot flow source.
- Object string
- The object specified in the Salesforce Pardot flow source.
- object String
- The object specified in the Salesforce Pardot flow source.
- object string
- The object specified in the Salesforce Pardot flow source.
- object str
- The object specified in the Salesforce Pardot flow source.
- object String
- The object specified in the Salesforce Pardot flow source.
FlowPathPrefix
FlowPrefixConfig
- Path
Prefix List<Pulumi.Hierarchy Aws Native. App Flow. Flow Path Prefix> Specifies whether the destination file path includes either or both of the following elements:
EXECUTION_ID - The ID that Amazon AppFlow assigns to the flow run.
SCHEMA_VERSION - The version number of your data schema. Amazon AppFlow assigns this version number. The version number increases by one when you change any of the following settings in your flow configuration:
Source-to-destination field mappings
Field data types
Partition keys
- Prefix
Format Pulumi.Aws Native. App Flow. Flow Prefix Format - Determines the level of granularity for the date and time that's included in the prefix.
- Prefix
Type Pulumi.Aws Native. App Flow. Flow Prefix Type - Determines the format of the prefix, and whether it applies to the file name, file path, or both.
- Path
Prefix []FlowHierarchy Path Prefix Specifies whether the destination file path includes either or both of the following elements:
EXECUTION_ID - The ID that Amazon AppFlow assigns to the flow run.
SCHEMA_VERSION - The version number of your data schema. Amazon AppFlow assigns this version number. The version number increases by one when you change any of the following settings in your flow configuration:
Source-to-destination field mappings
Field data types
Partition keys
- Prefix
Format FlowPrefix Format - Determines the level of granularity for the date and time that's included in the prefix.
- Prefix
Type FlowPrefix Type - Determines the format of the prefix, and whether it applies to the file name, file path, or both.
- path
Prefix List<FlowHierarchy Path Prefix> Specifies whether the destination file path includes either or both of the following elements:
EXECUTION_ID - The ID that Amazon AppFlow assigns to the flow run.
SCHEMA_VERSION - The version number of your data schema. Amazon AppFlow assigns this version number. The version number increases by one when you change any of the following settings in your flow configuration:
Source-to-destination field mappings
Field data types
Partition keys
- prefix
Format FlowPrefix Format - Determines the level of granularity for the date and time that's included in the prefix.
- prefix
Type FlowPrefix Type - Determines the format of the prefix, and whether it applies to the file name, file path, or both.
- path
Prefix FlowHierarchy Path Prefix[] Specifies whether the destination file path includes either or both of the following elements:
EXECUTION_ID - The ID that Amazon AppFlow assigns to the flow run.
SCHEMA_VERSION - The version number of your data schema. Amazon AppFlow assigns this version number. The version number increases by one when you change any of the following settings in your flow configuration:
Source-to-destination field mappings
Field data types
Partition keys
- prefix
Format FlowPrefix Format - Determines the level of granularity for the date and time that's included in the prefix.
- prefix
Type FlowPrefix Type - Determines the format of the prefix, and whether it applies to the file name, file path, or both.
- path_
prefix_ Sequence[Flowhierarchy Path Prefix] Specifies whether the destination file path includes either or both of the following elements:
EXECUTION_ID - The ID that Amazon AppFlow assigns to the flow run.
SCHEMA_VERSION - The version number of your data schema. Amazon AppFlow assigns this version number. The version number increases by one when you change any of the following settings in your flow configuration:
Source-to-destination field mappings
Field data types
Partition keys
- prefix_
format FlowPrefix Format - Determines the level of granularity for the date and time that's included in the prefix.
- prefix_
type FlowPrefix Type - Determines the format of the prefix, and whether it applies to the file name, file path, or both.
- path
Prefix List<"EXECUTION_ID" | "SCHEMA_VERSION">Hierarchy Specifies whether the destination file path includes either or both of the following elements:
EXECUTION_ID - The ID that Amazon AppFlow assigns to the flow run.
SCHEMA_VERSION - The version number of your data schema. Amazon AppFlow assigns this version number. The version number increases by one when you change any of the following settings in your flow configuration:
Source-to-destination field mappings
Field data types
Partition keys
- prefix
Format "YEAR" | "MONTH" | "DAY" | "HOUR" | "MINUTE" - Determines the level of granularity for the date and time that's included in the prefix.
- prefix
Type "FILENAME" | "PATH" | "PATH_AND_FILENAME" - Determines the format of the prefix, and whether it applies to the file name, file path, or both.
FlowPrefixFormat
FlowPrefixType
FlowRedshiftDestinationProperties
- Intermediate
Bucket stringName - The intermediate bucket that Amazon AppFlow uses when moving data into Amazon Redshift.
- Object string
- The object specified in the Amazon Redshift flow destination.
- Bucket
Prefix string - The object key for the bucket in which Amazon AppFlow places the destination files.
- Error
Handling Pulumi.Config Aws Native. App Flow. Inputs. Flow Error Handling Config - The settings that determine how Amazon AppFlow handles an error when placing data in the Amazon Redshift destination. For example, this setting would determine if the flow should fail after one insertion error, or continue and attempt to insert every record regardless of the initial failure.
ErrorHandlingConfig
is a part of the destination connector details.
- Intermediate
Bucket stringName - The intermediate bucket that Amazon AppFlow uses when moving data into Amazon Redshift.
- Object string
- The object specified in the Amazon Redshift flow destination.
- Bucket
Prefix string - The object key for the bucket in which Amazon AppFlow places the destination files.
- Error
Handling FlowConfig Error Handling Config - The settings that determine how Amazon AppFlow handles an error when placing data in the Amazon Redshift destination. For example, this setting would determine if the flow should fail after one insertion error, or continue and attempt to insert every record regardless of the initial failure.
ErrorHandlingConfig
is a part of the destination connector details.
- intermediate
Bucket StringName - The intermediate bucket that Amazon AppFlow uses when moving data into Amazon Redshift.
- object String
- The object specified in the Amazon Redshift flow destination.
- bucket
Prefix String - The object key for the bucket in which Amazon AppFlow places the destination files.
- error
Handling FlowConfig Error Handling Config - The settings that determine how Amazon AppFlow handles an error when placing data in the Amazon Redshift destination. For example, this setting would determine if the flow should fail after one insertion error, or continue and attempt to insert every record regardless of the initial failure.
ErrorHandlingConfig
is a part of the destination connector details.
- intermediate
Bucket stringName - The intermediate bucket that Amazon AppFlow uses when moving data into Amazon Redshift.
- object string
- The object specified in the Amazon Redshift flow destination.
- bucket
Prefix string - The object key for the bucket in which Amazon AppFlow places the destination files.
- error
Handling FlowConfig Error Handling Config - The settings that determine how Amazon AppFlow handles an error when placing data in the Amazon Redshift destination. For example, this setting would determine if the flow should fail after one insertion error, or continue and attempt to insert every record regardless of the initial failure.
ErrorHandlingConfig
is a part of the destination connector details.
- intermediate_
bucket_ strname - The intermediate bucket that Amazon AppFlow uses when moving data into Amazon Redshift.
- object str
- The object specified in the Amazon Redshift flow destination.
- bucket_
prefix str - The object key for the bucket in which Amazon AppFlow places the destination files.
- error_
handling_ Flowconfig Error Handling Config - The settings that determine how Amazon AppFlow handles an error when placing data in the Amazon Redshift destination. For example, this setting would determine if the flow should fail after one insertion error, or continue and attempt to insert every record regardless of the initial failure.
ErrorHandlingConfig
is a part of the destination connector details.
- intermediate
Bucket StringName - The intermediate bucket that Amazon AppFlow uses when moving data into Amazon Redshift.
- object String
- The object specified in the Amazon Redshift flow destination.
- bucket
Prefix String - The object key for the bucket in which Amazon AppFlow places the destination files.
- error
Handling Property MapConfig - The settings that determine how Amazon AppFlow handles an error when placing data in the Amazon Redshift destination. For example, this setting would determine if the flow should fail after one insertion error, or continue and attempt to insert every record regardless of the initial failure.
ErrorHandlingConfig
is a part of the destination connector details.
FlowS3ConnectorOperator
FlowS3DestinationProperties
- Bucket
Name string - The Amazon S3 bucket name in which Amazon AppFlow places the transferred data.
- Bucket
Prefix string - The object key for the destination bucket in which Amazon AppFlow places the files.
- S3Output
Format Pulumi.Config Aws Native. App Flow. Inputs. Flow S3Output Format Config - The configuration that determines how Amazon AppFlow should format the flow output data when Amazon S3 is used as the destination.
- Bucket
Name string - The Amazon S3 bucket name in which Amazon AppFlow places the transferred data.
- Bucket
Prefix string - The object key for the destination bucket in which Amazon AppFlow places the files.
- S3Output
Format FlowConfig S3Output Format Config - The configuration that determines how Amazon AppFlow should format the flow output data when Amazon S3 is used as the destination.
- bucket
Name String - The Amazon S3 bucket name in which Amazon AppFlow places the transferred data.
- bucket
Prefix String - The object key for the destination bucket in which Amazon AppFlow places the files.
- s3Output
Format FlowConfig S3Output Format Config - The configuration that determines how Amazon AppFlow should format the flow output data when Amazon S3 is used as the destination.
- bucket
Name string - The Amazon S3 bucket name in which Amazon AppFlow places the transferred data.
- bucket
Prefix string - The object key for the destination bucket in which Amazon AppFlow places the files.
- s3Output
Format FlowConfig S3Output Format Config - The configuration that determines how Amazon AppFlow should format the flow output data when Amazon S3 is used as the destination.
- bucket_
name str - The Amazon S3 bucket name in which Amazon AppFlow places the transferred data.
- bucket_
prefix str - The object key for the destination bucket in which Amazon AppFlow places the files.
- s3_
output_ Flowformat_ config S3Output Format Config - The configuration that determines how Amazon AppFlow should format the flow output data when Amazon S3 is used as the destination.
- bucket
Name String - The Amazon S3 bucket name in which Amazon AppFlow places the transferred data.
- bucket
Prefix String - The object key for the destination bucket in which Amazon AppFlow places the files.
- s3Output
Format Property MapConfig - The configuration that determines how Amazon AppFlow should format the flow output data when Amazon S3 is used as the destination.
FlowS3InputFormatConfig
- S3Input
File Pulumi.Type Aws Native. App Flow. Flow S3Input Format Config S3Input File Type - The file type that Amazon AppFlow gets from your Amazon S3 bucket.
- S3Input
File FlowType S3Input Format Config S3Input File Type - The file type that Amazon AppFlow gets from your Amazon S3 bucket.
- s3Input
File FlowType S3Input Format Config S3Input File Type - The file type that Amazon AppFlow gets from your Amazon S3 bucket.
- s3Input
File FlowType S3Input Format Config S3Input File Type - The file type that Amazon AppFlow gets from your Amazon S3 bucket.
- s3_
input_ Flowfile_ type S3Input Format Config S3Input File Type - The file type that Amazon AppFlow gets from your Amazon S3 bucket.
- s3Input
File "CSV" | "JSON"Type - The file type that Amazon AppFlow gets from your Amazon S3 bucket.
FlowS3InputFormatConfigS3InputFileType
FlowS3OutputFormatConfig
- Aggregation
Config Pulumi.Aws Native. App Flow. Inputs. Flow Aggregation Config - The aggregation settings that you can use to customize the output format of your flow data.
- File
Type Pulumi.Aws Native. App Flow. Flow File Type - Indicates the file type that Amazon AppFlow places in the Amazon S3 bucket.
- Prefix
Config Pulumi.Aws Native. App Flow. Inputs. Flow Prefix Config - Determines the prefix that Amazon AppFlow applies to the folder name in the Amazon S3 bucket. You can name folders according to the flow frequency and date.
- Preserve
Source boolData Typing - If your file output format is Parquet, use this parameter to set whether Amazon AppFlow preserves the data types in your source data when it writes the output to Amazon S3.
true
: Amazon AppFlow preserves the data types when it writes to Amazon S3. For example, an integer or1
in your source data is still an integer in your output.false
: Amazon AppFlow converts all of the source data into strings when it writes to Amazon S3. For example, an integer of1
in your source data becomes the string"1"
in the output.
- Aggregation
Config FlowAggregation Config - The aggregation settings that you can use to customize the output format of your flow data.
- File
Type FlowFile Type - Indicates the file type that Amazon AppFlow places in the Amazon S3 bucket.
- Prefix
Config FlowPrefix Config - Determines the prefix that Amazon AppFlow applies to the folder name in the Amazon S3 bucket. You can name folders according to the flow frequency and date.
- Preserve
Source boolData Typing - If your file output format is Parquet, use this parameter to set whether Amazon AppFlow preserves the data types in your source data when it writes the output to Amazon S3.
true
: Amazon AppFlow preserves the data types when it writes to Amazon S3. For example, an integer or1
in your source data is still an integer in your output.false
: Amazon AppFlow converts all of the source data into strings when it writes to Amazon S3. For example, an integer of1
in your source data becomes the string"1"
in the output.
- aggregation
Config FlowAggregation Config - The aggregation settings that you can use to customize the output format of your flow data.
- file
Type FlowFile Type - Indicates the file type that Amazon AppFlow places in the Amazon S3 bucket.
- prefix
Config FlowPrefix Config - Determines the prefix that Amazon AppFlow applies to the folder name in the Amazon S3 bucket. You can name folders according to the flow frequency and date.
- preserve
Source BooleanData Typing - If your file output format is Parquet, use this parameter to set whether Amazon AppFlow preserves the data types in your source data when it writes the output to Amazon S3.
true
: Amazon AppFlow preserves the data types when it writes to Amazon S3. For example, an integer or1
in your source data is still an integer in your output.false
: Amazon AppFlow converts all of the source data into strings when it writes to Amazon S3. For example, an integer of1
in your source data becomes the string"1"
in the output.
- aggregation
Config FlowAggregation Config - The aggregation settings that you can use to customize the output format of your flow data.
- file
Type FlowFile Type - Indicates the file type that Amazon AppFlow places in the Amazon S3 bucket.
- prefix
Config FlowPrefix Config - Determines the prefix that Amazon AppFlow applies to the folder name in the Amazon S3 bucket. You can name folders according to the flow frequency and date.
- preserve
Source booleanData Typing - If your file output format is Parquet, use this parameter to set whether Amazon AppFlow preserves the data types in your source data when it writes the output to Amazon S3.
true
: Amazon AppFlow preserves the data types when it writes to Amazon S3. For example, an integer or1
in your source data is still an integer in your output.false
: Amazon AppFlow converts all of the source data into strings when it writes to Amazon S3. For example, an integer of1
in your source data becomes the string"1"
in the output.
- aggregation_
config FlowAggregation Config - The aggregation settings that you can use to customize the output format of your flow data.
- file_
type FlowFile Type - Indicates the file type that Amazon AppFlow places in the Amazon S3 bucket.
- prefix_
config FlowPrefix Config - Determines the prefix that Amazon AppFlow applies to the folder name in the Amazon S3 bucket. You can name folders according to the flow frequency and date.
- preserve_
source_ booldata_ typing - If your file output format is Parquet, use this parameter to set whether Amazon AppFlow preserves the data types in your source data when it writes the output to Amazon S3.
true
: Amazon AppFlow preserves the data types when it writes to Amazon S3. For example, an integer or1
in your source data is still an integer in your output.false
: Amazon AppFlow converts all of the source data into strings when it writes to Amazon S3. For example, an integer of1
in your source data becomes the string"1"
in the output.
- aggregation
Config Property Map - The aggregation settings that you can use to customize the output format of your flow data.
- file
Type "CSV" | "JSON" | "PARQUET" - Indicates the file type that Amazon AppFlow places in the Amazon S3 bucket.
- prefix
Config Property Map - Determines the prefix that Amazon AppFlow applies to the folder name in the Amazon S3 bucket. You can name folders according to the flow frequency and date.
- preserve
Source BooleanData Typing - If your file output format is Parquet, use this parameter to set whether Amazon AppFlow preserves the data types in your source data when it writes the output to Amazon S3.
true
: Amazon AppFlow preserves the data types when it writes to Amazon S3. For example, an integer or1
in your source data is still an integer in your output.false
: Amazon AppFlow converts all of the source data into strings when it writes to Amazon S3. For example, an integer of1
in your source data becomes the string"1"
in the output.
FlowS3SourceProperties
- Bucket
Name string - The Amazon S3 bucket name where the source files are stored.
- Bucket
Prefix string - The object key for the Amazon S3 bucket in which the source files are stored.
- S3Input
Format Pulumi.Config Aws Native. App Flow. Inputs. Flow S3Input Format Config - When you use Amazon S3 as the source, the configuration format that you provide the flow input data.
- Bucket
Name string - The Amazon S3 bucket name where the source files are stored.
- Bucket
Prefix string - The object key for the Amazon S3 bucket in which the source files are stored.
- S3Input
Format FlowConfig S3Input Format Config - When you use Amazon S3 as the source, the configuration format that you provide the flow input data.
- bucket
Name String - The Amazon S3 bucket name where the source files are stored.
- bucket
Prefix String - The object key for the Amazon S3 bucket in which the source files are stored.
- s3Input
Format FlowConfig S3Input Format Config - When you use Amazon S3 as the source, the configuration format that you provide the flow input data.
- bucket
Name string - The Amazon S3 bucket name where the source files are stored.
- bucket
Prefix string - The object key for the Amazon S3 bucket in which the source files are stored.
- s3Input
Format FlowConfig S3Input Format Config - When you use Amazon S3 as the source, the configuration format that you provide the flow input data.
- bucket_
name str - The Amazon S3 bucket name where the source files are stored.
- bucket_
prefix str - The object key for the Amazon S3 bucket in which the source files are stored.
- s3_
input_ Flowformat_ config S3Input Format Config - When you use Amazon S3 as the source, the configuration format that you provide the flow input data.
- bucket
Name String - The Amazon S3 bucket name where the source files are stored.
- bucket
Prefix String - The object key for the Amazon S3 bucket in which the source files are stored.
- s3Input
Format Property MapConfig - When you use Amazon S3 as the source, the configuration format that you provide the flow input data.
FlowSalesforceConnectorOperator
FlowSalesforceDestinationProperties
- Object string
- The object specified in the Salesforce flow destination.
- Data
Transfer Pulumi.Api Aws Native. App Flow. Flow Data Transfer Api Specifies which Salesforce API is used by Amazon AppFlow when your flow transfers data to Salesforce.
- AUTOMATIC - The default. Amazon AppFlow selects which API to use based on the number of records that your flow transfers to Salesforce. If your flow transfers fewer than 1,000 records, Amazon AppFlow uses Salesforce REST API. If your flow transfers 1,000 records or more, Amazon AppFlow uses Salesforce Bulk API 2.0.
Each of these Salesforce APIs structures data differently. If Amazon AppFlow selects the API automatically, be aware that, for recurring flows, the data output might vary from one flow run to the next. For example, if a flow runs daily, it might use REST API on one day to transfer 900 records, and it might use Bulk API 2.0 on the next day to transfer 1,100 records. For each of these flow runs, the respective Salesforce API formats the data differently. Some of the differences include how dates are formatted and null values are represented. Also, Bulk API 2.0 doesn't transfer Salesforce compound fields.
By choosing this option, you optimize flow performance for both small and large data transfers, but the tradeoff is inconsistent formatting in the output.
- BULKV2 - Amazon AppFlow uses only Salesforce Bulk API 2.0. This API runs asynchronous data transfers, and it's optimal for large sets of data. By choosing this option, you ensure that your flow writes consistent output, but you optimize performance only for large data transfers.
Note that Bulk API 2.0 does not transfer Salesforce compound fields.
- REST_SYNC - Amazon AppFlow uses only Salesforce REST API. By choosing this option, you ensure that your flow writes consistent output, but you decrease performance for large data transfers that are better suited for Bulk API 2.0. In some cases, if your flow attempts to transfer a vary large set of data, it might fail with a timed out error.
- Error
Handling Pulumi.Config Aws Native. App Flow. Inputs. Flow Error Handling Config - The settings that determine how Amazon AppFlow handles an error when placing data in the Salesforce destination. For example, this setting would determine if the flow should fail after one insertion error, or continue and attempt to insert every record regardless of the initial failure.
ErrorHandlingConfig
is a part of the destination connector details. - Id
Field List<string>Names - List of fields used as ID when performing a write operation.
- Write
Operation Pulumi.Type Aws Native. App Flow. Flow Write Operation Type - This specifies the type of write operation to be performed in Salesforce. When the value is
UPSERT
, thenidFieldNames
is required.
- Object string
- The object specified in the Salesforce flow destination.
- Data
Transfer FlowApi Data Transfer Api Specifies which Salesforce API is used by Amazon AppFlow when your flow transfers data to Salesforce.
- AUTOMATIC - The default. Amazon AppFlow selects which API to use based on the number of records that your flow transfers to Salesforce. If your flow transfers fewer than 1,000 records, Amazon AppFlow uses Salesforce REST API. If your flow transfers 1,000 records or more, Amazon AppFlow uses Salesforce Bulk API 2.0.
Each of these Salesforce APIs structures data differently. If Amazon AppFlow selects the API automatically, be aware that, for recurring flows, the data output might vary from one flow run to the next. For example, if a flow runs daily, it might use REST API on one day to transfer 900 records, and it might use Bulk API 2.0 on the next day to transfer 1,100 records. For each of these flow runs, the respective Salesforce API formats the data differently. Some of the differences include how dates are formatted and null values are represented. Also, Bulk API 2.0 doesn't transfer Salesforce compound fields.
By choosing this option, you optimize flow performance for both small and large data transfers, but the tradeoff is inconsistent formatting in the output.
- BULKV2 - Amazon AppFlow uses only Salesforce Bulk API 2.0. This API runs asynchronous data transfers, and it's optimal for large sets of data. By choosing this option, you ensure that your flow writes consistent output, but you optimize performance only for large data transfers.
Note that Bulk API 2.0 does not transfer Salesforce compound fields.
- REST_SYNC - Amazon AppFlow uses only Salesforce REST API. By choosing this option, you ensure that your flow writes consistent output, but you decrease performance for large data transfers that are better suited for Bulk API 2.0. In some cases, if your flow attempts to transfer a vary large set of data, it might fail with a timed out error.
- Error
Handling FlowConfig Error Handling Config - The settings that determine how Amazon AppFlow handles an error when placing data in the Salesforce destination. For example, this setting would determine if the flow should fail after one insertion error, or continue and attempt to insert every record regardless of the initial failure.
ErrorHandlingConfig
is a part of the destination connector details. - Id
Field []stringNames - List of fields used as ID when performing a write operation.
- Write
Operation FlowType Write Operation Type - This specifies the type of write operation to be performed in Salesforce. When the value is
UPSERT
, thenidFieldNames
is required.
- object String
- The object specified in the Salesforce flow destination.
- data
Transfer FlowApi Data Transfer Api Specifies which Salesforce API is used by Amazon AppFlow when your flow transfers data to Salesforce.
- AUTOMATIC - The default. Amazon AppFlow selects which API to use based on the number of records that your flow transfers to Salesforce. If your flow transfers fewer than 1,000 records, Amazon AppFlow uses Salesforce REST API. If your flow transfers 1,000 records or more, Amazon AppFlow uses Salesforce Bulk API 2.0.
Each of these Salesforce APIs structures data differently. If Amazon AppFlow selects the API automatically, be aware that, for recurring flows, the data output might vary from one flow run to the next. For example, if a flow runs daily, it might use REST API on one day to transfer 900 records, and it might use Bulk API 2.0 on the next day to transfer 1,100 records. For each of these flow runs, the respective Salesforce API formats the data differently. Some of the differences include how dates are formatted and null values are represented. Also, Bulk API 2.0 doesn't transfer Salesforce compound fields.
By choosing this option, you optimize flow performance for both small and large data transfers, but the tradeoff is inconsistent formatting in the output.
- BULKV2 - Amazon AppFlow uses only Salesforce Bulk API 2.0. This API runs asynchronous data transfers, and it's optimal for large sets of data. By choosing this option, you ensure that your flow writes consistent output, but you optimize performance only for large data transfers.
Note that Bulk API 2.0 does not transfer Salesforce compound fields.
- REST_SYNC - Amazon AppFlow uses only Salesforce REST API. By choosing this option, you ensure that your flow writes consistent output, but you decrease performance for large data transfers that are better suited for Bulk API 2.0. In some cases, if your flow attempts to transfer a vary large set of data, it might fail with a timed out error.
- error
Handling FlowConfig Error Handling Config - The settings that determine how Amazon AppFlow handles an error when placing data in the Salesforce destination. For example, this setting would determine if the flow should fail after one insertion error, or continue and attempt to insert every record regardless of the initial failure.
ErrorHandlingConfig
is a part of the destination connector details. - id
Field List<String>Names - List of fields used as ID when performing a write operation.
- write
Operation FlowType Write Operation Type - This specifies the type of write operation to be performed in Salesforce. When the value is
UPSERT
, thenidFieldNames
is required.
- object string
- The object specified in the Salesforce flow destination.
- data
Transfer FlowApi Data Transfer Api Specifies which Salesforce API is used by Amazon AppFlow when your flow transfers data to Salesforce.
- AUTOMATIC - The default. Amazon AppFlow selects which API to use based on the number of records that your flow transfers to Salesforce. If your flow transfers fewer than 1,000 records, Amazon AppFlow uses Salesforce REST API. If your flow transfers 1,000 records or more, Amazon AppFlow uses Salesforce Bulk API 2.0.
Each of these Salesforce APIs structures data differently. If Amazon AppFlow selects the API automatically, be aware that, for recurring flows, the data output might vary from one flow run to the next. For example, if a flow runs daily, it might use REST API on one day to transfer 900 records, and it might use Bulk API 2.0 on the next day to transfer 1,100 records. For each of these flow runs, the respective Salesforce API formats the data differently. Some of the differences include how dates are formatted and null values are represented. Also, Bulk API 2.0 doesn't transfer Salesforce compound fields.
By choosing this option, you optimize flow performance for both small and large data transfers, but the tradeoff is inconsistent formatting in the output.
- BULKV2 - Amazon AppFlow uses only Salesforce Bulk API 2.0. This API runs asynchronous data transfers, and it's optimal for large sets of data. By choosing this option, you ensure that your flow writes consistent output, but you optimize performance only for large data transfers.
Note that Bulk API 2.0 does not transfer Salesforce compound fields.
- REST_SYNC - Amazon AppFlow uses only Salesforce REST API. By choosing this option, you ensure that your flow writes consistent output, but you decrease performance for large data transfers that are better suited for Bulk API 2.0. In some cases, if your flow attempts to transfer a vary large set of data, it might fail with a timed out error.
- error
Handling FlowConfig Error Handling Config - The settings that determine how Amazon AppFlow handles an error when placing data in the Salesforce destination. For example, this setting would determine if the flow should fail after one insertion error, or continue and attempt to insert every record regardless of the initial failure.
ErrorHandlingConfig
is a part of the destination connector details. - id
Field string[]Names - List of fields used as ID when performing a write operation.
- write
Operation FlowType Write Operation Type - This specifies the type of write operation to be performed in Salesforce. When the value is
UPSERT
, thenidFieldNames
is required.
- object str
- The object specified in the Salesforce flow destination.
- data_
transfer_ Flowapi Data Transfer Api Specifies which Salesforce API is used by Amazon AppFlow when your flow transfers data to Salesforce.
- AUTOMATIC - The default. Amazon AppFlow selects which API to use based on the number of records that your flow transfers to Salesforce. If your flow transfers fewer than 1,000 records, Amazon AppFlow uses Salesforce REST API. If your flow transfers 1,000 records or more, Amazon AppFlow uses Salesforce Bulk API 2.0.
Each of these Salesforce APIs structures data differently. If Amazon AppFlow selects the API automatically, be aware that, for recurring flows, the data output might vary from one flow run to the next. For example, if a flow runs daily, it might use REST API on one day to transfer 900 records, and it might use Bulk API 2.0 on the next day to transfer 1,100 records. For each of these flow runs, the respective Salesforce API formats the data differently. Some of the differences include how dates are formatted and null values are represented. Also, Bulk API 2.0 doesn't transfer Salesforce compound fields.
By choosing this option, you optimize flow performance for both small and large data transfers, but the tradeoff is inconsistent formatting in the output.
- BULKV2 - Amazon AppFlow uses only Salesforce Bulk API 2.0. This API runs asynchronous data transfers, and it's optimal for large sets of data. By choosing this option, you ensure that your flow writes consistent output, but you optimize performance only for large data transfers.
Note that Bulk API 2.0 does not transfer Salesforce compound fields.
- REST_SYNC - Amazon AppFlow uses only Salesforce REST API. By choosing this option, you ensure that your flow writes consistent output, but you decrease performance for large data transfers that are better suited for Bulk API 2.0. In some cases, if your flow attempts to transfer a vary large set of data, it might fail with a timed out error.
- error_
handling_ Flowconfig Error Handling Config - The settings that determine how Amazon AppFlow handles an error when placing data in the Salesforce destination. For example, this setting would determine if the flow should fail after one insertion error, or continue and attempt to insert every record regardless of the initial failure.
ErrorHandlingConfig
is a part of the destination connector details. - id_
field_ Sequence[str]names - List of fields used as ID when performing a write operation.
- write_
operation_ Flowtype Write Operation Type - This specifies the type of write operation to be performed in Salesforce. When the value is
UPSERT
, thenidFieldNames
is required.
- object String
- The object specified in the Salesforce flow destination.
- data
Transfer "AUTOMATIC" | "BULKV2" | "REST_SYNC"Api Specifies which Salesforce API is used by Amazon AppFlow when your flow transfers data to Salesforce.
- AUTOMATIC - The default. Amazon AppFlow selects which API to use based on the number of records that your flow transfers to Salesforce. If your flow transfers fewer than 1,000 records, Amazon AppFlow uses Salesforce REST API. If your flow transfers 1,000 records or more, Amazon AppFlow uses Salesforce Bulk API 2.0.
Each of these Salesforce APIs structures data differently. If Amazon AppFlow selects the API automatically, be aware that, for recurring flows, the data output might vary from one flow run to the next. For example, if a flow runs daily, it might use REST API on one day to transfer 900 records, and it might use Bulk API 2.0 on the next day to transfer 1,100 records. For each of these flow runs, the respective Salesforce API formats the data differently. Some of the differences include how dates are formatted and null values are represented. Also, Bulk API 2.0 doesn't transfer Salesforce compound fields.
By choosing this option, you optimize flow performance for both small and large data transfers, but the tradeoff is inconsistent formatting in the output.
- BULKV2 - Amazon AppFlow uses only Salesforce Bulk API 2.0. This API runs asynchronous data transfers, and it's optimal for large sets of data. By choosing this option, you ensure that your flow writes consistent output, but you optimize performance only for large data transfers.
Note that Bulk API 2.0 does not transfer Salesforce compound fields.
- REST_SYNC - Amazon AppFlow uses only Salesforce REST API. By choosing this option, you ensure that your flow writes consistent output, but you decrease performance for large data transfers that are better suited for Bulk API 2.0. In some cases, if your flow attempts to transfer a vary large set of data, it might fail with a timed out error.
- error
Handling Property MapConfig - The settings that determine how Amazon AppFlow handles an error when placing data in the Salesforce destination. For example, this setting would determine if the flow should fail after one insertion error, or continue and attempt to insert every record regardless of the initial failure.
ErrorHandlingConfig
is a part of the destination connector details. - id
Field List<String>Names - List of fields used as ID when performing a write operation.
- write
Operation "INSERT" | "UPSERT" | "UPDATE" | "DELETE"Type - This specifies the type of write operation to be performed in Salesforce. When the value is
UPSERT
, thenidFieldNames
is required.
FlowSalesforceSourceProperties
- Object string
- The object specified in the Salesforce flow source.
- Data
Transfer Pulumi.Api Aws Native. App Flow. Flow Data Transfer Api Specifies which Salesforce API is used by Amazon AppFlow when your flow transfers data from Salesforce.
- AUTOMATIC - The default. Amazon AppFlow selects which API to use based on the number of records that your flow transfers from Salesforce. If your flow transfers fewer than 1,000,000 records, Amazon AppFlow uses Salesforce REST API. If your flow transfers 1,000,000 records or more, Amazon AppFlow uses Salesforce Bulk API 2.0.
Each of these Salesforce APIs structures data differently. If Amazon AppFlow selects the API automatically, be aware that, for recurring flows, the data output might vary from one flow run to the next. For example, if a flow runs daily, it might use REST API on one day to transfer 900,000 records, and it might use Bulk API 2.0 on the next day to transfer 1,100,000 records. For each of these flow runs, the respective Salesforce API formats the data differently. Some of the differences include how dates are formatted and null values are represented. Also, Bulk API 2.0 doesn't transfer Salesforce compound fields.
By choosing this option, you optimize flow performance for both small and large data transfers, but the tradeoff is inconsistent formatting in the output.
- BULKV2 - Amazon AppFlow uses only Salesforce Bulk API 2.0. This API runs asynchronous data transfers, and it's optimal for large sets of data. By choosing this option, you ensure that your flow writes consistent output, but you optimize performance only for large data transfers.
Note that Bulk API 2.0 does not transfer Salesforce compound fields.
- REST_SYNC - Amazon AppFlow uses only Salesforce REST API. By choosing this option, you ensure that your flow writes consistent output, but you decrease performance for large data transfers that are better suited for Bulk API 2.0. In some cases, if your flow attempts to transfer a vary large set of data, it might fail wituh a timed out error.
- Enable
Dynamic boolField Update - The flag that enables dynamic fetching of new (recently added) fields in the Salesforce objects while running a flow.
- Include
Deleted boolRecords - Indicates whether Amazon AppFlow includes deleted files in the flow run.
- Object string
- The object specified in the Salesforce flow source.
- Data
Transfer FlowApi Data Transfer Api Specifies which Salesforce API is used by Amazon AppFlow when your flow transfers data from Salesforce.
- AUTOMATIC - The default. Amazon AppFlow selects which API to use based on the number of records that your flow transfers from Salesforce. If your flow transfers fewer than 1,000,000 records, Amazon AppFlow uses Salesforce REST API. If your flow transfers 1,000,000 records or more, Amazon AppFlow uses Salesforce Bulk API 2.0.
Each of these Salesforce APIs structures data differently. If Amazon AppFlow selects the API automatically, be aware that, for recurring flows, the data output might vary from one flow run to the next. For example, if a flow runs daily, it might use REST API on one day to transfer 900,000 records, and it might use Bulk API 2.0 on the next day to transfer 1,100,000 records. For each of these flow runs, the respective Salesforce API formats the data differently. Some of the differences include how dates are formatted and null values are represented. Also, Bulk API 2.0 doesn't transfer Salesforce compound fields.
By choosing this option, you optimize flow performance for both small and large data transfers, but the tradeoff is inconsistent formatting in the output.
- BULKV2 - Amazon AppFlow uses only Salesforce Bulk API 2.0. This API runs asynchronous data transfers, and it's optimal for large sets of data. By choosing this option, you ensure that your flow writes consistent output, but you optimize performance only for large data transfers.
Note that Bulk API 2.0 does not transfer Salesforce compound fields.
- REST_SYNC - Amazon AppFlow uses only Salesforce REST API. By choosing this option, you ensure that your flow writes consistent output, but you decrease performance for large data transfers that are better suited for Bulk API 2.0. In some cases, if your flow attempts to transfer a vary large set of data, it might fail wituh a timed out error.
- Enable
Dynamic boolField Update - The flag that enables dynamic fetching of new (recently added) fields in the Salesforce objects while running a flow.
- Include
Deleted boolRecords - Indicates whether Amazon AppFlow includes deleted files in the flow run.
- object String
- The object specified in the Salesforce flow source.
- data
Transfer FlowApi Data Transfer Api Specifies which Salesforce API is used by Amazon AppFlow when your flow transfers data from Salesforce.
- AUTOMATIC - The default. Amazon AppFlow selects which API to use based on the number of records that your flow transfers from Salesforce. If your flow transfers fewer than 1,000,000 records, Amazon AppFlow uses Salesforce REST API. If your flow transfers 1,000,000 records or more, Amazon AppFlow uses Salesforce Bulk API 2.0.
Each of these Salesforce APIs structures data differently. If Amazon AppFlow selects the API automatically, be aware that, for recurring flows, the data output might vary from one flow run to the next. For example, if a flow runs daily, it might use REST API on one day to transfer 900,000 records, and it might use Bulk API 2.0 on the next day to transfer 1,100,000 records. For each of these flow runs, the respective Salesforce API formats the data differently. Some of the differences include how dates are formatted and null values are represented. Also, Bulk API 2.0 doesn't transfer Salesforce compound fields.
By choosing this option, you optimize flow performance for both small and large data transfers, but the tradeoff is inconsistent formatting in the output.
- BULKV2 - Amazon AppFlow uses only Salesforce Bulk API 2.0. This API runs asynchronous data transfers, and it's optimal for large sets of data. By choosing this option, you ensure that your flow writes consistent output, but you optimize performance only for large data transfers.
Note that Bulk API 2.0 does not transfer Salesforce compound fields.
- REST_SYNC - Amazon AppFlow uses only Salesforce REST API. By choosing this option, you ensure that your flow writes consistent output, but you decrease performance for large data transfers that are better suited for Bulk API 2.0. In some cases, if your flow attempts to transfer a vary large set of data, it might fail wituh a timed out error.
- enable
Dynamic BooleanField Update - The flag that enables dynamic fetching of new (recently added) fields in the Salesforce objects while running a flow.
- include
Deleted BooleanRecords - Indicates whether Amazon AppFlow includes deleted files in the flow run.
- object string
- The object specified in the Salesforce flow source.
- data
Transfer FlowApi Data Transfer Api Specifies which Salesforce API is used by Amazon AppFlow when your flow transfers data from Salesforce.
- AUTOMATIC - The default. Amazon AppFlow selects which API to use based on the number of records that your flow transfers from Salesforce. If your flow transfers fewer than 1,000,000 records, Amazon AppFlow uses Salesforce REST API. If your flow transfers 1,000,000 records or more, Amazon AppFlow uses Salesforce Bulk API 2.0.
Each of these Salesforce APIs structures data differently. If Amazon AppFlow selects the API automatically, be aware that, for recurring flows, the data output might vary from one flow run to the next. For example, if a flow runs daily, it might use REST API on one day to transfer 900,000 records, and it might use Bulk API 2.0 on the next day to transfer 1,100,000 records. For each of these flow runs, the respective Salesforce API formats the data differently. Some of the differences include how dates are formatted and null values are represented. Also, Bulk API 2.0 doesn't transfer Salesforce compound fields.
By choosing this option, you optimize flow performance for both small and large data transfers, but the tradeoff is inconsistent formatting in the output.
- BULKV2 - Amazon AppFlow uses only Salesforce Bulk API 2.0. This API runs asynchronous data transfers, and it's optimal for large sets of data. By choosing this option, you ensure that your flow writes consistent output, but you optimize performance only for large data transfers.
Note that Bulk API 2.0 does not transfer Salesforce compound fields.
- REST_SYNC - Amazon AppFlow uses only Salesforce REST API. By choosing this option, you ensure that your flow writes consistent output, but you decrease performance for large data transfers that are better suited for Bulk API 2.0. In some cases, if your flow attempts to transfer a vary large set of data, it might fail wituh a timed out error.
- enable
Dynamic booleanField Update - The flag that enables dynamic fetching of new (recently added) fields in the Salesforce objects while running a flow.
- include
Deleted booleanRecords - Indicates whether Amazon AppFlow includes deleted files in the flow run.
- object str
- The object specified in the Salesforce flow source.
- data_
transfer_ Flowapi Data Transfer Api Specifies which Salesforce API is used by Amazon AppFlow when your flow transfers data from Salesforce.
- AUTOMATIC - The default. Amazon AppFlow selects which API to use based on the number of records that your flow transfers from Salesforce. If your flow transfers fewer than 1,000,000 records, Amazon AppFlow uses Salesforce REST API. If your flow transfers 1,000,000 records or more, Amazon AppFlow uses Salesforce Bulk API 2.0.
Each of these Salesforce APIs structures data differently. If Amazon AppFlow selects the API automatically, be aware that, for recurring flows, the data output might vary from one flow run to the next. For example, if a flow runs daily, it might use REST API on one day to transfer 900,000 records, and it might use Bulk API 2.0 on the next day to transfer 1,100,000 records. For each of these flow runs, the respective Salesforce API formats the data differently. Some of the differences include how dates are formatted and null values are represented. Also, Bulk API 2.0 doesn't transfer Salesforce compound fields.
By choosing this option, you optimize flow performance for both small and large data transfers, but the tradeoff is inconsistent formatting in the output.
- BULKV2 - Amazon AppFlow uses only Salesforce Bulk API 2.0. This API runs asynchronous data transfers, and it's optimal for large sets of data. By choosing this option, you ensure that your flow writes consistent output, but you optimize performance only for large data transfers.
Note that Bulk API 2.0 does not transfer Salesforce compound fields.
- REST_SYNC - Amazon AppFlow uses only Salesforce REST API. By choosing this option, you ensure that your flow writes consistent output, but you decrease performance for large data transfers that are better suited for Bulk API 2.0. In some cases, if your flow attempts to transfer a vary large set of data, it might fail wituh a timed out error.
- enable_
dynamic_ boolfield_ update - The flag that enables dynamic fetching of new (recently added) fields in the Salesforce objects while running a flow.
- include_
deleted_ boolrecords - Indicates whether Amazon AppFlow includes deleted files in the flow run.
- object String
- The object specified in the Salesforce flow source.
- data
Transfer "AUTOMATIC" | "BULKV2" | "REST_SYNC"Api Specifies which Salesforce API is used by Amazon AppFlow when your flow transfers data from Salesforce.
- AUTOMATIC - The default. Amazon AppFlow selects which API to use based on the number of records that your flow transfers from Salesforce. If your flow transfers fewer than 1,000,000 records, Amazon AppFlow uses Salesforce REST API. If your flow transfers 1,000,000 records or more, Amazon AppFlow uses Salesforce Bulk API 2.0.
Each of these Salesforce APIs structures data differently. If Amazon AppFlow selects the API automatically, be aware that, for recurring flows, the data output might vary from one flow run to the next. For example, if a flow runs daily, it might use REST API on one day to transfer 900,000 records, and it might use Bulk API 2.0 on the next day to transfer 1,100,000 records. For each of these flow runs, the respective Salesforce API formats the data differently. Some of the differences include how dates are formatted and null values are represented. Also, Bulk API 2.0 doesn't transfer Salesforce compound fields.
By choosing this option, you optimize flow performance for both small and large data transfers, but the tradeoff is inconsistent formatting in the output.
- BULKV2 - Amazon AppFlow uses only Salesforce Bulk API 2.0. This API runs asynchronous data transfers, and it's optimal for large sets of data. By choosing this option, you ensure that your flow writes consistent output, but you optimize performance only for large data transfers.
Note that Bulk API 2.0 does not transfer Salesforce compound fields.
- REST_SYNC - Amazon AppFlow uses only Salesforce REST API. By choosing this option, you ensure that your flow writes consistent output, but you decrease performance for large data transfers that are better suited for Bulk API 2.0. In some cases, if your flow attempts to transfer a vary large set of data, it might fail wituh a timed out error.
- enable
Dynamic BooleanField Update - The flag that enables dynamic fetching of new (recently added) fields in the Salesforce objects while running a flow.
- include
Deleted BooleanRecords - Indicates whether Amazon AppFlow includes deleted files in the flow run.
FlowSapoDataConnectorOperator
FlowSapoDataDestinationProperties
- Object
Path string - The object path specified in the SAPOData flow destination.
- Error
Handling Pulumi.Config Aws Native. App Flow. Inputs. Flow Error Handling Config - The settings that determine how Amazon AppFlow handles an error when placing data in the destination. For example, this setting would determine if the flow should fail after one insertion error, or continue and attempt to insert every record regardless of the initial failure.
ErrorHandlingConfig
is a part of the destination connector details. - Id
Field List<string>Names - List of fields used as ID when performing a write operation.
- Success
Response Pulumi.Handling Config Aws Native. App Flow. Inputs. Flow Success Response Handling Config Determines how Amazon AppFlow handles the success response that it gets from the connector after placing data.
For example, this setting would determine where to write the response from a destination connector upon a successful insert operation.
- Write
Operation Pulumi.Type Aws Native. App Flow. Flow Write Operation Type - The possible write operations in the destination connector. When this value is not provided, this defaults to the
INSERT
operation.
- Object
Path string - The object path specified in the SAPOData flow destination.
- Error
Handling FlowConfig Error Handling Config - The settings that determine how Amazon AppFlow handles an error when placing data in the destination. For example, this setting would determine if the flow should fail after one insertion error, or continue and attempt to insert every record regardless of the initial failure.
ErrorHandlingConfig
is a part of the destination connector details. - Id
Field []stringNames - List of fields used as ID when performing a write operation.
- Success
Response FlowHandling Config Success Response Handling Config Determines how Amazon AppFlow handles the success response that it gets from the connector after placing data.
For example, this setting would determine where to write the response from a destination connector upon a successful insert operation.
- Write
Operation FlowType Write Operation Type - The possible write operations in the destination connector. When this value is not provided, this defaults to the
INSERT
operation.
- object
Path String - The object path specified in the SAPOData flow destination.
- error
Handling FlowConfig Error Handling Config - The settings that determine how Amazon AppFlow handles an error when placing data in the destination. For example, this setting would determine if the flow should fail after one insertion error, or continue and attempt to insert every record regardless of the initial failure.
ErrorHandlingConfig
is a part of the destination connector details. - id
Field List<String>Names - List of fields used as ID when performing a write operation.
- success
Response FlowHandling Config Success Response Handling Config Determines how Amazon AppFlow handles the success response that it gets from the connector after placing data.
For example, this setting would determine where to write the response from a destination connector upon a successful insert operation.
- write
Operation FlowType Write Operation Type - The possible write operations in the destination connector. When this value is not provided, this defaults to the
INSERT
operation.
- object
Path string - The object path specified in the SAPOData flow destination.
- error
Handling FlowConfig Error Handling Config - The settings that determine how Amazon AppFlow handles an error when placing data in the destination. For example, this setting would determine if the flow should fail after one insertion error, or continue and attempt to insert every record regardless of the initial failure.
ErrorHandlingConfig
is a part of the destination connector details. - id
Field string[]Names - List of fields used as ID when performing a write operation.
- success
Response FlowHandling Config Success Response Handling Config Determines how Amazon AppFlow handles the success response that it gets from the connector after placing data.
For example, this setting would determine where to write the response from a destination connector upon a successful insert operation.
- write
Operation FlowType Write Operation Type - The possible write operations in the destination connector. When this value is not provided, this defaults to the
INSERT
operation.
- object_
path str - The object path specified in the SAPOData flow destination.
- error_
handling_ Flowconfig Error Handling Config - The settings that determine how Amazon AppFlow handles an error when placing data in the destination. For example, this setting would determine if the flow should fail after one insertion error, or continue and attempt to insert every record regardless of the initial failure.
ErrorHandlingConfig
is a part of the destination connector details. - id_
field_ Sequence[str]names - List of fields used as ID when performing a write operation.
- success_
response_ Flowhandling_ config Success Response Handling Config Determines how Amazon AppFlow handles the success response that it gets from the connector after placing data.
For example, this setting would determine where to write the response from a destination connector upon a successful insert operation.
- write_
operation_ Flowtype Write Operation Type - The possible write operations in the destination connector. When this value is not provided, this defaults to the
INSERT
operation.
- object
Path String - The object path specified in the SAPOData flow destination.
- error
Handling Property MapConfig - The settings that determine how Amazon AppFlow handles an error when placing data in the destination. For example, this setting would determine if the flow should fail after one insertion error, or continue and attempt to insert every record regardless of the initial failure.
ErrorHandlingConfig
is a part of the destination connector details. - id
Field List<String>Names - List of fields used as ID when performing a write operation.
- success
Response Property MapHandling Config Determines how Amazon AppFlow handles the success response that it gets from the connector after placing data.
For example, this setting would determine where to write the response from a destination connector upon a successful insert operation.
- write
Operation "INSERT" | "UPSERT" | "UPDATE" | "DELETE"Type - The possible write operations in the destination connector. When this value is not provided, this defaults to the
INSERT
operation.
FlowSapoDataPaginationConfig
- Max
Page intSize
- Max
Page intSize
- max
Page IntegerSize
- max
Page numberSize
- max_
page_ intsize
- max
Page NumberSize
FlowSapoDataParallelismConfig
- Max
Parallelism int
- Max
Parallelism int
- max
Parallelism Integer
- max
Parallelism number
- max_
parallelism int
- max
Parallelism Number
FlowSapoDataSourceProperties
- Object
Path string - The object path specified in the SAPOData flow source.
- Pagination
Config Pulumi.Aws Native. App Flow. Inputs. Flow Sapo Data Pagination Config - Parallelism
Config Pulumi.Aws Native. App Flow. Inputs. Flow Sapo Data Parallelism Config
- Object
Path string - The object path specified in the SAPOData flow source.
- Pagination
Config FlowSapo Data Pagination Config - Parallelism
Config FlowSapo Data Parallelism Config
- object
Path String - The object path specified in the SAPOData flow source.
- pagination
Config FlowSapo Data Pagination Config - parallelism
Config FlowSapo Data Parallelism Config
- object
Path string - The object path specified in the SAPOData flow source.
- pagination
Config FlowSapo Data Pagination Config - parallelism
Config FlowSapo Data Parallelism Config
- object_
path str - The object path specified in the SAPOData flow source.
- pagination_
config FlowSapo Data Pagination Config - parallelism_
config FlowSapo Data Parallelism Config
- object
Path String - The object path specified in the SAPOData flow source.
- pagination
Config Property Map - parallelism
Config Property Map
FlowScheduledTriggerProperties
- Schedule
Expression string - The scheduling expression that determines the rate at which the schedule will run, for example
rate(5minutes)
. - Data
Pull Pulumi.Mode Aws Native. App Flow. Flow Scheduled Trigger Properties Data Pull Mode - Specifies whether a scheduled flow has an incremental data transfer or a complete data transfer for each flow run.
- First
Execution doubleFrom - Specifies the date range for the records to import from the connector in the first flow run.
- Flow
Error intDeactivation Threshold - Defines how many times a scheduled flow fails consecutively before Amazon AppFlow deactivates it.
- Schedule
End doubleTime - The time at which the scheduled flow ends. The time is formatted as a timestamp that follows the ISO 8601 standard, such as
2022-04-27T13:00:00-07:00
. - Schedule
Offset double - Specifies the optional offset that is added to the time interval for a schedule-triggered flow.
- Schedule
Start doubleTime - The time at which the scheduled flow starts. The time is formatted as a timestamp that follows the ISO 8601 standard, such as
2022-04-26T13:00:00-07:00
. - Time
Zone string Specifies the time zone used when referring to the dates and times of a scheduled flow, such as
America/New_York
. This time zone is only a descriptive label. It doesn't affect how Amazon AppFlow interprets the timestamps that you specify to schedule the flow.If you want to schedule a flow by using times in a particular time zone, indicate the time zone as a UTC offset in your timestamps. For example, the UTC offsets for the
America/New_York
timezone are-04:00
EDT and-05:00 EST
.
- Schedule
Expression string - The scheduling expression that determines the rate at which the schedule will run, for example
rate(5minutes)
. - Data
Pull FlowMode Scheduled Trigger Properties Data Pull Mode - Specifies whether a scheduled flow has an incremental data transfer or a complete data transfer for each flow run.
- First
Execution float64From - Specifies the date range for the records to import from the connector in the first flow run.
- Flow
Error intDeactivation Threshold - Defines how many times a scheduled flow fails consecutively before Amazon AppFlow deactivates it.
- Schedule
End float64Time - The time at which the scheduled flow ends. The time is formatted as a timestamp that follows the ISO 8601 standard, such as
2022-04-27T13:00:00-07:00
. - Schedule
Offset float64 - Specifies the optional offset that is added to the time interval for a schedule-triggered flow.
- Schedule
Start float64Time - The time at which the scheduled flow starts. The time is formatted as a timestamp that follows the ISO 8601 standard, such as
2022-04-26T13:00:00-07:00
. - Time
Zone string Specifies the time zone used when referring to the dates and times of a scheduled flow, such as
America/New_York
. This time zone is only a descriptive label. It doesn't affect how Amazon AppFlow interprets the timestamps that you specify to schedule the flow.If you want to schedule a flow by using times in a particular time zone, indicate the time zone as a UTC offset in your timestamps. For example, the UTC offsets for the
America/New_York
timezone are-04:00
EDT and-05:00 EST
.
- schedule
Expression String - The scheduling expression that determines the rate at which the schedule will run, for example
rate(5minutes)
. - data
Pull FlowMode Scheduled Trigger Properties Data Pull Mode - Specifies whether a scheduled flow has an incremental data transfer or a complete data transfer for each flow run.
- first
Execution DoubleFrom - Specifies the date range for the records to import from the connector in the first flow run.
- flow
Error IntegerDeactivation Threshold - Defines how many times a scheduled flow fails consecutively before Amazon AppFlow deactivates it.
- schedule
End DoubleTime - The time at which the scheduled flow ends. The time is formatted as a timestamp that follows the ISO 8601 standard, such as
2022-04-27T13:00:00-07:00
. - schedule
Offset Double - Specifies the optional offset that is added to the time interval for a schedule-triggered flow.
- schedule
Start DoubleTime - The time at which the scheduled flow starts. The time is formatted as a timestamp that follows the ISO 8601 standard, such as
2022-04-26T13:00:00-07:00
. - time
Zone String Specifies the time zone used when referring to the dates and times of a scheduled flow, such as
America/New_York
. This time zone is only a descriptive label. It doesn't affect how Amazon AppFlow interprets the timestamps that you specify to schedule the flow.If you want to schedule a flow by using times in a particular time zone, indicate the time zone as a UTC offset in your timestamps. For example, the UTC offsets for the
America/New_York
timezone are-04:00
EDT and-05:00 EST
.
- schedule
Expression string - The scheduling expression that determines the rate at which the schedule will run, for example
rate(5minutes)
. - data
Pull FlowMode Scheduled Trigger Properties Data Pull Mode - Specifies whether a scheduled flow has an incremental data transfer or a complete data transfer for each flow run.
- first
Execution numberFrom - Specifies the date range for the records to import from the connector in the first flow run.
- flow
Error numberDeactivation Threshold - Defines how many times a scheduled flow fails consecutively before Amazon AppFlow deactivates it.
- schedule
End numberTime - The time at which the scheduled flow ends. The time is formatted as a timestamp that follows the ISO 8601 standard, such as
2022-04-27T13:00:00-07:00
. - schedule
Offset number - Specifies the optional offset that is added to the time interval for a schedule-triggered flow.
- schedule
Start numberTime - The time at which the scheduled flow starts. The time is formatted as a timestamp that follows the ISO 8601 standard, such as
2022-04-26T13:00:00-07:00
. - time
Zone string Specifies the time zone used when referring to the dates and times of a scheduled flow, such as
America/New_York
. This time zone is only a descriptive label. It doesn't affect how Amazon AppFlow interprets the timestamps that you specify to schedule the flow.If you want to schedule a flow by using times in a particular time zone, indicate the time zone as a UTC offset in your timestamps. For example, the UTC offsets for the
America/New_York
timezone are-04:00
EDT and-05:00 EST
.
- schedule_
expression str - The scheduling expression that determines the rate at which the schedule will run, for example
rate(5minutes)
. - data_
pull_ Flowmode Scheduled Trigger Properties Data Pull Mode - Specifies whether a scheduled flow has an incremental data transfer or a complete data transfer for each flow run.
- first_
execution_ floatfrom - Specifies the date range for the records to import from the connector in the first flow run.
- flow_
error_ intdeactivation_ threshold - Defines how many times a scheduled flow fails consecutively before Amazon AppFlow deactivates it.
- schedule_
end_ floattime - The time at which the scheduled flow ends. The time is formatted as a timestamp that follows the ISO 8601 standard, such as
2022-04-27T13:00:00-07:00
. - schedule_
offset float - Specifies the optional offset that is added to the time interval for a schedule-triggered flow.
- schedule_
start_ floattime - The time at which the scheduled flow starts. The time is formatted as a timestamp that follows the ISO 8601 standard, such as
2022-04-26T13:00:00-07:00
. - time_
zone str Specifies the time zone used when referring to the dates and times of a scheduled flow, such as
America/New_York
. This time zone is only a descriptive label. It doesn't affect how Amazon AppFlow interprets the timestamps that you specify to schedule the flow.If you want to schedule a flow by using times in a particular time zone, indicate the time zone as a UTC offset in your timestamps. For example, the UTC offsets for the
America/New_York
timezone are-04:00
EDT and-05:00 EST
.
- schedule
Expression String - The scheduling expression that determines the rate at which the schedule will run, for example
rate(5minutes)
. - data
Pull "Incremental" | "Complete"Mode - Specifies whether a scheduled flow has an incremental data transfer or a complete data transfer for each flow run.
- first
Execution NumberFrom - Specifies the date range for the records to import from the connector in the first flow run.
- flow
Error NumberDeactivation Threshold - Defines how many times a scheduled flow fails consecutively before Amazon AppFlow deactivates it.
- schedule
End NumberTime - The time at which the scheduled flow ends. The time is formatted as a timestamp that follows the ISO 8601 standard, such as
2022-04-27T13:00:00-07:00
. - schedule
Offset Number - Specifies the optional offset that is added to the time interval for a schedule-triggered flow.
- schedule
Start NumberTime - The time at which the scheduled flow starts. The time is formatted as a timestamp that follows the ISO 8601 standard, such as
2022-04-26T13:00:00-07:00
. - time
Zone String Specifies the time zone used when referring to the dates and times of a scheduled flow, such as
America/New_York
. This time zone is only a descriptive label. It doesn't affect how Amazon AppFlow interprets the timestamps that you specify to schedule the flow.If you want to schedule a flow by using times in a particular time zone, indicate the time zone as a UTC offset in your timestamps. For example, the UTC offsets for the
America/New_York
timezone are-04:00
EDT and-05:00 EST
.
FlowScheduledTriggerPropertiesDataPullMode
FlowServiceNowConnectorOperator
FlowServiceNowSourceProperties
- Object string
- The object specified in the ServiceNow flow source.
- Object string
- The object specified in the ServiceNow flow source.
- object String
- The object specified in the ServiceNow flow source.
- object string
- The object specified in the ServiceNow flow source.
- object str
- The object specified in the ServiceNow flow source.
- object String
- The object specified in the ServiceNow flow source.
FlowSingularConnectorOperator
FlowSingularSourceProperties
- Object string
- The object specified in the Singular flow source.
- Object string
- The object specified in the Singular flow source.
- object String
- The object specified in the Singular flow source.
- object string
- The object specified in the Singular flow source.
- object str
- The object specified in the Singular flow source.
- object String
- The object specified in the Singular flow source.
FlowSlackConnectorOperator
FlowSlackSourceProperties
- Object string
- The object specified in the Slack flow source.
- Object string
- The object specified in the Slack flow source.
- object String
- The object specified in the Slack flow source.
- object string
- The object specified in the Slack flow source.
- object str
- The object specified in the Slack flow source.
- object String
- The object specified in the Slack flow source.
FlowSnowflakeDestinationProperties
- Intermediate
Bucket stringName - The intermediate bucket that Amazon AppFlow uses when moving data into Snowflake.
- Object string
- The object specified in the Snowflake flow destination.
- Bucket
Prefix string - The object key for the destination bucket in which Amazon AppFlow places the files.
- Error
Handling Pulumi.Config Aws Native. App Flow. Inputs. Flow Error Handling Config - The settings that determine how Amazon AppFlow handles an error when placing data in the Snowflake destination. For example, this setting would determine if the flow should fail after one insertion error, or continue and attempt to insert every record regardless of the initial failure.
ErrorHandlingConfig
is a part of the destination connector details.
- Intermediate
Bucket stringName - The intermediate bucket that Amazon AppFlow uses when moving data into Snowflake.
- Object string
- The object specified in the Snowflake flow destination.
- Bucket
Prefix string - The object key for the destination bucket in which Amazon AppFlow places the files.
- Error
Handling FlowConfig Error Handling Config - The settings that determine how Amazon AppFlow handles an error when placing data in the Snowflake destination. For example, this setting would determine if the flow should fail after one insertion error, or continue and attempt to insert every record regardless of the initial failure.
ErrorHandlingConfig
is a part of the destination connector details.
- intermediate
Bucket StringName - The intermediate bucket that Amazon AppFlow uses when moving data into Snowflake.
- object String
- The object specified in the Snowflake flow destination.
- bucket
Prefix String - The object key for the destination bucket in which Amazon AppFlow places the files.
- error
Handling FlowConfig Error Handling Config - The settings that determine how Amazon AppFlow handles an error when placing data in the Snowflake destination. For example, this setting would determine if the flow should fail after one insertion error, or continue and attempt to insert every record regardless of the initial failure.
ErrorHandlingConfig
is a part of the destination connector details.
- intermediate
Bucket stringName - The intermediate bucket that Amazon AppFlow uses when moving data into Snowflake.
- object string
- The object specified in the Snowflake flow destination.
- bucket
Prefix string - The object key for the destination bucket in which Amazon AppFlow places the files.
- error
Handling FlowConfig Error Handling Config - The settings that determine how Amazon AppFlow handles an error when placing data in the Snowflake destination. For example, this setting would determine if the flow should fail after one insertion error, or continue and attempt to insert every record regardless of the initial failure.
ErrorHandlingConfig
is a part of the destination connector details.
- intermediate_
bucket_ strname - The intermediate bucket that Amazon AppFlow uses when moving data into Snowflake.
- object str
- The object specified in the Snowflake flow destination.
- bucket_
prefix str - The object key for the destination bucket in which Amazon AppFlow places the files.
- error_
handling_ Flowconfig Error Handling Config - The settings that determine how Amazon AppFlow handles an error when placing data in the Snowflake destination. For example, this setting would determine if the flow should fail after one insertion error, or continue and attempt to insert every record regardless of the initial failure.
ErrorHandlingConfig
is a part of the destination connector details.
- intermediate
Bucket StringName - The intermediate bucket that Amazon AppFlow uses when moving data into Snowflake.
- object String
- The object specified in the Snowflake flow destination.
- bucket
Prefix String - The object key for the destination bucket in which Amazon AppFlow places the files.
- error
Handling Property MapConfig - The settings that determine how Amazon AppFlow handles an error when placing data in the Snowflake destination. For example, this setting would determine if the flow should fail after one insertion error, or continue and attempt to insert every record regardless of the initial failure.
ErrorHandlingConfig
is a part of the destination connector details.
FlowSourceConnectorProperties
- Amplitude
Pulumi.
Aws Native. App Flow. Inputs. Flow Amplitude Source Properties - Specifies the information that is required for querying Amplitude.
- Custom
Connector Pulumi.Aws Native. App Flow. Inputs. Flow Custom Connector Source Properties - The properties that are applied when the custom connector is being used as a source.
- Datadog
Pulumi.
Aws Native. App Flow. Inputs. Flow Datadog Source Properties - Specifies the information that is required for querying Datadog.
- Dynatrace
Pulumi.
Aws Native. App Flow. Inputs. Flow Dynatrace Source Properties - Specifies the information that is required for querying Dynatrace.
- Google
Analytics Pulumi.Aws Native. App Flow. Inputs. Flow Google Analytics Source Properties - Specifies the information that is required for querying Google Analytics.
- Infor
Nexus Pulumi.Aws Native. App Flow. Inputs. Flow Infor Nexus Source Properties - Specifies the information that is required for querying Infor Nexus.
- Marketo
Pulumi.
Aws Native. App Flow. Inputs. Flow Marketo Source Properties - Specifies the information that is required for querying Marketo.
- Pardot
Pulumi.
Aws Native. App Flow. Inputs. Flow Pardot Source Properties - Specifies the information that is required for querying Salesforce Pardot.
- S3
Pulumi.
Aws Native. App Flow. Inputs. Flow S3Source Properties - Specifies the information that is required for querying Amazon S3.
- Salesforce
Pulumi.
Aws Native. App Flow. Inputs. Flow Salesforce Source Properties - Specifies the information that is required for querying Salesforce.
- Sapo
Data Pulumi.Aws Native. App Flow. Inputs. Flow Sapo Data Source Properties - The properties that are applied when using SAPOData as a flow source.
- Service
Now Pulumi.Aws Native. App Flow. Inputs. Flow Service Now Source Properties - Specifies the information that is required for querying ServiceNow.
- Singular
Pulumi.
Aws Native. App Flow. Inputs. Flow Singular Source Properties - Specifies the information that is required for querying Singular.
- Slack
Pulumi.
Aws Native. App Flow. Inputs. Flow Slack Source Properties - Specifies the information that is required for querying Slack.
- Trendmicro
Pulumi.
Aws Native. App Flow. Inputs. Flow Trendmicro Source Properties - Specifies the information that is required for querying Trend Micro.
- Veeva
Pulumi.
Aws Native. App Flow. Inputs. Flow Veeva Source Properties - Specifies the information that is required for querying Veeva.
- Zendesk
Pulumi.
Aws Native. App Flow. Inputs. Flow Zendesk Source Properties - Specifies the information that is required for querying Zendesk.
- Amplitude
Flow
Amplitude Source Properties - Specifies the information that is required for querying Amplitude.
- Custom
Connector FlowCustom Connector Source Properties - The properties that are applied when the custom connector is being used as a source.
- Datadog
Flow
Datadog Source Properties - Specifies the information that is required for querying Datadog.
- Dynatrace
Flow
Dynatrace Source Properties - Specifies the information that is required for querying Dynatrace.
- Google
Analytics FlowGoogle Analytics Source Properties - Specifies the information that is required for querying Google Analytics.
- Infor
Nexus FlowInfor Nexus Source Properties - Specifies the information that is required for querying Infor Nexus.
- Marketo
Flow
Marketo Source Properties - Specifies the information that is required for querying Marketo.
- Pardot
Flow
Pardot Source Properties - Specifies the information that is required for querying Salesforce Pardot.
- S3
Flow
S3Source Properties - Specifies the information that is required for querying Amazon S3.
- Salesforce
Flow
Salesforce Source Properties - Specifies the information that is required for querying Salesforce.
- Sapo
Data FlowSapo Data Source Properties - The properties that are applied when using SAPOData as a flow source.
- Service
Now FlowService Now Source Properties - Specifies the information that is required for querying ServiceNow.
- Singular
Flow
Singular Source Properties - Specifies the information that is required for querying Singular.
- Slack
Flow
Slack Source Properties - Specifies the information that is required for querying Slack.
- Trendmicro
Flow
Trendmicro Source Properties - Specifies the information that is required for querying Trend Micro.
- Veeva
Flow
Veeva Source Properties - Specifies the information that is required for querying Veeva.
- Zendesk
Flow
Zendesk Source Properties - Specifies the information that is required for querying Zendesk.
- amplitude
Flow
Amplitude Source Properties - Specifies the information that is required for querying Amplitude.
- custom
Connector FlowCustom Connector Source Properties - The properties that are applied when the custom connector is being used as a source.
- datadog
Flow
Datadog Source Properties - Specifies the information that is required for querying Datadog.
- dynatrace
Flow
Dynatrace Source Properties - Specifies the information that is required for querying Dynatrace.
- google
Analytics FlowGoogle Analytics Source Properties - Specifies the information that is required for querying Google Analytics.
- infor
Nexus FlowInfor Nexus Source Properties - Specifies the information that is required for querying Infor Nexus.
- marketo
Flow
Marketo Source Properties - Specifies the information that is required for querying Marketo.
- pardot
Flow
Pardot Source Properties - Specifies the information that is required for querying Salesforce Pardot.
- s3
Flow
S3Source Properties - Specifies the information that is required for querying Amazon S3.
- salesforce
Flow
Salesforce Source Properties - Specifies the information that is required for querying Salesforce.
- sapo
Data FlowSapo Data Source Properties - The properties that are applied when using SAPOData as a flow source.
- service
Now FlowService Now Source Properties - Specifies the information that is required for querying ServiceNow.
- singular
Flow
Singular Source Properties - Specifies the information that is required for querying Singular.
- slack
Flow
Slack Source Properties - Specifies the information that is required for querying Slack.
- trendmicro
Flow
Trendmicro Source Properties - Specifies the information that is required for querying Trend Micro.
- veeva
Flow
Veeva Source Properties - Specifies the information that is required for querying Veeva.
- zendesk
Flow
Zendesk Source Properties - Specifies the information that is required for querying Zendesk.
- amplitude
Flow
Amplitude Source Properties - Specifies the information that is required for querying Amplitude.
- custom
Connector FlowCustom Connector Source Properties - The properties that are applied when the custom connector is being used as a source.
- datadog
Flow
Datadog Source Properties - Specifies the information that is required for querying Datadog.
- dynatrace
Flow
Dynatrace Source Properties - Specifies the information that is required for querying Dynatrace.
- google
Analytics FlowGoogle Analytics Source Properties - Specifies the information that is required for querying Google Analytics.
- infor
Nexus FlowInfor Nexus Source Properties - Specifies the information that is required for querying Infor Nexus.
- marketo
Flow
Marketo Source Properties - Specifies the information that is required for querying Marketo.
- pardot
Flow
Pardot Source Properties - Specifies the information that is required for querying Salesforce Pardot.
- s3
Flow
S3Source Properties - Specifies the information that is required for querying Amazon S3.
- salesforce
Flow
Salesforce Source Properties - Specifies the information that is required for querying Salesforce.
- sapo
Data FlowSapo Data Source Properties - The properties that are applied when using SAPOData as a flow source.
- service
Now FlowService Now Source Properties - Specifies the information that is required for querying ServiceNow.
- singular
Flow
Singular Source Properties - Specifies the information that is required for querying Singular.
- slack
Flow
Slack Source Properties - Specifies the information that is required for querying Slack.
- trendmicro
Flow
Trendmicro Source Properties - Specifies the information that is required for querying Trend Micro.
- veeva
Flow
Veeva Source Properties - Specifies the information that is required for querying Veeva.
- zendesk
Flow
Zendesk Source Properties - Specifies the information that is required for querying Zendesk.
- amplitude
Flow
Amplitude Source Properties - Specifies the information that is required for querying Amplitude.
- custom_
connector FlowCustom Connector Source Properties - The properties that are applied when the custom connector is being used as a source.
- datadog
Flow
Datadog Source Properties - Specifies the information that is required for querying Datadog.
- dynatrace
Flow
Dynatrace Source Properties - Specifies the information that is required for querying Dynatrace.
- google_
analytics FlowGoogle Analytics Source Properties - Specifies the information that is required for querying Google Analytics.
- infor_
nexus FlowInfor Nexus Source Properties - Specifies the information that is required for querying Infor Nexus.
- marketo
Flow
Marketo Source Properties - Specifies the information that is required for querying Marketo.
- pardot
Flow
Pardot Source Properties - Specifies the information that is required for querying Salesforce Pardot.
- s3
Flow
S3Source Properties - Specifies the information that is required for querying Amazon S3.
- salesforce
Flow
Salesforce Source Properties - Specifies the information that is required for querying Salesforce.
- sapo_
data FlowSapo Data Source Properties - The properties that are applied when using SAPOData as a flow source.
- service_
now FlowService Now Source Properties - Specifies the information that is required for querying ServiceNow.
- singular
Flow
Singular Source Properties - Specifies the information that is required for querying Singular.
- slack
Flow
Slack Source Properties - Specifies the information that is required for querying Slack.
- trendmicro
Flow
Trendmicro Source Properties - Specifies the information that is required for querying Trend Micro.
- veeva
Flow
Veeva Source Properties - Specifies the information that is required for querying Veeva.
- zendesk
Flow
Zendesk Source Properties - Specifies the information that is required for querying Zendesk.
- amplitude Property Map
- Specifies the information that is required for querying Amplitude.
- custom
Connector Property Map - The properties that are applied when the custom connector is being used as a source.
- datadog Property Map
- Specifies the information that is required for querying Datadog.
- dynatrace Property Map
- Specifies the information that is required for querying Dynatrace.
- google
Analytics Property Map - Specifies the information that is required for querying Google Analytics.
- infor
Nexus Property Map - Specifies the information that is required for querying Infor Nexus.
- marketo Property Map
- Specifies the information that is required for querying Marketo.
- pardot Property Map
- Specifies the information that is required for querying Salesforce Pardot.
- s3 Property Map
- Specifies the information that is required for querying Amazon S3.
- salesforce Property Map
- Specifies the information that is required for querying Salesforce.
- sapo
Data Property Map - The properties that are applied when using SAPOData as a flow source.
- service
Now Property Map - Specifies the information that is required for querying ServiceNow.
- singular Property Map
- Specifies the information that is required for querying Singular.
- slack Property Map
- Specifies the information that is required for querying Slack.
- trendmicro Property Map
- Specifies the information that is required for querying Trend Micro.
- veeva Property Map
- Specifies the information that is required for querying Veeva.
- zendesk Property Map
- Specifies the information that is required for querying Zendesk.
FlowSourceFlowConfig
- Connector
Type Pulumi.Aws Native. App Flow. Flow Connector Type - Type of source connector
- Source
Connector Pulumi.Properties Aws Native. App Flow. Inputs. Flow Source Connector Properties - Source connector details required to query a connector
- Api
Version string - The API version that the destination connector uses.
- Connector
Profile stringName - Name of source connector profile
- Incremental
Pull Pulumi.Config Aws Native. App Flow. Inputs. Flow Incremental Pull Config - Configuration for scheduled incremental data pull
- Connector
Type FlowConnector Type - Type of source connector
- Source
Connector FlowProperties Source Connector Properties - Source connector details required to query a connector
- Api
Version string - The API version that the destination connector uses.
- Connector
Profile stringName - Name of source connector profile
- Incremental
Pull FlowConfig Incremental Pull Config - Configuration for scheduled incremental data pull
- connector
Type FlowConnector Type - Type of source connector
- source
Connector FlowProperties Source Connector Properties - Source connector details required to query a connector
- api
Version String - The API version that the destination connector uses.
- connector
Profile StringName - Name of source connector profile
- incremental
Pull FlowConfig Incremental Pull Config - Configuration for scheduled incremental data pull
- connector
Type FlowConnector Type - Type of source connector
- source
Connector FlowProperties Source Connector Properties - Source connector details required to query a connector
- api
Version string - The API version that the destination connector uses.
- connector
Profile stringName - Name of source connector profile
- incremental
Pull FlowConfig Incremental Pull Config - Configuration for scheduled incremental data pull
- connector_
type FlowConnector Type - Type of source connector
- source_
connector_ Flowproperties Source Connector Properties - Source connector details required to query a connector
- api_
version str - The API version that the destination connector uses.
- connector_
profile_ strname - Name of source connector profile
- incremental_
pull_ Flowconfig Incremental Pull Config - Configuration for scheduled incremental data pull
- connector
Type "SAPOData" | "Salesforce" | "Pardot" | "Singular" | "Slack" | "Redshift" | "S3" | "Marketo" | "Googleanalytics" | "Zendesk" | "Servicenow" | "Datadog" | "Trendmicro" | "Snowflake" | "Dynatrace" | "Infornexus" | "Amplitude" | "Veeva" | "CustomConnector" | "Event Bridge" | "Upsolver" | "Lookout Metrics" - Type of source connector
- source
Connector Property MapProperties - Source connector details required to query a connector
- api
Version String - The API version that the destination connector uses.
- connector
Profile StringName - Name of source connector profile
- incremental
Pull Property MapConfig - Configuration for scheduled incremental data pull
FlowStatus
FlowSuccessResponseHandlingConfig
- Bucket
Name string - The name of the Amazon S3 bucket.
- Bucket
Prefix string - The Amazon S3 bucket prefix.
- Bucket
Name string - The name of the Amazon S3 bucket.
- Bucket
Prefix string - The Amazon S3 bucket prefix.
- bucket
Name String - The name of the Amazon S3 bucket.
- bucket
Prefix String - The Amazon S3 bucket prefix.
- bucket
Name string - The name of the Amazon S3 bucket.
- bucket
Prefix string - The Amazon S3 bucket prefix.
- bucket_
name str - The name of the Amazon S3 bucket.
- bucket_
prefix str - The Amazon S3 bucket prefix.
- bucket
Name String - The name of the Amazon S3 bucket.
- bucket
Prefix String - The Amazon S3 bucket prefix.
FlowTask
- Source
Fields List<string> - Source fields on which particular task will be applied
- Task
Type Pulumi.Aws Native. App Flow. Flow Task Type - Type of task
- Connector
Operator Pulumi.Aws Native. App Flow. Inputs. Flow Connector Operator - Operation to be performed on provided source fields
- Destination
Field string - A field value on which source field should be validated
- Task
Properties List<Pulumi.Aws Native. App Flow. Inputs. Flow Task Properties Object> - A Map used to store task related info
- Source
Fields []string - Source fields on which particular task will be applied
- Task
Type FlowTask Type - Type of task
- Connector
Operator FlowConnector Operator - Operation to be performed on provided source fields
- Destination
Field string - A field value on which source field should be validated
- Task
Properties []FlowTask Properties Object - A Map used to store task related info
- source
Fields List<String> - Source fields on which particular task will be applied
- task
Type FlowTask Type - Type of task
- connector
Operator FlowConnector Operator - Operation to be performed on provided source fields
- destination
Field String - A field value on which source field should be validated
- task
Properties List<FlowTask Properties Object> - A Map used to store task related info
- source
Fields string[] - Source fields on which particular task will be applied
- task
Type FlowTask Type - Type of task
- connector
Operator FlowConnector Operator - Operation to be performed on provided source fields
- destination
Field string - A field value on which source field should be validated
- task
Properties FlowTask Properties Object[] - A Map used to store task related info
- source_
fields Sequence[str] - Source fields on which particular task will be applied
- task_
type FlowTask Type - Type of task
- connector_
operator FlowConnector Operator - Operation to be performed on provided source fields
- destination_
field str - A field value on which source field should be validated
- task_
properties Sequence[FlowTask Properties Object] - A Map used to store task related info
- source
Fields List<String> - Source fields on which particular task will be applied
- task
Type "Arithmetic" | "Filter" | "Map" | "Map_all" | "Mask" | "Merge" | "Passthrough" | "Truncate" | "Validate" | "Partition" - Type of task
- connector
Operator Property Map - Operation to be performed on provided source fields
- destination
Field String - A field value on which source field should be validated
- task
Properties List<Property Map> - A Map used to store task related info
FlowTaskPropertiesObject
- Key
Pulumi.
Aws Native. App Flow. Flow Operator Properties Keys - The task property key.
- Value string
- The task property value.
- Key
Flow
Operator Properties Keys - The task property key.
- Value string
- The task property value.
- key
Flow
Operator Properties Keys - The task property key.
- value String
- The task property value.
- key
Flow
Operator Properties Keys - The task property key.
- value string
- The task property value.
- key
Flow
Operator Properties Keys - The task property key.
- value str
- The task property value.
- key "VALUE" | "VALUES" | "DATA_TYPE" | "UPPER_BOUND" | "LOWER_BOUND" | "SOURCE_DATA_TYPE" | "DESTINATION_DATA_TYPE" | "VALIDATION_ACTION" | "MASK_VALUE" | "MASK_LENGTH" | "TRUNCATE_LENGTH" | "MATH_OPERATION_FIELDS_ORDER" | "CONCAT_FORMAT" | "SUBFIELD_CATEGORY_MAP" | "EXCLUDE_SOURCE_FIELDS_LIST" | "INCLUDE_NEW_FIELDS" | "ORDERED_PARTITION_KEYS_LIST"
- The task property key.
- value String
- The task property value.
FlowTaskType
FlowTrendmicroConnectorOperator
FlowTrendmicroSourceProperties
- Object string
- The object specified in the Trend Micro flow source.
- Object string
- The object specified in the Trend Micro flow source.
- object String
- The object specified in the Trend Micro flow source.
- object string
- The object specified in the Trend Micro flow source.
- object str
- The object specified in the Trend Micro flow source.
- object String
- The object specified in the Trend Micro flow source.
FlowTriggerConfig
- Trigger
Type Pulumi.Aws Native. App Flow. Flow Trigger Type - Trigger type of the flow
- Trigger
Properties Pulumi.Aws Native. App Flow. Inputs. Flow Scheduled Trigger Properties - Details required based on the type of trigger
- Trigger
Type FlowTrigger Type - Trigger type of the flow
- Trigger
Properties FlowScheduled Trigger Properties - Details required based on the type of trigger
- trigger
Type FlowTrigger Type - Trigger type of the flow
- trigger
Properties FlowScheduled Trigger Properties - Details required based on the type of trigger
- trigger
Type FlowTrigger Type - Trigger type of the flow
- trigger
Properties FlowScheduled Trigger Properties - Details required based on the type of trigger
- trigger_
type FlowTrigger Type - Trigger type of the flow
- trigger_
properties FlowScheduled Trigger Properties - Details required based on the type of trigger
- trigger
Type "Scheduled" | "Event" | "OnDemand" - Trigger type of the flow
- trigger
Properties Property Map - Details required based on the type of trigger
FlowTriggerType
FlowUpsolverDestinationProperties
- Bucket
Name string - The Upsolver Amazon S3 bucket name in which Amazon AppFlow places the transferred data.
- S3Output
Format Pulumi.Config Aws Native. App Flow. Inputs. Flow Upsolver S3Output Format Config - The configuration that determines how data is formatted when Upsolver is used as the flow destination.
- Bucket
Prefix string - The object key for the destination Upsolver Amazon S3 bucket in which Amazon AppFlow places the files.
- Bucket
Name string - The Upsolver Amazon S3 bucket name in which Amazon AppFlow places the transferred data.
- S3Output
Format FlowConfig Upsolver S3Output Format Config - The configuration that determines how data is formatted when Upsolver is used as the flow destination.
- Bucket
Prefix string - The object key for the destination Upsolver Amazon S3 bucket in which Amazon AppFlow places the files.
- bucket
Name String - The Upsolver Amazon S3 bucket name in which Amazon AppFlow places the transferred data.
- s3Output
Format FlowConfig Upsolver S3Output Format Config - The configuration that determines how data is formatted when Upsolver is used as the flow destination.
- bucket
Prefix String - The object key for the destination Upsolver Amazon S3 bucket in which Amazon AppFlow places the files.
- bucket
Name string - The Upsolver Amazon S3 bucket name in which Amazon AppFlow places the transferred data.
- s3Output
Format FlowConfig Upsolver S3Output Format Config - The configuration that determines how data is formatted when Upsolver is used as the flow destination.
- bucket
Prefix string - The object key for the destination Upsolver Amazon S3 bucket in which Amazon AppFlow places the files.
- bucket_
name str - The Upsolver Amazon S3 bucket name in which Amazon AppFlow places the transferred data.
- s3_
output_ Flowformat_ config Upsolver S3Output Format Config - The configuration that determines how data is formatted when Upsolver is used as the flow destination.
- bucket_
prefix str - The object key for the destination Upsolver Amazon S3 bucket in which Amazon AppFlow places the files.
- bucket
Name String - The Upsolver Amazon S3 bucket name in which Amazon AppFlow places the transferred data.
- s3Output
Format Property MapConfig - The configuration that determines how data is formatted when Upsolver is used as the flow destination.
- bucket
Prefix String - The object key for the destination Upsolver Amazon S3 bucket in which Amazon AppFlow places the files.
FlowUpsolverS3OutputFormatConfig
- Prefix
Config Pulumi.Aws Native. App Flow. Inputs. Flow Prefix Config - Specifies elements that Amazon AppFlow includes in the file and folder names in the flow destination.
- Aggregation
Config Pulumi.Aws Native. App Flow. Inputs. Flow Aggregation Config - The aggregation settings that you can use to customize the output format of your flow data.
- File
Type Pulumi.Aws Native. App Flow. Flow File Type - Indicates the file type that Amazon AppFlow places in the Upsolver Amazon S3 bucket.
- Prefix
Config FlowPrefix Config - Specifies elements that Amazon AppFlow includes in the file and folder names in the flow destination.
- Aggregation
Config FlowAggregation Config - The aggregation settings that you can use to customize the output format of your flow data.
- File
Type FlowFile Type - Indicates the file type that Amazon AppFlow places in the Upsolver Amazon S3 bucket.
- prefix
Config FlowPrefix Config - Specifies elements that Amazon AppFlow includes in the file and folder names in the flow destination.
- aggregation
Config FlowAggregation Config - The aggregation settings that you can use to customize the output format of your flow data.
- file
Type FlowFile Type - Indicates the file type that Amazon AppFlow places in the Upsolver Amazon S3 bucket.
- prefix
Config FlowPrefix Config - Specifies elements that Amazon AppFlow includes in the file and folder names in the flow destination.
- aggregation
Config FlowAggregation Config - The aggregation settings that you can use to customize the output format of your flow data.
- file
Type FlowFile Type - Indicates the file type that Amazon AppFlow places in the Upsolver Amazon S3 bucket.
- prefix_
config FlowPrefix Config - Specifies elements that Amazon AppFlow includes in the file and folder names in the flow destination.
- aggregation_
config FlowAggregation Config - The aggregation settings that you can use to customize the output format of your flow data.
- file_
type FlowFile Type - Indicates the file type that Amazon AppFlow places in the Upsolver Amazon S3 bucket.
- prefix
Config Property Map - Specifies elements that Amazon AppFlow includes in the file and folder names in the flow destination.
- aggregation
Config Property Map - The aggregation settings that you can use to customize the output format of your flow data.
- file
Type "CSV" | "JSON" | "PARQUET" - Indicates the file type that Amazon AppFlow places in the Upsolver Amazon S3 bucket.
FlowVeevaConnectorOperator
FlowVeevaSourceProperties
- Object string
- The object specified in the Veeva flow source.
- Document
Type string - The document type specified in the Veeva document extract flow.
- Include
All boolVersions - Boolean value to include All Versions of files in Veeva document extract flow.
- Include
Renditions bool - Boolean value to include file renditions in Veeva document extract flow.
- Include
Source boolFiles - Boolean value to include source files in Veeva document extract flow.
- Object string
- The object specified in the Veeva flow source.
- Document
Type string - The document type specified in the Veeva document extract flow.
- Include
All boolVersions - Boolean value to include All Versions of files in Veeva document extract flow.
- Include
Renditions bool - Boolean value to include file renditions in Veeva document extract flow.
- Include
Source boolFiles - Boolean value to include source files in Veeva document extract flow.
- object String
- The object specified in the Veeva flow source.
- document
Type String - The document type specified in the Veeva document extract flow.
- include
All BooleanVersions - Boolean value to include All Versions of files in Veeva document extract flow.
- include
Renditions Boolean - Boolean value to include file renditions in Veeva document extract flow.
- include
Source BooleanFiles - Boolean value to include source files in Veeva document extract flow.
- object string
- The object specified in the Veeva flow source.
- document
Type string - The document type specified in the Veeva document extract flow.
- include
All booleanVersions - Boolean value to include All Versions of files in Veeva document extract flow.
- include
Renditions boolean - Boolean value to include file renditions in Veeva document extract flow.
- include
Source booleanFiles - Boolean value to include source files in Veeva document extract flow.
- object str
- The object specified in the Veeva flow source.
- document_
type str - The document type specified in the Veeva document extract flow.
- include_
all_ boolversions - Boolean value to include All Versions of files in Veeva document extract flow.
- include_
renditions bool - Boolean value to include file renditions in Veeva document extract flow.
- include_
source_ boolfiles - Boolean value to include source files in Veeva document extract flow.
- object String
- The object specified in the Veeva flow source.
- document
Type String - The document type specified in the Veeva document extract flow.
- include
All BooleanVersions - Boolean value to include All Versions of files in Veeva document extract flow.
- include
Renditions Boolean - Boolean value to include file renditions in Veeva document extract flow.
- include
Source BooleanFiles - Boolean value to include source files in Veeva document extract flow.
FlowWriteOperationType
FlowZendeskConnectorOperator
FlowZendeskDestinationProperties
- Object string
- The object specified in the Zendesk flow destination.
- Error
Handling Pulumi.Config Aws Native. App Flow. Inputs. Flow Error Handling Config - The settings that determine how Amazon AppFlow handles an error when placing data in the destination. For example, this setting would determine if the flow should fail after one insertion error, or continue and attempt to insert every record regardless of the initial failure.
ErrorHandlingConfig
is a part of the destination connector details. - Id
Field List<string>Names - List of fields used as ID when performing a write operation.
- Write
Operation Pulumi.Type Aws Native. App Flow. Flow Write Operation Type - The possible write operations in the destination connector. When this value is not provided, this defaults to the
INSERT
operation.
- Object string
- The object specified in the Zendesk flow destination.
- Error
Handling FlowConfig Error Handling Config - The settings that determine how Amazon AppFlow handles an error when placing data in the destination. For example, this setting would determine if the flow should fail after one insertion error, or continue and attempt to insert every record regardless of the initial failure.
ErrorHandlingConfig
is a part of the destination connector details. - Id
Field []stringNames - List of fields used as ID when performing a write operation.
- Write
Operation FlowType Write Operation Type - The possible write operations in the destination connector. When this value is not provided, this defaults to the
INSERT
operation.
- object String
- The object specified in the Zendesk flow destination.
- error
Handling FlowConfig Error Handling Config - The settings that determine how Amazon AppFlow handles an error when placing data in the destination. For example, this setting would determine if the flow should fail after one insertion error, or continue and attempt to insert every record regardless of the initial failure.
ErrorHandlingConfig
is a part of the destination connector details. - id
Field List<String>Names - List of fields used as ID when performing a write operation.
- write
Operation FlowType Write Operation Type - The possible write operations in the destination connector. When this value is not provided, this defaults to the
INSERT
operation.
- object string
- The object specified in the Zendesk flow destination.
- error
Handling FlowConfig Error Handling Config - The settings that determine how Amazon AppFlow handles an error when placing data in the destination. For example, this setting would determine if the flow should fail after one insertion error, or continue and attempt to insert every record regardless of the initial failure.
ErrorHandlingConfig
is a part of the destination connector details. - id
Field string[]Names - List of fields used as ID when performing a write operation.
- write
Operation FlowType Write Operation Type - The possible write operations in the destination connector. When this value is not provided, this defaults to the
INSERT
operation.
- object str
- The object specified in the Zendesk flow destination.
- error_
handling_ Flowconfig Error Handling Config - The settings that determine how Amazon AppFlow handles an error when placing data in the destination. For example, this setting would determine if the flow should fail after one insertion error, or continue and attempt to insert every record regardless of the initial failure.
ErrorHandlingConfig
is a part of the destination connector details. - id_
field_ Sequence[str]names - List of fields used as ID when performing a write operation.
- write_
operation_ Flowtype Write Operation Type - The possible write operations in the destination connector. When this value is not provided, this defaults to the
INSERT
operation.
- object String
- The object specified in the Zendesk flow destination.
- error
Handling Property MapConfig - The settings that determine how Amazon AppFlow handles an error when placing data in the destination. For example, this setting would determine if the flow should fail after one insertion error, or continue and attempt to insert every record regardless of the initial failure.
ErrorHandlingConfig
is a part of the destination connector details. - id
Field List<String>Names - List of fields used as ID when performing a write operation.
- write
Operation "INSERT" | "UPSERT" | "UPDATE" | "DELETE"Type - The possible write operations in the destination connector. When this value is not provided, this defaults to the
INSERT
operation.
FlowZendeskSourceProperties
- Object string
- The object specified in the Zendesk flow source.
- Object string
- The object specified in the Zendesk flow source.
- object String
- The object specified in the Zendesk flow source.
- object string
- The object specified in the Zendesk flow source.
- object str
- The object specified in the Zendesk flow source.
- object String
- The object specified in the Zendesk flow source.
Tag
Package Details
- Repository
- AWS Native pulumi/pulumi-aws-native
- License
- Apache-2.0
We recommend new projects start with resources from the AWS provider.