Interface KafkaStreamingSourceOptions.Builder

  • Method Details

    • bootstrapServers

      KafkaStreamingSourceOptions.Builder bootstrapServers(String bootstrapServers)

      A list of bootstrap server URLs, for example, as b-1.vpc-test-2.o4q88o.c6.kafka.us-east-1.amazonaws.com:9094. This option must be specified in the API call or defined in the table metadata in the Data Catalog.

      Parameters:
      bootstrapServers - A list of bootstrap server URLs, for example, as b-1.vpc-test-2.o4q88o.c6.kafka.us-east-1.amazonaws.com:9094. This option must be specified in the API call or defined in the table metadata in the Data Catalog.
      Returns:
      Returns a reference to this object so that method calls can be chained together.
    • securityProtocol

      KafkaStreamingSourceOptions.Builder securityProtocol(String securityProtocol)

      The protocol used to communicate with brokers. The possible values are "SSL" or "PLAINTEXT".

      Parameters:
      securityProtocol - The protocol used to communicate with brokers. The possible values are "SSL" or "PLAINTEXT".
      Returns:
      Returns a reference to this object so that method calls can be chained together.
    • connectionName

      KafkaStreamingSourceOptions.Builder connectionName(String connectionName)

      The name of the connection.

      Parameters:
      connectionName - The name of the connection.
      Returns:
      Returns a reference to this object so that method calls can be chained together.
    • topicName

      The topic name as specified in Apache Kafka. You must specify at least one of "topicName", "assign" or "subscribePattern".

      Parameters:
      topicName - The topic name as specified in Apache Kafka. You must specify at least one of "topicName" , "assign" or "subscribePattern".
      Returns:
      Returns a reference to this object so that method calls can be chained together.
    • assign

      The specific TopicPartitions to consume. You must specify at least one of "topicName", "assign" or "subscribePattern".

      Parameters:
      assign - The specific TopicPartitions to consume. You must specify at least one of "topicName", "assign" or "subscribePattern".
      Returns:
      Returns a reference to this object so that method calls can be chained together.
    • subscribePattern

      KafkaStreamingSourceOptions.Builder subscribePattern(String subscribePattern)

      A Java regex string that identifies the topic list to subscribe to. You must specify at least one of "topicName", "assign" or "subscribePattern".

      Parameters:
      subscribePattern - A Java regex string that identifies the topic list to subscribe to. You must specify at least one of "topicName", "assign" or "subscribePattern".
      Returns:
      Returns a reference to this object so that method calls can be chained together.
    • classification

      KafkaStreamingSourceOptions.Builder classification(String classification)

      An optional classification.

      Parameters:
      classification - An optional classification.
      Returns:
      Returns a reference to this object so that method calls can be chained together.
    • delimiter

      Specifies the delimiter character.

      Parameters:
      delimiter - Specifies the delimiter character.
      Returns:
      Returns a reference to this object so that method calls can be chained together.
    • startingOffsets

      KafkaStreamingSourceOptions.Builder startingOffsets(String startingOffsets)

      The starting position in the Kafka topic to read data from. The possible values are "earliest" or "latest". The default value is "latest".

      Parameters:
      startingOffsets - The starting position in the Kafka topic to read data from. The possible values are "earliest" or "latest". The default value is "latest".
      Returns:
      Returns a reference to this object so that method calls can be chained together.
    • endingOffsets

      KafkaStreamingSourceOptions.Builder endingOffsets(String endingOffsets)

      The end point when a batch query is ended. Possible values are either "latest" or a JSON string that specifies an ending offset for each TopicPartition.

      Parameters:
      endingOffsets - The end point when a batch query is ended. Possible values are either "latest" or a JSON string that specifies an ending offset for each TopicPartition.
      Returns:
      Returns a reference to this object so that method calls can be chained together.
    • pollTimeoutMs

      KafkaStreamingSourceOptions.Builder pollTimeoutMs(Long pollTimeoutMs)

      The timeout in milliseconds to poll data from Kafka in Spark job executors. The default value is 512.

      Parameters:
      pollTimeoutMs - The timeout in milliseconds to poll data from Kafka in Spark job executors. The default value is 512.
      Returns:
      Returns a reference to this object so that method calls can be chained together.
    • numRetries

      The number of times to retry before failing to fetch Kafka offsets. The default value is 3.

      Parameters:
      numRetries - The number of times to retry before failing to fetch Kafka offsets. The default value is 3.
      Returns:
      Returns a reference to this object so that method calls can be chained together.
    • retryIntervalMs

      KafkaStreamingSourceOptions.Builder retryIntervalMs(Long retryIntervalMs)

      The time in milliseconds to wait before retrying to fetch Kafka offsets. The default value is 10 .

      Parameters:
      retryIntervalMs - The time in milliseconds to wait before retrying to fetch Kafka offsets. The default value is 10.
      Returns:
      Returns a reference to this object so that method calls can be chained together.
    • maxOffsetsPerTrigger

      KafkaStreamingSourceOptions.Builder maxOffsetsPerTrigger(Long maxOffsetsPerTrigger)

      The rate limit on the maximum number of offsets that are processed per trigger interval. The specified total number of offsets is proportionally split across topicPartitions of different volumes. The default value is null, which means that the consumer reads all offsets until the known latest offset.

      Parameters:
      maxOffsetsPerTrigger - The rate limit on the maximum number of offsets that are processed per trigger interval. The specified total number of offsets is proportionally split across topicPartitions of different volumes. The default value is null, which means that the consumer reads all offsets until the known latest offset.
      Returns:
      Returns a reference to this object so that method calls can be chained together.
    • minPartitions

      KafkaStreamingSourceOptions.Builder minPartitions(Integer minPartitions)

      The desired minimum number of partitions to read from Kafka. The default value is null, which means that the number of spark partitions is equal to the number of Kafka partitions.

      Parameters:
      minPartitions - The desired minimum number of partitions to read from Kafka. The default value is null, which means that the number of spark partitions is equal to the number of Kafka partitions.
      Returns:
      Returns a reference to this object so that method calls can be chained together.
    • includeHeaders

      KafkaStreamingSourceOptions.Builder includeHeaders(Boolean includeHeaders)

      Whether to include the Kafka headers. When the option is set to "true", the data output will contain an additional column named "glue_streaming_kafka_headers" with type Array[Struct(key: String, value: String)]. The default value is "false". This option is available in Glue version 3.0 or later only.

      Parameters:
      includeHeaders - Whether to include the Kafka headers. When the option is set to "true", the data output will contain an additional column named "glue_streaming_kafka_headers" with type Array[Struct(key: String, value: String)]. The default value is "false". This option is available in Glue version 3.0 or later only.
      Returns:
      Returns a reference to this object so that method calls can be chained together.
    • addRecordTimestamp

      KafkaStreamingSourceOptions.Builder addRecordTimestamp(String addRecordTimestamp)

      When this option is set to 'true', the data output will contain an additional column named "__src_timestamp" that indicates the time when the corresponding record received by the topic. The default value is 'false'. This option is supported in Glue version 4.0 or later.

      Parameters:
      addRecordTimestamp - When this option is set to 'true', the data output will contain an additional column named "__src_timestamp" that indicates the time when the corresponding record received by the topic. The default value is 'false'. This option is supported in Glue version 4.0 or later.
      Returns:
      Returns a reference to this object so that method calls can be chained together.
    • emitConsumerLagMetrics

      KafkaStreamingSourceOptions.Builder emitConsumerLagMetrics(String emitConsumerLagMetrics)

      When this option is set to 'true', for each batch, it will emit the metrics for the duration between the oldest record received by the topic and the time it arrives in Glue to CloudWatch. The metric's name is "glue.driver.streaming.maxConsumerLagInMs". The default value is 'false'. This option is supported in Glue version 4.0 or later.

      Parameters:
      emitConsumerLagMetrics - When this option is set to 'true', for each batch, it will emit the metrics for the duration between the oldest record received by the topic and the time it arrives in Glue to CloudWatch. The metric's name is "glue.driver.streaming.maxConsumerLagInMs". The default value is 'false'. This option is supported in Glue version 4.0 or later.
      Returns:
      Returns a reference to this object so that method calls can be chained together.
    • startingTimestamp

      KafkaStreamingSourceOptions.Builder startingTimestamp(Instant startingTimestamp)

      The timestamp of the record in the Kafka topic to start reading data from. The possible values are a timestamp string in UTC format of the pattern yyyy-mm-ddTHH:MM:SSZ (where Z represents a UTC timezone offset with a +/-. For example: "2023-04-04T08:00:00+08:00").

      Only one of StartingTimestamp or StartingOffsets must be set.

      Parameters:
      startingTimestamp - The timestamp of the record in the Kafka topic to start reading data from. The possible values are a timestamp string in UTC format of the pattern yyyy-mm-ddTHH:MM:SSZ (where Z represents a UTC timezone offset with a +/-. For example: "2023-04-04T08:00:00+08:00").

      Only one of StartingTimestamp or StartingOffsets must be set.

      Returns:
      Returns a reference to this object so that method calls can be chained together.