Class KafkaConnectSources


  • public final class KafkaConnectSources
    extends java.lang.Object
    Contains factory methods to create a Kafka Connect source.
    • Method Summary

      All Methods Static Methods Concrete Methods 
      Modifier and Type Method Description
      static StreamSource<org.apache.kafka.connect.source.SourceRecord> connect​(java.util.Properties properties)
      A generic Kafka Connect source provides ability to plug any Kafka Connect source for data ingestion to Jet pipelines.
      static <T> StreamSource<T> connect​(java.util.Properties properties, FunctionEx<org.apache.kafka.connect.source.SourceRecord,​T> projectionFn)
      A generic Kafka Connect source provides ability to plug any Kafka Connect source for data ingestion to Jet pipelines.
      • Methods inherited from class java.lang.Object

        clone, equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
    • Method Detail

      • connect

        @Nonnull
        public static <T> StreamSource<T> connect​(@Nonnull
                                                  java.util.Properties properties,
                                                  @Nonnull
                                                  FunctionEx<org.apache.kafka.connect.source.SourceRecord,​T> projectionFn)
        A generic Kafka Connect source provides ability to plug any Kafka Connect source for data ingestion to Jet pipelines.

        You need to add the Kafka Connect connector JARs or a ZIP file contains the JARs as a job resource via JobConfig.addJar(URL) or JobConfig.addJarsInZip(URL) respectively.

        After that you can use the Kafka Connect connector with the configuration parameters as you'd use it with Kafka. Hazelcast Jet will drive the Kafka Connect connector from the pipeline and the records will be available to your pipeline as a stream of the custom type objects created by projectionFn.

        In case of a failure; this source keeps track of the source partition offsets, it will restore the partition offsets and resume the consumption from where it left off.

        Hazelcast Jet will instantiate tasks on a random cluster member and use local parallelism for scaling. Property tasks.max is not allowed. Use StreamStage.setLocalParallelism(int) in the pipeline instead. This limitation can be changed in the future.

        Parameters:
        properties - Kafka connect properties
        projectionFn - function to create output objects from the Kafka SourceRecords. If the projection returns a null for an item, that item will be filtered out.
        Returns:
        a source to use in Pipeline.readFrom(StreamSource)
        Since:
        5.3
      • connect

        @Nonnull
        public static StreamSource<org.apache.kafka.connect.source.SourceRecord> connect​(@Nonnull
                                                                                         java.util.Properties properties)
        A generic Kafka Connect source provides ability to plug any Kafka Connect source for data ingestion to Jet pipelines.

        You need to add the Kafka Connect connector JARs or a ZIP file contains the JARs as a job resource via JobConfig.addJar(URL) or JobConfig.addJarsInZip(URL) respectively.

        After that you can use the Kafka Connect connector with the configuration parameters as you'd use it with Kafka. Hazelcast Jet will drive the Kafka Connect connector from the pipeline and the records will be available to your pipeline as SourceRecords.

        In case of a failure; this source keeps track of the source partition offsets, it will restore the partition offsets and resume the consumption from where it left off.

        Hazelcast Jet will instantiate tasks on a random cluster member and use local parallelism for scaling. Property tasks.max is not allowed. Use StreamStage.setLocalParallelism(int) in the pipeline instead.

        Parameters:
        properties - Kafka connect properties
        Returns:
        a source to use in Pipeline.readFrom(StreamSource)
        Since:
        5.3