Class KafkaConnectSources
-
Method Summary
Modifier and TypeMethodDescriptionstatic StreamSource<org.apache.kafka.connect.source.SourceRecord>
connect
(Properties properties) A generic Kafka Connect source provides ability to plug any Kafka Connect source for data ingestion to Jet pipelines.static <T> StreamSource<T>
connect
(Properties properties, FunctionEx<org.apache.kafka.connect.source.SourceRecord, T> projectionFn) A generic Kafka Connect source provides ability to plug any Kafka Connect source for data ingestion to Jet pipelines.static <T> StreamSource<T>
connect
(Properties properties, FunctionEx<org.apache.kafka.connect.source.SourceRecord, T> projectionFn, RetryStrategy retryStrategy) A generic Kafka Connect source provides ability to plug any Kafka Connect source for data ingestion to Jet pipelines.
-
Method Details
-
connect
@Nonnull public static <T> StreamSource<T> connect(@Nonnull Properties properties, @Nonnull FunctionEx<org.apache.kafka.connect.source.SourceRecord, T> projectionFn) A generic Kafka Connect source provides ability to plug any Kafka Connect source for data ingestion to Jet pipelines.You need to add the Kafka Connect connector JARs or a ZIP file contains the JARs as a job resource via
JobConfig.addJar(URL)
orJobConfig.addJarsInZip(URL)
respectively.After that you can use the Kafka Connect connector with the configuration parameters as you'd use it with Kafka. Hazelcast Jet will drive the Kafka Connect connector from the pipeline and the records will be available to your pipeline as a stream of the custom type objects created by projectionFn.
In case of a failure; this source keeps track of the source partition offsets, it will restore the partition offsets and resume the consumption from where it left off.
Hazelcast Jet will instantiate tasks on a random cluster member and use local parallelism for scaling. Property
tasks.max
is not allowed. UseStreamStage.setLocalParallelism(int)
in the pipeline instead. This limitation can be changed in the future.- Parameters:
properties
- Kafka connect propertiesprojectionFn
- function to create output objects from the KafkaSourceRecord
s. If the projection returns anull
for an item, that item will be filtered out.- Returns:
- a source to use in
Pipeline.readFrom(StreamSource)
- Since:
- 5.3
-
connect
@Nonnull public static <T> StreamSource<T> connect(@Nonnull Properties properties, @Nonnull FunctionEx<org.apache.kafka.connect.source.SourceRecord, T> projectionFn, @Nullable RetryStrategy retryStrategy) A generic Kafka Connect source provides ability to plug any Kafka Connect source for data ingestion to Jet pipelines.You need to add the Kafka Connect connector JARs or a ZIP file contains the JARs as a job resource via
JobConfig.addJar(URL)
orJobConfig.addJarsInZip(URL)
respectively.After that you can use the Kafka Connect connector with the configuration parameters as you'd use it with Kafka. Hazelcast Jet will drive the Kafka Connect connector from the pipeline and the records will be available to your pipeline as a stream of the custom type objects created by projectionFn.
In case of a failure; this source keeps track of the source partition offsets, it will restore the partition offsets and resume the consumption from where it left off.
Hazelcast Jet will instantiate tasks on a random cluster member and use local parallelism for scaling. Property
tasks.max
is not allowed. UseStreamStage.setLocalParallelism(int)
in the pipeline instead. This limitation can be changed in the future.- Parameters:
properties
- Kafka connect propertiesprojectionFn
- function to create output objects from the KafkaSourceRecord
s. If the projection returns anull
for an item, that item will be filtered out.retryStrategy
- Strategy that will be used to perform reconnection retries after the connection is lost. You may want to useRetryStrategies
to provide custom strategy. By default, it'sSourceConnectorWrapper.DEFAULT_RECONNECT_BEHAVIOR
.- Returns:
- a source to use in
Pipeline.readFrom(StreamSource)
- Since:
- 5.4
-
connect
@Nonnull public static StreamSource<org.apache.kafka.connect.source.SourceRecord> connect(@Nonnull Properties properties) A generic Kafka Connect source provides ability to plug any Kafka Connect source for data ingestion to Jet pipelines.You need to add the Kafka Connect connector JARs or a ZIP file contains the JARs as a job resource via
JobConfig.addJar(URL)
orJobConfig.addJarsInZip(URL)
respectively.After that you can use the Kafka Connect connector with the configuration parameters as you'd use it with Kafka. Hazelcast Jet will drive the Kafka Connect connector from the pipeline and the records will be available to your pipeline as
SourceRecord
s.In case of a failure; this source keeps track of the source partition offsets, it will restore the partition offsets and resume the consumption from where it left off.
Hazelcast Jet will instantiate tasks on a random cluster member and use local parallelism for scaling. Property
tasks.max
is not allowed. UseStreamStage.setLocalParallelism(int)
in the pipeline instead.- Parameters:
properties
- Kafka connect properties- Returns:
- a source to use in
Pipeline.readFrom(StreamSource)
- Since:
- 5.3
-