Uses of Interface
org.apache.camel.builder.endpoint.dsl.KafkaEndpointBuilderFactory.KafkaEndpointProducerBuilder
Packages that use KafkaEndpointBuilderFactory.KafkaEndpointProducerBuilder
-
Uses of KafkaEndpointBuilderFactory.KafkaEndpointProducerBuilder in org.apache.camel.builder.endpoint.dsl
Subinterfaces of KafkaEndpointBuilderFactory.KafkaEndpointProducerBuilder in org.apache.camel.builder.endpoint.dslModifier and TypeInterfaceDescriptionstatic interfaceBuilder for endpoint for the Kafka component.Methods in org.apache.camel.builder.endpoint.dsl that return KafkaEndpointBuilderFactory.KafkaEndpointProducerBuilderModifier and TypeMethodDescriptionKafkaEndpointBuilderFactory.KafkaEndpointProducerBuilder.additionalProperties(String key, Object value) Sets additional properties for either kafka consumer or kafka producer in case they can't be set directly on the camel configurations (e.g: new Kafka properties that are not reflected yet in Camel configurations), the properties have to be prefixed with additionalProperties..KafkaEndpointBuilderFactory.KafkaEndpointProducerBuilder.additionalProperties(Map values) Sets additional properties for either kafka consumer or kafka producer in case they can't be set directly on the camel configurations (e.g: new Kafka properties that are not reflected yet in Camel configurations), the properties have to be prefixed with additionalProperties..KafkaEndpointBuilderFactory.AdvancedKafkaEndpointProducerBuilder.basic()KafkaEndpointBuilderFactory.KafkaEndpointProducerBuilder.batchWithIndividualHeaders(boolean batchWithIndividualHeaders) If this feature is enabled and a single element of a batch is an Exchange or Message, the producer will generate individual kafka header values for it by using the batch Message to determine the values.KafkaEndpointBuilderFactory.KafkaEndpointProducerBuilder.batchWithIndividualHeaders(String batchWithIndividualHeaders) If this feature is enabled and a single element of a batch is an Exchange or Message, the producer will generate individual kafka header values for it by using the batch Message to determine the values.URL of the Kafka brokers to use.KafkaEndpointBuilderFactory.KafkaEndpointProducerBuilder.bufferMemorySize(Integer bufferMemorySize) The total bytes of memory the producer can use to buffer records waiting to be sent to the server.KafkaEndpointBuilderFactory.KafkaEndpointProducerBuilder.bufferMemorySize(String bufferMemorySize) The total bytes of memory the producer can use to buffer records waiting to be sent to the server.The client id is a user-specified string sent in each request to help trace calls.KafkaEndpointBuilderFactory.KafkaEndpointProducerBuilder.compressionCodec(String compressionCodec) This parameter allows you to specify the compression codec for all data generated by this producer.KafkaEndpointBuilderFactory.KafkaEndpointProducerBuilder.connectionMaxIdleMs(Integer connectionMaxIdleMs) Close idle connections after the number of milliseconds specified by this config.KafkaEndpointBuilderFactory.KafkaEndpointProducerBuilder.connectionMaxIdleMs(String connectionMaxIdleMs) Close idle connections after the number of milliseconds specified by this config.KafkaEndpointBuilderFactory.KafkaEndpointProducerBuilder.deliveryTimeoutMs(Integer deliveryTimeoutMs) An upper bound on the time to report success or failure after a call to send() returns.KafkaEndpointBuilderFactory.KafkaEndpointProducerBuilder.deliveryTimeoutMs(String deliveryTimeoutMs) An upper bound on the time to report success or failure after a call to send() returns.KafkaEndpointBuilderFactory.KafkaEndpointProducerBuilder.enableIdempotence(boolean enableIdempotence) When set to 'true', the producer will ensure that exactly one copy of each message is written in the stream.KafkaEndpointBuilderFactory.KafkaEndpointProducerBuilder.enableIdempotence(String enableIdempotence) When set to 'true', the producer will ensure that exactly one copy of each message is written in the stream.KafkaEndpointBuilderFactory.KafkaEndpointProducerBuilder.headerFilterStrategy(String headerFilterStrategy) To use a custom HeaderFilterStrategy to filter header to and from Camel message.KafkaEndpointBuilderFactory.KafkaEndpointProducerBuilder.headerFilterStrategy(org.apache.camel.spi.HeaderFilterStrategy headerFilterStrategy) To use a custom HeaderFilterStrategy to filter header to and from Camel message.KafkaEndpointBuilderFactory.KafkaEndpointProducerBuilder.headerSerializer(String headerSerializer) To use a custom KafkaHeaderSerializer to serialize kafka headers values.KafkaEndpointBuilderFactory.KafkaEndpointProducerBuilder.headerSerializer(org.apache.camel.component.kafka.serde.KafkaHeaderSerializer headerSerializer) To use a custom KafkaHeaderSerializer to serialize kafka headers values.KafkaEndpointBuilderFactory.KafkaEndpointProducerBuilder.interceptorClasses(String interceptorClasses) Sets interceptors for producer or consumers.KafkaEndpointBuilderFactory.KafkaEndpointProducerBuilder.kerberosBeforeReloginMinTime(Integer kerberosBeforeReloginMinTime) Login thread sleep time between refresh attempts.KafkaEndpointBuilderFactory.KafkaEndpointProducerBuilder.kerberosBeforeReloginMinTime(String kerberosBeforeReloginMinTime) Login thread sleep time between refresh attempts.KafkaEndpointBuilderFactory.KafkaEndpointProducerBuilder.kerberosConfigLocation(String kerberosConfigLocation) Location of the kerberos config file.KafkaEndpointBuilderFactory.KafkaEndpointProducerBuilder.kerberosInitCmd(String kerberosInitCmd) Kerberos kinit command path.KafkaEndpointBuilderFactory.KafkaEndpointProducerBuilder.kerberosPrincipalToLocalRules(String kerberosPrincipalToLocalRules) A list of rules for mapping from principal names to short names (typically operating system usernames).KafkaEndpointBuilderFactory.KafkaEndpointProducerBuilder.kerberosRenewJitter(Double kerberosRenewJitter) Percentage of random jitter added to the renewal time.KafkaEndpointBuilderFactory.KafkaEndpointProducerBuilder.kerberosRenewJitter(String kerberosRenewJitter) Percentage of random jitter added to the renewal time.KafkaEndpointBuilderFactory.KafkaEndpointProducerBuilder.kerberosRenewWindowFactor(Double kerberosRenewWindowFactor) Login thread will sleep until the specified window factor of time from last refresh to ticket's expiry has been reached, at which time it will try to renew the ticket.KafkaEndpointBuilderFactory.KafkaEndpointProducerBuilder.kerberosRenewWindowFactor(String kerberosRenewWindowFactor) Login thread will sleep until the specified window factor of time from last refresh to ticket's expiry has been reached, at which time it will try to renew the ticket.The record key (or null if no key is specified).KafkaEndpointBuilderFactory.KafkaEndpointProducerBuilder.keySerializer(String keySerializer) The serializer class for keys (defaults to the same as for messages if nothing is given).The producer groups together any records that arrive in between request transmissions into a single batched request.The producer groups together any records that arrive in between request transmissions into a single batched request.KafkaEndpointBuilderFactory.KafkaEndpointProducerBuilder.maxBlockMs(Integer maxBlockMs) The configuration controls how long the KafkaProducer's send(), partitionsFor(), initTransactions(), sendOffsetsToTransaction(), commitTransaction() and abortTransaction() methods will block.KafkaEndpointBuilderFactory.KafkaEndpointProducerBuilder.maxBlockMs(String maxBlockMs) The configuration controls how long the KafkaProducer's send(), partitionsFor(), initTransactions(), sendOffsetsToTransaction(), commitTransaction() and abortTransaction() methods will block.KafkaEndpointBuilderFactory.KafkaEndpointProducerBuilder.maxInFlightRequest(Integer maxInFlightRequest) The maximum number of unacknowledged requests the client will send on a single connection before blocking.KafkaEndpointBuilderFactory.KafkaEndpointProducerBuilder.maxInFlightRequest(String maxInFlightRequest) The maximum number of unacknowledged requests the client will send on a single connection before blocking.KafkaEndpointBuilderFactory.KafkaEndpointProducerBuilder.maxRequestSize(Integer maxRequestSize) The maximum size of a request.KafkaEndpointBuilderFactory.KafkaEndpointProducerBuilder.maxRequestSize(String maxRequestSize) The maximum size of a request.KafkaEndpointBuilderFactory.KafkaEndpointProducerBuilder.metadataMaxAgeMs(Integer metadataMaxAgeMs) The period of time in milliseconds after which we force a refresh of metadata even if we haven't seen any partition leadership changes to proactively discover any new brokers or partitions.KafkaEndpointBuilderFactory.KafkaEndpointProducerBuilder.metadataMaxAgeMs(String metadataMaxAgeMs) The period of time in milliseconds after which we force a refresh of metadata even if we haven't seen any partition leadership changes to proactively discover any new brokers or partitions.KafkaEndpointBuilderFactory.KafkaEndpointProducerBuilder.metricReporters(String metricReporters) A list of classes to use as metrics reporters.KafkaEndpointBuilderFactory.KafkaEndpointProducerBuilder.metricsSampleWindowMs(Integer metricsSampleWindowMs) The window of time a metrics sample is computed over.KafkaEndpointBuilderFactory.KafkaEndpointProducerBuilder.metricsSampleWindowMs(String metricsSampleWindowMs) The window of time a metrics sample is computed over.KafkaEndpointBuilderFactory.KafkaEndpointProducerBuilder.noOfMetricsSample(Integer noOfMetricsSample) The number of samples maintained to compute metrics.KafkaEndpointBuilderFactory.KafkaEndpointProducerBuilder.noOfMetricsSample(String noOfMetricsSample) The number of samples maintained to compute metrics.KafkaEndpointBuilderFactory.KafkaEndpointProducerBuilder.partitioner(String partitioner) The partitioner class for partitioning messages amongst sub-topics.KafkaEndpointBuilderFactory.KafkaEndpointProducerBuilder.partitionerIgnoreKeys(boolean partitionerIgnoreKeys) Whether the message keys should be ignored when computing partition.KafkaEndpointBuilderFactory.KafkaEndpointProducerBuilder.partitionerIgnoreKeys(String partitionerIgnoreKeys) Whether the message keys should be ignored when computing partition.KafkaEndpointBuilderFactory.KafkaEndpointProducerBuilder.partitionKey(Integer partitionKey) The partition to which the record will be sent (or null if no partition was specified).KafkaEndpointBuilderFactory.KafkaEndpointProducerBuilder.partitionKey(String partitionKey) The partition to which the record will be sent (or null if no partition was specified).KafkaEndpointBuilderFactory.KafkaEndpointProducerBuilder.producerBatchSize(Integer producerBatchSize) The producer will attempt to batch records together into fewer requests whenever multiple records are being sent to the same partition.KafkaEndpointBuilderFactory.KafkaEndpointProducerBuilder.producerBatchSize(String producerBatchSize) The producer will attempt to batch records together into fewer requests whenever multiple records are being sent to the same partition.KafkaEndpointBuilderFactory.KafkaEndpointProducerBuilder.queueBufferingMaxMessages(Integer queueBufferingMaxMessages) The maximum number of unsent messages that can be queued up the producer when using async mode before either the producer must be blocked or data must be dropped.KafkaEndpointBuilderFactory.KafkaEndpointProducerBuilder.queueBufferingMaxMessages(String queueBufferingMaxMessages) The maximum number of unsent messages that can be queued up the producer when using async mode before either the producer must be blocked or data must be dropped.KafkaEndpointBuilderFactory.KafkaEndpointProducerBuilder.receiveBufferBytes(Integer receiveBufferBytes) The size of the TCP receive buffer (SO_RCVBUF) to use when reading data.KafkaEndpointBuilderFactory.KafkaEndpointProducerBuilder.receiveBufferBytes(String receiveBufferBytes) The size of the TCP receive buffer (SO_RCVBUF) to use when reading data.KafkaEndpointBuilderFactory.KafkaEndpointProducerBuilder.reconnectBackoffMaxMs(Integer reconnectBackoffMaxMs) The maximum amount of time in milliseconds to wait when reconnecting to a broker that has repeatedly failed to connect.KafkaEndpointBuilderFactory.KafkaEndpointProducerBuilder.reconnectBackoffMaxMs(String reconnectBackoffMaxMs) The maximum amount of time in milliseconds to wait when reconnecting to a broker that has repeatedly failed to connect.KafkaEndpointBuilderFactory.KafkaEndpointProducerBuilder.reconnectBackoffMs(Integer reconnectBackoffMs) The amount of time to wait before attempting to reconnect to a given host.KafkaEndpointBuilderFactory.KafkaEndpointProducerBuilder.reconnectBackoffMs(String reconnectBackoffMs) The amount of time to wait before attempting to reconnect to a given host.KafkaEndpointBuilderFactory.KafkaEndpointProducerBuilder.recordMetadata(boolean recordMetadata) Whether the producer should store the RecordMetadata results from sending to Kafka.KafkaEndpointBuilderFactory.KafkaEndpointProducerBuilder.recordMetadata(String recordMetadata) Whether the producer should store the RecordMetadata results from sending to Kafka.KafkaEndpointBuilderFactory.KafkaEndpointProducerBuilder.requestRequiredAcks(String requestRequiredAcks) The number of acknowledgments the producer requires the leader to have received before considering a request complete.KafkaEndpointBuilderFactory.KafkaEndpointProducerBuilder.requestTimeoutMs(Integer requestTimeoutMs) The amount of time the broker will wait trying to meet the request.required.acks requirement before sending back an error to the client.KafkaEndpointBuilderFactory.KafkaEndpointProducerBuilder.requestTimeoutMs(String requestTimeoutMs) The amount of time the broker will wait trying to meet the request.required.acks requirement before sending back an error to the client.Setting a value greater than zero will cause the client to resend any record whose send fails with a potentially transient error.Setting a value greater than zero will cause the client to resend any record whose send fails with a potentially transient error.KafkaEndpointBuilderFactory.KafkaEndpointProducerBuilder.retryBackoffMs(Integer retryBackoffMs) Before each retry, the producer refreshes the metadata of relevant topics to see if a new leader has been elected.KafkaEndpointBuilderFactory.KafkaEndpointProducerBuilder.retryBackoffMs(String retryBackoffMs) Before each retry, the producer refreshes the metadata of relevant topics to see if a new leader has been elected.KafkaEndpointBuilderFactory.KafkaEndpointProducerBuilder.saslJaasConfig(String saslJaasConfig) Expose the kafka sasl.jaas.config parameter Example: org.apache.kafka.common.security.plain.PlainLoginModule required username=USERNAME password=PASSWORD;.KafkaEndpointBuilderFactory.KafkaEndpointProducerBuilder.saslKerberosServiceName(String saslKerberosServiceName) The Kerberos principal name that Kafka runs as.KafkaEndpointBuilderFactory.KafkaEndpointProducerBuilder.saslMechanism(String saslMechanism) The Simple Authentication and Security Layer (SASL) Mechanism used.KafkaEndpointBuilderFactory.KafkaEndpointProducerBuilder.schemaRegistryURL(String schemaRegistryURL) URL of the schema registry servers to use.KafkaEndpointBuilderFactory.KafkaEndpointProducerBuilder.securityProtocol(String securityProtocol) Protocol used to communicate with brokers.KafkaEndpointBuilderFactory.KafkaEndpointProducerBuilder.sendBufferBytes(Integer sendBufferBytes) Socket write buffer size.KafkaEndpointBuilderFactory.KafkaEndpointProducerBuilder.sendBufferBytes(String sendBufferBytes) Socket write buffer size.KafkaEndpointBuilderFactory.KafkaEndpointProducerBuilder.shutdownTimeout(int shutdownTimeout) Timeout in milliseconds to wait gracefully for the consumer or producer to shutdown and terminate its worker threads.KafkaEndpointBuilderFactory.KafkaEndpointProducerBuilder.shutdownTimeout(String shutdownTimeout) Timeout in milliseconds to wait gracefully for the consumer or producer to shutdown and terminate its worker threads.KafkaEndpointBuilderFactory.KafkaEndpointProducerBuilder.sslCipherSuites(String sslCipherSuites) A list of cipher suites.KafkaEndpointBuilderFactory.KafkaEndpointProducerBuilder.sslContextParameters(String sslContextParameters) SSL configuration using a Camel SSLContextParameters object.KafkaEndpointBuilderFactory.KafkaEndpointProducerBuilder.sslContextParameters(org.apache.camel.support.jsse.SSLContextParameters sslContextParameters) SSL configuration using a Camel SSLContextParameters object.KafkaEndpointBuilderFactory.KafkaEndpointProducerBuilder.sslEnabledProtocols(String sslEnabledProtocols) The list of protocols enabled for SSL connections.KafkaEndpointBuilderFactory.KafkaEndpointProducerBuilder.sslEndpointAlgorithm(String sslEndpointAlgorithm) The endpoint identification algorithm to validate server hostname using server certificate.KafkaEndpointBuilderFactory.KafkaEndpointProducerBuilder.sslKeymanagerAlgorithm(String sslKeymanagerAlgorithm) The algorithm used by key manager factory for SSL connections.KafkaEndpointBuilderFactory.KafkaEndpointProducerBuilder.sslKeyPassword(String sslKeyPassword) The password of the private key in the key store file or the PEM key specified in sslKeystoreKey.KafkaEndpointBuilderFactory.KafkaEndpointProducerBuilder.sslKeystoreLocation(String sslKeystoreLocation) The location of the key store file.KafkaEndpointBuilderFactory.KafkaEndpointProducerBuilder.sslKeystorePassword(String sslKeystorePassword) The store password for the key store file.KafkaEndpointBuilderFactory.KafkaEndpointProducerBuilder.sslKeystoreType(String sslKeystoreType) The file format of the key store file.KafkaEndpointBuilderFactory.KafkaEndpointProducerBuilder.sslProtocol(String sslProtocol) The SSL protocol used to generate the SSLContext.KafkaEndpointBuilderFactory.KafkaEndpointProducerBuilder.sslProvider(String sslProvider) The name of the security provider used for SSL connections.KafkaEndpointBuilderFactory.KafkaEndpointProducerBuilder.sslTrustmanagerAlgorithm(String sslTrustmanagerAlgorithm) The algorithm used by trust manager factory for SSL connections.KafkaEndpointBuilderFactory.KafkaEndpointProducerBuilder.sslTruststoreLocation(String sslTruststoreLocation) The location of the trust store file.KafkaEndpointBuilderFactory.KafkaEndpointProducerBuilder.sslTruststorePassword(String sslTruststorePassword) The password for the trust store file.KafkaEndpointBuilderFactory.KafkaEndpointProducerBuilder.sslTruststoreType(String sslTruststoreType) The file format of the trust store file.KafkaEndpointBuilderFactory.KafkaEndpointProducerBuilder.valueSerializer(String valueSerializer) The serializer class for messages.KafkaEndpointBuilderFactory.KafkaEndpointProducerBuilder.workerPool(String workerPool) To use a custom worker pool for continue routing Exchange after kafka server has acknowledge the message that was sent to it from KafkaProducer using asynchronous non-blocking processing.KafkaEndpointBuilderFactory.KafkaEndpointProducerBuilder.workerPool(ExecutorService workerPool) To use a custom worker pool for continue routing Exchange after kafka server has acknowledge the message that was sent to it from KafkaProducer using asynchronous non-blocking processing.KafkaEndpointBuilderFactory.KafkaEndpointProducerBuilder.workerPoolCoreSize(Integer workerPoolCoreSize) Number of core threads for the worker pool for continue routing Exchange after kafka server has acknowledge the message that was sent to it from KafkaProducer using asynchronous non-blocking processing.KafkaEndpointBuilderFactory.KafkaEndpointProducerBuilder.workerPoolCoreSize(String workerPoolCoreSize) Number of core threads for the worker pool for continue routing Exchange after kafka server has acknowledge the message that was sent to it from KafkaProducer using asynchronous non-blocking processing.KafkaEndpointBuilderFactory.KafkaEndpointProducerBuilder.workerPoolMaxSize(Integer workerPoolMaxSize) Maximum number of threads for the worker pool for continue routing Exchange after kafka server has acknowledge the message that was sent to it from KafkaProducer using asynchronous non-blocking processing.KafkaEndpointBuilderFactory.KafkaEndpointProducerBuilder.workerPoolMaxSize(String workerPoolMaxSize) Maximum number of threads for the worker pool for continue routing Exchange after kafka server has acknowledge the message that was sent to it from KafkaProducer using asynchronous non-blocking processing.