Skip to main content
Confluent logo

Confluent

Overview

Confluent offers a fully managed cloud-native Apache Kafka service (Confluent Cloud) as well as an enterprise-grade distribution for Apache Kafka (Confluent Platform).

Setting up Confluent

1. Add integration

Select Confluent from the integrations page.

2. Configure settings

Fill out the form with the following settings:

SettingRequiredDescription
NameTRUEName that will be displayed to users when selecting this integration in Superblocks
BrokersTRUEComma-separated list of broker endpoints
SASL MechanismTRUEAuthentication mechanism. Choose between PLAIN, SCRAM SHA256, SCRAM SHA512.
UsernameTRUEUsername to connect to broker
PasswordTRUEPassword for broker username
Enable SSLFALSEConnect via SSL if selected (SSL encryption should always be used if SASL mechanism is PLAIN)

3. Test and save

Click Test Connection to check that Superblocks can connect to the data source.

info

If using Superblocks Cloud, add these Superblocks IPs to your allowlist (not necessary for On-Premise-Agent)

After connecting successfully, click Create to save the integration.

4. Set profiles

Optionally, configure different profiles for separate development environments.

success

Confluent Connected You can now consume and produce messages through Confluent in any Application, Workflow, or Scheduled Job.

Use Confluent in APIs

Connect to your Confluent integration from Superblocks by creating steps in Application backend APIs, Workflows, and Scheduled Jobs. A Confluent step can perform a Consume or Produce action.

info

For Kafka-based integrations, Consume steps must be added inside a Stream block, whereas Produce actions cannot. For general concepts around streaming in Superblocks, see Streaming Applications.

consume from topic