
Confluent
Overview
Confluent offers a fully managed cloud-native Apache Kafka service (Confluent Cloud) as well as an enterprise-grade distribution for Apache Kafka (Confluent Platform).
Setting up Confluent
1. Add integration
Select Confluent from the integrations page.
2. Configure settings
Fill out the form with the following settings:
Setting | Required | Description |
---|---|---|
Name | TRUE | Name that will be displayed to users when selecting this integration in Superblocks |
Brokers | TRUE | Comma-separated list of broker endpoints |
SASL Mechanism | TRUE | Authentication mechanism. Choose between PLAIN , SCRAM SHA256 , SCRAM SHA512 . |
Username | TRUE | Username to connect to broker |
Password | TRUE | Password for broker username |
Enable SSL | FALSE | Connect via SSL if selected (SSL encryption should always be used if SASL mechanism is PLAIN ) |
3. Test and save
Click Test Connection to check that Superblocks can connect to the data source.
info
If using Superblocks Cloud, add these Superblocks IPs to your allowlist (not necessary for On-Premise-Agent)
After connecting successfully, click Create to save the integration.
4. Set profiles
Optionally, configure different profiles for separate development environments.
success
Confluent Connected You can now consume and produce messages through Confluent in any Application, Workflow, or Scheduled Job.
Use Confluent in APIs
Connect to your Confluent integration from Superblocks by creating steps in Application backend APIs, Workflows, and Scheduled Jobs. A Confluent step can perform a Consume or Produce action.
info
For Kafka-based integrations, Consume steps must be added inside a Stream block, whereas Produce actions cannot. For general concepts around streaming in Superblocks, see Streaming Applications.
- Consume
- Produce

