Confluent
Overview
Confluent offers a fully managed cloud-native Apache Kafka service (Confluent Cloud) as well as an enterprise-grade distribution for Apache Kafka (Confluent Platform).
Setting up Confluent
1. Add integration
Select Confluent from the integrations page.
2. Configure settings
Fill out the form with the following settings:
Setting | Required | Description |
---|---|---|
Name | TRUE | Name that will be displayed to users when selecting this integration in Superblocks |
Brokers | TRUE | Comma-separated list of Confluent Cloud bootstrap servers |
SASL Mechanism | TRUE | Authentication mechanism. Choose between |
Username | TRUE | Confluent Cloud API key. Learn how to create an API key. |
Password | TRUE | Secret value for the API key provided as the username. |
Enable SSL | FALSE | Connect via SSL if selected (SSL encryption should always be used if SASL
mechanism is |
3. Test and save
Click Test Connection to check that Superblocks can connect to the data source.
If using Superblocks Cloud, add these Superblocks IPs to your allowlist (not necessary for On-Premise-Agent)
After connecting successfully, click Create to save the integration.
4. Set profiles
Optionally, configure different profiles for separate development environments.
Confluent Connected You can now consume and produce messages through Confluent in any Application, Workflow, or Scheduled Job.
Use Confluent in APIs
Once your Confluent integration is created, you can start creating steps in Application backend APIs, Workflows, and Scheduled Jobs. Confluent steps can be used to either Consume or Produce.
Learn more about building internal tools with streaming in our Streaming Applications guide.
Consume
To consume data from Confluent:- Add a Stream block to your API
-
Add a block for your new Confluent integration with the action set to Consume
- Set the topic to consume from, how you want to consume, and any advanced settings desired
- Optionally, configure Process steps to process each message read off the stream
Produce
Write data to Confluent just by adding a step to your API using the Produce action.
Superblocks supports JSON Schema formatted messages with the schema below. Schema Registry's are currently unsupported.
Key | Required | Description |
---|---|---|
topic | True | The topic this record will be sent to. |
value | True | Record contents. |
partition | False | The partition that the record should be sent to. |
key | False | Key used to deterministically map messages to a partitions based on the hash of the key. |
timestamp | False | The timestamp of the record, in milliseconds since epoch. |
headers | False | Headers to be included with the record, sent as an Object with key-value pairs. |