Skip to main content
Apache Kafka is an open-source distributed event streaming platform. The project aims to provide a unified, high-throughput, low-latency platform for handling real-time data feeds.

Setting up Kafka

1. Add integration

Select Kafka from the integrations page.

2. Configure settings

Fill out the form with the following settings:
SettingRequiredDescription
NameName that will be displayed to users when selecting this integration in Superblocks
BrokersComma-separated list of broker endpoints
SASL MechanismAuthentication mechanism. Choose between PLAIN, SCRAM SHA256, SCRAM SHA512 and AWS.
UsernameUsername to connect to broker
PasswordPassword for broker username
Enable SSLConnect via SSL if selected (SSL encryption should always be used if SASL mechanism is PLAIN)

3. Test and save

Click Test Connection to check that Superblocks can connect to the data source.
If using Superblocks Cloud, add these Superblocks IPs to your allowlist (not necessary for Hybrid or Cloud-Prem deployments).
After connecting successfully, click Create to save the integration.

4. Set profiles

Optionally, configure different profiles for separate development environments.

Kafka connected!
You can now consume and produce messages through Kafka in any Application, Workflow, or Scheduled Job.