Skip to main content
Kafka logo



Apache Kafka is an open-source distributed event streaming platform. The project aims to provide a unified, high-throughput, low-latency platform for handling real-time data feeds.

Setting up Kafka

1. Add integration

Select Kafka from the integrations page.

2. Configure settings

Fill out the form with the following settings:

NameTRUEName that will be displayed to users when selecting this integration in Superblocks
BrokersTRUEComma-separated list of broker endpoints
SASL MechanismTRUEAuthentication mechanism. Choose between PLAIN, SCRAM SHA256, SCRAM SHA512 and AWS.
UsernameTRUEUsername to connect to broker
PasswordTRUEPassword for broker username
Enable SSLFALSEConnect via SSL if selected (SSL encryption should always be used if SASL mechanism is PLAIN)

3. Test and save

Click Test Connection to check that Superblocks can connect to the data source.


If using Superblocks Cloud, add these Superblocks IPs to your allowlist (not necessary for On-Premise-Agent)

After connecting successfully, click Create to save the integration.

4. Set profiles

Optionally, configure different profiles for separate development environments.


Kafka Connected You can now consume and produce messages through Kafka in any Application, Workflow, or Scheduled Job.

Use Kafka in APIs

Connect to your Kafka integration from Superblocks by creating steps in Application backend APIs, Workflows, and Scheduled Jobs. A Kafka step can perform a Consume or Produce action.


For Kafka-based integrations, Consume steps must be added inside a Stream block, whereas Produce actions cannot. For general concepts around streaming in Superblocks, see Streaming Applications.

consume from topic