Apache Kafka is an open-source distributed event streaming platform. The project aims to provide a unified, high-throughput, low-latency platform for handling real-time data feeds.
Setting up Kafka
1. Add integration
Select Kafka from the integrations page.
2. Configure settings
Fill out the form with the following settings:
|Name||Name that will be displayed to users when selecting this integration in Superblocks|
|Brokers||Comma-separated list of broker endpoints|
|SASL Mechanism||Authentication mechanism. Choose between |
|Username||Username to connect to broker|
|Password||Password for broker username|
|Enable SSL||Connect via SSL if selected (SSL encryption should always be used if SASL mechanism is |
3. Test and save
Click Test Connection to check that Superblocks can connect to the data source.
After connecting successfully, click Create to save the integration.
4. Set profiles
Optionally, configure different profiles for separate development environments.
Use Kafka in APIs
Connect to your Kafka integration from Superblocks by creating steps in Application backend APIs, Workflows, and Scheduled Jobs. A Kafka step can perform a Consume or Produce action.