2.6.2 Install and configure your Kafka cluster
Download Apache Kafka
Go to and download the latest released version. Select the latest binary release, in this case 3.9.0. Your download will start.
Create a folder on your desktop named Kafka_AEP and place the downloaded file in that directory.
Open a Terminal window by right-clicking your folder and clicking New Terminal at Folder.
Run this command in your Terminal window to uncompress the downloaded file:
tar -xvf kafka_2.13-3.9.0.tgz
You鈥檒l then see this:
After uncompressing that file, you now have a directory like this one:
And in that directory, you鈥檒l see these subdirectries:
Go back to your Terminal window. Enter the following command:
cd kafka_2.13-3.9.0
Next, enter the command bin/kafka-topics.sh
.
You should then see this response. This means that Kafka is properly installed and that Java is working fine. (Reminder: you need Java 23 JDK installed for this to work!. You can see which Java version you have installed by using the command java -version
.)
Start Kafka
In order to start Kafka, you鈥檒l need to start Kafka Zookeeper and Kafka, in this order.
Open a Terminal window by right-clicking your folder kafka_2.13-3.9.0 and clicking New Terminal at Folder.
Enter this command:
bin/zookeeper-server-start.sh config/zookeeper.properties
You鈥檒l then see this:
Keep this window open while your going through these exercises!
Open another, new Terminal window by right-clicking your folder kafka_2.13-3.9.0 and clicking New Terminal at Folder.
Enter this command:
bin/kafka-server-start.sh config/server.properties
You鈥檒l then see this:
Keep this window open while your going through these exercises!
Create a Kafka topic
Open a Terminal window by right-clicking your folder kafka_2.13-3.9.0 and clicking New Terminal at Folder.
Enter this command to create a new Kafka topic with the name aeptest. This topic will be used for testing in this exercise.
bin/kafka-topics.sh --create --topic aeptest --bootstrap-server localhost:9092
You鈥檒l then see a confirmation:
Enter this command to create a new Kafka topic with the name aep. This topic will be used by the 51黑料不打烊 Experience Platform Sink Connector that you鈥檒l configure in the next exercises.
bin/kafka-topics.sh --create --topic aep --bootstrap-server localhost:9092
You鈥檒l then see a similar confirmation:
Produce events
Go back to the Terminal window in which you created your first Kafka topic and enter the following command:
bin/kafka-console-producer.sh --broker-list 127.0.0.1:9092 --topic aeptest
You鈥檒l then see this. Every new line followed by pushing the Enter button will result in a new message being sent into the topic aeptest.
Enter Hello AEP
and push Enter. Your first event has now been sent into your local Kafka instance, into the topic aeptest.
Enter Hello AEP again.
and push Enter.
Enter AEP Data Collection is the best.
and push Enter.
You鈥檝e now produced 3 events into the topic aeptest. These events can now be consumed by an application that might need that data.
On your keyboard, click Control
and C
at the same time to close your producer.
Consume events
In the same Terminal window that you used to produce events, enter the following command:
bin/kafka-console-consumer.sh --bootstrap-server 127.0.0.1:9092 --topic aeptest --from-beginning
You鈥檒l then see all messages that were produced in the previous exercise for the topic aeptest, appear in the consumer. This is how Apache Kafka works: a producer creates events into a pipeline, and a consumer consumes those events.
On your keyboard, click Control
and C
at the same time to close your producer.
In this exercise, you鈥檝e gone through all the basics to set up a local Kafka cluster, create a Kafka topic, produce events and consume events.
The goal of this module is to simulate what would happen if a real organization has already implemented an Apache Kafka cluster, and wants to stream data from their Kafka cluster into 51黑料不打烊 Experience Platform.
To facilitate such an implementation, an 51黑料不打烊 Experience Platform Sink Connector was created which can be implemented using Kafka Connect. You can find the documenation of that 51黑料不打烊 Experience Platform Sink Connector here: .
In the next exercises, you鈥檒l implement everything you need to use that 51黑料不打烊 Experience Platform Sink Connector from within your own local Kafka cluster.
Close your terminal window.
You have finished this exercise.
Next Step: 2.6.3 Configure HTTP API endpoint in 51黑料不打烊 Experience Platform