4

Kafka Event Exchange Between Local and Azure

 2 years ago
source link: https://dzone.com/articles/kafka-event-exchange-between-local-and-azure
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.
neoserver,ios ssh client

Kafka Event Exchange Between Local and Azure

Learn about a simple set-up involving a local machine and Azure VM, and see the step-by-step procedure to produce events from the local machine to Kafka broker.

Join the DZone community and get the full member experience.

Join For Free

While it may not be a daunting task to set up Kafka on a local machine or within a particular network and produce/consume messages, people do face challenges when they try to make it work across the network.

Let’s consider a hybrid scenario where your software solution is distributed across two different platforms (say AWS and Azure or on-premise and Azure), and there is a need to route messages from Kafka cluster hosted on one platform to the one hosted on another. This could be a valid business scenario wherein you are trying to consolidate your solution with one cloud platform, and in the interim, you need to have this routing in place till you complete your migration. Even in the long term, there may be a need to maintain solution across multiple platforms for various business and technical reasons.

In this article, we will be talking about a simple set-up involving local machine (macOS) and Azure VM. We’ll discuss the step-by-step procedure to produce events from local machine to Kafka broker hosted on Azure VM and also to consume those events back in local machine. While this does not cover the exact scenario described above, it gives a fair idea about how the Kafka messages can be exchanged across the network.

Prerequisites

  • Azure VM with Ubuntu server
  • MacOS machine with Homebrew installed

Setting up Kafka on Azure VM

Update packages.

$ apt-get update

Install Java.

$ apt install default-jdk

Download Kafka.

$ wget https://downloads.apache.org/kafka/3.1.0/kafka_2.13-3.1.0.tgz

Unzip the downloaded file.

$ tar -xzf kafka_2.13–3.1.0.tgz

Change directory to kafka_2.13–3.1.0.

$ cd kafka_2.13–3.1.0

Start ZooKeeper service (in terminal 1).

$ bin/zookeeper-server-start.sh config/zookeeper.properties

Start Kafka broker service (in terminal 2).

$ bin/kafka-server-start.sh config/server.properties

Create a topic (in terminal 3).

$ bin/kafka-topics.sh --create --topic azure-events --bootstrap-server localhost:9092

Start Kafka producer (in terminal 3) and write events to the above topic.

$ bin/kafka-console-producer.sh --topic azure-events --bootstrap-server localhost:9092

Start Kafka consumer (in terminal 4) and read events from the above topic.

$ bin/kafka-console-consumer.sh --topic azure-events --from-beginning --bootstrap-server localhost:9092

If we are able to produce and consume the events seamlessly on Azure VM, that means the above set-up is successful and we can move on.

Setting up Kafka on Local Machine (macOS)

Install Java.

$ brew cask install java

Install Kafka.

$ brew install kafka

Start ZooKeeper service (in terminal 1).

$ zookeeper-server-start /usr/local/etc/kafka/zookeeper.properties

Start Kafka broker service (in terminal 2).

$ kafka-server-start /usr/local/etc/kafka/server.properties

Create a topic (in terminal 3).

$ kafka-topics --create --topic local-events --bootstrap-server localhost:9092

Start Kafka producer (in terminal 3) and write events to the above topic.

$ kafka-console-producer --topic local-events --bootstrap-server localhost:9092

Start Kafka consumer (in terminal 4) and read events from the above topic.

$ kafka-console-consumer --topic local-events --from-beginning --bootstrap-server localhost:9092

If you are able to produce and consume the events seamlessly on local machine, that means the above set-up is successful and we can move on.

Connecting Local Machine To Azure VM To Exchange Kafka Messages

Open port 9092 on Azure VM by creating inbound security rule. This can be done through the Azure portal under the networking settings of VM.

Go to /usr/local/etc/kafka/server.properties file on the local machine. Uncomment and update advertised listeners. Add Azure VM public IP address in place of your.host.name:

advertised.listeners=PLAINTEXT://your.host.name:9092)

Restart ZooKeeper service on the local machine (in terminal 1).

$ zookeeper-server-start /usr/local/etc/kafka/zookeeper.properties

Restart Kafka broker service on the local machine (in terminal 2).

$ kafka-server-start /usr/local/etc/kafka/server.properties

Restart Kafka producer (in terminal 3), but write events to Kafka topic on Azure VM this time instead of local Kafka topic. Also, use Azure VM Public IP instead of localhost.

$ kafka-console-producer --topic azure-events --bootstrap-server <Azure VM Public IP>:9092

Restart Kafka consumer (in terminal 4), but read events from Kafka topic on Azure VM this time instead of local Kafka topic. Also, use Azure VM Public IP instead of localhost.

$ kafka-console-consumer --topic azure-events --from-beginning --bootstrap-server <Azure VM Public IP>:9092

Make sure ZooKeeper and Kafka broker services are still running on Azure VM as usual. If you are able to produce and consume the events seamlessly, that means the above set-up is successful.

This brings us to the end of this article. While Azure VM with Ubuntu server and local machine with macOS has been considered here, a similar exercise can be tried out with any other Linux flavor on Azure VM and Windows OS on local machine. In that case, the commands will be a little different.


About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK