Using the KafkaProducer and KafkaConsumer nodes within a message flow

IBM Knowledge Center

Learn how to use a KafkaProducer and KafkaConsumer node to produce and consume messages using a Kafka topic.

Import projects

When you click Import, two applications will be created in your workspace: KafkaProducerApplication and KafkaConsumerApplication.

KafkaProducerApplication provides a single message flow with a HTTPInput, KafkaProducer and HTTPReply nodes.

KafkaConsumerApplication provides a single message flow with a KafkaConsumer and FileOutput nodes.

In this tutorial, these two applications will run in the same integration server. With additional configuration they can run on different servers or even on different machines.

This tutorial can be run against any stand alone Apache Kafka server (https://kafka.apache.org/) at version 0.10.0.1 or above. The IBM Bluemix Message Hub service provides an easy to use cloud based Kafka implementation which you can use with this tutorial if you don't have your own Kafka server available.

Once you import the tutorial, you will see red crosses against the two Kafka applications. This is because a few setup steps (mandatory properties) are required to complete the configuration of the Kafka nodes. These are explained when you move onto the Prepare tab.

Imported projects

Before you can deploy the flows there are some setup steps that need to be completed.

  1. Within your own Apache Kafka server, e.g. IBM Bluemix Message Hub Service, record the Bootstrap server name. Create a topic if you don't already have one and remember its name ready for step 3.
  2. If you are using Message Hub, the connection from IIB must be secured using SASL_SSL. Open the Message Hub Service Credentials and remember the user and password ready for step 5.
  3. Open message flows KafkaProducerFlow.msgflow and KafkaConsumerFlow.msgflow. Make the following changes to the Kafka nodes.
    1. Topic name* set this to the one that you created on your Kafka server.
    2. Bootstrap servers* set this to the one listed in the Credentials above.
    3. If using Message Hub, switch to the Security tab and set the Security protocol to be "SASL_SSL" and leave the default value for SSL protocol to be "TLSv1.2".
  4. Within the KafkaConsumerFlow.msgflow you will also need to change the File Output node, Directory and File name.
  5. If you are using IBM Bluemix Message hub or if your Kafka Server needs to authenticate the connection from IIB, then you will need to issue the following command from the IIB Command Console.
    • mqsisetdbparms integrationNodeName -n kafka::KAFKA::integrationServerName -u user -p password
    • Restart the Integration Node.

You are now ready to exercise the flows. To exercise the flows move onto the Run tab.

Follow these steps to complete the tutorial

Use the Flow Exerciser in the KafkaProducerFlow.msgflow and KafkaConsumerFlow.msgflow to run this tutorial.

  1. Open KafkaConsumerFlow.msgflow.
    • Click the Flow Exerciser icon to start testing the flow
  2. Open KafkaProducerFlow.msgflow.
    • Click the Flow Exerciser icon to start testing the flow
    • Click the Send Message icon .
    • Create a new message and click Send. Your message is sent to the HTTP input node.
    • When the list of actions is shown, click on the Received HTTP reply message... item to show the output from the flow. This wil be the same as the message that you sent in.
  3. After you close the dialog, the path taken through the messageflow is highlighted. Click on the message icon on each connection to see how the tree is updated by each node.
  4. Go to the output directory that you specified on the FileOutput node and you should see a file that contains the message that you sent in.

This tutorial showed the KafkaConsumer and KafkaProducer nodes running in the same integration server. Remember if you experiment with other topologies you may need to run the mqsisetdbparms command accordingly, as the credentials for the Kafka nodes are scoped to the integration server.