Create Kafka action
To send PubNub data into Kafka, you must first configure your Kafka environment and the Kafka action in Admin Portal section.
For Events & Actions, it doesn't matter how or where you host your Kafka instance as long as you correctly configure your action in Admin Portal.
Receive Kafka events in PubNub
Apart from streaming PubNub events to Kafka through Kafka Action, PubNub lets you receive Kafka events in PubNub. For more details, read about PubNub Kafka Sink Connector.
Existing Kafka cluster
You will need the following information to configure Events & Actions to connect to an Apache Kafka event streaming platform.
- Topic and key
- Broker URL and port
- Username and password
When you have all this information, go to the Configure Admin Portal section.
New Kafka cluster
There are many ways to create and host Kafka clusters. The Kafka action within Events & Actions is technology-agnostic. Regardless of how you host your instance, the Kafka action will connect.
If you need guidance on creating a new Kafka cluster, read on.
- Self-hosted Kafka
- Amazon MSK cluster using Terraform
- Confluent Cloud
Refer to Kafka Quick Start documentation to understand how to create a local Kafka cluster. Ensure to have the following information ready as you will need it to create a Kafka action:
- Topic and key
- Broker URL and port
- Username and password
When you've created your Kafka cluster and have access to this information, create a Kafka action by following the steps described in the Configure Admin Portal section.
Refer to Terraform documentation to understand how to create an Amazon MSK cluster. Ensure to have the following information ready as you will need it to create a Kafka action:
- Topic and key
- Broker URL and port
- Username and password
When you've created your Kafka cluster and have access to this information, create a Kafka action by following the steps described in the Configure Admin Portal section.
This procedure uses the Confluent Cloud user interface to set up your Kafka environment. There are three parts of this procedure:
For more information on creating clusters on Confluent Cloud, refer to Quick Start for Confluent Cloud.
Configure a Kafka cluster
Creating a cluster is the first step in getting your environment ready. To configure a Kafka cluster:
-
Log into your Confluent Cloud account and create a Kafka cluster to send the PubNub messages to.
-
Select the cloud services provider and cluster type, and complete the wizard steps to configure it according to your needs.
-
Once your cluster is launched, go to Cluster Settings and note the value of the Bootstrap server parameter, as you will need it in the Admin Portal configuration part.
Create a Kafka topic
You must create a Kafka topic before sending PubNub data to it. If you do not set up a topic before you enable the Kafka action, the action will fail.
To create a Kafka topic:
-
On the cluster overview page of Confluent Cloud, click Topics.
-
On the Topics page, click the Create topic button. Configure the topic according to your needs.
-
Once your topic is created, note its name, as you will need it in the Admin Portal configuration part.
Create API key
PubNub doesn't allow unsecured connections to Kafka, so you must properly set up your Kafka credentials.
To create an API key:
-
On the cluster overview page of Confluent Cloud, click API keys. The API keys page opens.
-
On the API keys page, click the Create key button. Configure the key according to your needs.
Secret visibility
Do not click Download and continue yet, as this will hide the Secret credential.
-
Note the values of the Key and Secret parameters, as you will need them in the Admin Portal configuration part.
-
Click Download and continue to finish creating the key.
Configure Admin Portal
-
In the Events & Actions view on the Admin Portal, create an action by clicking on the + Add Action button.
-
Select Apache Kafka as the action type.
-
In the Routing Key field, type the name of the topic you created previously, for example,
topic_0
. -
Still in the Routing Key field, type your key after the topic in the following format:
topic:key
, for example,topic_0:stamford
.Key and message order
Each message loaded into a topic may have a
key
. Kafka, being a distributed system, divides topics into partitions, and message order is guaranteed only within the same partition. Adding a key to the topic name ensures the messages are sent to the same partition, preserving the correct order. -
In the Authentication Mechanism drop-down, select the authentication type used by your Kafka setup. Supported authentication types include SASL/PLAIN, SCRAM-SHA-256, and SCRAM-SHA-512.
-
In the Username field, type your Kafka username. In Confluent Cloud, this is the value of the Key credential you noted while creating an API key.
Unsecured connections
PubNub doesn't allow unsecured connections to Kafka.
-
In the Password field, type your Kafka password. In Confluent Cloud, this is the value of the Secret credential you noted while creating an API key.
-
In the Brokers field, provide a URL of the Kafka cluster to send the events to, for example,
574mf0rd-bridge.ldn-west:1905
. In Confluent Cloud, this is the value of the Bootstrap server parameter you noted while configuring a Kafka cluster.Broker list
You can provide a list of URLs in the following format:
hostname:port, hostname2:port
. -
Optionally, enable and configure retries through the Kafka retry option.
-
Pair your action with an event listener without leaving the Actions view. To do this, click the Add event listener button and select an existing event listener or create a new one.
-
Save your newly created action by clicking the Save changes button.