Im new using these resources so please correct me if there is sth wrong or not possible. I have an event hub connection string and a namespace. Unfortunately I don't have access to the event hub itself. As far as I understand the event hub is connected to a database and if there is a new entry or change in the data, a message is being sent and the event hub is being triggered.
I shall use Kafka to listen to the event hub and if the event hub is triggered, Kafka shall write the new entry to a blob storage account. Is this feasible using Kafka? And is the connection string and namespace enough to connect Kafka with the event hub?
I tried using the event hub python library and received an authentication error. Is there further a possibility to test the connection string?
Thanks in advance!
Kafka doesn't natively integrate with Azure Event Hub. Use the Kafka Connect framework along with the Kafka Connect Azure Event Hubs connector to achieve this integration. The Kafka Connect framework allows to connect Kafka with various data sources and sinks.
To connect Kafka with the event hub, replace
{YOUR.EVENTHUBS.CONNECTION.STRING}
with the connection string for the Event Hubs namespace in the Kafka configuration file.To set up Kafka Connect with the Azure Event Hubs
Install and set up Apache Kafka on system.
Download and install the Kafka Connect Azure Event Hubs connector from the Confluent Hub
Once the connector is up and running, it will listen to your Azure Event Hub and ingest the data into Kafka topics.
And consume the data from Kafka and perform any further processing or write it to a blob storage account.
Test the connection string by using the Azure CLI or Azure Portal.
The Azure CLI command:
References taken from Integrate with Apache Kafka Connect- Azure Event Hubs and Kafka with Azure - Streaming Unlimited Data Into Cloud.