Serverless Workflow eventing not working with SASL_SSL Kafka connection

31 views Asked by At

I've been trying cogito serverless workflow for a while now in local setup. In my setup, I've defined workflows with states involving AsyncAPI calls using Kafka. This is working fine in local, using local Kafka setup without any auth mechanism.

When I'm trying to promote and take this further and tried the same with remote Kafka cluster, I'm facing issue where it seems to do the authorisation as expected, but it is not polling for events to trigger the workflow. This is happening when I tried with Kafka cluster which used SASL_SSL auth mechanism. I tried with another remote Kafka cluster which uses cert based authorisation, and there it seems to work fine. I'm trying to understand if there is any restrictions on the auth mechanisms supported for Kafka.

I got to conclusion that it is doing authorisation as expected because when I tried incorrect username/password in "kafka.sasl.jaas.config" value, it gives proper auth failure error, but when I corrected it, it is not giving any error and prints the logs as shown at the bottom. But even though the logs mention that consumer is started, no consumer group is registered for the topic in scope, so something is going wrong there. There is no issue with Kafka broker as I tried connecting and consuming/producing messages with same properties and it works fine.

Can someone have a look and provide your comments.

Properties used for local Kafka connection in workflow service:

kafka.bootstrap.servers=127.0.0.1:9092

Properties used for remote Kafka connection where it is not working:

kafka.bootstrap.servers=<brokers>
kafka.security.protocol=SASL_SSL
kafka.sasl.mechanism=SCRAM-SHA-512
kafka.sasl.jaas.config=org.apache.kafka.common.security.scram.ScramLoginModule required username="<username>" password="<password>";
//Tried with and without the below properties and same result in both cases
kafka.ssl.truststore.location=<truststorepath>
kafka.ssl.truststore.password=<truststorepassword>

Logs:

17:03:26 INFO  traceId=, parentId=, spanId=, sampled= [io.sm.re.me.kafka] (Quarkus Main Thread) SRMSG18229: Configured topics for channel '<channelName>': [<topicName>]

17:03:27 INFO  traceId=, parentId=, spanId=, sampled= [io.sm.re.me.kafka] (smallrye-kafka-consumer-thread-4) SRMSG18257: Kafka consumer kafka-consumer-<channelName/topicName>, connected to Kafka brokers '<brokersList>', belongs to the 'sw-xyz-in8' consumer group and is configured to poll records from [<topicName>]

17:03:27 INFO  traceId=, parentId=, spanId=, sampled= [or.ki.ko.ev.im.AbstractMessageConsumer] (Quarkus Main Thread) Consumer for <channelName/topicName> started
1

There are 1 answers

3
Francisco Javier Tirado Sarti On

The issue does not seem related with Kogito, but with SASL authentication. Can you try a regular Java client using Kafka API? The idea is to verify that using the same set of SSL properties (I guess you read, as I did, this post ;), that client is able to publish and consume events from that broker (I did not skip the part where you metion you have already published and consume events from that broker, but I guess you are using a non java client to do that, this test will ensure Kafka Java API is ok interacting with that broker using auth)