Kafka2.2.0 with SASL-SCRAM - SSL peer is not authenticated, returning ANONYMOUS instead

2.1k views Asked by At

Kafka SSL peer is not authenticated, returning ANONYMOUS instead error when client connecting the brokers SASL port, it allows the connection on PLAINTEXT or SSL ports.

I have kafka 2.2.0 in windows systems with SSL enabled, where the kafka broker plaintext is running on 9092 and SSL on 9093. On top of that, configured the SASL with SCRAM mechanism with listener port as 9094, ending-up with error as mentioned in problem summary while running producer as kafka-console-producer.bat --broker-list localhost:9094 --topic xxx

Here are the SASL configurations, not provided other configuration like basic and SSL

zookeeper.properties

authProvider.1=org.apache.zookeeper.server.auth.SASLAuthenticationProvider
requireClientAuthScheme=sasl
jaasLoginRenew=3600000

server.properties

listeners=PLAINTEXT://0.0.0.0:9092,SSL://0.0.0.0:9093,SASL_SSL://0.0.0.0:9094
advertised.listeners=PLAINTEXT://localhost:9092,SSL://localhost:9093,SASL_SSL://localhost:9094
sasl.enabled.mechanisms=SCRAM-SHA-256
sasl.mechanism.inter.broker.protocol=SCRAM-SHA-256

zookeeper_server_jaas.conf

Server {
   org.apache.kafka.common.security.scram.ScramLoginModule required
   username="admin"
   password="admin-pwd"
   user_admin="admin-pwd"
   user_other1="other1-pwd"
   user_other2="other2-pwd";
};

producer.properties

security.protocol=SSL

kafka_server_jaas.conf

KafkaServer {
   org.apache.kafka.common.security.scram.ScramLoginModule required
   username="admin"
   password="admin-pwd";
};
Client {
   org.apache.kafka.common.security.plain.PlainLoginModule required
   username="admin"
   password="admin-pwd";
};

kafka_client_jaas.conf

KafkaClient {
    org.apache.kafka.common.security.scram.ScramLoginModule required
    username="admin"
    password="admin-pwd";
};

Start the Zookeeper as

SET ZOO_LOG_DIR=C:/Work/kafka_2.11-2.2.0-for-ssl/zookeeper-data
SET KAFKA_HOME=C:/Work/kafka_2.11-2.2.0-for-ssl
set KAFKA_OPTS=-Djava.security.auth.login.config=%KAFKA_HOME%/config/zookeeper_server_jaas.conf
zookeeper-server-start.bat %KAFKA_HOME%/config/zookeeper.properties

Start the kafka as

set KAFKA_HOME=C:/Work/kafka_2.11-2.2.0-for-ssl
set KAFKA_OPTS=-Djava.security.auth.login.config=%KAFKA_HOME%/config/kafka_server_jaas.conf
kafka-server-start.bat %KAFKA_HOME%/config/server.properties

Start the Producer as

SET KAFKA_HOME=C:/Work/kafka_2.11-2.2.0-for-ssl
set KAFKA_OPTS=-Djava.security.auth.login.config=%KAFKA_HOME%/config/kafka_client_jaas.conf
kafka-console-producer.bat --broker-list localhost:9094 --topic xxx

The producer is only working if I use the broker port as 9092. Did I missed something and end-up with mis-configuration. Any inputs?

Updated:

Here is the error while connecting the producer/consumer

[2019-10-14 15:39:42,108] DEBUG [SslTransportLayer channelId=127.0.0.1:9094-127.0.0.1:63848-0 key=sun.nio.ch.SelectionKeyImpl@222a223c] SSL peer is not authenticated, returning ANONYMOUS instead (org.apache.kafka.common.network.SslTransportLayer) [2019-10-14 15:39:42,108] DEBUG [SslTransportLayer channelId=127.0.0.1:9094-127.0.0.1:63848-0 key=sun.nio.ch.SelectionKeyImpl@222a223c] SSL handshake completed successfully with peerHost '127.0.0.1' peerPort 63848 peerPrincipal 'User:ANONYMOUS' cipherSuite 'TLS_DHE_DSS_WITH_AES_256_CBC_SHA256' (org.apache.kafka.common.network.SslTransportLayer) [2019-10-14 15:39:42,108] DEBUG Set SASL server state to HANDSHAKE_OR_VERSIONS_REQUEST during authentication (org.apache.kafka.common.security.authenticator.SaslServerAuthenticator) [2019-10-14 15:39:42,108] DEBUG Handling Kafka request API_VERSIONS during authentication (org.apache.kafka.common.security.authenticator.SaslServerAuthenticator) [2019-10-14 15:39:42,108] DEBUG Set SASL server state to HANDSHAKE_REQUEST during authentication (org.apache.kafka.common.security.authenticator.SaslServerAuthenticator) [2019-10-14 15:39:42,108] DEBUG Set SASL server state to FAILED during authentication (org.apache.kafka.common.security.authenticator.SaslServerAuthenticator) [2019-10-14 15:39:42,108] INFO [SocketServer brokerId=0] Failed authentication with 127.0.0.1/127.0.0.1 (Unexpected Kafka request of type METADATA during SASL handshake.) (org.apache.kafka.common.network.Selector)

1

There are 1 answers

2
Danilo On

I had same problem. Authentication with SASL SCRAM wasn't working on 2.2.x and 2.3.x Kafka versions. On 2.1 it was OK.

In the end I resolved the issue by providing zookeeper chroot path (/kafkaTest) when creating principals:

./kafka-configs --zookeeper zookeeper-01:2181/kafkaTest --alter --add-config 'SCRAM-SHA-256=[password=admin-secret],SCRAM-SHA-512=[password=admin-secret]' --entity-type users --entity-name admin

Seems like when credentials are created in zookeeper root path, Kafka can't find them to validate.

I hope it will solve your issue as well!