Apache Spark and Cassandra: UnauthorizedException on Super-user Permission with Consistency Level QUORUM

89 views Asked by At

I'm using Apache Spark to write data to a Cassandra cluster. The deployment is kubernetes based and am using cassandra helm chart. Sporadically, I encounter a SparkException that leads to job abortion, as detailed below:

...
Caused by: com.datastax.oss.driver.api.core.servererrors.UnauthorizedException: Unable to perform
authorization of permissions: Unable to perform authorization of super-user permission: Cannot
achieve consistency level QUORUM

Additional details on the Cassandra cluster:

Datacenter: datacenter1

Status=Up/Down
|/ State=Normal/Leaving/Joining/Moving
--  Address        Load       Tokens  Owns (effective)  Host ID                               Rack
UN  10.x.x.x  10.98 GiB  256     64.7%             blahblah-c2e0509a03                        rack1
UN  10.x.x.x  12.17 GiB  256     69.7%             blahblah-a617-4dfbcdb999aa                 rack1
UN  10.x.x.x  12.6 GiB   256     65.6%             blahblah-9d4f-9111f4ae55a3                 rack1

I have already ensured that the system_auth keyspace is replicated to all these nodes. However, the issue still appears intermittently. I'd appreciate any insight into why this might be happening and how to potentially resolve it.

1

There are 1 answers

0
Alex Ott On

This is a well known issue when using built-in administrator role - for this role, Cassandra always uses QUORUM when reading authentication data, and there could be situations when only one node is available while other are doing garbage collection or other activity and not responsible.

You can create another user and give it the necessary permissions (including fill admin rights) - for this user, the ONE will be used instead of QUORUM.

CREATE ROLE dba WITH SUPERUSER = true AND LOGIN = true AND PASSWORD = 'password';

P.S. It's even recommended to disable built-in super-user, see linked docs for details.