I want to read data from two different keyspaces in cassandra that are hosted on different clusters in a single spark context in java.

I have tried different approaches for it and I am stuck as when I read data I am getting data in the form of List of type com.datastax.driver.core.Row and I converted it into an RDD of type Row but I need an RDD of type String which I am unable to obtain. If anyone can suggest a different approach to move forward.

CassandraConnector eventsConnector = CassandraConnector
    .apply(sc.getConf().set("spark.cassandra.connection.host", "127.0.0.1"));

List<com.datastax.driver.core.Row> rows = eventsConnector
    .withSessionDo(new AbstractFunction1<Session, ResultSet>() {
        public ResultSet apply(Session session) {
            return session.execute("SELECT * FROM segment_index.dimension_text_data");
        }
    }).all();

JavaRDD<com.datastax.driver.core.Row> myRDD = sc.parallelize(rows);

0 Answers