**I'm trying to stream the data from kafka and convert it in to a data frame.followed this link
But when im running both producer and consumer applications, this is the output on my console.**
(0,[B@370ed56a) (1,[B@2edd3e63) (2,[B@3ba2944d) (3,[B@2eb669d1) (4,[B@49dd304c) (5,[B@4f6af565) (6,[B@7714e29e)
Which is literally the output of the kafka producer, the topic is empty before pushing messages.
Here is the producer code snippet :
Properties props = new Properties();
props.put("bootstrap.servers", "##########:9092");
props.put("key.serializer",
"org.apache.kafka.common.serialization.StringSerializer");
props.put("value.serializer",
"org.apache.kafka.common.serialization.ByteArraySerializer");
props.put("producer.type", "async");
Schema.Parser parser = new Schema.Parser();
Schema schema = parser.parse(EVENT_SCHEMA);
Injection<GenericRecord, byte[]> records = GenericAvroCodecs.toBinary(schema);
KafkaProducer<String, byte[]> producer = new KafkaProducer<String, byte[]>(props);
for (int i = 0; i < 100; i++) {
GenericData.Record avroRecord = new GenericData.Record(schema);
setEventValues(i, avroRecord);
byte[] messages = records.apply(avroRecord);
ProducerRecord<String, byte[]> producerRecord = new ProducerRecord<String, byte[]>(
"topic", String.valueOf(i),messages);
System.out.println(producerRecord);
producer.send(producerRecord);
}
And its output is:
key=0, value=[B@680387a key=1, value=[B@32bfb588 key=2, value=[B@2ac2e1b1 key=3, value=[B@606f4165 key=4, value=[B@282e7f59
Here is my consumer code snippet written in scala,
"group.id" -> "KafkaConsumer",
"zookeeper.connection.timeout.ms" -> "1000000"
val topicMaps = Map("topic" -> 1)
val messages = KafkaUtils.createStream[String, Array[Byte], StringDecoder, DefaultDecoder](ssc, kafkaConf, topicMaps, StorageLevel.MEMORY_ONLY_SER)
messages.print()
I've tried with both StringDecoder and DefaultDecoder in createStream().I'm sure that, the producer and consumer are in compliance with each other. Any help , from anybody?