How can i consume avro data using kafka connect

299 views Asked by At

I have created a connector for handling Avro data. I am able to publish data into a topic. But I am not getting the data into the output topic. I have checked the logs of connector and rest-proxy there is no error are showing.

{
    "name": "sink-elastic_avro_2_topic",
    "config": {
        "connector.class": "io.confluent.connect.http.HttpSinkConnector",
        "headers": "Content-Type:application/vnd.kafka.json.v2+json|Accept:application/vnd.kafka.v2+json",
        "batch.max.size": "3000",
        "confluent.topic.bootstrap.servers": "broker:9092",
        "tasks.max": "3",
        "http.api.url": "http://xxx.xxx.xxx:8090/topics/avro_output_topic",
        "topics": "avro_input_topic",
        "request.method": "POST",
        "reporter.bootstrap.servers": "broker:9092",
        "regex.patterns": "^~$",
        "regex.separator": "~",
        "reporter.error.topic.name": "error-responses",
        "regex.replacements": "{\"key\" : \"${key}\" ,\"value\":~}",
        "reporter.result.topic.name": "success-responses",
        "batch.prefix": "{\"records\":[",
        "reporter.error.topic.replication.factor": "1",
        "consumer.override.auto.offset.reset": "latest",
        "confluent.topic.replication.factor": "1",
        "value.converter.schemas.enable": "false",
        "value.converter": "io.confluent.connect.avro.AvroConverter",
        "value.converter.schema.registry.url": "http://schema-registry:8081",
        "batch.suffix": "]}",
        "key.converter": "org.apache.kafka.connect.storage.StringConverter",
        "reporter.result.topic.replication.factor": "1"
    }
}

How to publish Avro data to a topic using Rest proxy.

1

There are 1 answers

4
OneCricketeer On

publish Avro data to a topic using Rest proxy.

Using a Sink Connector to point at the Kafka REST Proxy doesn't make sense. You would be consuming from Kafka to write back to Kafka.

You should instead be using a stream-processor like Kafka Streams of KSQLdb to move data across topics within the same Kafka cluster.

Between Kafka clusters, you can use tools like MirrorMaker