I have a use case in which I receive my input data in a Kafka topic. This data has to be transformed and enriched and sent to a PostgreSQL table. My application is using Quarkus Kafka streams to consume data from the input topic and process them. However I do not find any sink/connector to any database (in my case PostgreSQL). If I am not wrong, we can only send the transformed result from a kafkaStream (KStream) to another Kafka topic or a KTable. Then use another service such as kafkaConnect or Flink to read the output topic and write the data into the target postgres table.
Is there a way to directly persist the data from Kstream to a PostgreSQL table? As it is a streaming application I do not want to hit the DB for each message and would like to batch insert the data to the table.
Thanks a lot in advance for any pointers.
You are correct that Kafka Streams does not have any built-in way to push data into PostgreSQL. As you noted, Kafka Streams is meant for data processing with inputs and outputs in Kafka.
What I would recommend is to continue using Kafka Streams to process the data and massage it into the proper format. Then you can use Kafka Connect to sink the data into Postgres. There are many open source Kafka Connectors that work as PG sinks.