Spark Structured Streaming

840 views Asked by At

How to enable multiple streaming SQL queries to be run on Kafka stream from a single job. Is the structured streaming is a reliable way of going ahead. For example, I'm running 10 queries on a stream in a single job. Suppose I want to run only 9 queries is there a way to dynamically change the queries to be run from a store for every run. I want the queries to be run to dynamically picked from a store for every run of the execute of the Streaming query aka Continuous query.

1

There are 1 answers

0
Paul Leclercq On

If you want to process multiples queries you should use spark.streams.awaitAnyTermination()

val spark = SparkSession
      .builder()
      .getOrCreate() 
val query1 =  spark ... 
val query2 =  spark ...  
spark.streams.awaitAnyTermination()

Then, you can read a SQL raw query from a Stream and use its value into a other stream without any problems.