DataFrame returned by getBatch from MQTTTextStreamSource did not have isStreaming=true

535 views Asked by At

I try to use MQTT together with PySpark Structured Streaming.

from pyspark.sql import SparkSession
from pyspark.sql.functions import explode
from pyspark.sql.functions import split

spark = SparkSession \
    .builder \
    .appName("Test") \
    .master("local[4]") \
    .getOrCreate()

# Custom Structured Streaming receiver
lines = spark\
             .readStream\
             .format("org.apache.bahir.sql.streaming.mqtt.MQTTStreamSourceProvider")\
             .option("topic","uwb/distances")\
             .option('brokerUrl', 'tcp://127.0.0.1:1883')\
             .load()

# Split the lines into words
words = lines.select(explode(split(lines.value, ' ')).alias('word'))

# Generate running word count
wordCounts = words.groupBy('word').count()

# Start running the query that prints the running counts to the console
query = wordCounts \
    .writeStream \
    .outputMode('complete') \
    .format('console') \
    .start()

query.awaitTermination()

Error message:

Logical Plan:
Aggregate [word#7], [word#7, count(1) AS count#11L]
+- Project [word#7]
   +- Generate explode(split(value#2,  )), false, [word#7]
      +- StreamingExecutionRelation org.apache.bahir.sql.streaming.mqtt.MQTTTextStreamSource@383ccec1, [value#2, timestamp#3]

    at org.apache.spark.sql.execution.streaming.StreamExecution.org$apache$spark$sql$execution$streaming$StreamExecution$$runStream(StreamExecution.scala:295)
    at org.apache.spark.sql.execution.streaming.StreamExecution$$anon$1.run(StreamExecution.scala:189)
Caused by: java.lang.AssertionError: assertion failed: DataFrame returned by getBatch from org.apache.bahir.sql.streaming.mqtt.MQTTTextStreamSource@383ccec1 did not have isStreaming=true

I do not understand what is wrong in my code. Moreover, according to this post Structured Streaming 2.1.0 is actually supported by Bahir MQTT. I also tried Spark 2.2.1 and got the same issue.

This is how I run the code:

spark-submit \
  --jars lib/spark-streaming-mqtt_2.11-2.2.1.jar, \
  lib/spark-sql-streaming-mqtt_2.11-2.2.1.jar, \
  lib/org.eclipse.paho.client.mqttv3-1.2.0.jar \
  TestSpark.py

How can I solve this issue?

1

There are 1 answers

0
ScalaBoy On

I downloaded Spark 2.2.0 binaries and executed the code as follows:

~/Downloads/spark-2.2.1-bin-hadoop2.7/bin/spark-submit \
    --jars lib/spark-streaming-mqtt_2.11-2.2.1.jar, \
    lib/spark-sql-streaming-mqtt_2.11-2.2.1.jar, \
    lib/org.eclipse.paho.client.mqttv3-1.2.0.jar \
    TestSpark.py

This solved the problem. Previously I was only changing the versions of MQTT jar files, e.g. spark-streaming-mqtt_2.11-2.2.1.jar, but apparently it was not enough.