We are using Spark 3.4.0 DStream in our application for Message consumption and processing via Kafka.
I have a use case, where I need to pause message consumption on some condition and resume back consuming messages once the condition is fulfilled.
Can this be achieved in Spark?
I tried to stop processing messages (ignoring the actual business logic) after message consumption and tried not to commit the offset in Kafka. The requirement is that the same message should be consume again and again till the offset is not updated, but this does not happen as the position of consumer is Changed