I have a dataframe and need to see if it contains null values. There are plenty of posts on the same topic but nearly all of them use the
count action or the
count operations are prohibitively expensive in my case as the data volume is large. Same for the
Is there a way in which I can ask spark to look for null values and raise an error as soon as it encounters the first null value?
The solutions in other posts give the count of missing values in each column. I don't need to know the number of missing values in every column. I just want to know if there is any cell in the dataframe with a null value.