I want to connect two VM machine in remote and executing my PySpark program using spark resources
- VM1: Standalone Spark
- VM2: Jupyter Notebook with Pyspark code
I have used "Spark Connect" to remote connectivity between Spark standalone clusters and other node which is having pyspark code in jupyter notebook. and getting following error:
**SparkConnectGrpcException: <_InactiveRpcError of RPC that terminated with:
status = StatusCode.UNAVAILABLE
details = "failed to connect to all addresses; last error: UNAVAILABLE: ipv4:spark://:port: Socket closed"
debug_error_string = "UNKNOWN:failed to connect to all addresses; last error: UNAVAILABLE: ipv4:spark://:port: Socket closed {created_time:"2023-10-18T08:18:31.248037669+01:00", grpc_status:14}"**
Any assistance would be greatly appreciated. Thanks in Advance