Spark Connect doesn't support Py4j, so in that case, we can't register JVM listener to monitor the task failures when running NDS over Spark Connect.
Instead, we may leverage "spark.extraListeners" which specifies the extra listeners which will be loaded and registered during the SparkContext initialization.
If the listener detects the task failure, the listener can write the failed task information to disk on the connect server. Then nds can read it from client