I've setup Apache Spark in windows system and have a java program which reads file.txt gets the data and then calls my first.dll file Now everytime I run it, I get an Exception
Exception in thread "main" org.apache.spark.SparkException: Job aborted due
to stage failure: Task 0 in stage 0.0 failed 1 times, most recent failure:
Lost task 0.0 in stage 0.0 (TID 0, localhost):
java.lang.UnsatisfiedLinkError:
main.java.SimpleApp$1$1someClass.SomeJniMethodName(Ljava/lang/String;[Ljava/lang/String;)I
Can anyone suggest me why it is failing? Dll is working fine .. tried and tested with simple Java Application.
spark-submit
and have a multi-node cluster can you confirm all nodes can access to the shared library?