0

I've setup Apache Spark in windows system and have a java program which reads file.txt gets the data and then calls my first.dll file Now everytime I run it, I get an Exception

Exception in thread "main" org.apache.spark.SparkException: Job aborted due
to stage failure: Task 0 in stage 0.0 failed 1 times, most recent failure:
Lost task 0.0 in stage 0.0 (TID 0, localhost): 

java.lang.UnsatisfiedLinkError:    
main.java.SimpleApp$1$1someClass.SomeJniMethodName(Ljava/lang/String;[Ljava/lang/String;)I 

Can anyone suggest me why it is failing? Dll is working fine .. tried and tested with simple Java Application.

3
  • If you are submitting via spark-submit and have a multi-node cluster can you confirm all nodes can access to the shared library? Commented Oct 6, 2015 at 12:12
  • 1
    try adding the DLL under folder pointed to by spark.executor.extraLibraryPath.
    – urug
    Commented Oct 6, 2015 at 13:26
  • I tried That ... it started working ..
    – niyant
    Commented Oct 7, 2015 at 4:57

0

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.