Skip to content

NoSuchMethodError when submitting to cluster #3

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
julkw opened this issue Aug 10, 2020 · 0 comments
Open

NoSuchMethodError when submitting to cluster #3

julkw opened this issue Aug 10, 2020 · 0 comments

Comments

@julkw
Copy link

julkw commented Aug 10, 2020

When trying to submit this project on a cluster (with the jar having been built on the cluster), I get a java.lang.NoSuchMethorError for scala.App

The jar was created with sbt assembly.

It was submitted with

/opt/spark/2.4.4/bin/spark-submit \
  --master spark://odin01:7077 \
  --driver-memory 28 \
  --executor-memory 28 \
  --num-executors 11 \
  --executor-cores 20 \
  --total-executor-cores 220 \
  target/scala-2.12/SparkTutorialSBT-assembly-0.1.jar

The resulting Error Message is:

2020-08-10 11:04:59,845 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Exception in thread "main" java.lang.NoSuchMethodError: scala.App.$init$(Lscala/App;)V
        at de.hpi.spark_tutorial.SimpleSpark$.<init>(SimpleSpark.scala:20)
        at de.hpi.spark_tutorial.SimpleSpark$.<clinit>(SimpleSpark.scala)
	at de.hpi.spark_tutorial.SimpleSpark.main(SimpleSpark.scala)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
	at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:845)
	at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:161)
	at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:184)
	at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
	at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:920)
	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:929)
	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
2020-08-10 11:05:00,021 INFO util.ShutdownHookManager: Shutdown hook called
2020-08-10 11:05:00,022 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-37bb2c26-f19d-4551-a9ee-29c989aa47ae
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant