1

The following code results in Exception in thread "main" java.lang.NoSuchMethodError: org.apache.spark.sql.SQLContext.implicits()Lorg/apache/spark/sql/SQLContext$implicits$

import org.apache.spark.SparkConf
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.log4j.Logger
import org.apache.log4j.Level

object Small {

  def main(args: Array[String]) {
    Logger.getLogger("org.apache.spark").setLevel(Level.WARN)
    Logger.getLogger("org.eclipse.jetty.server").setLevel(Level.OFF)
    // set up environment
    val conf = new SparkConf()
      .setMaster("local[1]")
      .setAppName("Small")
      .set("spark.executor.memory", "2g")
    val sc = new SparkContext(conf)

    val sqlContext = new org.apache.spark.sql.SQLContext(sc)
    import sqlContext.implicits._
    val df = sc.parallelize(Array((1,30),(2,10),(3,20),(1,10), (2,30))).toDF("books","readers")
    df.show

Project was built with SBT:

name := "Small"

version := "1.0"

scalaVersion := "2.10.4"

libraryDependencies += "org.apache.spark" %% "spark-core" % "1.3.1"

libraryDependencies += "org.apache.spark" %% "spark-sql" % "1.3.1"

I run this with submit script:

#!/bin/sh
/home/test/usr/spark-1.1.0/bin/spark-submit \
--class Small \
--master local[*] \
--driver-memory 2g \
/home/test/wks/Pairs/target/scala-2.10/small_2.10-1.0.jar 

Any ideas?

SBT compiles and packages this code all right. Yet when I try to run this code with sbt run I get another exception: [error] (run-main-0) scala.reflect.internal.MissingRequirementError: class org.apache.spark.sql.catalyst.ScalaReflection in JavaMirror with java.net.URLClassLoader@210ce673 of type class java.net.URLClassLoader with classpath [file:/home/test/.ivy2/cache/org.scala-lang/scala-library/jars/scala-library-2.10.4.jar, ...

Is there any way to make sbt run include all dependencies when launching scala program?

zork
  • 2,085
  • 6
  • 32
  • 48
  • Does your spark instance use version 1.3.x as well? A NoSuchMethodError almost always means that the class files available at runtime are different from those that were there when you compiled your program. – jarandaf Jun 02 '15 at 11:10
  • I have specified `libraryDependencies += "org.apache.spark" %% "spark-core" % "1.3.1"` in sbt file, so I assume `sbt package` and `sbt run` should use spark 1.3.1., don't they? – zork Jun 02 '15 at 11:31
  • 1
    You are submitting your job to a spark instance; `/home/test/usr/spark-1.1.0/bin/spark-submit` shows what I am refering to (spark instance version is 1.1.0 while you are using 1.3.1 at compile time). Please update your spark version so that they match. – jarandaf Jun 02 '15 at 11:45
  • Oh yes, you are quite right, thank you. But what about `sbt run`? Should not it work anyway with `libraryDependencies += "org.apache.spark" %% "spark-core" % "1.3.1"` ? – zork Jun 02 '15 at 12:51
  • Check http://stackoverflow.com/questions/27824281/sparksql-missingrequirementerror-when-registering-table, it might help. – jarandaf Jun 02 '15 at 14:10

1 Answers1

2

"/home/test/usr/spark-1.1.0/bin/spark-submit" your compile version is 1.3.1, it's different from the runtime version. SqlContext of version 1.1.0 don't define "object implicits".