error: not found: value spark

2.69K viewsApache Sparkspark spark-shell

error: not found: value spark

I am getting the below errors while starting the spark shell

java.lang.IllegalArgumentException: requirement failed: Can only call getServletHandlers on a running MetricsSystem
  at scala.Predef$.require(Predef.scala:224)
  at org.apache.spark.metrics.MetricsSystem.getServletHandlers(MetricsSystem.scala:91)
  at org.apache.spark.SparkContext.<init>(SparkContext.scala:524)
  at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2320)
  at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:868)
  at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:860)
  at scala.Option.getOrElse(Option.scala:121)
  at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:860)
  at org.apache.spark.repl.Main$.createSparkSession(Main.scala:96)
  ... 47 elided
<console>:14: error: not found: value spark
       import spark.implicits._
              ^
<console>:14: error: not found: value spark
       import spark.sql

Did I miss anything in the setup?

Share:
vito Edited question June 4, 2021
1

You need to start the Spark daemons before starting the spark-shell to resolve the issue. Execute the below commands

cd $SPARK_HOME/sbin
./start-all.sh
spark-shell

Share:
hiberstackers Answered question February 24, 2021
0