Skip to content Skip to sidebar Skip to footer

Trouble Installing Pyspark

I want to run Spark on a local machine using pyspark. From here I use the commands: sbt/sbt assembly $ ./bin/pyspark The install completes, but pyspark is unable to run, resultin

Solution 1:

The right solution is to set SPARK_LOCAL_IP environment variable to localhost or whatever your host name is.

Solution 2:

I had the same problem with Spark and it is related to your Laptop IP.

My solution:

sudo /etc/hosts

below

127.0.0.1 localhost localhost.localdomain localhost4 localhost4.localdomain4

add

127.0.0.1 LAPTOPNAME

your LAPTOPNAME can be found with your Terminal and it is root@LAPTOPNAME (whichever you have set up during your installation)

It will run with Java1.7

Solution 3:

I turns out, the Java version I was using was 1.7. I'm using a Macbook Air, running 10.9.2

$ java -version

gave me:

java version "1.7.0_25"
Java(TM) SE Runtime Environment (build 1.7.0_25-b15)
Java HotSpot(TM) 64-Bit Server VM (build 23.25-b01, mixed mode)

To downgrade to 1.6:

$ cd /Library/Java/JavaVirtualMachines$ ls

returned:

jdk1.7.0_25.jdk

To delete that file (and downgrade java and fix my issue):

$ sudo rm -rf jdk1.7.0_25.jdk

Then I had:

$ java -version

Which gave the output:

java version "1.6.0_65"
Java(TM) SE Runtime Environment (build 1.6.0_65-b14-462-11M4609)
Java HotSpot(TM) 64-Bit Server VM (build 20.65-b04-462, mixed mode)

And finally, I am able to run Spark:

$ ./bin/pyspark

And all is happy:

Welcome to
      ____              __
     / __/__  ___ _____//__
    _\\/ _ \/ _ `/ __/  '_/
   /__ / .__/\_,_/_/ /_/\_\   version 0.9.1
      /_/

Post a Comment for "Trouble Installing Pyspark"