Spark 1.6.1 Installation on Ubuntu 14.04 and Hadoop 2.6.0

Hi friends,

I have started learning Apache Spark and it is time now to share few things with you.
In this series related to Spark and Scala, we are going to see essential things that we should know regarding both Spark and Scala.

So as I always begin, we are going to start with Spark Installation.
For this series, we are going to use Spark 1.6.1 pre-bundled with Hadoop 2.6.0 and compatible with Scala 2.11.7.

Please use below steps in order to install Spark 1.6.1 on Ubuntu 14.04

Following are the steps which you need to follow in order to install Spark 1.6.1 on Ubuntu 14.04
Assumptions : username is hduser
Step 1 : JAVA Installation
If you have already installed JAVA, then you can skip this step.
If JAVA is not installed, then please follow below steps in order to install JAVA 1.7
command 1 : sudo apt-get update
command 2 : sudo apt-get install openjdk
Once above commands are executed successfully, then you can run following command for verification.
$java -version
Above command should give you output which might look something like this.
java version "1.7.0_71"
Java(TM) SE Runtime Environment (build 1.7.0_71-b13)
Java HotSpot(TM) Client VM (build 25.0-b02, mixed mode)
JAVA version may be different based on your installation.
Step 2 : Scala Installation
If you done Scala Installation, then you can skip this step, otherwise you can install Scala 2.11.6 with the help of <a href=""&gt; this </a> link.
Step 3 : Download Spark Package
Step 4 : Extracting Spark Package
sudo cp spark-1.6.1-bin-hadoop2.6.tgz /usr/local/
cd /usr/local/
sudo tar -xvf spark-1.6.1-bin-hadoop2.6.tgz
sudo mv spark-1.6.1-bin-hadoop2.6 spark
Step 5 : Update .bashrc file with following contents
export SPARK_HOME=/usr/local/spark
$nano .bashrc
press CTRL + X
press Y
press ENTER
Step 5 : SPARK Installation Verification
For verifying SPARK installation, you can run following command.
This completed SPARK Installation.
Thanks for having a read.

When you type $spark-shell command, you should get following output.

Spark Shell Output
Spark Shell Output

If you are getting same output, then cheers, Spark is installed successfully.

Published by milindjagre

I founded my blog four years ago and am currently working as a Data Scientist Analyst at the Ford Motor Company. I graduated from the University of Connecticut pursuing Master of Science in Business Analytics and Project Management. I am working hard and learning a lot of new things in the field of Data Science. I am a strong believer of constant and directional efforts keeping the teamwork at the highest priority. Please reach out to me at for further information. Cheers!

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.

%d bloggers like this: