Friday, December 5, 2014

How to: Install Hadoop (as Single Node) on Ubuntu Step by Step

How to Installing Hadoop Step by Step

Environment

Ubuntu 14.04, Hadoop 2.6.0, OpenJDK 1.7

I. Installing JAVA JDK

Login as root

sudo –s

Install Jave Runtime

sudo apt-get install default-jre

Install Java JDK (OpenJDK v 1.7 or newer)

sudo apt-get update
sudo apt-get install default-jdk
                Checking Java verison
Java --version

II. Install Secure Shell

Install (SSH and RSYNC)

sudo apt-get install ssh
sudo apt-get install rsync

Create and Setup SSH Certificates (Setup passphraseless ssh)

To enable password-less login, generate a new SSH key with an empty passphrase:
Use Hadoop User:
ssh-keygen -t dsa -P '' -f ~/.ssh/id_dsa
cat ~/.ssh/id_dsa.pub >> ~/.ssh/authorized_keys

III. Fetch and Install Hadoop

Fetcth Hadoop (Stable Version)

wget http://apache.tradebit.com/pub/hadoop/common/current/hadoop-2.6.0.tar.gz

Extract File

tar xfz hadoop-2.6.0.tar.gz

Move to local (Runing Directory)

mv hadoop-2.6.0 /usr/local/hadoop

IV. Edit Hadoop Configuration Files

Get Java Correct Path

update-alternatives --config java
                /usr/lib/jvm/java-7-openjdk-i386 /jre/bin/java

Modifing The following Files

~/.bashrc
/usr/local/hadoop/etc/hadoop/hadoop-env.sh
/usr/local/hadoop/etc/hadoop/core-site.xml
/usr/local/hadoop/etc/hadoop/yarn-site.xml
/usr/local/hadoop/etc/hadoop/mapred-site.xml
/usr/local/hadoop/etc/hadoop/hdfs-site.xml

(1)~/.bashrc

nano ~/.bashrc
Add the following to the end file after modify the java_home and Hadoop_install, Change JAVA_HOME to Correct path you getting above, and HADOOP_INSTALL to your hadoop folder room
#HADOOP VARIABLES START
export JAVA_HOME=/usr/lib/jvm/java-7-openjdk-i386
export HADOOP_INSTALL=/usr/local/hadoop
export PATH=$PATH:$HADOOP_INSTALL/bin
export PATH=$PATH:$HADOOP_INSTALL/sbin
export HADOOP_MAPRED_HOME=$HADOOP_INSTALL
export HADOOP_COMMON_HOME=$HADOOP_INSTALL
export HADOOP_HDFS_HOME=$HADOOP_INSTALL
export YARN_HOME=$HADOOP_INSTALL
export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_INSTALL/lib/native
export HADOOP_OPTS="-Djava.library.path=$HADOOP_INSTALL/lib"
#HADOOP VARIABLES END
Save file and close then Refresh Source using the following command
source ~/.bashrc

(2) /usr/local/hadoop/etc/hadoop/hadoop-env.sh

nano /usr/local/hadoop/etc/hadoop/hadoop-env.sh
                Update the JAVA_HOME
JAVA_HOME=/usr/lib/jvm/java-7-openjdk-i386
                Save and close the file

(3)/usr/local/hadoop/etc/hadoop/core-site.xml

nano /usr/local/hadoop/etc/hadoop/core-site.xml
Add the following to the configuration Tag
<property>
   <name>fs.default.name</name>
   <value>hdfs://localhost:9000</value>
</property>
                Save and close the file

(4)/usr/local/hadoop/etc/hadoop/yarn-site.xml

nano /usr/local/hadoop/etc/hadoop/yarn-site.xml
Add the following to the configuration Tag
<property>
   <name>yarn.nodemanager.aux-services</name>
   <value>mapreduce_shuffle</value>
</property>
<property>
   <name>yarn.nodemanager.aux-services.mapreduce.shuffle.class</name>
   <value>org.apache.hadoop.mapred.ShuffleHandler</value>
</property>
                Save and close the file

(5)/usr/local/hadoop/etc/hadoop/mapred-site.xml

                Copy template
cp /usr/local/hadoop/etc/hadoop/mapred-site.xml.template  /usr/local/hadoop/etc/hadoop/mapred-site.xml
                Modify file
nano /usr/local/hadoop/etc/hadoop/mapred-site.xml
Add the following to the configuration Tag
<property>
   <name>mapreduce.framework.name</name>
   <value>yarn</value>
</property>
                Save and close the file

(6)/usr/local/hadoop/etc/hadoop/hdfs-site.xml

                Create two directory for hadoop storage
mkdir -p /usr/local/hadoop_store/hdfs/namenode
mkdir -p /usr/local/hadoop_store/hdfs/datanode
                Modify the file
nano /usr/local/hadoop/etc/hadoop/hdfs-site.xml
Add the following to the configuration Tag
<property>
   <name>dfs.replication</name>
   <value>1</value>
 </property>
 <property>
   <name>dfs.namenode.name.dir</name>
   <value>file:/usr/local/hadoop_store/hdfs/namenode</value>
 </property>
 <property>
   <name>dfs.datanode.data.dir</name>
   <value>file:/usr/local/hadoop_store/hdfs/datanode</value>
 </property>
                Save and close the file

V. Change folder permission

                Replace qursaan: with your hadoop users to be the owner of the folder

sudo chown qursaan:qursaan -R /usr/local/hadoop
sudo chown qursaan:qursaan -R /usr/local/hadoop_store

                also give the folder the full permission

sudo chmod -R 777 /usr/local/hadoop
sudo chmod -R 777 /usr/local/hadoop_store

VI. Format the New Hadoop Filesystem

Using Hadoop user (not superuser)
hdfs namenode –format

VII. Start hadoop

                Using Hadoop user (not superuser)
start-dfs.sh
start-yarn.sh
OR to run all
start-all.sh

VIII. Web View

ResourceManager @- http://localhost:8088/
                

References


76 comments:

  1. not able to start start-dfs.sh.

    14/12/26 17:56:29 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

    How can i resolve this

    ReplyDelete
    Replies
    1. Did you format namenode without any issues using Hadoop user

      Delete
    2. yes i did. I was quite sure that it will run. Its a nice tutorial for beginners.
      It was not resolving classpath. I am sending you that...
      STARTUP_MSG: Starting NameNode
      STARTUP_MSG: host = ruchi-Latitude-E4300/127.0.1.1
      STARTUP_MSG: args = [-format]
      STARTUP_MSG: version = 2.6.0
      STARTUP_MSG: classpath = /usr/local/hadoop/etc/hadoop:/usr/local/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-digester-1.8.jar:/usr/local/hadoop/share/hadoop/common/lib/servlet-api-2.5.jar:

      Delete
    3. 14/12/27 20:01:42 INFO common.Storage: Storage directory /usr/local/hadoop_store/hdfs/namenode has been successfully formatted.
      14/12/27 20:01:42 INFO namenode.NNStorageRetentionManager: Going to retain 1 images with txid >= 0
      14/12/27 20:01:42 INFO util.ExitUtil: Exiting with status 0
      14/12/27 20:01:42 INFO namenode.NameNode: SHUTDOWN_MSG:
      /************************************************************
      SHUTDOWN_MSG: Shutting down NameNode at ruchi-Latitude-E4300/127.0.1.1
      ************************************************************/
      In last it was like this....after giving command hduser@ruchi-Latitude-E4300:/usr/local/hadoop$ bin/hdfs namenode -format

      Sir, pls guide me. I have downloaded from stable apache releases folder- not with common

      Delete
    4. Woh it worked!!!!!!!!!! Happy finally...........I uninstalled and downloaded from common folder and followed all the steps...
      Thanks a lot sir!!!
      Sir now please suggest how to go for hive installation, if you can suggest any good source.

      Delete
    5. That Good News, for hive you can go to
      https://cwiki.apache.org/confluence/display/Hive/GettingStarted
      and soon I will made a final steps of it,
      with my best wishes

      Delete
    6. Thanks sir!!!!!Please provide us the final steps. Really your tutorial is working perfectly.

      Delete
  2. Works perfect ..Thank you so much brother.. Can you make a tutorial for integrating eclipse with hadoop ?

    ReplyDelete
  3. Works perfectly brother, thanks a lot!!
    ~Ajmal

    ReplyDelete
  4. All thinks are working fine But Am getting this please help me.
    14/12/26 17:56:29 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

    ReplyDelete
    Replies
    1. please check the phase (IV) (1-2), and restart the terminal.
      make sure the following command is working fine>hadoop version

      Delete
  5. Please check my output
    15/01/03 00:41:33 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
    Starting namenodes on [localhost]
    localhost: starting namenode, logging to /usr/local/hadoop/logs/hadoop-hduser-namenode-teamsm-Inspiron-5521.out
    localhost: starting datanode, logging to /usr/local/hadoop/logs/hadoop-hduser-datanode-teamsm-Inspiron-5521.out
    Starting secondary namenodes [0.0.0.0]
    0.0.0.0: starting secondarynamenode, logging to /usr/local/hadoop/logs/hadoop-hduser-secondarynamenode-teamsm-Inspiron-5521.out
    15/01/03 00:41:49 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
    starting yarn daemons
    starting resourcemanager, logging to /usr/local/hadoop/logs/yarn-hduser-resourcemanager-teamsm-Inspiron-5521.out
    localhost: starting nodemanager, logging to /usr/local/hadoop/logs/yarn-hduser-nodemanager-teamsm-Inspiron-5521.out

    ReplyDelete
  6. My all nodes started but this warning coming regularly..

    hduser@teamsm-Inspiron-5521:/usr/local/hadoop/sbin$ jps
    28840 ResourceManager
    28345 NameNode
    29295 Jps
    28684 SecondaryNameNode
    28971 NodeManager
    28498 DataNode

    ReplyDelete
  7. when i am runing tomcat7 then i am getting this anything extra need to do
    Jan 03, 2015 1:12:50 AM com.sun.jersey.spi.container.ContainerResponse mapMappableContainerException
    SEVERE: The exception contained within MappableContainerException could not be mapped to a response, re-throwing to the HTTP container
    java.lang.NoClassDefFoundError: com/google/common/cache/CacheBuilder
    at org.apache.hadoop.hdfs.shortcircuit.DomainSocketFactory.(DomainSocketFactory.java:93)
    at org.apache.hadoop.hdfs.ClientContext.(ClientContext.java:111)
    at org.apache.hadoop.hdfs.ClientContext.get(ClientContext.java:151)
    at org.apache.hadoop.hdfs.DFSClient.(DFSClient.java:690)
    at org.apache.hadoop.hdfs.DFSClient.(DFSClient.java:601)
    at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:148)
    at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2596)
    at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:91)
    at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2630)
    at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2612)
    at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:370)
    at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:169)
    at com.atsgrid.products.sandrokottos.services.ProfilePictureServices.updateProfilePic1(ProfilePictureServices.java:80)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at com.sun.jersey.spi.container.JavaMethodInvokerFactory$1.invoke(JavaMethodInvokerFactory.java:60)
    at com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$ResponseOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:205)
    at com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:75)
    at com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:288)
    at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
    at com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:108)
    at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
    at com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:84)


    ReplyDelete
  8. How to: Install Hadoop (as Multi Node) on Ubuntu Step by Step

    ReplyDelete
    Replies
    1. A very interested question, Next Week I will post the steps

      Delete
  9. hi mahdi,

    cant access the job tracker : 50030 and task tracker : 50060.

    please suggest,

    best regards

    ReplyDelete
  10. Hi Mahdi,

    I got an error running the step VI. and the error is as follows:
    hduser@kunal-Vostro-3546:/home/kunal$ hdfs namenode –format
    No command 'hdfs' found, did you mean:
    Command 'hdfls' from package 'hdf4-tools' (universe)
    Command 'hfs' from package 'hfsutils-tcltk' (universe)
    hdfs: command not found

    Please help me out on this.

    Thanks and regards

    ReplyDelete
    Replies
    1. If you finish step 4 without any issues, Try to restart you terminal

      Delete
  11. Hi, All the steps are completed successfully but can't access http://localhost:50070/. http://localhost:8088 is working properly. Could you please tell me the reason?

    ReplyDelete
  12. Hello Ruchi and Mahdi,


    I installed the hadoop 2.6.0 version with Ubuntu 12.04 LTS 64 bit and java "openjdk-7-jdk". I used the above procedure as mentioned by sir mahdi, but still I am getting the same warning message "WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable"

    Ruchi, I saw your comment on it as well but I didn't get by your statement:

    "I uninstalled and downloaded from common folder and followed all the steps..."


    kindly help me to resolve the issue and the root cause of the warning message.

    Thanks a tonne :)

    ReplyDelete
    Replies
    1. Please, Identify to me the command used that printout this message

      Delete
    2. Hi
      I am also getting same error.
      What was he solution to remove it.
      O net it shows to build the native library, but here we have not downloaded src, so how to build?

      Delete
    3. Hi
      I first uninstalled and then installed again from the common folder not from stable folder. I followed all the steps....open 2 windows separately for root and hadoop user. The video given with this tutorial by Mahdi sir is ultimate.......

      Delete
    4. in Step 1, export HADOOP_OPTS as
      export HADOOP_OPTS="$HADOOP_OPTS -Djava.library.path=/usr/local/hadoop/lib/native"

      Delete
  13. Slut, Congrat for this sraightforward and very helpful video. I still need help with step V, can you please help...thank you so much !
    This is below what I've got when entered the step v formula ''hadoop namenode -format'' and ''hadoop version''

    15/01/30 23:09:06 INFO util.ExitUtil: Exiting with status 1
    15/01/30 23:09:06 INFO namenode.NameNode: SHUTDOWN_MSG:
    /************************************************************
    SHUTDOWN_MSG: Shutting down NameNode at user-Latitude-E5410/127.0.1.1
    ************************************************************/
    user@user-Latitude-E5410:~$ hadoop version
    Hadoop 2.6.0
    Subversion https://git-wip-us.apache.org/repos/asf/hadoop.git -r e3496499ecb8d220fba99dc5ed4c99c8f9e33bb1
    Compiled by jenkins on 2014-11-13T21:10Z
    Compiled with protoc 2.5.0
    From source with checksum 18e43357c8f927c0695f1e9522859d6a
    This command was run using /usr/local/hadoop/share/hadoop/common/hadoop-common-2.6.0.jar

    ReplyDelete
    Replies
    1. In Step V: you need to add a permission to hadoop user ("Not Root")
      Also, you should run this command "hadoop namenode -format" using the hadoop user console, not using root user.

      * Hadoop user: you can create a new user for hadoop services, or used your local user as hadoop.
      * In my way, I used the local ubuntu user "qursaan"

      Delete
    2. Hi Mahdi,
      I have replicated the step V and every thing is working fine now.
      thanks man really, great job from ur side

      Delete
  14. Hello Mahmmoud,

    Even after using the permission command outlined in step-V, I was getting the following error

    bash: /usr/local/hadoop/bin/hadoop: Permission denied

    In addition, I used the following commands to resolve it.
    sudo chmod -R 777 /usr/local/hadoop/
    sudo chmod -R 777 /usr/local/hadoop_store

    Please can you add these commands to the original document so that others who face this issue can move ahead.

    Thanks,
    Pawan

    ReplyDelete
  15. Great tutorial .. everything worked fine ... Thanks for Sharing :)

    - Swap

    ReplyDelete
  16. Hi Mahdi,

    I have installed Hadoop but seem to have some issue when i run the start script command. the error is as follows:

    kunal@ubuntu:~$ su -l hduser
    Password:
    hduser@ubuntu:~$ hadoop version
    Hadoop 2.6.0
    Subversion https://git-wip-us.apache.org/repos/asf/hadoop.git -r e3496499ecb8d220fba99dc5ed4c99c8f9e33bb1
    Compiled by jenkins on 2014-11-13T21:10Z
    Compiled with protoc 2.5.0
    From source with checksum 18e43357c8f927c0695f1e9522859d6a
    This command was run using /usr/local/hadoop/share/hadoop/common/hadoop-common-2.6.0.jar
    hduser@ubuntu:~$ sudo pico /usr/local/hadoop/etc/hadoop/hdfs-site.xml
    [sudo] password for hduser:
    hduser@ubuntu:~$ sudo pico /usr/local/hadoop/etc/hadoop/mapred-site.xml
    hduser@ubuntu:~$ sudo pico /usr/local/hadoop/etc/hadoop/yarn-site.xml
    hduser@ubuntu:~$ sudo pico /usr/local/hadoop/etc/hadoop/core-site.xml
    hduser@ubuntu:~$ sudo pico /usr/local/hadoop/etc/hadoop/hadoop-env.sh
    hduser@ubuntu:~$ start-dfs.sh
    15/03/19 03:14:51 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
    Starting namenodes on [localhost]
    localhost: starting namenode, logging to /usr/local/hadoop/logs/hadoop-hduser-namenode-ubuntu.out
    localhost: starting datanode, logging to /usr/local/hadoop/logs/hadoop-hduser-datanode-ubuntu.out
    Starting secondary namenodes [0.0.0.0]
    0.0.0.0: starting secondarynamenode, logging to /usr/local/hadoop/logs/hadoop-hduser-secondarynamenode-ubuntu.out
    15/03/19 03:16:38 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
    hduser@ubuntu:~$ start-yarn.sh
    starting yarn daemons
    starting resourcemanager, logging to /usr/local/hadoop/logs/yarn-hduser-resourcemanager-ubuntu.out
    localhost: starting nodemanager, logging to /usr/local/hadoop/logs/yarn-hduser-nodemanager-ubuntu.out
    hduser@ubuntu:~$ sudo jps
    18633 -- process information unavailable
    18906 Jps
    18466 -- process information unavailable
    18315 -- process information unavailable
    18750 -- process information unavailable


    Please help me out of this. Thanks

    ReplyDelete
    Replies
    1. Also, unable to view Namenode UI; when i had run formatted namenode i had got the following output:

      15/03/19 03:35:12 INFO namenode.NameNode: STARTUP_MSG:
      /************************************************************
      STARTUP_MSG: Starting NameNode
      STARTUP_MSG: host = ubuntu/127.0.1.1
      STARTUP_MSG: args = [–format]
      STARTUP_MSG: version = 2.6.0
      STARTUP_MSG: classpath = /usr/local/hadoop/etc/hadoop:/usr/local/hadoop/share/hadoop/common/lib/junit-4.11.jar:/usr/local/hadoop/share/hadoop/common/lib/jettison-1.1.jar:
      STARTUP_MSG: build = https://git-wip-us.apache.org/repos/asf/hadoop.git -r e3496499ecb8d220fba99dc5ed4c99c8f9e33bb1; compiled by 'jenkins' on 2014-11-13T21:10Z
      STARTUP_MSG: java = 1.7.0_75
      ************************************************************/
      15/03/19 03:35:12 INFO namenode.NameNode: registered UNIX signal handlers for [TERM, HUP, INT]
      15/03/19 03:35:12 INFO namenode.NameNode: createNameNode [–format]
      Usage: java NameNode [-backup] |
      [-checkpoint] |
      [-format [-clusterid cid ] [-force] [-nonInteractive] ] |
      [-upgrade [-clusterid cid] [-renameReserved] ] |
      [-upgradeOnly [-clusterid cid] [-renameReserved] ] |
      [-rollback] |
      [-rollingUpgrade ] |
      [-finalize] |
      [-importCheckpoint] |
      [-initializeSharedEdits] |
      [-bootstrapStandby] |
      [-recover [ -force] ] |
      [-metadataVersion ] ]

      15/03/19 03:35:12 INFO namenode.NameNode: SHUTDOWN_MSG:
      /************************************************************
      SHUTDOWN_MSG: Shutting down NameNode at ubuntu/127.0.1.1
      ************************************************************/

      Delete
    2. Guess its the same error as Ruchi had encountered in the first comment. please help. Thanks

      Delete
    3. Please, Change Folder Permission and Format Data Node
      Then Restart Terminal and try again

      Delete
  17. cannot open ResourceManager @- http://localhost:8088/
    NameNode @- http://localhost:50070/

    ReplyDelete
  18. I have successfully installed hadoop 2.7.0 on ubuntu 14.04 on watching your video.Please tell the installation process of Mahout.
    contact:ashish.kumar.ece12@itbhu.ac.in

    ReplyDelete
  19. Hi,

    I have installed Hadoop on ubuntu. When I hit jps, the answer is:
    6995 ResourceManager
    7119 NodeManager
    6849 SecondaryNameNode
    6680 DataNode
    8493 Jps
    and ResourceManager works on http but NameNode doesn't work.Could you help me ASAP ? Thanks alot.

    ReplyDelete
    Replies
    1. Try to do the following in a new terminal:
      1. Stop all services "stop-all.sh"
      2. Format the node "step VI"
      3. Restart the services "start-all.sh"

      Delete
  20. I did them all, but it didn't work. Anything else ??

    ReplyDelete
    Replies
    1. if you didn't have error msg, so please check configurations' files

      Delete
    2. I am also having the same case: in which config file we have to check

      Delete
  21. I checked the files , could you please look at https://gist.github.com/yaseminn/6cb97bb1f8cd5359b925

    ReplyDelete
    Replies
    1. OK, you must format the node using hadoop user, not with root user

      Delete
  22. How can i format the node using hadoop user? I am new this issue.

    ReplyDelete
    Replies
    1. did you create a new user or use a local user account as me?
      if you didn't create any new account, just run format direct from terminal before login as root

      Delete
    2. No didn't create a new user.

      Delete
  23. root@s-desktop:~# start-dfs.sh
    15/09/19 19:40:33 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
    Starting namenodes on [localhost]
    root@localhost's password:
    Hi.. I am getting this error always.. it is asking the password to the localhost. Which I don't know.

    ReplyDelete
    Replies
    1. In step II, the second part,
      Create and Setup SSH Certificates (Setup passphraseless ssh)
      do this step using Hadoop user or your local user not root user

      Delete
  24. You should do more tutorials from hadoop, this is the best that I find

    ReplyDelete
  25. This comment has been removed by a blog administrator.

    ReplyDelete
  26. nobody know how to install hadoop.
    check whole internet and most of all installation process all failed.
    eve people who made hadoop i dont think they even can install it withour error. they even cant publish full installation guide. just disgusting. now i know why windows and its programme become popular.

    ReplyDelete
    Replies
    1. Please check your grammar first.

      I have found quite a couple of good installation process over internet, and by far this one is the easiest .

      Delete
  27. Hi, I am new to hadoop....I have installed hadoop and started all the nodes... and nodes running perfectly....when I am copying file to HDFS I am getting error "WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable"... can you please anybody help me on this issue?

    Thanks
    Srinu

    ReplyDelete
  28. i want to install hadoop on my windows. please help me out.
    ty

    ReplyDelete
    Replies
    1. i am currently working on sandbox hdp 1.3
      but as i am reffering hadoop definative guide book so its getting difficult for me to understand the concept while working in sandbox. so i wanted to install hadoop without virtual machine. please suggests me something.

      Delete
    2. You can follow this
      https://wiki.apache.org/hadoop/Hadoop2OnWindows
      If you have problem, please inform me

      Delete
  29. This comment has been removed by the author.

    ReplyDelete
  30. i have error like this, hadoop version
    /usr/local/hadoop/bin/hadoop: line 166: /home/syamsul/JAVA_HOME=/usr/lib/jvm/java-7-openjdk-amd64/bin/java: No such file or directory, can you help me, please?

    ReplyDelete
    Replies
    1. Recheck step IV,
      Don't include "/bin/java" in the path

      Delete
  31. Thank you very much sir.I had Successfully Installed Hadoop. Your guidence prove right to me.

    1,00,000 times Thank you.

    ReplyDelete
  32. Everything works perfectly on the first go(Only one change:I installed ssh for hduser not root, unless it asks me for password everytime)

    There is one problem, everything starts, except the namenode :(

    ReplyDelete
    Replies
    1. Do the following cycle to start namenode
      1. Stop all services
      2. Format Hadoop file
      3. Manually delete all temp node file
      4. Restart services

      Delete
  33. This comment has been removed by the author.

    ReplyDelete
    Replies
    1. What's the wrong? file path or permission?

      Delete
    2. I have a problem permission,I used a défferent file hadoop (hadoop-2.6.0-src.tar.gz) , doesn't have the same path to access into file hadoop-env,
      My path is as follows ( nano /usr/local/hadoop/hadoop-common-project/hadoop-common/src/main/conf/hadoop-env.sh ) , I wille repeat the work with hadoop-2.6.0.tar.gz


      Thank u for this tuto, Jazaka allah khayran nchallah

      Delete
    3. b1 installed, my problem is fixed, thanks again

      Delete
  34. please can you send to me the word file
    to my mail :mostafaabat4e@gmail.com

    ReplyDelete
  35. This comment has been removed by a blog administrator.

    ReplyDelete
  36. After all successful installation while using start-all.sh command i am getting this error
    localhost: ssh: connect to host localhost port 22: Connection timed out

    ReplyDelete
  37. Excellent blog I visit this blog it's really awesome. The important thing is that in this blog content written clearly and understandable. The content of information is very informative. We are also providing the best services click on below links to visit our website.
    Oracle Fusion HCM Training
    Workday Training
    Okta Training
    Palo Alto Training
    Adobe Analytics Training

    ReplyDelete