How to Installing Hadoop Step by Step
Environment
Ubuntu 14.04, Hadoop 2.6.0, OpenJDK
1.7
I. Installing JAVA JDK
Login as root
sudo –s
Install Jave Runtime
sudo apt-get install
default-jre
Install Java JDK (OpenJDK v 1.7 or newer)
sudo
apt-get update
sudo
apt-get install default-jdk
Checking
Java verison
Java --version
II. Install Secure Shell
Install (SSH and RSYNC)
sudo
apt-get install ssh
sudo
apt-get install rsync
Create and Setup SSH Certificates (Setup passphraseless ssh)
To enable password-less login,
generate a new SSH key with an empty passphrase:
Use Hadoop User:
ssh-keygen
-t dsa -P '' -f ~/.ssh/id_dsa
cat
~/.ssh/id_dsa.pub >> ~/.ssh/authorized_keys
III. Fetch and Install Hadoop
Fetcth Hadoop (Stable Version)
wget
http://apache.tradebit.com/pub/hadoop/common/current/hadoop-2.6.0.tar.gz
Extract File
tar
xfz hadoop-2.6.0.tar.gz
Move to local (Runing Directory)
mv
hadoop-2.6.0 /usr/local/hadoop
IV. Edit Hadoop Configuration Files
Get Java Correct Path
update-alternatives
--config java
/usr/lib/jvm/java-7-openjdk-i386 /jre/bin/java
Modifing The following Files
~/.bashrc
/usr/local/hadoop/etc/hadoop/hadoop-env.sh
/usr/local/hadoop/etc/hadoop/core-site.xml
/usr/local/hadoop/etc/hadoop/yarn-site.xml
/usr/local/hadoop/etc/hadoop/mapred-site.xml
/usr/local/hadoop/etc/hadoop/hdfs-site.xml
(1)~/.bashrc
nano
~/.bashrc
Add the following to the end file
after modify the java_home and Hadoop_install, Change JAVA_HOME to Correct path
you getting above, and HADOOP_INSTALL to your hadoop folder room
#HADOOP
VARIABLES START
export
JAVA_HOME=/usr/lib/jvm/java-7-openjdk-i386
export
HADOOP_INSTALL=/usr/local/hadoop
export
PATH=$PATH:$HADOOP_INSTALL/bin
export
PATH=$PATH:$HADOOP_INSTALL/sbin
export
HADOOP_MAPRED_HOME=$HADOOP_INSTALL
export
HADOOP_COMMON_HOME=$HADOOP_INSTALL
export
HADOOP_HDFS_HOME=$HADOOP_INSTALL
export
YARN_HOME=$HADOOP_INSTALL
export
HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_INSTALL/lib/native
export
HADOOP_OPTS="-Djava.library.path=$HADOOP_INSTALL/lib"
#HADOOP
VARIABLES END
Save file and close then Refresh
Source using the following command
source
~/.bashrc
(2) /usr/local/hadoop/etc/hadoop/hadoop-env.sh
nano
/usr/local/hadoop/etc/hadoop/hadoop-env.sh
Update the
JAVA_HOME
JAVA_HOME=/usr/lib/jvm/java-7-openjdk-i386
Save and
close the file
(3)/usr/local/hadoop/etc/hadoop/core-site.xml
nano
/usr/local/hadoop/etc/hadoop/core-site.xml
Add the following to the
configuration Tag
<property>
<name>fs.default.name</name>
<value>hdfs://localhost:9000</value>
</property>
Save and
close the file
(4)/usr/local/hadoop/etc/hadoop/yarn-site.xml
nano
/usr/local/hadoop/etc/hadoop/yarn-site.xml
Add the following to the
configuration Tag
<property>
<name>yarn.nodemanager.aux-services</name>
<value>mapreduce_shuffle</value>
</property>
<property>
<name>yarn.nodemanager.aux-services.mapreduce.shuffle.class</name>
<value>org.apache.hadoop.mapred.ShuffleHandler</value>
</property>
Save and
close the file
(5)/usr/local/hadoop/etc/hadoop/mapred-site.xml
Copy
template
cp
/usr/local/hadoop/etc/hadoop/mapred-site.xml.template /usr/local/hadoop/etc/hadoop/mapred-site.xml
Modify
file
nano
/usr/local/hadoop/etc/hadoop/mapred-site.xml
Add the following to the
configuration Tag
<property>
<name>mapreduce.framework.name</name>
<value>yarn</value>
</property>
Save and
close the file
(6)/usr/local/hadoop/etc/hadoop/hdfs-site.xml
Create
two directory for hadoop storage
mkdir
-p /usr/local/hadoop_store/hdfs/namenode
mkdir
-p /usr/local/hadoop_store/hdfs/datanode
Modify
the file
nano
/usr/local/hadoop/etc/hadoop/hdfs-site.xml
Add the following to the
configuration Tag
<property>
<name>dfs.replication</name>
<value>1</value>
</property>
<property>
<name>dfs.namenode.name.dir</name>
<value>file:/usr/local/hadoop_store/hdfs/namenode</value>
</property>
<property>
<name>dfs.datanode.data.dir</name>
<value>file:/usr/local/hadoop_store/hdfs/datanode</value>
</property>
Save and
close the file
V. Change folder permission
also give the folder the full permission
VI. Format the New Hadoop Filesystem
V. Change folder permission
Replace qursaan: with your hadoop users to be the owner of the folder
sudo chown qursaan:qursaan -R /usr/local/hadoop
sudo chown qursaan:qursaan -R /usr/local/hadoop_store
also give the folder the full permission
sudo chmod -R 777 /usr/local/hadoop
sudo chmod -R 777 /usr/local/hadoop_store
VI. Format the New Hadoop Filesystem
Using Hadoop user (not superuser)
hdfs
namenode –format
VII. Start hadoop
Using
Hadoop user (not superuser)
start-dfs.sh
start-yarn.sh
OR to run all
start-all.sh
References
http://www.csrdu.org/nauman/2014/01/23/geting-started-with-hadoop-2-2-0-building/
http://www.yongbok.net/blog/how-to-install-hadoop-2-2-0-pseudo-distributed-mode/
http://tecadmin.net/setup-hadoop-2-4-single-node-cluster-on-linux/
http://www.yongbok.net/blog/how-to-install-hadoop-2-2-0-pseudo-distributed-mode/
http://tecadmin.net/setup-hadoop-2-4-single-node-cluster-on-linux/
not able to start start-dfs.sh.
ReplyDelete14/12/26 17:56:29 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
How can i resolve this
Did you format namenode without any issues using Hadoop user
Deleteyes i did. I was quite sure that it will run. Its a nice tutorial for beginners.
DeleteIt was not resolving classpath. I am sending you that...
STARTUP_MSG: Starting NameNode
STARTUP_MSG: host = ruchi-Latitude-E4300/127.0.1.1
STARTUP_MSG: args = [-format]
STARTUP_MSG: version = 2.6.0
STARTUP_MSG: classpath = /usr/local/hadoop/etc/hadoop:/usr/local/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-digester-1.8.jar:/usr/local/hadoop/share/hadoop/common/lib/servlet-api-2.5.jar:
14/12/27 20:01:42 INFO common.Storage: Storage directory /usr/local/hadoop_store/hdfs/namenode has been successfully formatted.
Delete14/12/27 20:01:42 INFO namenode.NNStorageRetentionManager: Going to retain 1 images with txid >= 0
14/12/27 20:01:42 INFO util.ExitUtil: Exiting with status 0
14/12/27 20:01:42 INFO namenode.NameNode: SHUTDOWN_MSG:
/************************************************************
SHUTDOWN_MSG: Shutting down NameNode at ruchi-Latitude-E4300/127.0.1.1
************************************************************/
In last it was like this....after giving command hduser@ruchi-Latitude-E4300:/usr/local/hadoop$ bin/hdfs namenode -format
Sir, pls guide me. I have downloaded from stable apache releases folder- not with common
Woh it worked!!!!!!!!!! Happy finally...........I uninstalled and downloaded from common folder and followed all the steps...
DeleteThanks a lot sir!!!
Sir now please suggest how to go for hive installation, if you can suggest any good source.
That Good News, for hive you can go to
Deletehttps://cwiki.apache.org/confluence/display/Hive/GettingStarted
and soon I will made a final steps of it,
with my best wishes
Thanks sir!!!!!Please provide us the final steps. Really your tutorial is working perfectly.
DeleteWorks perfect ..Thank you so much brother.. Can you make a tutorial for integrating eclipse with hadoop ?
ReplyDeleteOk, Soon "Inshaa Allah"
DeleteWorks perfectly brother, thanks a lot!!
ReplyDelete~Ajmal
All thinks are working fine But Am getting this please help me.
ReplyDelete14/12/26 17:56:29 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
please check the phase (IV) (1-2), and restart the terminal.
Deletemake sure the following command is working fine>hadoop version
Please check my output
ReplyDelete15/01/03 00:41:33 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Starting namenodes on [localhost]
localhost: starting namenode, logging to /usr/local/hadoop/logs/hadoop-hduser-namenode-teamsm-Inspiron-5521.out
localhost: starting datanode, logging to /usr/local/hadoop/logs/hadoop-hduser-datanode-teamsm-Inspiron-5521.out
Starting secondary namenodes [0.0.0.0]
0.0.0.0: starting secondarynamenode, logging to /usr/local/hadoop/logs/hadoop-hduser-secondarynamenode-teamsm-Inspiron-5521.out
15/01/03 00:41:49 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
starting yarn daemons
starting resourcemanager, logging to /usr/local/hadoop/logs/yarn-hduser-resourcemanager-teamsm-Inspiron-5521.out
localhost: starting nodemanager, logging to /usr/local/hadoop/logs/yarn-hduser-nodemanager-teamsm-Inspiron-5521.out
My all nodes started but this warning coming regularly..
ReplyDeletehduser@teamsm-Inspiron-5521:/usr/local/hadoop/sbin$ jps
28840 ResourceManager
28345 NameNode
29295 Jps
28684 SecondaryNameNode
28971 NodeManager
28498 DataNode
What is your platform?
Deletewhen i am runing tomcat7 then i am getting this anything extra need to do
ReplyDeleteJan 03, 2015 1:12:50 AM com.sun.jersey.spi.container.ContainerResponse mapMappableContainerException
SEVERE: The exception contained within MappableContainerException could not be mapped to a response, re-throwing to the HTTP container
java.lang.NoClassDefFoundError: com/google/common/cache/CacheBuilder
at org.apache.hadoop.hdfs.shortcircuit.DomainSocketFactory.(DomainSocketFactory.java:93)
at org.apache.hadoop.hdfs.ClientContext.(ClientContext.java:111)
at org.apache.hadoop.hdfs.ClientContext.get(ClientContext.java:151)
at org.apache.hadoop.hdfs.DFSClient.(DFSClient.java:690)
at org.apache.hadoop.hdfs.DFSClient.(DFSClient.java:601)
at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:148)
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2596)
at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:91)
at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2630)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2612)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:370)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:169)
at com.atsgrid.products.sandrokottos.services.ProfilePictureServices.updateProfilePic1(ProfilePictureServices.java:80)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at com.sun.jersey.spi.container.JavaMethodInvokerFactory$1.invoke(JavaMethodInvokerFactory.java:60)
at com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$ResponseOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:205)
at com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:75)
at com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:288)
at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
at com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:108)
at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
at com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:84)
How to: Install Hadoop (as Multi Node) on Ubuntu Step by Step
ReplyDeleteA very interested question, Next Week I will post the steps
Deletehi mahdi,
ReplyDeletecant access the job tracker : 50030 and task tracker : 50060.
please suggest,
best regards
Hi Mahdi,
ReplyDeleteI got an error running the step VI. and the error is as follows:
hduser@kunal-Vostro-3546:/home/kunal$ hdfs namenode –format
No command 'hdfs' found, did you mean:
Command 'hdfls' from package 'hdf4-tools' (universe)
Command 'hfs' from package 'hfsutils-tcltk' (universe)
hdfs: command not found
Please help me out on this.
Thanks and regards
If you finish step 4 without any issues, Try to restart you terminal
DeleteHi, All the steps are completed successfully but can't access http://localhost:50070/. http://localhost:8088 is working properly. Could you please tell me the reason?
ReplyDeleteHello Ruchi and Mahdi,
ReplyDeleteI installed the hadoop 2.6.0 version with Ubuntu 12.04 LTS 64 bit and java "openjdk-7-jdk". I used the above procedure as mentioned by sir mahdi, but still I am getting the same warning message "WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable"
Ruchi, I saw your comment on it as well but I didn't get by your statement:
"I uninstalled and downloaded from common folder and followed all the steps..."
kindly help me to resolve the issue and the root cause of the warning message.
Thanks a tonne :)
Please, Identify to me the command used that printout this message
DeleteHi
DeleteI am also getting same error.
What was he solution to remove it.
O net it shows to build the native library, but here we have not downloaded src, so how to build?
Hi
DeleteI first uninstalled and then installed again from the common folder not from stable folder. I followed all the steps....open 2 windows separately for root and hadoop user. The video given with this tutorial by Mahdi sir is ultimate.......
in Step 1, export HADOOP_OPTS as
Deleteexport HADOOP_OPTS="$HADOOP_OPTS -Djava.library.path=/usr/local/hadoop/lib/native"
Slut, Congrat for this sraightforward and very helpful video. I still need help with step V, can you please help...thank you so much !
ReplyDeleteThis is below what I've got when entered the step v formula ''hadoop namenode -format'' and ''hadoop version''
15/01/30 23:09:06 INFO util.ExitUtil: Exiting with status 1
15/01/30 23:09:06 INFO namenode.NameNode: SHUTDOWN_MSG:
/************************************************************
SHUTDOWN_MSG: Shutting down NameNode at user-Latitude-E5410/127.0.1.1
************************************************************/
user@user-Latitude-E5410:~$ hadoop version
Hadoop 2.6.0
Subversion https://git-wip-us.apache.org/repos/asf/hadoop.git -r e3496499ecb8d220fba99dc5ed4c99c8f9e33bb1
Compiled by jenkins on 2014-11-13T21:10Z
Compiled with protoc 2.5.0
From source with checksum 18e43357c8f927c0695f1e9522859d6a
This command was run using /usr/local/hadoop/share/hadoop/common/hadoop-common-2.6.0.jar
In Step V: you need to add a permission to hadoop user ("Not Root")
DeleteAlso, you should run this command "hadoop namenode -format" using the hadoop user console, not using root user.
* Hadoop user: you can create a new user for hadoop services, or used your local user as hadoop.
* In my way, I used the local ubuntu user "qursaan"
Hi Mahdi,
DeleteI have replicated the step V and every thing is working fine now.
thanks man really, great job from ur side
Thanks
DeleteHello Mahmmoud,
ReplyDeleteEven after using the permission command outlined in step-V, I was getting the following error
bash: /usr/local/hadoop/bin/hadoop: Permission denied
In addition, I used the following commands to resolve it.
sudo chmod -R 777 /usr/local/hadoop/
sudo chmod -R 777 /usr/local/hadoop_store
Please can you add these commands to the original document so that others who face this issue can move ahead.
Thanks,
Pawan
Ok, Thanks
DeleteGreat tutorial .. everything worked fine ... Thanks for Sharing :)
ReplyDelete- Swap
Hi Mahdi,
ReplyDeleteI have installed Hadoop but seem to have some issue when i run the start script command. the error is as follows:
kunal@ubuntu:~$ su -l hduser
Password:
hduser@ubuntu:~$ hadoop version
Hadoop 2.6.0
Subversion https://git-wip-us.apache.org/repos/asf/hadoop.git -r e3496499ecb8d220fba99dc5ed4c99c8f9e33bb1
Compiled by jenkins on 2014-11-13T21:10Z
Compiled with protoc 2.5.0
From source with checksum 18e43357c8f927c0695f1e9522859d6a
This command was run using /usr/local/hadoop/share/hadoop/common/hadoop-common-2.6.0.jar
hduser@ubuntu:~$ sudo pico /usr/local/hadoop/etc/hadoop/hdfs-site.xml
[sudo] password for hduser:
hduser@ubuntu:~$ sudo pico /usr/local/hadoop/etc/hadoop/mapred-site.xml
hduser@ubuntu:~$ sudo pico /usr/local/hadoop/etc/hadoop/yarn-site.xml
hduser@ubuntu:~$ sudo pico /usr/local/hadoop/etc/hadoop/core-site.xml
hduser@ubuntu:~$ sudo pico /usr/local/hadoop/etc/hadoop/hadoop-env.sh
hduser@ubuntu:~$ start-dfs.sh
15/03/19 03:14:51 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Starting namenodes on [localhost]
localhost: starting namenode, logging to /usr/local/hadoop/logs/hadoop-hduser-namenode-ubuntu.out
localhost: starting datanode, logging to /usr/local/hadoop/logs/hadoop-hduser-datanode-ubuntu.out
Starting secondary namenodes [0.0.0.0]
0.0.0.0: starting secondarynamenode, logging to /usr/local/hadoop/logs/hadoop-hduser-secondarynamenode-ubuntu.out
15/03/19 03:16:38 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
hduser@ubuntu:~$ start-yarn.sh
starting yarn daemons
starting resourcemanager, logging to /usr/local/hadoop/logs/yarn-hduser-resourcemanager-ubuntu.out
localhost: starting nodemanager, logging to /usr/local/hadoop/logs/yarn-hduser-nodemanager-ubuntu.out
hduser@ubuntu:~$ sudo jps
18633 -- process information unavailable
18906 Jps
18466 -- process information unavailable
18315 -- process information unavailable
18750 -- process information unavailable
Please help me out of this. Thanks
Also, unable to view Namenode UI; when i had run formatted namenode i had got the following output:
Delete15/03/19 03:35:12 INFO namenode.NameNode: STARTUP_MSG:
/************************************************************
STARTUP_MSG: Starting NameNode
STARTUP_MSG: host = ubuntu/127.0.1.1
STARTUP_MSG: args = [–format]
STARTUP_MSG: version = 2.6.0
STARTUP_MSG: classpath = /usr/local/hadoop/etc/hadoop:/usr/local/hadoop/share/hadoop/common/lib/junit-4.11.jar:/usr/local/hadoop/share/hadoop/common/lib/jettison-1.1.jar:
STARTUP_MSG: build = https://git-wip-us.apache.org/repos/asf/hadoop.git -r e3496499ecb8d220fba99dc5ed4c99c8f9e33bb1; compiled by 'jenkins' on 2014-11-13T21:10Z
STARTUP_MSG: java = 1.7.0_75
************************************************************/
15/03/19 03:35:12 INFO namenode.NameNode: registered UNIX signal handlers for [TERM, HUP, INT]
15/03/19 03:35:12 INFO namenode.NameNode: createNameNode [–format]
Usage: java NameNode [-backup] |
[-checkpoint] |
[-format [-clusterid cid ] [-force] [-nonInteractive] ] |
[-upgrade [-clusterid cid] [-renameReserved] ] |
[-upgradeOnly [-clusterid cid] [-renameReserved] ] |
[-rollback] |
[-rollingUpgrade ] |
[-finalize] |
[-importCheckpoint] |
[-initializeSharedEdits] |
[-bootstrapStandby] |
[-recover [ -force] ] |
[-metadataVersion ] ]
15/03/19 03:35:12 INFO namenode.NameNode: SHUTDOWN_MSG:
/************************************************************
SHUTDOWN_MSG: Shutting down NameNode at ubuntu/127.0.1.1
************************************************************/
Guess its the same error as Ruchi had encountered in the first comment. please help. Thanks
DeletePlease, Change Folder Permission and Format Data Node
DeleteThen Restart Terminal and try again
cannot open ResourceManager @- http://localhost:8088/
ReplyDeleteNameNode @- http://localhost:50070/
I have successfully installed hadoop 2.7.0 on ubuntu 14.04 on watching your video.Please tell the installation process of Mahout.
ReplyDeletecontact:ashish.kumar.ece12@itbhu.ac.in
Hi,
ReplyDeleteI have installed Hadoop on ubuntu. When I hit jps, the answer is:
6995 ResourceManager
7119 NodeManager
6849 SecondaryNameNode
6680 DataNode
8493 Jps
and ResourceManager works on http but NameNode doesn't work.Could you help me ASAP ? Thanks alot.
Try to do the following in a new terminal:
Delete1. Stop all services "stop-all.sh"
2. Format the node "step VI"
3. Restart the services "start-all.sh"
I did them all, but it didn't work. Anything else ??
ReplyDeleteif you didn't have error msg, so please check configurations' files
DeleteI am also having the same case: in which config file we have to check
DeleteI checked the files , could you please look at https://gist.github.com/yaseminn/6cb97bb1f8cd5359b925
ReplyDeleteOK, you must format the node using hadoop user, not with root user
DeleteHow can i format the node using hadoop user? I am new this issue.
ReplyDeletedid you create a new user or use a local user account as me?
Deleteif you didn't create any new account, just run format direct from terminal before login as root
No didn't create a new user.
Deleteroot@s-desktop:~# start-dfs.sh
ReplyDelete15/09/19 19:40:33 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Starting namenodes on [localhost]
root@localhost's password:
Hi.. I am getting this error always.. it is asking the password to the localhost. Which I don't know.
In step II, the second part,
DeleteCreate and Setup SSH Certificates (Setup passphraseless ssh)
do this step using Hadoop user or your local user not root user
You should do more tutorials from hadoop, this is the best that I find
ReplyDeleteThis comment has been removed by a blog administrator.
ReplyDeletenobody know how to install hadoop.
ReplyDeletecheck whole internet and most of all installation process all failed.
eve people who made hadoop i dont think they even can install it withour error. they even cant publish full installation guide. just disgusting. now i know why windows and its programme become popular.
Please check your grammar first.
DeleteI have found quite a couple of good installation process over internet, and by far this one is the easiest .
Hi, I am new to hadoop....I have installed hadoop and started all the nodes... and nodes running perfectly....when I am copying file to HDFS I am getting error "WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable"... can you please anybody help me on this issue?
ReplyDeleteThanks
Srinu
i want to install hadoop on my windows. please help me out.
ReplyDeletety
Use a virtual machine
Deletei am currently working on sandbox hdp 1.3
Deletebut as i am reffering hadoop definative guide book so its getting difficult for me to understand the concept while working in sandbox. so i wanted to install hadoop without virtual machine. please suggests me something.
You can follow this
Deletehttps://wiki.apache.org/hadoop/Hadoop2OnWindows
If you have problem, please inform me
This comment has been removed by the author.
ReplyDeletei have error like this, hadoop version
ReplyDelete/usr/local/hadoop/bin/hadoop: line 166: /home/syamsul/JAVA_HOME=/usr/lib/jvm/java-7-openjdk-amd64/bin/java: No such file or directory, can you help me, please?
Recheck step IV,
DeleteDon't include "/bin/java" in the path
Thank you very much sir.I had Successfully Installed Hadoop. Your guidence prove right to me.
ReplyDelete1,00,000 times Thank you.
You are welcome
DeleteEverything works perfectly on the first go(Only one change:I installed ssh for hduser not root, unless it asks me for password everytime)
ReplyDeleteThere is one problem, everything starts, except the namenode :(
Do the following cycle to start namenode
Delete1. Stop all services
2. Format Hadoop file
3. Manually delete all temp node file
4. Restart services
This comment has been removed by the author.
ReplyDeleteWhat's the wrong? file path or permission?
DeleteI have a problem permission,I used a défferent file hadoop (hadoop-2.6.0-src.tar.gz) , doesn't have the same path to access into file hadoop-env,
DeleteMy path is as follows ( nano /usr/local/hadoop/hadoop-common-project/hadoop-common/src/main/conf/hadoop-env.sh ) , I wille repeat the work with hadoop-2.6.0.tar.gz
Thank u for this tuto, Jazaka allah khayran nchallah
b1 installed, my problem is fixed, thanks again
Deleteplease can you send to me the word file
ReplyDeleteto my mail :mostafaabat4e@gmail.com
This comment has been removed by a blog administrator.
ReplyDeleteAfter all successful installation while using start-all.sh command i am getting this error
ReplyDeletelocalhost: ssh: connect to host localhost port 22: Connection timed out
Excellent blog I visit this blog it's really awesome. The important thing is that in this blog content written clearly and understandable. The content of information is very informative. We are also providing the best services click on below links to visit our website.
ReplyDeleteOracle Fusion HCM Training
Workday Training
Okta Training
Palo Alto Training
Adobe Analytics Training