Skip to content

Bringing Up Hadoop

Awantik Das edited this page Jan 25, 2019 · 1 revision
  1. Install Java Runtime 1.8
  2. Download hadoop 2.7 & spark with hadoop 2.7
  3. Bashrc Configuration

`export HADOOP_HOME=/home/awantik/packages/hadoop-2.7.3 export HADOOP_CONF_DIR=/home/awantik/packages/hadoop-2.7.3/etc/hadoop export HADOOP_MAPRED_HOME=/home/awantik/packages/hadoop-2.7.3 export HADOOP_COMMON_HOME=/home/awantik/packages/hadoop-2.7.3 export HADOOP_HDFS_HOME=/home/awantik/packages/hadoop-2.7.3 export YARN_HOME=/home/awantik/packages/hadoop-2.7.3 export PATH=$PATH:/home/awantik/packages/hadoop-2.7.3/bin

export JAVA_HOME=/usr/lib/jvm/java-8-oracle export PATH=/usr/lib/jvm/java-8-oracle:$PATH export SPARK_HOME=/home/awantik/packages/spark-2.4.0-bin-hadoop2.7 export LD_LIBRARY_PATH=/home/awantik/packages/hadoop-2.7.3/lib/native:$LD_LIBRARY_PATH export PATH=$SPARK_HOME/bin:$PATH export PATH=~/Downloads/eclipse/:$PATH

export HIVE_HOME=/home/awantik/packages/apache-hive-2.1.0-bin export PATH=$HIVE_HOME/bin:$PATH`

  1. In file etc/hadoop/hadoop-env.h export JAVA_HOME=/usr/lib/jvm/java-8-oracle

  2. Changes in hadoop files present in hadoop-dir/etc/hadoop 5.a core-site.xml `

fs.default.name hdfs://localhost:9000 ~ `

5.b hdfs-site.xml <configuration> <property> <name>dfs.replication</name> <value>1</value> </property> <property> <name>dfs.permission</name> <value>false</value> </property> </configuration>

http://hadoop.apache.org/docs/stable/hadoop-project-dist/hadoop-common/SingleCluster.html#Configuration

  1. jps - to show hadoop processes 29025 Jps 7121 NodeManager 6146 NameNode 6771 ResourceManager 21400 SparkSubmit 16713 HistoryServer 6345 DataNode 6587 SecondaryNameNode

  2. Format namenode before bringing up hadoop namenode -format

  3. Go to sbin -/home/awantik/packages/hadoop-2.7.3/sbin ./start-all.sh

`

Clone this wiki locally