6

Hadoop yarn集群安装

 3 years ago
source link: https://www.wencst.com/archives/801
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.
neoserver,ios ssh client

Hadoop yarn集群安装

作者: wencst 分类: 架构设计 发布时间: 2018-12-17 11:17 阅读: 1,777 次

所有节点操作:

设置ip地址,hosts,ssh免密登录,scp,sudo,关闭防火墙,yum,ntp时间同步 略。
Java安装 略。
所有节点基础修改:
yum install -y expect
yum install -y telnet
修改所有节点/etc/hosts
10.8.5.180 hadoop1
10.8.5.181 hadoop2
10.8.5.182 hadoop3
创建hadoop用户
groupadd hadoop
useradd hadoop -g hadoop
passwd hadoop

主节点操作

创建无密码登录秘钥
auto-key.sh
#!/bin/bash
PASSWORD=hadoop123
auto_ssh_copy_id() {
expect -c “set timeout -1;
spawn ssh-copy-id $1;
expect {
*(yes/no)* {send — yes\r;exp_continue;}
*assword:* {send — $2\r;exp_continue;}
eof {exit 0;}
cat nodes | while read host
auto_ssh_copy_id $host $PASSWORD
}&wait

exec.sh

#!/bin/bash
cat nodes | while read host
ssh $host $1
}&wait
scp.sh
#!/bin/bash
cat nodes | while read host
scp -r $1 $host:$2
}&wait
nodes
hadoop1
hadoop2
hadoop3
./auto-key.sh
主节点java安装:
./exec.sh “tar -zxvf jdk-9.0.4_linux-x64_bin.tar.gz -C /home/hadoop/”
./exec.sh “echo ‘export JAVA_HOME=/home/hadoop/jdk-9.0.4’ >> /etc/profile”
./exec.sh “echo ‘export PATH=\$JAVA_HOME/bin:\$PATH’ >> /etc/profile”
./exec.sh “source /etc/profile”
统一修改file文件把‘abc’换为‘xxx’
./exec.sh “sed -i ‘s/abc/xxx/g’ file”
主节点hadoop安装:
#使用hadoop用户执行以下命令(以下命令
su hadoop
tar -zxvf hadoop-3.1.1/hadoop-3.1.1.tar.gz -C /home/hadoop/
#增加JAVA_HOME配置
vi /home/hadoop/hadoop-3.1.1/etc/hadoop/hadoop-env.sh
export JAVA_HOME=/home/hadoop/jdk-9.0.4
#修改core-site.xml
vi /home/hadoop/hadoop-3.1.1/etc/hadoop/core-site.xml
<configuration>
<property>
<name>fs.defaultFS</name>
<value>hdfs://hadoop1:9000</value>
</property>
<property>
<name>hadoop.tmp.dir</name>
<value>/home/hadoop/hadoop-3.1.1/tmp</value>
</property>
</configuration>
#修改hdfs-site.xml
vi /home/hadoop/hadoop-3.1.1/etc/hadoop/hdfs-site.xml
<configuration>
<property>
<name>dfs.namenode.name.dir</name>
<value>/home/hadoop/hadoop-3.1.1/data/name</value>
</property>
<property>
<name>dfs.datanode.data.dir</name>
<value>/home/hadoop/hadoop-3.1.1/data/data</value>
</property>

<property>
<name>dfs.replication</name>
<value>3</value>
</property>

<property>
<name>dfs.secondary.http.address</name>
<value>hadoop1:50090</value>
</property>
</configuration>

#修改mapred-site.xml
vi /home/hadoop/hadoop-3.1.1/etc/hadoop/mapred-site.xml
<configuration>
<property>
<name>mapreduce.framework.name</name>
<value>yarn</value>
</property>
</configuration>
#修改yarn-site.xml
vi /home/hadoop/hadoop-3.1.1/etc/hadoop/yarn-site.xml
<configuration>
<property>
<name>yarn.resourcemanager.hostname</name>
<value>hadoop1</value>
</property>

<property>
<name>yarn.nodemanager.aux-services</name>
<value>mapreduce_shuffle</value>
</property>
</configuration>

#修改slaves/workers
hadoop2
hadoop3
#初始化hdfs
/home/hadoop/hadoop-3.1.1/bin/hadoop namenode -format
#启动集群
/home/hadoop/hadoop-3.1.1/sbin/start-all.sh
#测试
/home/hadoop/hadoop-3.1.1/bin/hadoop jar /home/hadoop/hadoop-3.1.1/share/hadoop/mapreduce/hadoop-mapreduce-examples-3.1.1.jar pi 5 10

如果文章对您有用,扫一下支付宝的红包,不胜感激!

欢迎加入QQ群进行技术交流:656897351(各种技术、招聘、兼职、培训欢迎加入)

Leave a Reply Cancel reply

You must be logged in to post a comment.


About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK