1.集群的 hadoop-current
hive-current
spark-current
copy 到 gateway 机器
2.集群的 hadoop-conf
hive-conf
spark-conf
copy 到 gateway 机器
3.集群的 yarn.sh
hdfs.sh
hive.sh
spark.sh
copy 到 gateway 机器的 /etc/profile.d 目录下,以便用户登入时候配置相关环境变量
主要有:PATH 中增加:
/usr/lib/hadoop-current/bin
/usr/lib/hadoop-current/sbin
/usr/lib/hive-current/bin
/usr/lib/hive-current/hcatalog/bin
/usr/lib/spark-current/bin
HADOOP_CONF_DIR = /etc/hadoop-conf
HIVE_CONF_DIR = /etc/hive-conf
SPARK_CONF_DIR = /etc/spark-conf
其它服务组件的配置,方法类似。