使用方法:
./spark-script.sh your_file.scala first_arg second_arg third_arg
脚本:
scala_file=$1 shift 1 arguments=$@ #set +o posix # to enable process substitution when not running on bash spark-shell --master yarn --deploy-mode client --queue default --driver-memory 2G --executor-memory 4G --num-executors 10 -i <(echo 'val args = "'$arguments'".split("\s+")' ; cat $scala_file)
linux shell 重定向:
Command < filename > filename2 | Command命令以filename文件作为标准输入,以filename2文件作为标准输出 |
参考文献:
http://stackoverflow.com/questions/29928999/passing-command-line-arguments-to-spark-shell