使用方法:
./spark-script.sh your_file.scala first_arg second_arg third_arg
腳本:
scala_file=$1 shift 1 arguments=$@ #set +o posix # to enable process substitution when not running on bash spark-shell --master yarn --deploy-mode client \ --queue default \ --driver-memory 2G --executor-memory 4G \ --num-executors 10 \ -i <(echo 'val args = "'$arguments'".split("\\s+")' ; cat $scala_file)
linux shell 重定向:
Command < filename > filename2 | Command命令以filename文件作為標准輸入,以filename2文件作為標准輸出 |
參考文獻:
http://stackoverflow.com/questions/29928999/passing-command-line-arguments-to-spark-shell