1.美团点评优化笔记: https://tech.meituan.com/spark-tuning-basic.html
2.jar包冲突 https://www.jianshu.com/p/0fe48bc43a8c
3.spark job失败会重试,利用参数 spark.yarn.maxAppAttempts和yarn.resourcemanager.am.max-attempts可以控制重试次数
https://stackoverflow.com/questions/38709280/how-to-limit-the-number-of-retries-on-spark-job-failure