spark通過idea远程提交job到yarn:
Caused by: java.lang.ClassCastException: cannot assign instance of scala.collection.immutable.List$SerializationProxy to field org.apache.spark.rdd.RDD.org$apache$spark$rdd$RDD$$dependencies_ of type scala.collection.Seq in instance of org.apache.spark.rdd.MapPartitionsRDD
1.大概是由于没有设置要提交的jar包路径,远程提交是需要设置SPARK_HOME的环境变量,所以idea远程提交应该是类似于spark-shell --master yarn的提交方式,需要执行的代码要yarn集群的executor端