1、Exception in thread "main" org.apache.spark.SparkException: Cluster deploy mode is not applicable to Spark shells.
解决方案:
使用命令:
spark-shell --master yarn --deploy-mode client
2、Caused by: java.lang.IllegalStateException: Cannot call methods on a stopped SparkContext.
原因:没有关闭sc资源
解决:
sqlContext.sparkSession.close()
sc.stop()
详细检查写结束的位置,写在最后,解决问题。低级失误