spark出现Exception in thread "main" org.apache.spark.SparkException: Master must either be yarn or start with spark, mesos, k8s, or local
时间: 2024-04-25 14:24:33 浏览: 546
这个问题通常发生在Spark应用程序的主节点配置不正确时。根据提供的引用内容,可以看出在创建SparkSession时,使用了`master("local\[*\]")`来指定Spark应用程序的主节点为本地模式。而根据错误信息`org.apache.spark.SparkException: Master must either be yarn or start with spark, mesos, k8s, or local`,可以得出结论,Spark应用程序的主节点必须是`yarn`、`spark`、`mesos`、`k8s`或`local`。因此,要解决这个问题,你需要将`master`参数修改为其中一个合法的值,例如`yarn`或`local`。
#### 引用[.reference_title]
- *1* *2* [Exception in thread “main“ org.apache.spark.sql.AnalysisException: Cannot write incompatible data to](https://blog.csdn.net/weixin_46389691/article/details/127869342)[target="_blank" data-report-click={"spm":"1018.2226.3001.9630","extra":{"utm_source":"vip_chatgpt_common_search_pc_result","utm_medium":"distribute.pc_search_result.none-task-cask-2~all~insert_cask~default-1-null.142^v91^koosearch_v1,239^v3^insert_chatgpt"}} ] [.reference_item]
- *3* [Exception in thread “main“ org.apache.spark.SparkException: Yarn application has already ended! It m](https://blog.csdn.net/weixin_38842096/article/details/109024966)[target="_blank" data-report-click={"spm":"1018.2226.3001.9630","extra":{"utm_source":"vip_chatgpt_common_search_pc_result","utm_medium":"distribute.pc_search_result.none-task-cask-2~all~insert_cask~default-1-null.142^v91^koosearch_v1,239^v3^insert_chatgpt"}} ] [.reference_item]
[ .reference_list ]
阅读全文