springboot 接入hadoop
时间: 2023-08-23 09:12:52 浏览: 98
1. 首先需要在pom.xml文件中添加以下依赖:
```
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter</artifactId>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>2.4.5</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-common</artifactId>
<version>2.7.7</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-hdfs</artifactId>
<version>2.7.7</version>
</dependency>
```
2. 在application.properties文件中添加以下配置:
```
# Spark配置
spark.master=local[*]
spark.app.name=MyApp
# Hadoop配置
fs.defaultFS=hdfs://localhost:9000
```
3. 在代码中使用Spark和Hadoop:
```
// Spark
SparkConf conf = new SparkConf().setAppName("MyApp").setMaster("local[*]");
JavaSparkContext sc = new JavaSparkContext(conf);
// Hadoop
Configuration conf = new Configuration();
conf.set("fs.defaultFS", "hdfs://localhost:9000");
FileSystem fs = FileSystem.get(conf);
```
以上就是使用Spring Boot、Spark和Hadoop的环境配置。
阅读全文