编写一个使用hdfs及其java api实现的简易文件云盘系统 提供java代码
时间: 2023-06-10 21:04:10 浏览: 27
为您提供一个简单的hdfs文件云盘系统的Java代码:
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.Path;
import java.io.IOException;
import java.io.InputStream;
import java.io.OutputStream;
public class HdfsCloudDisk {
public static final String HDFS_URI = "hdfs://localhost:9000";
public static final String ROOT_DIR = "/cloud_disk/";
private FileSystem fileSystem;
public HdfsCloudDisk() throws IOException {
Configuration configuration = new Configuration();
configuration.set("fs.defaultFS", HDFS_URI);
fileSystem = FileSystem.get(configuration);
if (!fileSystem.exists(new Path(ROOT_DIR))) {
fileSystem.mkdirs(new Path(ROOT_DIR));
}
}
public void upload(String fileName, InputStream inputStream) throws IOException {
Path filePath = new Path(ROOT_DIR + fileName);
OutputStream outputStream = fileSystem.create(filePath);
byte[] buffer = new byte[1024];
int len = inputStream.read(buffer);
while (len > 0) {
outputStream.write(buffer, 0, len);
len = inputStream.read(buffer);
}
inputStream.close();
outputStream.close();
}
public InputStream download(String fileName) throws IOException {
Path filePath = new Path(ROOT_DIR + fileName);
return fileSystem.open(filePath);
}
public void delete(String fileName) throws IOException {
Path filePath = new Path(ROOT_DIR + fileName);
fileSystem.delete(filePath, true);
}
public static void main(String[] args) throws IOException {
HdfsCloudDisk cloudDisk = new HdfsCloudDisk();
cloudDisk.upload("test.txt", HdfsCloudDisk.class.getResourceAsStream("/test.txt"));
InputStream inputStream = cloudDisk.download("test.txt");
byte[] buffer = new byte[1024];
int len = inputStream.read(buffer);
while (len > 0) {
System.out.print(new String(buffer, 0, len));
len = inputStream.read(buffer);
}
inputStream.close();
cloudDisk.delete("test.txt");
}
}
本代码中,使用了hadoop的java api来连接hdfs文件系统,并实现了上传、下载和删除文件的基本操作。在初始化时,代码会创建一个根目录,上传文件时文件名需要带上路径,下载文件时返回输入流,删除文件时可以选择是否删除目录。
相关推荐
















