boto3快速上传大文件 完整代码
时间: 2024-12-16 22:17:05 浏览: 3
python boto和boto3操作bucket的示例
Boto3是Amazon AWS官方提供的Python SDK,用于方便地操作AWS服务,包括S3(Simple Storage Service)。如果你想通过boto3快速上传大文件到S3,可以使用`multipart_upload`功能,因为单次请求大小有限制。下面是一个简单的示例代码:
```python
import boto3
from botocore.exceptions import BotoCoreError
def upload_large_file_to_s3(file_path, bucket_name, object_key):
s3 = boto3.client('s3')
# 创建一个S3 MultiPart Upload会话
mpu = s3.create_multipart_upload(Bucket=bucket_name, Key=object_key)
# 分块读取并上传文件
part_size = 8 * 1024 * 1024 # 每部分8MB
num_parts = int(1e6) # 假设总大小为1GB
with open(file_path, 'rb') as file:
for i in range(1, num_parts + 1):
offset = i * part_size
bytes_to_read = min(part_size, os.path.getsize(file_path) - offset)
data = file.read(bytes_to_read)
try:
s3.upload_part(MultipartUploadId=mpu['UploadId'], Body=data,
PartNumber=i, Bucket=bucket_name, Key=object_key)
except BotoCoreError as e:
print(f"Failed to upload part {i}: {str(e)}")
break
# 结束Multipart Upload
try:
s3.complete_multipart_upload(Bucket=bucket_name, Key=object_key, MultipartUpload={'Parts': mpu['Parts']})
except BotoCoreError as e:
print(f"Failed to complete multipart upload: {str(e)}")
# 使用函数,例如你要上传的本地路径是file.txt,bucket名为my-bucket,对象键为large_file
upload_large_file_to_s3('file.txt', 'my-bucket', 'large_file')
阅读全文