-
I'm looking to replace the below method of chunking/multi-part copy file from S3 to S3 (multi-region) using smart_open:s3. I have combination of files, data, txt, pdf, binary etc etc. The requirement is to not read/write the file by line, instead based on the given chunk size. How can I achieve this by chunk read/write in S3 using smart_open:s3? objectKey = s3Client.get_object(Bucket=bucketName, Key=key) |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments
-
If you open the read and write destinations in binary mode, you'll be able to do this: while True:
buf = fin.read(chunk_size)
if not buf:
break
fout.write(chunk_size) |
Beta Was this translation helpful? Give feedback.
-
This above code worked out, here is working code: ` transport_params = {'client': s3Client, 'buffer_size': chunk_size, 'multipart_upload': True} with s3.open(**read_arguments) as read_file: |
Beta Was this translation helpful? Give feedback.
If you open the read and write destinations in binary mode, you'll be able to do this: