Use Boto For Gzipping Files Instead Of Sfs3
import contextlib import gzip import s3fs AWS_S3 = s3fs.S3FileSystem(anon=False) # AWS env must be set up correctly source_file_path = '/tmp/your_file.txt' s3_file_path = 'my-bu
Solution 1:
AWS Lambda function but it throws an error because It Is unable to install the s3fs module
Additional packages and your own lib code (reusable code) should be put in lambda layers.
How I can use boto for this too?
s3 = boto3.resource("s3")
bucket = s3.Bucket(bucket_name)
Then either:
If you have your file in memory (file-like object, open in bytes mode, e.g. io.BytesIO
or just open(..., 'b')
)
bucket.upload_fileobj(fileobj, s3_filename)
Or if you have a file in your current space:
bucket.upload_file(filepath, s3_filename)
Post a Comment for "Use Boto For Gzipping Files Instead Of Sfs3"