Direct upload to Amazon S3 in a Python Flask application.
(Ideally the static assets are processed (minified / compressed) and served from a CDN instead of this app itself.)
Disclaimer: This is not a real service provided by Amazon Web Services.
This repository refers to a final project in a cloud computing course.
-
Set environment variables for your AWS access key, secret, and bucket name (at
~/.aws
directory) -
Run:
docker build -t aws-annoymizer . docker run -d --name "annoymizer" -p 5000:5000 \ -e "AWS_ACCESS_KEY_ID=${AWS_ACCESS_KEY_ID}" \ -e "AWS_SECRET_ACCESS_KEY=${AWS_SECRET_ACCESS_KEY}" \ -e "S3_BUCKET=${S3_BUCKET}" \ -e "S3_REGION=${S3_REGION}" \ aws-annoymizer
-
Visit localhost:5000/upload to try it out
You can test the AWS credentials provided have the rights to upload as follows:
$ docker exec -it uploader sh
$ python
import boto3
from botocore.client import Config
s3 = boto3.client('s3', 'ap-southeast-1',config=Config(s3={'addressing_style':'path'}))
s3.upload_file('application.py','myBucket','application.py')
You can update the Flask app after modifications by:
This sample app is a direct copy of:
The files in this repository are, unless stated otherwise, released under the Apache License. You are free to redistribute this code with or without modification. The full license text is available here.