I wrote a simple but effective performance tweak – only upload files that have changed. So it checks the modified dates on the files before uploading. This will save a bit of bandwidth costs and time during deployment.
The change I made to my settings.py was simply to point the MEDIA_URL to the bucket like this:
MEDIA_URL = 'https://s3.amazonaws.com/bucketname/media/'
And here’s the function I added to my fabfile:
from boto.s3.connection import S3Connection from boto.s3.key import Key import os from stat import * import time def deploy_media(): """Deploy the media files to S3 """ conn = S3Connection('ACCESS KEY ID', 'SECRET ACCESS KEY') bucket = conn.get_bucket('bucketname') #upload files for root, dirs, files in os.walk('media'): for f in files: if f.endswith('.swp') or f.startswith('.') : continue filename = root + '/' + f modify_time = os.stat(filename)[ST_MTIME] key = bucket.get_key(filename) if key is None: key = Key(bucket) key.key = filename if key.last_modified is None or time.localtime(modify_time) > time.strptime(key.last_modified, '%a, %d %b %Y %H:%M:%S %Z'): print filename fid = file(filename, 'r') key.set_contents_from_file(fid) key.set_acl('public-read') print 'file uploaded'