1. Đăng kí:
http://aws.amazon.com/2. Download thu vien boto:
Vao Developer/Sample code & Libraries, search boto
3. Giai nen va chay lenh python setup.py install
4. Neu muon nhung code: copy thu muc boto vao source project
6. Su dung boto voi Amazon S3, tren web2py:
record_create.html:
{{=form.custom.begin}}
input tyle="file" name="img" >
{{=form.custom.end}}
trong controllers cua web2py: record.py
from boto.s3.connection import S3Connection
from boto.s3.key import Key
...
def record_create
...
conn = S3Connection('aws_access_key', 'secret_key')
# rs = conn.get_all_buckets()
bucket = conn.get_bucket('myimg') # Trong TH nay bucket myimg da duoc tao san
if request.vars.has_key('img'):
img = request.vars['img']
if img.file:
#data = img.file.read()
k = Key(bucket)
k.key = 'Winter.jpg'
k.set_contents_from_file(img.file, headers=None, replace=True, cb=None, num_cb=10, policy=None, md5=None)
OK. That's all
Load anh truc tiep tu Amazon: (bucket phai duoc public)
http://bucket_name.s3.amazonaws.com/image_name?AWSAccessKeyId=aws_acesskey
Quan li % upload:
....
bucket = conn.create_bucket(bucket_name,
location=s3.connection.Location.DEFAULT)
testfile = "replace this with an actual filename"
print 'Uploading %s to Amazon S3 bucket %s' % \
(testfile, bucket_name)
import sys
def percent_cb(complete, total):
sys.stdout.write('.')
sys.stdout.flush()
from boto.s3.key import Key
k = Key(bucket)
k.key = 'my test file'
k.set_contents_from_filename(testfile, cb=percent_cb, num_cb=10)
Upload to S3
Here is the code we use to upload the picture files:
01 | def push_picture_to_s3( id ): |
04 | from boto.s3.key import Key |
06 | logging.getLogger( 'boto' ).setLevel(logging.CRITICAL) |
07 | bucket_name = settings.BUCKET_NAME |
09 | conn = boto.connect_s3(settings.AWS_ACCESS_KEY_ID, |
10 | settings.AWS_SECRET_ACCESS_KEY) |
11 | bucket = conn.get_bucket(bucket_name) |
14 | fn = '/var/www/data/%s.png' % id |
18 | k.set_contents_from_filename(fn) |
25 | ...
Download from S3As you saw, you can access the file using the URL: http://s3.amazonaws.com/bucket_name/key but you can also use the boto library to download the files. I do that to create a daily backup of the bucket’s files on my local machine. Here is the script to do that: 03 | from boto.s3.key import Key |
05 | LOCAL_PATH = '/backup/s3/' |
06 | AWS_ACCESS_KEY_ID = '...' |
07 | AWS_SECRET_ACCESS_KEY = '...' |
09 | bucket_name = 'bucket_name' |
11 | conn = boto.connect_s3(AWS_ACCESS_KEY_ID, |
12 | AWS_SECRET_ACCESS_KEY) |
13 | bucket = conn.get_bucket(bucket_name) |
15 | bucket_list = bucket. list () |
17 | keyString = str (l.key) |
19 | if not os.path.exists(LOCAL_PATH + keyString): |
20 | l.get_contents_to_filename(LOCAL_PATH + keyString) |
|