
geralt / Pixabay
List all files in AWS S3 was one of the most recent requirement that I had to work on. Also, write them into a file and the easiest solution that I found was using the python boto.
Below is the code that I used to make this work. While I worked on this, it also made me realize the extensive capabalities of boto. Look at its documentation, you can do a lot more. Hope this helps someone.
#!/usr/bin/python
# -*- coding: utf-8 -*-
from boto.s3.connection import S3Connection
import requests
# Used to get the temporary credentials
r = requests.get('URL which provides the temporary Credentials')
data = r.json()
# Extracting the required fields
AccessKeyId = data['AccessKeyId']
SecretAccessKey = data['SecretAccessKey']
Token = data['Token']
my_file = open('results.txt', 'w')
# making a connectiont o S3
conn = S3Connection(aws_access_key_id=AccessKeyId,
aws_secret_access_key=SecretAccessKey,
security_token=Token)
# Accessing the required bucket
bucket = conn.get_bucket('your-bucket-name')
# Choosing the right prefix
for key in bucket.list(prefix='foldername1/foldername2/foldername3/'):
# print key.name.encode('utf-8')
my_file.write(key.name.encode('utf-8') + '\n')
my_file.close()