Question:
I’d like to get the size (in bytes, and # of keys) of an Amazon S3 bucket.
I’m looking for an efficient way to get the size of bucket.
One possible way(which is NOT efficient): I can get the key list of the bucket and sum up the size of each key. This is inefficient when I have thousand of key because I have to look for each key’s size.
Is there any efficient solution??
UPDATE:
following code is not what I looking for(because it’s not efficient):
1 2 3 4 5 |
bucket = conn.get_bucket("bucket_name") total_size = 0 for key in bucket.list(): total_size += key.size |
Answer:
There doesn’t seem to be direct call to do that. You can iterate through the keys and sum up.
1 2 3 4 5 |
bucket = conn.get_bucket(self.container) size = 0 for key in bucket.list(): size += key.size |
This should be used only if the bucket has a small number of keys and the calculation is not performed very often.
Check this (Not Boto) for a more useful option.