I have a few S3 buckets that I want to hand access over to another organisation, as am handing over an existing hosting client to them.
After looking through the docs, if I no longer want to be responsible for the bucket, it seems the simplest option is to copy the contents of the new bucket across to a new bucket controlled by the new organisation, and make any existing apps write files to the new bucket from now on.
The bucket isn't very big:
aws s3 ls --human-readable --recursive --summarize s3://some-client-bucket
# (snip… lots of files listed, all less than 10mb)
# Total Objects: 22764
# Total Size: 2.4 GiB
But some the bucket is versioned, and I have daily snapshots of some files going back the last year which I also want to be able to transfer across.
Is there a straightforward way to do this?
I understand how I might copy the current contents of a bucket to to new one controlled by another organisation after they grant me access using something like:
aws s3 sync s3://some-client-bucket s3://new-client-bucket --recursive
However, I don't think this will move across the versions too, and I'm I've been relying on S3's per file versioning rather than timestamping the files myself.
Do I have to jerry-rig some script to:
- download each versioned file
- rename it with a timestamp
- upload it to the new bucket
Or is there some nifty extra feature in S3 to do this automagically for me?
This one is an old question, but I just happened to come against the same requirement. Today there is a fairly easy way to copy all files from one versioned bucket to another. I came up with the following PowerShell script to do it: