I would like to create a new standard of two things:
how long would be generated and be rotated the logs generated by my applications ?
how to transfer the logs to Amazon S3, as a backup server ?.
I was thinking in use logrotate, to rotate and compress my daily files this way:
{filename}-{year}-{month}-{day}-{r-months}.gz
The r-months
variable means remain-months
, for how many months the file should remain in S3, files older than that should be removed.
A friend of mine, give the idea I should compress the logs daily
(in the new format proposed above) , after that these files should be sent to our bucket in Amazon S3.
Then files older than 7 days should be removed by logrotate
(cause they are in S3 already).
Nowadays, our applications use log4j
and others to generate logs.
1) Should we disable the versions logs, generated by our application and handle only with logrotate ?
2) In your opinion did you think this could crash some application ?
3) This new format of log, is a good one ?
4) And how send the files to S3 ? Now, I'm using s3cmd
, did you recommend me another tool ?
Regarding your question #4:
You can mount S3 bucket as a local partition and work with S3 files as they were located on your server's filesystem. There are a number of good Open Source tools available.
But from my side I would recommend you to take a look at my project: RioFS, a userspace filesystem to mount Amazon S3 buckets. Project's goals and the main advantages comparing to other similar tools are: simplicity, the speed of operations and a bugs-free code.
Currently the project is in the “beta” state, but it's been running on several high-loaded fileservers for quite some time (RioFS provides there an access to S3 located files to ftp / sftp servers).
We are building a community around our project and are seeking for more people to join our project to discuss future plans and help with the testing. From our side we offer a quick bugs fix and will listen to your requests to add new features.
A quick how-to:
You could mount a bucket using the following command (assuming you have installed RioFS and have exported AWSACCESSKEYID and AWSSECRETACCESSKEY environment variables):
riofs http://s3.amazonaws.com your_bucket_name /path/to/localfolder
(please refer to project's description and run
riofs --help
to get help with command line arguments)Please note that the project is still in the development, there could be still a number of bugs left. If you find that something doesn't work as expected: please fill an issue report on the project's GitHub page.
Hope it helps and we are looking forward to seeing you joined our community !