Ok, this is a very practial use case from my point of view.
Let say I have a some simple shell oneliner which does log the output into a file. This can be simply anything, for example tcpdump. Is there any generic and trivial way, to make sure, that the output file won't exceed given size?
Resoning behind this, is to protect from filling the whole available space on the mount point by mistake. If I forget about the script, or it will yield GBs of data per hour, then this simple debugging task can lead to a potential system crash.
Now, I am aware of the options build in some of the tools (like combination of -W/-C in tcpdump). What I need is a very generic failsafe.
Long story short - when I run a script like:
% this -is --my=very|awsome|script >> /var/tmp/output.log
How to make sure that output.log will never get bigger than 1GB.
Script can crash, be killed or whatever.
Solution I am looking for should be easy and simple, using only tools available in popular distros like ubuntu/debian/fedora. In general something widely available. Complicated, multiline program is not an options here regardless of the language/technology.
You can use
head
for this:It accepts K, M, G and the like as suffixes (bytes are the default). Append 'B' to use the base 10 versions.
Set the maximum file size for a user that will only be used to run these scripts.
The file
/etc/security/limits
limits a user with the values of "default" unless there are explicit values for a specific user. These user specific values will overwrite the default values. The file may have a slightly different name depending on your OS.If your log user is named log_maker, then add this line to the file:
log_maker hard fsize 1000000
The number after fsize is the maximum file size in KB.
curtail limits the size of a program's output and preserves the last X KB of output with the following command:
run_program | curtail -s 1G myprogram.log
https://github.com/Comcast/Infinite-File-Curtailer
NOTE: I'm the maintainer of the above repo. Just sharing the solution...
You can limit any file with
Writing the tail directly to the same file makes an empty file. So you need to use a temporary file.
Like Seamus said inside Eduardo Ivanec answer, this questions seens to ask about how to continuously update into a file, like doing a backup from a external journalctl.
| #!/bin/sh
Run the cutter.sh, and keep it running:
./cutter.sh &
Run your script/application pointing to file, ex:
journalctl -f >> myJournal.log