Please anyone can give the solution for this I'm trying to take gitlab backup to another server, script should delete the old files if the file number in the directory exceeds 15?
Please anyone can give the solution for this I'm trying to take gitlab backup to another server, script should delete the old files if the file number in the directory exceeds 15?
Diclaimer:I've tested the following commands with filenames that contain spaces, but not with filenames that contain newlines. I suspect that they will not play nicely with filenames containing the newline character, and would avoid using them if you suspect filenames containing them may be created.
This approach relies on the ctime of the file so if files have any of their attributes changed, they will appear to be newer than their creation time. Only you can decide whether relying on ctime is applicable in your situation. If you'd rather use mtime, change
-printf "%C+ %p\n"
to-printf "%T+ %p\n"
in thefind
commands.The following command can be issued inside the directory containing your tar files. This assumes that the filenames are all something like
something.tar
. If the filenames are not of this format, the command will need to be modified, either changing the-iname '*.tar.'
to-iname '*.tar.gz*'
if the files are .tar.gz files, or by removing the entire-iname '*.tar.'
string if you just want to operate on any files in that directory, regardless of filename format.If this shows you the oldest files, outside of your 15 file limit, then use the following command to delete those files.
By Way of explanation:
find . -mindepth 1 -maxdepth 1 -type f -iname '*.tar' -printf "%C+ %p\n"
will list all of the files that end in .tar in.
(the current directory) without recursing into subdirectories. It then prints the ctime timastamp followed by a space and the file name with a trailing newline character.| sort -n
sorts the output offind
numerically, so files are listed from oldest to newest (by ctime) .| cut -d ' ' -f 2-
removes the added timestamp thatfind
created, but preserves the order of the files listed bysort
.| head -n -15
trims the bottom 15 items from the output ofcut
.xargs -I{} rm "{}"
runs therm
command on each file, ensuring the filename is not split on whitespace.This can be written as a bash script, with the number of files to retain and the directory on which to operate as variables in the script. It's possible to pass in the directory and file retention count as arguments instead, but I won't cover that here.
If you save this script somewhere e.g.
/home/user/trim_old_gits
and ensure that you have given it executable permissions. The script can be run from the command line by entering:Or from within
/home/user
:As mentioned in Jacob's marvellous python answer, using the
cron
utility would be a good way to ensure that this happens on a regular basis, if it's not crucial that the files be deleted immediately, orinotifywait
if timing is more sensitive.Given the fact that both
ctime
andmtime
are no guarantee that you actually delete the oldest files, depending on what happened to the files in between, the script below deletes the files, exceeding an arbitrary number, inside a given directory.(a.o.) here we can read:
Having said that
According to the
ctime
of a file, the tiny background script below will delete the oldest files if the number of files exceeds a set number. It is yours to decide if that is a usable option in your situation.The script
How to use
keep_latest.py
Test- run the script from a terminal with the path to your directory and the number of (latest) files to keep as arguments:
to keep the latest 15 files in
'/path/to/directory'
If all works fine, add to Startup Applications: Dash > Startup Applications > Add. Add the command:
Other options
The script above is one of the many options. If either
mtime
orctime
would suffice, another option would be to use inotifywait, and make it do the same as the script above, but only if a file is added-, moved- or copied into the directory.If time accuracy (immediate removal of extra files) is not really important, also a command, run by
cron
would be a good option.If either the loop of
inotifywait
or the script above would be more efficient would be object of testing and comparing.Either way, the used resources would be practically none.