Do anyone know some good way to delete files on remote server that are older than X days using just SCP/SFTP?
Sure I can write some script on perl etc but I feel it's overkill.
Any UNIX way?
Oneliner?
Separate utility?
Thanks
P.S. The task is to delete some outdated backup files.
You don't need a script to achieve the intended effect - a one-liner will do if you have shell access to send a command:
-mtime +7
matches files created one week ago from midnight of the present day.This question is very old but I still wanted to add my bash only solution as I was just searching for one when I came here. The grep tar in the listing command is just for my own purpose to list only tar files, can be adapted of course.
This deletes all tar files in the given directory except the last 7 ones. It is not considering the date though but if you only have one backup per day it is good enough.
If you insist on SCP/SFTP you can list files, parse them using a simple script and delete old backup files.
Batch mode "-b" switch should help you out. It reads sftp commands from file. http://linux.die.net/man/1/sftp
Nothing worked for me from above answers.
Especially when you limited by password and cannot use private key for sftp utility.
I have found good script using lftp (gist).
You need to uncomment
# STORE_DAYS=6
to specify count manually.