By what method are most system administrators managing the automatic cleanup of hundreds of thousands of old files across a wide range of folder locations?
These folders are located across the enterprise on many servers. I'm looking to manage their automatic cleanup by describing each location and the specific rules which govern their cleanup.
Such rules might be the age of file, based on date created, last change, or last-modified dates, the size of file, naming convention of folder or filenames.
It would be ideal to set triggers to invoke the cleanup without manual intervention, such as disk free space or percent, or simply a periodic cleanup.
I don't. None of the criteria you mentioned--"age of file, based on date created, last change, or last-modified dates, the size of file, naming convention of folder or filenames"--are adequate to evaluating whether or not a file is "valuable." For example, your script based on date/last modified could delete Marketing's important promotional video but leave the iTunes library someone thought it would be clever to hide nearby. For similar reasons, you can't just delete all MP3 files because Marketing might be creating MP3s for legitimate promotional purposes.
The only way to judge whether or not a file is worth retaining is for a human being to make that determination, and in the case of user files I'm not the best judge. The user is.
Push that task back on the people who are the "experts" on those files: the people who created them.