The issue is that any modification to the directory locks up Explorer indefinitely, though Samba access to other directories still works. I've tried moving files locally and over Samba.
Even enumerating the directory to get the list of files locks up the computer indefinitely.
I tried using Python's win32file.FindFilesIterator
to iterate the files but that also hangs.
My idea was to move each file to a different directory (in a directory above the directory we're dealing with) based on its timestamp, so that we'd have at most a thousand or so files in each directory... But since I can't even enumerate the files, that's been a non-starter.
If I have to give up and just nuke the directory I'm willing to do that, but a standard delete also hangs indefinitely.
I have set these two parameters to increase speed and they also did not help the issue:
R:\>fsutil behavior query disablelastaccess
disablelastaccess = 1
R:\>fsutil behavior query disable8dot3
disable8dot3 = 1
These are all sequential images that would have run into the 'bug' with 8.3 filenames whereby many similarly named files in one directory can take a long time to compute 8.3 filenames. From what I understand this data is stored in the file system even after disable8dot3 is enabled, so it may still be contributing to the problem.
Any ideas?
I don't know if it will work, neither if it is practical approach for you, but what about putting the harddisk into a Linux Computer and try it with Linux?
(There are bootable Live CDs for download - you don't even have to physically move the harddisk).
Two things you should not do:
What's happening is that Explorer is accessing each and every one of those files for the information that will, or could, be displayed in the Explorer interface. Additionally, if you have on access antivirus scanning taking place (and if not, why not), each file will be scanned, even if it's only to determine whether or not it is one of the types that should be scanned. All this takes a long time when dealing with so many files.
Short term solution - Use the command line or some form of scripting instead.
Long term solution - Develop a sensible scheme for ensuring you never have more than a few thousand files in any one folder.
Depending on how the files were added and how badly the folder is fragmented the quickest way might even be to move off the files you want to keep and nuke the partition!
Try using PowerShell. Using the Move-Item Cmdlet
HTH
There's always
cmd.exe
and itsdel
command.You will need to reboot for the FSUTIL changes to take effect, just in case you haven't.
The general order of complexity for NTFS functions (Add, Delete, Search) with 8.3 names disabled is O(Log N) so pretty much anything you do is going to take the same length of time once things have gotten to this state.
If you have any AV active on the system then turn it off while you fix this.
Out of curiosity do you get better performance if you copy files out in chunks using something like Robocopy? I suspect that it won't make much of a difference but it should take half the time a delete takes (there's only a search of the directory involved not a search then a delete of the entry). You'll still have the problem of deleting the originals eventually though.
use robocopy with the /mov option
http://technet.microsoft.com/en-us/library/cc733145.aspx
For example I had a folder with 1.2 m email files in it, which would hang explorer. The files were named starting with the date so I created folders for each month of the year and then used robocopy filtering on the date in the filenames.
robocopy e:\sent-old\ e:\sentmail\2013-4 201304*.* /mov