This project is read-only.

NTFS: bad performance for folders having a large number of entries



I noticed this issue on a folder having 900k+ files when calling NtfsFileSystem.GetFiles.

The culprit is method FilterEntries because it uses List<T>.RemoveAt(index). Internally, this calls Array.Copy. If this is done multiple times (based on the NtfsOptions) then the operation becomes very slow.

I attached a patch to fix this.

file attachments


dotnet4fun wrote Jan 13, 2015 at 6:26 PM

The attached patch is only a minor improvement of the issue. I was under the impression it fixes it, but I was wrong. I'll continue to investigate this issue.

dotnet4fun wrote Jan 16, 2015 at 3:18 PM

After I spent some time investigating, I can conclude this is by design. There is not much to do to improve performance when folders contain this large number of files. However, the user experience can improve if the caller has control over the folder content iteration, instead of being blocked while the iteration completes internally. Please consider the attached patch which provides 2 new methods on IFileSystem:

IEnumerable<string> EnumerateDirectories(path, searchPattern, searchOption)
IEnumerable<string> EnumerateFiles(path, searchPattern, searchOption)

The patch implements them for NTFS file system only.