Open avogelba opened 10 years ago
Original comment by Oliver (Bitbucket: assarbad, GitHub: assarbad):
The problem with huge file systems is that it gets kept in memory. That's the issue. I've worked on a prototype (non-public) using SQLite3 as both on-disk and in-memory database which would solve two problems at once: this one about file systems having too many entities to keep in memory at once and storing scan results.
I haven't come up with the ideal DB schema so far, I guess. But if someone wants to give me a hand, I'm all for it.
Original comment by Michael Oliver (Bitbucket: moliverhalon, GitHub: Unknown):
There is a great program called Clarity Now that will work. It is expensive but you can get a free trial to demo. Super fast as well.
Original comment by Ben Small (Bitbucket: thesmall, GitHub: thesmall):
Thanks for the flurry of updates. I have actually tried to enumerate the data using the WinDirStat program, but did in fact encounter an out-of-memory exception (and excessive freezing, preventing me from knowing how much progress had been made)
Glad to know the issue is being worked on;) Looking forward to a proper 64-bit release.
I wish I knew how many files were on the SAN, but that's exactly the problem; the share is so big that we have no insight into it. The only metric I have is that it's 12TB of data deduped down from 26TB.
Original comment by Alexander Riccio (Bitbucket: alexander_riccio, GitHub: Unknown):
If I remember correctly, the pain-in-the-ass issue blocking 64 bit compilation was in the manifest, which needed a '*', instead of something like "x86".
Original comment by Alexander Riccio (Bitbucket: alexander_riccio, GitHub: Unknown):
"The limits to windirstat actually have to do with the number of files being scanned rather than the size of the files. I was able to scan a 120TB volume but it crashed around 60+TB. "
Huh? It's not unsigned integer overflow (ULONGLONG maxes out at 18446744073709551615) - is it just ( ( sizeof(CItem) * number of files ) + the overhead of storing a billion pointers ) > 32bit address space? That'd make sense.
Original comment by Alexander Riccio (Bitbucket: alexander_riccio, GitHub: Unknown):
Friendly open source competition at it's best 😊 ;)
Ben: Yes, I'm like 97% done, but just have been a bit slow about pushing out binaries. There was a (stupid) bug with the "latest" build, where I didn't change an ASSERT(IsWindows8OrGreater) to an if(IsWindows8OrGreater), before calling some windows-8 specific APIs, immediately at startup, and thus ran fine on my dev machine, but crashed instantly pretty much everywhere else. I'm guessing you're running windows 7/vista?
Otherwise, I'm always improving error messages.
Original comment by Michael Oliver (Bitbucket: moliverhalon, GitHub: Unknown):
Hey Ben. The limits to windirstat actually have to do with the number of files being scanned rather than the size of the files. I was able to scan a 120TB volume but it crashed around 60+TB. Cannot remember how many files that was. You might try giving it a shot with the original windirstat if you have not already.
Original comment by Ben Small (Bitbucket: thesmall, GitHub: thesmall):
Hi Alexander, i followed the link to the altWinDirStat page that has several different versions and builds of a 64-bit version of WinDirStat, but none of the downloads execute successfully. They mostly just crash, but some throw an error and then crash. I have not been able to run any of the program builds even against my own C Drive. I understand that these builds are prereleases, but they seem to be much more like beta due to their non-working nature. Now, I mean no offense, and I am more than happy to assist with bug reports and Event Logs, but I am really in a bind and severely need a program that can enumerate 12TB worth of data without crashing. Do have any idea when altWinDirStat plans to release a version 1.0?
Thanks, Ben
edit: I'm on Windows 7 Enterprise SP1 x64, and am attempting to run the tool against both SANs and SSDs.
Original comment by Alexander Riccio (Bitbucket: alexander_riccio, GitHub: Unknown):
Try my 64 bit build, and lemme know what you get.
Original comment by Chris Quirk (Bitbucket: 9story, GitHub: Unknown):
I just ran into this issue after only 45TB on a 60TB volume (There are many, many folders compared to most filesystems). Same behavior observed, and similar memory usage indicating exhaustion. Great tool otherwise.
Originally reported by: Michael Oliver (Bitbucket: moliverhalon, GitHub: Unknown)
Microsoft Visual C++ Runtime Library error crashes windirstat when scanning large volumes. Looks to be memory related
Able to replicate it by scanning a large 120TB volume. Crashes out arount 60TB, RAM Usage at 1.9GB (see attached png)