Open nblog opened 1 year ago
When my ReFS volume size is 6GB, the first time the Container Table
is parsed in setContainerTableEntries
, it's no longer Container Table Structural, it's been inserted into a new Node
for storage
6GB Volume
5GB Volume
Hey! Thanks for raising the issue and for taking so long to reply.
Hmmm, interesting discovery. I've also tried with volumes that are 10GB and everything worked fine.
My primary suspicion is that the Node and the Checkpoint classes are written poorly. For instance, the convertToLCN() function should not be here. I feel like this is more of a helper function rather than a class-specific one.
Also this I remembered I had to do some really odd stuff to get it to work and I can bet money this could also be the problem.
The thing is that there are lots of stuff that I have to rewrite, and also I'd like to implement tests, but sadly I have no idea how to load the "test material" on GitHub. By test material I mean the actual ReFS file.
I think the internal class should have a reader
and the parameter being limited to bytes
would interrupt reading the data again. So I can't begin to fix the above problem.
Here I describe below my analyzing environment:
Windows 10 Enterprise LTSC 2021 x64
, which itself has ReFS 3.4 enabled
I don't think that the Windows version matters too much, although I will take a look and come back to you with more info.
@1Doomdie1
I'm sorry, I didn't read your message clearly, we're different and it's causing problems!
Allocation unit sizes are not the same,when I select 64K
as well, it works fine
My output message:
Ah yes, I haven't implemented a stable version for volumes that have the cluster size smaller than 64kb. I'll have to work on that. Totally sliped my mind, sorry.
when my volume length is greater than
5GB
, it leads to acontainer table
exception, and since I don't know much about the ReFS filesystem, I can only guess if it's because of multipleMinistores
nested in thecontainer table
?