Closed blinxen closed 1 year ago
this is probably caused by this test hardcoding the maximum number of leaf bytes:
However, the actual number is computed from type sizes which are architecture-dependent:
https://github.com/cessen/ropey/blob/387ea30b3520b7bb20c2e04c50d9e4025e81daaf/src/tree/mod.rs#L69
The problem is that we can't access the internal constant in the integration test here. Not sure what the best way to fix that would be. We could replicate the calculation in the test but then the test would fail if we change the internal constant in the future (altough that is already the case so it may be ok)
We can either duplicate code or make the hard coded variable pointer size dependent. For example with #[cfg(target_pointer_width = "64")]
/ #[cfg(target_pointer_width = "32")]
. The second approach would have the benefit of keeping the tests as they are without having to move internal consts around.
Yeah I thought about that too. I think that probably makes more sense for a test. I was worried about other architecture-dependent size differences but this is probably not a concern in practice. This is not C so except for pointer size most types should have the same size everywhere (some crates could technically have cfg(arch) based type definitions but neither smallvec nor Arc do AFAIK)
How was max_leaf_bytes
(1024 - 33) calculated? I guess 1024 is the TARGET_TOTAL_SIZE
but what is 33?
33 is supposed to be the combination of alignment offset, Arc overhead, and smallvec overhead, as seen here:
I put some effort into making the actual amount of memory taken up by each node be a power of two, which ends up being kind of obnoxious for various reasons, and has caused other issues like this before. In retrospect, I'm now suspecting it probably doesn't make much (if any) difference in practice. So at some point we should probably just remove all of that over-complicated logic, and replace it with something much simpler.
To resolve this immediate issue, however, I think it might make sense to actually expose the relevant constants publicly, but mark them to be hidden from documentation, along with adding comments warning that they aren't part of the public API. I've done similar things to expose other internals for testing. For example:
Anyway, I'll take care of this one. @blinxen Once all of these issues you're finding are properly fixed up and everything is working on your end, I'll cut a patch release. Thank you for all of your time on this!
Perfect, thanks for looking into this!
I think it might make sense to actually expose the relevant constants publicly, but mark them to be hidden from documentation, along with adding comments warning that they aren't part of the public API
:+1:
Let's leave this open until the fix is actually landed and confirmed to work.
I just pushed c185ec562eca1670bfdbc0b7bca7f23bbfa42c81, which should in theory fix this. @blinxen Let me know if it actually does on your end.
Works for me!! Thanks for the quick fix!!
Here are the relevant builds:
Awesome! Is everything working now, then? Or should I wait a bit to see if you run into anything else before cutting a release?
There is not much to wait for. I packaged ropey
for Fedora because it is a dependency of helix
. So When I said "When running tests on i686 architecture..", I meant that "when building the i686 arch package for Fedora, the tests fail".
You can go ahead and make a release if you want :D.
Ah, okay. I was assuming you'd want to package an official release, though, rather than a random commit on master. Am I mistaken about that?
I was assuming you'd want to package an official release, though, rather than a random commit on master
That's what I did. I packaged version 1.6.0 but with this test being skipped. It would be great if you could make a release so I don't have to skip the test but it is not really urgent.
Just published Ropey 1.6.1!
When running tests on i686 architecture (32bit), then the assertion
(rope.capacity() - rope.len_bytes()) <= max_leaf_bytes
fails. The test in question is https://github.com/cessen/ropey/blob/master/tests/proptest_tests.rs#L134Here is the full log https://kojipkgs.fedoraproject.org//work/tasks/7930/107477930/build.log
Does anyone (probably the maintainers) know what might cause this?