Open TomArrow opened 1 month ago
Hmm judging by the debug outputs, the calculated hash is identical to test_hashes[1] but it's being compared against test_hashes[0].
Looks like the failure has to do with processing of 8-bit characters. Maybe the compiler or VM has issues with handling of signed vs. unsigned char, and in particular with sign/zero extension. Or maybe the hash prefix check (2b
vs. other flavors) or the tests get miscompiled.
I suggest you test with your own input password containing an 8-bit character, and see whether you get the right hash or not.
One specific guess is that the cast on this line might be non-working:
tmp[0] |= (unsigned char)*ptr; /* correct */
You can try writing it differently, e.g.:
tmp[0] |= *(unsigned char *)ptr; /* correct */
or:
tmp[0] |= (BF_word)(unsigned char)*ptr; /* correct */
Seems like
tmp[0] |= *(unsigned char *)ptr; /* correct */
did the trick.
tmp[0] |= (BF_word)(unsigned char)*ptr; /* correct */
on the other hand didn't work, unless I made some mistake.
I'll do more tests tomorrow but thanks a lot already!
Happy to hear we found a workaround @TomArrow!
This sounds like an LCC bug. Is there an upstream for LCC where you could check whether the bug is still present, and report the bug if needed? Could also update LCC in or report to other projects embedding LCC, such as the SDK you're using.
Looks like LCC wasn't actually updated for 10 years, but it seems like Issues are still getting responses, so I created an issue there. This is all very much out of my depth but maybe your findings could help describe the issue, so I linked this issue as well.
I've tested some more strings today including ones with some values between ~161 and 255. Seems to be stable with your fix. Thanks again!
Actually now I've been told it may not be quite the same LCC version. But maybe it's still relevant, so I'll keep that issue open.
Thanks. I suggest you edit the LCC issue's title to "(unsigned char) cast from char ignored".
Thanks. I suggest you edit the LCC issue's title to "(unsigned char) cast from char ignored".
Done.
I understand this may be too obscure of a question for you to take time to look into, but I am trying to compile this code as a Quake 3 VM, which uses an old LCC version.
I did a bit of debug output which gives me this:
(the line numbers wont match, as I added the debug outputs)
The bcrypt settings I'm using are:
"$2b$06$85Jfn/5QaxpFhXqxyIufg4"
When I compile as a normal DLL with MSVC (the Quake 3 engine allows both DLL and Quake 3 VM as an option for modules), everything works fine. Note: The DLL in which it works fine is compiled with MSVC in Debug mode, 64 bit, v142 compiler, BF_ASM and BF_SCALE both 0.
Now the curious (or maybe not?) thing is that the hashing/derivation of the "test2" password seems to actually produce the correct result and the hash is identical to what I get with the dll. It's just the test that fails and subsequently prevents the correct result from being returned, and I'm not sure why. I read the comment mentioning some obscure GCC compiler bug, but I'm not sure this would apply.
This is the source code for the LCC compiler used from my undertanding, if it helps: https://github.com/TomArrow/mvsdk/blob/everything/tools/lcc/lcc.c
This is the code that I use to call everything:
I am including crypt_blowfish.h. I don't really need the gensalt functionality in my use case (I just use one salt for the entire program), so I only included crypt_blowfish.h and crypt_blowfish.c. I made sure that BF_ASM and BF_SCALE are both 0.
Not sure if it matters but I know that this QVM system is limited to 32 bit integers (and pointers) and it doesn't deal well with signed shorts, though the latter just produces a compiler error, so I don't think that's related.
Maybe I just forgot to initialize some global variables or something, no idea. Any help or pointer would be appreciated if you can spare the time.