moria / crypto-js

Automatically exported from code.google.com/p/crypto-js
0 stars 0 forks source link

Hashing "binary" strings fails with some specific input values #98

Closed GoogleCodeExporter closed 9 years ago

GoogleCodeExporter commented 9 years ago
What steps will reproduce the problem?

Compare the computation of the following HMAC-SHA1 values, each time we compute 
the expected result using openssl, then compare to the value returned by 
CryptoJS, running under firefox 23 :

$ echo -ne "\x01\x01\x01\x01" | openssl sha1 -hmac $(echo -en 
"\x01\x01\x01\x01")
(stdin)= 18a34d69bdecd1ae5ee2b961730383fe47e4b6c4

> CryptoJS.HmacSHA1( "\x01\x01\x01\x01", "\x01\x01\x01\x01" ).toString()
"18a34d69bdecd1ae5ee2b961730383fe47e4b6c4"

=> OK

$ echo -ne "\x02\x01\x01\x01" | openssl sha1 -hmac $(echo -en 
"\x01\x01\x01\x01")
(stdin)= e9a1f1e8f3d47eb24f9b93c656ba3aaa8386d47b

> CryptoJS.HmacSHA1( "\x01\x01\x01\x01", "\x01\x01\x01\x01" ).toString()
"e9a1f1e8f3d47eb24f9b93c656ba3aaa8386d47b"

=> OK

$ echo -ne "\x04\x01\x01\x01" | openssl sha1 -hmac $(echo -en 
"\x01\x01\x01\x01")
(stdin)= 9c9ebc2102936f6934634777db0f686f54fed0b0

> CryptoJS.HmacSHA1( "\x04\x01\x01\x01", "\x01\x01\x01\x01" ).toString()
"9c9ebc2102936f6934634777db0f686f54fed0b0"

=> OK

[..]

$ echo -ne "\x40\x01\x01\x01" | openssl sha1 -hmac $(echo -en 
"\x01\x01\x01\x01")
(stdin)= 2ca00497b0cd85b5a96a8d828dab402b462d8417

> CryptoJS.HmacSHA1( "\x04\x01\x01\x01", "\x01\x01\x01\x01" ).toString()
"2ca00497b0cd85b5a96a8d828dab402b462d8417"

=> OK

$ echo -ne "\x80\x01\x01\x01" | openssl sha1 -hmac $(echo -en 
"\x01\x01\x01\x01")
(stdin)= 9fc8f16415708535bff5a74f0ae44e95c57be286

> CryptoJS.HmacSHA1( "\x04\x01\x01\x01", "\x01\x01\x01\x01" ).toString()
"70f8cb0773819f13c442e64efa01583e7d5e611d"

=> FAILURE !

What is the expected output? What do you see instead?

Expected result : same hash as openssl for all values.

Through more extensive testing it appears that as soon the most significant bit 
is set in any byte of the input payload, then the hashed value is incorrect. 
This feels like a signed vs unsigned integer issue.

What version of the product are you using? On what operating system?
CryptoJS 3.1
Firefox 23

Please provide any additional information below.

Original issue reported on code.google.com by Florian....@gmail.com on 27 Sep 2013 at 3:19

GoogleCodeExporter commented 9 years ago
There are some minor errors in my copy/paste of example commands above, for 
each example the payload in the openssl command line should obviously match the 
payload used in cryptoJS

Original comment by Florian....@gmail.com on 27 Sep 2013 at 3:25

GoogleCodeExporter commented 9 years ago
The issue is that JavaScript strings are *not* binary data. They are UTF-16 
characters. You need to decide how you want to convert those characters to 
bytes. For example:

var binaryDataRepresentedAsWordArrayObject = 
CryptoJS.enc.Latin1.parse("\x80\x01\x01\x01");

If you haven't converted the string to bytes, and you just pass in the string 
itself, then CryptoJS will use UTF-8 by default. That's why you get different 
results when the most significant bit is set, because the character \x80 is 
represented by 2 bytes in UTF-8.

Original comment by Jeff.Mott.OR on 27 Sep 2013 at 5:47

GoogleCodeExporter commented 9 years ago
Hi Jeff,

Thank you for your quick reply.

For what it's worth I think the front page documentation is a bit lacking on 
the subject of input parsing.

Since all examples use simple strings, I think this kind of subtle issue would 
deserve at least a heads up in the "Cipher Input" section.

Thank you again,

Florian

Original comment by Florian....@gmail.com on 27 Sep 2013 at 6:04

GoogleCodeExporter commented 9 years ago
For now it's described in the hasher input section 
(https://code.google.com/p/crypto-js/#The_Hasher_Input). Though, perhaps I'll 
try to expound more on the topic.

Original comment by Jeff.Mott.OR on 27 Sep 2013 at 6:08