Closed JJTech0130 closed 1 year ago
I tried printing out what you were adding to the hash (with the raw bytes function), and I got this hexdump:
Not sure what all the special values that aren't characters are? This left me more confused than before.
Ah, I see, it's the lengths It's starting to be a little bit clearer...
With key sorting you're refering to the attribute keys?
attrKeys = node.keys()
# Attributes need to be sorted
attrKeys.sort()
# TODO Implement UTF-8 bytewise sorting:
# "Attributes are sorted first by their namespaces and
# then by their names; sorting is done bytewise on UTF-8
# representations."
for attribute in attrKeys:
....
I don't have any code to sort the actual attributes in the XML itself, so I don't think that that's necessary. Adobe probably sorts the XML themselves before comparing it to the hashed and signed value.
The bytes that aren't text are either the string length, or the element type:
ASN_NONE = 0
ASN_NS_TAG = 1 # aka "BEGIN_ELEMENT"
ASN_CHILD = 2 # aka "END_ATTRIBUTES"
ASN_END_TAG = 3 # aka "END_ELEMENT"
ASN_TEXT = 4 # aka "TEXT_NODE"
ASN_ATTRIBUTE = 5 # aka "ATTRIBUTE"
I think I understand it now, but I don’t get how the elements are sorted. They must be sorted, right? Otherwise the hashes would not match.
Specifically, I'm confused on how child elements are sorted.
In your code, it appears that it is sorted like this: fingerprint
, deviceType
, clientOS
, ... targetDevice
?
Which is basically not sorted at all? So how do you determine the order they should be in?
I implemented basic alphabetical sorting, but it doesn't seem to match the output of your plugin:
(strings
passed to grep -v ns.adobe.com
then hand-formatted)
activate(requestType: Initial):
clientLocale(): en
clientOS(): Windows 8
clientVersion(): 2.0.1.78765
deviceType(): standalone
expiration(): 2022-07-12T13:42:45Z
fingerprint(): iJYuMUeN9R8vr2wJT762Wo+ayxo=
nonce(): EZLPlo7XpgPCqERP
targetDevice():
clientLocale(): en
clientOS(): Windows 8
clientVersion(): 2.0.1.78765
deviceType(): standalone
fingerprint(): iJYuMUeN9R8vr2wJT762Wo+ayxo=
productName(): ADOBE Digitial Editions
softwareVersion(): 9.3.58046
user(): -urn:uuid:9ff48d98-40d5-46e3-a50c-ebe57a5aa8c7
Here is the implementation I have so far: https://github.com/JJTech0130/kodobe/blob/master/adobe/util/asn1.lua
I'm thinking now that they're just in the order that you added them to the XML, and that it just has to match the XML? I'll have to do some minor rewriting/wrapping, as the XML library expects it to be unsorted, so I have to fix that.
The comment about things being sorted only applies to attributes, not to tags / elements. These indeed just have to be hashed in the same order they appear in the XML.
If they just have to be the same, but not any particular order, I might be able to get away with just slightly modifying the XML builder to output the elements alphabetically. The way the tables are structured now, all order metadata is lost and they are simply in a random order, so I can't just use the order I added them in.
So, I think I got it all working, but when I send it to adobe I get:
<error xmlns="http://ns.adobe.com/adept" data="E_AUTH_USER_AUTH http://adeactivate.adobe.com/adept/Activate urn:uuid:80194bf2-87a1-4098-9df2-a408b43e46d4"/>
Do you know what causes this error?
Nope. Didn't even know that error existed and it's not in my error list.
If you think your signing code might be wrong, maybe take the XML you've sent and run it through my Python code and see if that generates the same hash.
Well, I just checked, and they do generate the same hash! I guess it must be my pkcs12 signing code that's buggy? That or the way the request is constructed...
Here's the XML, just in case you can spot any glaring errors:
<?xml version="1.0"?>
<adept:activate requestType="initial" xmlns:adept="http://ns.adobe.com/adept">
<adept:clientLocale>en</adept:clientLocale>
<adept:clientOS>Windows 8</adept:clientOS>
<adept:clientVersion>2.0.1.78765</adept:clientVersion>
<adept:deviceType>standalone</adept:deviceType>
<adept:expiration>2022-07-12T17:53:36Z</adept:expiration>
<adept:fingerprint>Agx6h6Y2cWEwI2RkAJe47ZvQu2g=</adept:fingerprint>
<adept:nonce>o6Csc7ZmOyn19saV</adept:nonce>
<adept:signature>AK61NX2z4U0Si9wpCrIEy7CzVYnWnnNe5Wk4JkqEG/QGuDespS2yXQ+LZrBHN50Cd7T8MK0jx9xCxFiurXkHvKOkp0RcnAieUJngneeygpZP0snv8OswstgkkhcPzZ2vFmzQ+0Dwu5McFM8CeHHqVZ8ZzgwjYJxa7sXcC7B6DBFY</adept:signature>
<adept:targetDevice>
<adept:clientLocale>en</adept:clientLocale>
<adept:clientOS>Windows 8</adept:clientOS>
<adept:clientVersion>2.0.1.78765</adept:clientVersion>
<adept:deviceType>standalone</adept:deviceType>
<adept:fingerprint>Agx6h6Y2cWEwI2RkAJe47ZvQu2g=</adept:fingerprint>
<adept:productName>ADOBE Digitial Editions</adept:productName>
<adept:softwareVersion>9.3.58046</adept:softwareVersion>
</adept:targetDevice>
<adept:user>urn:uuid:ec034af5-f793-408b-b5cc-11207a149557</adept:user>
</adept:activate>
I just checked my logs from when I started developing this plugin and I did indeed run into E_AUTH_USER_AUTH
, too, after I finished implementing the hashing code.
In my case this issue was caused by buggy signing code - this really needs to be a raw signature of just the hash, without any additional hashing or algorithm identifiers. Usual signing code (I tried OpenSSL's RSA_private_encrypt
and Python's PKCS1_v1_5.sign()
) doesn't do that, which means I had to use a special, low-level RSA python library and later re-wrote the complete native RSA algorithm which you can see in the customRSA.py
file.
For your testing, you can see a payload and its correct signature inside the function test_sign_node_new()
in tests/main.py
. When encrypting that payload (34 52 e3 ...) with the mock key that's included there ("MIICdAI..."), your code needs to generate this exact signature ("RO/JmWr..."). If it doesn't, then you're probably not perfoming raw RSA.
Is your test key a PEM encoded private key? Or is it PKCS12? Does it have a password? I'm getting parsing errors trying to use it.
Also, unrelated, but I noticed you're emulating Windows 8 and ADE 2.0.1, which I believe is not something a legit ADE will ever send. As far as I know, ADE 2.0.X and lower only register as up to Windows Vista, only ADE 3+ use Windows 8 as OS.
Is your test key a PEM encoded private key? Or is it PKCS12? Does it have a password? I'm getting parsing errors trying to use it.
Ah, nvm, I got it to work. It's just a raw key, was going through the extra PKCS12 step by accident lol
Looks like the signing is the issue. I'm getting PiW/monqj0mzTvyRbGORQ4UaK58WdusYTRbQxOb9V4yMpOD4JEbGvEcmGrxavi7NP7RtJg2q8IJrF13B5imWChNPDk2ozkDeWZrA4Vr9myHkbHyGZIclHEkNlmWF4ImnJwKthaxYLoQa6idTxhpOUE0rASm6H1cxIUbKkbft/3c=
as the sig...
Yeah, I took a look at your code and you're using openssl's evp_pkey:sign()
for the actual signature. That's going to be the exact same issue I had with OpenSSL - if the signature function you're using is asking you for an algorithm (like "sha1" in your case) then it's going to be the wrong encryption / signing code. Raw RSA doesn't need to know the signing algorithm.
You might need to do what I did and basically translate the CustomRSA.py into LUA.
I'm trying to use sign_raw now, but I'm getting an error:
pkey:asymmetric_routine EVP_PKEY_sign: rsa/rsa_none.c:23:error:0406B07A:rsa routines:RSA_padding_add_none:data too small for key size
Here's my code
-- calculate SHA1 hash
local sha1 = digest.new("SHA1")
sha1:update(data)
local hash = sha1:final()
print("HASH: " .. util.base64.encode(hash))
-- sign hash with no padding
local sig, err = key:sign_raw(hash, pkey.PADDINGS.RSA_NO_PADDING)
-- catch errors & return
if err ~= nil then error(err) end
return util.base64.encode(sig)
Hm, looking at some docs, sign_raw might actually do what you need. Though the message (= the hash) will need to be padded for encryption as RSA only works on messages that are the same length as the key (that's why you're getting the error)
I don't know what other padding modes your library supports, but the one you need is implemented in pad_message
in customRSA.py. Maybe you can just try all available ones. Or, if necessary, pre-pad the message yourself using my code. Unfortunately I have no idea if the padding algo in my code has a particular name, and I can't find definitions for all the different algorithms.
Well, I tried some of the methods in the list it said it supported, but apparently the version of OpenSSL I'm using doesn't support them? Because it's returning errors for some of them. I tried all the ones that worked, and none returned the correct sig...
According to this dude at Stackexchange it should be PKCS1 v1.5 signature padding. At least the graphic looks identical to the byte representation of my padding. Which is interesting, because that padding didn't work for me when I tried using it. If that doesn't work for whatever reason, you'll have to try to implement it yourself. Shouldn't be too hard, it's just like 5 lines of Python code.
Yeah, perhaps I'll implement the padding myself. It has PKCS1 padding, but it's not coming out the same...
Well, I implemented it, and now I'm getting ZJ4G6KdxOcPq8hJrQ2QAP8sAIAwijC+XqfRkzPfb3nGvx/AnODScUUSdtNECS34OrDVihtTMfw2qVl0iWOxnpT1bZDD1BB04i81K0qJvf/EQbUlBaRmMGW//EnFWyIQYRdKACcLWC9UGddCL28r4UtDXZwkd9a6CpcDnWVn5rX4=
...
I pushed my code in case you want to see it...
Here's the hexdump of the padding:
00000000 00 01 ff ff ff ff ff ff ff ff ff ff ff ff ff ff |..??????????????|
00000010 ff ff ff ff ff ff ff ff ff ff ff ff ff ff ff ff |????????????????|
*
00000060 ff ff ff ff ff ff ff ff ff ff ff 40 6c 4c 22 72 |???????????@lL"r|
00000070 5b d7 2f 80 3b bb 21 92 27 28 d8 cb d5 60 9d 00 |[?/.;?!.'(???`..|
00000080
Okay, I see two mistakes in that padding: A) You implemented "00 01 PADDING DATA 00", but it needs to be "00 01 PADDING 00 DATA". B) Why is the data 40 6c 4c ... and so on? Assuming you're using the data from my test code it should be 34 52 e3 ... and so on.
1) Just noticed and fixed that, thanks! 2) 'Cuz it's the SHA1 hash? Was it not supposed to be 🤦♂️
Ok, so the regular OpenSSL PKCS1 padding does work then! I was just SHA1 hashing it first, 'cuz I didn't realize I wasn't supposed to lol
No, the payload_bytes
array in the test code already is the SHA1 hash of the XML. See test_hash_node()
, that's the test code that takes an XML as input and returns that 34 52 e3 ... SHA1 hash.
So, are you now getting the correct signature with my test key and test payload, or is there still an issue?
Do you still get the E_AUTH_USER_AUTH
error with the new code?
I get the correct sig with the test key + payload, but same error
Interesting, so there must be something else that's wrong ...
OK, so I took a break for a bit, and tried checking the hashes again, and they were different?! But then I had them output the ASN data and it was the same... so my SHA1 hashing function is broken?!
Uhhhhh... I made have made a bit of a mistake lol: I was hashing the wrong variable lol
IT WORKS!!! I have successfully activated it with Adobe!
Forgot to close the issue. It works now! I'll open another issue if I run into more problems later...
Sorry to bother you again, but I'm a little confused. In order to verify that the activation request came from the user you claim to be, you sign the SHA1 of the request with your auth key, correct? The only thing I don't get is the key sorting: do you need to sort the XML you send to adobe, or only sort before you sign it (and still send the unordered version to adobe, who re-sorts it on the other end) The reason I ask is that when I'm signing the keys, I could sort them pretty easily, as they're parsed and I can move them around, etc. The problem is that the XML builder I'm using doesn't support sorting, so in order to sort them in XML form I would have to modify it significantly.