Open iamtxena opened 3 years ago
Similar issue reading pkcs12.
Bag values (Subject & Issuer CommonName, for example), are encoded in latin1, easily translatable to utf8.
However, cert's FriendlyName is already in utf8, but incorrectly. I.e. "Èeská Republika" instead of "Česká republika"
I had the same problem trying to make PKCS#7 encryption.
My solution was after loading the certificate loop through issuer fields, read it as ASCII and write it back as UTF-8 like that:
certificate.issuer.attributes.forEach(element => { element.value = Buffer.from(Buffer.from(element.value, 'ASCII')).toString('utf-8'); });
The problem with certificateFromPem() function is that it uses the obsolete method of reading base64 see:
var msg = { type: type, procType: null, contentDomain: null, dekInfo: null, headers: [], body: forge.util.decode64(match[3]) };
I hope it helps.
Hello, I am trying to generate a CMS signature and my code is working when using a x509 certificate with no special characters. However, when using one that contains accents in the issuer attributes. It fails the verification because the issuer name does not match.
This is because the resulting issuer text encoding is not properly done.
I attach the given certificate that fails. Just rename it as
.pem
certificate.txtWhen using
openssl
to read this certificate:As you can see the CN contains two UTF-8 characters
\C3\A0
to represent theà
character.When reading the pem certificate using this library (
"node-forge": "^0.10.0",
) and printing the issuer I got it wrong:And when using the node crypto v15.10 it prints the expected well coded string:
To pass the validation, I have to use the issuer parsed from node crypto.
Is there a way to force this library to code correctly the given string as UTF-8?