Closed TheCardinalSystem closed 3 months ago
I was not able to reproduce the problem with the code example you provided. You'll need to provide more information 🤔 you might want to double check how you are receiving and extract the token
https://github.com/prince-chrismc/jwt-cpp/actions/runs/8731222077/job/23956285054?pr=40#step:9:135
Invalid input: not within alphabet
This is a error thrown by the base64 decoding. Make sure you are passing a valid token.
I was not able to reproduce the problem with the code example you provided. You'll need to provide more information 🤔 you might want to double check how you are receiving and extract the token
https://github.com/prince-chrismc/jwt-cpp/actions/runs/8731222077/job/23956285054?pr=40#step:9:135
Interesting. It works with a string literal, but the same token fails when it's provided via IP packet. It looks like there's nothing wrong with your code, so I will investigate my own code further. Sorry for the pointless issue 😛
but the same token fails when it's provided via IP packet.
Make sure you don't have extra bytes at the front or back of the string. The base64 decoding doesn't run strlen, it relies on the string you give it. If you have a nullbyte at the end it will give that error but the byte won't show in e.g. a terminal.
but the same token fails when it's provided via IP packet.
Make sure you don't have extra bytes at the front or back of the string. The base64 decoding doesn't run strlen, it relies on the string you give it. If you have a nullbyte at the end it will give that error but the byte won't show in e.g. a terminal.
It was an issue converting from a C string to an std::string
. Thanks for the tip!
Glad we could help!
I am trying to decode a JWT so I can verify it, but I keep getting a verification exception with the message "Invalid input: not within alphabet". Here is how to reproduce:
I verified the token's integrity here.