Open infinisil opened 3 years ago
For posteriority, here's the patch:
diff --git a/src/Data/HexString.hs b/src/Data/HexString.hs
index 8b1c276..3caebcf 100644
--- a/src/Data/HexString.hs
+++ b/src/Data/HexString.hs
@@ -12,7 +12,7 @@ import Data.Aeson
import Data.Word (Word8)
import qualified Data.ByteString as BS
-import qualified Data.ByteString.Base16 as BS16 (decode, encode)
+import qualified Data.ByteString.Base16 as BS16 (decodeLenient, encode)
import qualified Data.ByteString.Lazy as BSL
import qualified Data.Text as T
@@ -52,7 +52,7 @@ fromBinary = hexString . BS16.encode . BSL.toStrict . B.encode
-- | Converts a 'HexString' to a 'B.Binary' value
toBinary :: B.Binary a => HexString -> a
-toBinary (HexString bs) = B.decode . BSL.fromStrict . fst . BS16.decode $ bs
+toBinary (HexString bs) = B.decode . BSL.fromStrict . BS16.decodeLenient $ bs
-- | Reads a 'BS.ByteString' as raw bytes and converts to hex representation. We
-- cannot use the instance Binary of 'BS.ByteString' because it provides
@@ -62,7 +62,7 @@ fromBytes = hexString . BS16.encode
-- | Access to the raw bytes in a 'BS.ByteString' format.
toBytes :: HexString -> BS.ByteString
-toBytes (HexString bs) = (fst . BS16.decode) bs
+toBytes (HexString bs) = BS16.decodeLenient bs
-- | Access to a 'T.Text' representation of the 'HexString'
toText :: HexString -> T.Text
Might as well make HexString
a newtype
For posteriority, here's the patch:
diff --git a/src/Data/HexString.hs b/src/Data/HexString.hs index 8b1c276..3caebcf 100644 --- a/src/Data/HexString.hs +++ b/src/Data/HexString.hs @@ -12,7 +12,7 @@ import Data.Aeson import Data.Word (Word8) import qualified Data.ByteString as BS -import qualified Data.ByteString.Base16 as BS16 (decode, encode) +import qualified Data.ByteString.Base16 as BS16 (decodeLenient, encode) import qualified Data.ByteString.Lazy as BSL import qualified Data.Text as T @@ -52,7 +52,7 @@ fromBinary = hexString . BS16.encode . BSL.toStrict . B.encode -- | Converts a 'HexString' to a 'B.Binary' value toBinary :: B.Binary a => HexString -> a -toBinary (HexString bs) = B.decode . BSL.fromStrict . fst . BS16.decode $ bs +toBinary (HexString bs) = B.decode . BSL.fromStrict . BS16.decodeLenient $ bs -- | Reads a 'BS.ByteString' as raw bytes and converts to hex representation. We -- cannot use the instance Binary of 'BS.ByteString' because it provides @@ -62,7 +62,7 @@ fromBytes = hexString . BS16.encode -- | Access to the raw bytes in a 'BS.ByteString' format. toBytes :: HexString -> BS.ByteString -toBytes (HexString bs) = (fst . BS16.decode) bs +toBytes (HexString bs) = BS16.decodeLenient bs -- | Access to a 'T.Text' representation of the 'HexString' toText :: HexString -> T.Text
This works for me. I was building passveil
and I had to change the versions in passveil.cabal
(due to a zillion of dependency issues related to versions), but then it failed to build hexstring
. This solved it for me. I am using ghc 9.2.1
, with base-4.16.0.0
, and hexstring-0.11.1
.
I've made a public repo with @infinisil 's patch. If you use Stack, you can get it with
extra-deps:
- git: https://github.com/reach-sh/haskell-hexstring.git
commit: 085c16fb21b9f856a435a3faab980e7e0b319341
in your stack.yaml
This library depends on base16-bytestring, which in version 1.0 changed the
decode
function fromByteString -> (ByteString, ByteString)
toByteString -> Either String ByteString
, which newly can error. The replacement for the previous behavior ofdecode
(which never failed), seems to bedecodeLenient
, for which I have a fix prepared here.I suggest to add a version constraint for
base16-bytestring < 1.0
to all current versions, and to release a new version that includes a fix for this, with a constraint forbase16-bytestring >= 1.0
then.