Closed kleinreact closed 1 year ago
Isn't this just a consequence that you should never really write a custom BitPack
instance by hand, or rather if you write it by hand it must match what Clash generates internally? In general you should only derive them or use the TH annotations to generate custom ones. Does the issue go away if you do:
import Clash.Annotations.BitRepresentation
data T = A | B deriving (Generic)
{-# ANN module (DataReprAnn
$(liftQ [t|T|])
1
[ ConstrRepr 'A 0b1 0b1 []
, ConstrRepr 'B 0b1 0b0 []
]) #-}
If so this is not really specific to the ILA but a general Clash thing.
Adding a DataReprAnn
fixes the problem as well, but this means I have to keep the BitPack
instance and the DataReprAnn
in sync. I prefer explicitly calling pack
then instead.
It's just that I need the BitPack
instance:
BitPack
for that larger structure. I cannot use a DataReprAnn
on that record anyway.BitPack
instance (to ensure consistency with respect to the encoded data).If so this is not really specific to the ILA but a general Clash thing
It should be a solvable problem for the ILA blackbox at least, because there we can automatically add the pack
conversion when passing the arguments to the ILA. Or we explicitly note in the documentation, that BitPack
is not used. Although, it's especially usefull for debugging purposes.
You can also derive a BitPack
instance using deriveBitPack [t| T |]
This takes the DataReprAnn
into account.
I would really not go the route of having non-conforming BitPack
instances. There are probably more things that will break in that case.
It would be nice if we documented this better, but indeed, I also think you can't just create a BitPack
instance to influence encoding. You just get a schism between Haskell and HDL, and inconsistencies in HDL depending on whether the type itself is used or it is passed through pack
. Several functions in our Prelude do that. If I take the code from the first post and add
topEntity :: T
topEntity = A
this generates the following Verilog:
assign result = 1'd0;
If you want to influence the encoding of a type, you'll need DataReprAnn
and friends and only after that derive the BitPack
instance if you need one, with deriveBitPack
. Watch out for bug #2401 though. In general, custom data representations are an area that hasn't seen a lot of use and might have issues.
So the problem is you're approaching it in reverse. BitPack
needs to tell the Haskell simulation what the HDL encoding is. It allows no creativity. The direction of a correct approach is the other way: first influence the HDL encoding, and then supply a BitPack
that conforms to that.
Thanks for the feedback. I never considered the purpose of BitPack
being that strict.
Maybe that's a good point then to be added to the documentation for Clash.Class.BitPack
.
I think PR #2575 perhaps clarifies this well enough.
The ILA blackboxes of
clash-cores
use a polyvariadic interface, hency any value can be dumped into an ILA. Now consider some data type with a customBitPack
instances, such asNote that this instance intentionally deviates from the default produced by deriving
BitPack
, which would assignsA
↦ 0 andB
↦ 1.Now, if we pass a value
x :: T
directly into the ILA, then the ILA still used the implicit encoding as given by the derivedBitPack
instance, e.g. it assignsA
↦ 0 andB
↦ 1. Only if we passpack x :: BitVector (BitSize T)
instead, then it assignsA
↦ 1 andB
↦ 0.This can be very confusing, especially if the custom
BitPack
instance only slightly differs against the default, making a wrongly produced ILA dump value hard to catch.As an ILA user, I just would assume that the ILA automatically uses the custom
BitPack
instance, if it exists.The same probably also applies to VIO blackboxes, but I haven't explicitly tested it.