Other information
Not the discrepancy with the following Interval literals. Decimal values are encode using the .I arguments
while BigInt and Int values are interpreted according to the .I arguments. I think this is likely to create confusion
though the cat may already be out of the bag on this one.
What is the current behavior?
When creating Interval literals the behavior is different for BigInts, Longs, and Ints, than it is for Double and BigDecimal.
In the former case the bits of the number are interpreted according to the paramters of the cap-I conversion while the latter the number is the supplied
What is the expected behavior?
Seems to me that the behavior should be the same as Double and BigDecimal
Please tell us about your environment:
Chisel3 3.3
Type of issue: bug report | documentation
Impact: Possible API modification
Other information Not the discrepancy with the following Interval literals. Decimal values are encode using the .I arguments while BigInt and Int values are interpreted according to the .I arguments. I think this is likely to create confusion though the cat may already be out of the bag on this one.
import chisel3._
What is the current behavior? When creating Interval literals the behavior is different for BigInts, Longs, and Ints, than it is for Double and BigDecimal. In the former case the bits of the number are interpreted according to the paramters of the cap-I conversion while the latter the number is the supplied
What is the expected behavior? Seems to me that the behavior should be the same as Double and BigDecimal
Please tell us about your environment: Chisel3 3.3