Open tromer opened 8 years ago
There are two things that are machine-dependent:
There is the easy-to-fix thing about lengths: it might be the case that 32 * (number of 32-bit limbs) != 64 * (number of 64-bit limbs), so one needs to trim upper bits appropriately. Similar about the endianness.
Another thing is the Montgomery form, which uses auxiliary modulus that is word-size dependent. It is very plausible that there is a relatively cheap conversion (esp. given that we probably only care about 32 and 64 bit architectures), but we haven't looked into it.
Currently the binary serialization format (BINARY_OUTPUT) is not compatible across machine with different word size. This is documented, but should be fixed.
The format may be endianness-dependent as well.
One place that causes the machine-dependence is the bigint serialization (
operator<<(std::ostream &out, const bigint<n> &b)
andoperator>>(std::istream &in, bigint<n> &b)
insrc/algebra/fields/bigint.tcc
), which just copies the internal array ofmp_limb_t
limbs. This is easily fixed.Are there additional places?