Closed weucode closed 2 years ago
This is not a change of order, but a result of qubit de-allocation. Qubits are de-allocated in reverse order and then pushed to a free-list. That's why when re-allocating q
in loops with an odd index
, the order will be reversed. You can print Message($"q:{q}");
before applying BigEndian
to see that.
Thanks for your reply! It helps me a lot to understand the qubit allocation. What's more, I have a question when running the following programs. Why is the order same for q1 and q2? Dose it mean I use the wrong way to display the different encoding?
namespace NISLNameSpace {
open Microsoft.Quantum.Intrinsic;
open Microsoft.Quantum.Arithmetic;
@EntryPoint()
operation main() : Unit {
use q = Qubit[3];
Message($"q : {q}");
mutable q1 = BigEndian(q);
Message($"q1: {q1} {q1!}");
mutable q2 = LittleEndian(q);
Message($"q2: {q2} {q2!}");
ResetAll(q);
}
}
The output is:
q : [q:0,q:1,q:2] q1: BigEndian([q:0,q:1,q:2]) [q:0,q:1,q:2] q2: LittleEndian([q:0,q:1,q:2]) [q:0,q:1,q:2]
LittleEndian
and BigEndian
are not functions that change the array. You can better think of them as tags in this case. We create a type LittleEndian
which is wrapping a qubit array but indicating that we interpret it in little-endian order, i.e., q[0] is the least-significant bit, whereas BigEndian
indicates that q[0]
is the most-significant bit. The algorithms need to ensure that this order is respected, but the type checker in the compiler can fail when I pass a LittleEndian
type to an operation that expects BigEndian
, even if they wrap the same qubit array.
Thanks a lot, I have fully understood these two types.
Describe the bug The order of the qubit register changed after applying BigEndian or LittleEndian on it. The output of the following program is shown below. Is this an expected output when encoding a qubit register repeatedly?
To Reproduce
System information operating system : Ubuntu 22.04 LTS
dotnet version : 6.0.400
QDK : 0.25.228311