microsoft / QuantumLibraries

Q# libraries for the Quantum Development Kit
https://docs.microsoft.com/quantum
MIT License
543 stars 179 forks source link

LE and BE will modify the order of qubit register #638

Closed weucode closed 2 years ago

weucode commented 2 years ago

Describe the bug The order of the qubit register changed after applying BigEndian or LittleEndian on it. The output of the following program is shown below. Is this an expected output when encoding a qubit register repeatedly?

q:[q:0,q:1,q:2] q1:BigEndian([q:0,q:1,q:2]) q:[q:2,q:1,q:0] q1:BigEndian([q:2,q:1,q:0]) q:[q:0,q:1,q:2] q1:BigEndian([q:0,q:1,q:2]) q:[q:2,q:1,q:0] q1:BigEndian([q:2,q:1,q:0])

To Reproduce

namespace NISLNameSpace {
    open Microsoft.Quantum.Intrinsic;
    open Microsoft.Quantum.Arithmetic;

    @EntryPoint()
    operation main() : Unit {
        for index in 0 .. 3{
            use q = Qubit[3];
            mutable q1 = BigEndian(q);
            Message($"q:{q}   q1:{q1}");
            ResetAll(q);
        }
    }
}

System information operating system : Ubuntu 22.04 LTS

dotnet version : 6.0.400

QDK : 0.25.228311

msoeken commented 2 years ago

This is not a change of order, but a result of qubit de-allocation. Qubits are de-allocated in reverse order and then pushed to a free-list. That's why when re-allocating q in loops with an odd index, the order will be reversed. You can print Message($"q:{q}"); before applying BigEndian to see that.

weucode commented 2 years ago

Thanks for your reply! It helps me a lot to understand the qubit allocation. What's more, I have a question when running the following programs. Why is the order same for q1 and q2? Dose it mean I use the wrong way to display the different encoding?

namespace NISLNameSpace {
    open Microsoft.Quantum.Intrinsic;
    open Microsoft.Quantum.Arithmetic;

    @EntryPoint()
    operation main() : Unit {
        use q = Qubit[3];
        Message($"q : {q}");
        mutable q1 = BigEndian(q);
        Message($"q1: {q1} {q1!}");
        mutable q2 = LittleEndian(q);
        Message($"q2: {q2} {q2!}");
        ResetAll(q);
    }
}

The output is:

q : [q:0,q:1,q:2] q1: BigEndian([q:0,q:1,q:2]) [q:0,q:1,q:2] q2: LittleEndian([q:0,q:1,q:2]) [q:0,q:1,q:2]

msoeken commented 2 years ago

LittleEndian and BigEndian are not functions that change the array. You can better think of them as tags in this case. We create a type LittleEndian which is wrapping a qubit array but indicating that we interpret it in little-endian order, i.e., q[0] is the least-significant bit, whereas BigEndian indicates that q[0] is the most-significant bit. The algorithms need to ensure that this order is respected, but the type checker in the compiler can fail when I pass a LittleEndian type to an operation that expects BigEndian, even if they wrap the same qubit array.

weucode commented 2 years ago

Thanks a lot, I have fully understood these two types.