qiskit-community / qiskit-machine-learning

Quantum Machine Learning
https://qiskit-community.github.io/qiskit-machine-learning/
Apache License 2.0
657 stars 322 forks source link

Enhancement of PyTorch connector #787

Open edoaltamura opened 6 months ago

edoaltamura commented 6 months ago

What should we add?

Given the demand for robust coupling with PyTorch, we propose to enhance the module https://github.com/qiskit-community/qiskit-machine-learning/blob/60f2c2e1b6469b0ba878d2026c662cb5f98af50f/qiskit_machine_learning/connectors/torch_connector.py#L56 in two phases.

Phase 1 will involve no or very minor changes to the user-facing API. The changes will focus on refractoring the backend connector, bug patches, and compatibility robustness.

Phase 2 will focus on upgrading the inter-functionality with PyTorch and may involve changes in API and UX. These changes are likely to be incrementally introduced after the version 0.8 roll-out, leading up to a stable 1.0.

Jrbiltmore commented 6 months ago

Orange County California I'm a research scientist and was assigned to investigate the housing crisis and fraud in the area. Over the past seven years I have ran into repeated evidence of systemic corruption and fraud, but this one scares me. The Bonds of Surety market, if you're familiar, basically pays the Judge, the District attorney and the public defender with each new case. They've been embezzling money from the inheritance of Latinos in the area since the 1800s. I have 1000s of documents of proof. AAA, Ralphs, Boot Barn Inc and Reynolds and Reynolds using MAS90, all get pay offs from selling everyone as Units through the municipal procurement system. If you enter an EBT number at Whole Foods it registers as a Gift Card.  They moved Baker street, the street the slaves lived on at Irvine Ranch, in to hide the postal addresses, then in the 40s they started to hide the pay offs through social services and in the 70s and 80s using 192.168 with a.com, b.com etc... and now they're using Exhibit "C", B and D Automotive and the Tustin Senior Center to launder money from probate fraud after Covid. All of this leads back to the Solano Boat Slip system. They are literally drugging victims, torturing them, having Jim Palmer of the OC Rescue Mission use hypnosis on them and then compel victims to commit crimes so they can incapacitate them to commit more fraud. That's why the right thumb prints are so important. These lawyers are selling us all as slaves now. Even the police.I wish like hell I was wrong. Sent from my iPhoneOn 11. 3. 2024., at 09:10, Edoardo Altamura @.***> wrote: What should we add? Given the demand for robust coupling with PyTorch, we propose to enhance the module https://github.com/qiskit-community/qiskit-machine-learning/blob/60f2c2e1b6469b0ba878d2026c662cb5f98af50f/qiskit_machine_learning/connectors/torch_connector.py#L56 by

Refactoring _TorchNNFunction and other definitions in torch_connector.py Refractoring code that involves https://github.com/qiskit-community/qiskit-machine-learning/blob/60f2c2e1b6469b0ba878d2026c662cb5f98af50f/qiskit_machine_learning/connectors/torch_connector.py#L215 into modular units based on _optionals.HAS_SPARSE.require_now("SparseArray") Adding version compatibility between Qiskit machine-learning and PyTorch, with relevant deprecation warnings and supported features Improving the test coverage

—Reply to this email directly, view it on GitHub, or unsubscribe.You are receiving this because you are subscribed to this thread.Message ID: @.***>

edoaltamura commented 3 months ago

This problem could be broken down into two steps.

Refactoring _TorchNNFunction as a standalone function

This is currently nested within TorchConnector. To improve modularity, we can move it outside and make it a separate function to use within TorchConnector:

def torch_nn_function_forward(
    input_data: Tensor,
    weights: Tensor,
    neural_network: NeuralNetwork,
    sparse: bool
) -> Tensor:
    """Forward pass computation."""
    if input_data.shape[-1] != neural_network.num_inputs:
        raise QiskitMachineLearningError(
            f"Invalid input dimension! Received {input_data.shape} and "
            + f"expected input compatible to {neural_network.num_inputs}"
        )

    result = neural_network.forward(
        input_data.detach().cpu().numpy(), weights.detach().cpu().numpy()
    )
    if sparse:
        if not neural_network.sparse:
            raise RuntimeError("TorchConnector configured as sparse, the network must be sparse as well")

        # Handle sparse output
        # (Implementation of sparse handling here)
    else:
        if neural_network.sparse:
            # Handle conversion to dense if necessary
            # (Implementation of dense handling here)
        result_tensor = torch.as_tensor(result, dtype=torch.float)

    if len(input_data.shape) == 1:
        result_tensor = result_tensor[0]

    result_tensor = result_tensor.to(input_data.device)
    return result_tensor

def torch_nn_function_backward(
    ctx: Any,
    grad_output: Tensor
) -> Tuple:
    """Backward pass computation."""
    input_data, weights = ctx.saved_tensors
    neural_network = ctx.neural_network

    if input_data.shape[-1] != neural_network.num_inputs:
        raise QiskitMachineLearningError(
            f"Invalid input dimension! Received {input_data.shape} and "
            + f" expected input compatible to {neural_network.num_inputs}"
        )

    # (Implementation of backward pass here)

    return input_grad, weights_grad, None, None

Restructuring TorchConnector

Now, TorchConnector will use these functions for its forward and backward passes. So we can write the rest as:

class TorchConnector(Module):
    def __init__(
        self,
        neural_network: NeuralNetwork,
        initial_weights: np.ndarray | Tensor | None = None,
        sparse: bool | None = None,
    ):
        super().__init__()
        self._neural_network = neural_network
        if sparse is None:
            sparse = self._neural_network.sparse

        self._sparse = sparse

        if self._sparse and not self._neural_network.sparse:
            raise QiskitMachineLearningError(
                "TorchConnector configured as sparse, the network must be sparse as well"
            )

        weight_param = torch.nn.Parameter(torch.zeros(neural_network.num_weights))
        self.register_parameter("weight", weight_param)
        self._weights = weight_param

        if initial_weights is None:
            self._weights.data.uniform_(-1, 1)
        else:
            self._weights.data = torch.tensor(initial_weights, dtype=torch.float)

    @property
    def neural_network(self) -> NeuralNetwork:
        return self._neural_network

    @property
    def weight(self) -> Tensor:
        return self._weights

    @property
    def sparse(self) -> bool | None:
        return self._sparse

    def forward(self, input_data: Tensor | None = None) -> Tensor:
        input_ = input_data if input_data is not None else torch.zeros(0)
        return torch_nn_function_forward(
            input_, self._weights, self._neural_network, self._sparse
        )

    def backward(self, ctx: Any, grad_output: Tensor) -> Tuple:
        return torch_nn_function_backward(ctx, grad_output)

Other points to note when updating