Open JiyouSeo opened 1 month ago
The camera object (defined here) is essentially a wrapper around a single data torch.Tensor
(see here) with convenience class methods and constructors. You could replace all instances of this camera by the tensor itself and make the class methods regular functions. This will work for a given camera models, e.g. pinhole. ONNX supports only static compute graphs so you will need to compile a different graph for each camera model.
I am trying to export a custom GeoCalib model to ONNX. The model uses LMOptimizer, which accepts a custom class Pinhole (inherited from BaseCamera) as input. When I attempt to export the model, I encounter the following error:
RuntimeError: Only tuples, lists and Variables are supported as JIT inputs/outputs. Dictionaries and strings are also accepted, but their usage is not recommended. Here, received an input of unsupported type: Pinhole
It appears that ONNX export does not support custom class inputs directly. Since the Pinhole class is crucial to my model, I am looking for guidance on how to modify the export process to handle this input type.
Steps to Reproduce:
Create a custom class Pinhole inheriting from BaseCamera. Pass an instance of Pinhole as an input to LMOptimizer. Attempt to export the model using torch.onnx.export. Expected Behavior: The model should be exported successfully to ONNX format without runtime errors. Observed Behavior: The above runtime error occurs due to the unsupported Pinhole input type.
Environment:
PyTorch version: 2.4.1 Python version: 3.9.20
Additional Context: The Pinhole class is essential for my camera model, and changing its structure significantly is not ideal. Are there any best practices or workarounds to handle custom class inputs during ONNX export? Any advice on how to convert or wrap this class to make it compatible with ONNX export would be greatly appreciated.