dotnet / machinelearning

ML.NET is an open source and cross-platform machine learning framework for .NET.
https://dot.net/ml
MIT License
9.05k stars 1.88k forks source link

Onnx with single float output throws exception during MLContext predict #7225

Open xqiu opened 2 months ago

xqiu commented 2 months ago

System Information (please complete the following information):

I can use InferenceSession to run it without problem.

To Reproduce Steps to reproduce the behavior:

  1. Clone https://github.com/xqiu/ml_onnx_output_error
  2. Run the program to see the bug appearance

Code is:

    public class ModelInput
    {
        [ColumnName("input")]
        [VectorType(1, 1, 40, 40)]  // Adjust dimensions to match your model input
        public float[] Features { get; set; } = new float[1 * 1 * 40 * 40];  // Flattened 4D array for ONNX input
    }

    public class ModelOutput
    {
        [ColumnName("output")]
        public float Prediction { get; set; }
    }

        public static void PredictWithMLContextSingle(string modelPath)
        {
            Console.WriteLine("\r\nRun PredictWithMLContext:");

            var mlContext = new MLContext();

            // Define the ONNX pipeline
            var pipeline = mlContext.Transforms.ApplyOnnxModel(
                modelFile: modelPath,
                outputColumnNames: new[] { "output" },
                inputColumnNames: new[] { "input" });
            // Prepare input data (replace this with your actual input data)
            var input = new ModelInput();
            for (int i = 0; i < input.Features.Length; i++)
            {
                input.Features[i] = 0.1f; // Example value, replace with your actual data
            }

            // Create the data view with a single item wrapped in a list
            var emptyDataView = mlContext.Data.LoadFromEnumerable(new[] { input });

            // Fit the pipeline
            var mlModel = pipeline.Fit(emptyDataView);

            // Load the model
            var predictionEngine = mlContext.Model.CreatePredictionEngine<ModelInput, ModelOutput>(mlModel);

            // Perform prediction
            var output = predictionEngine.Predict(input);

            Console.WriteLine("Using MLContext Prediction result:");
            Console.WriteLine($"Output: {output}");
        }

Expected behavior Prediction works with MLContext.

Screenshots, Code, Sample Projects Sample code in https://github.com/xqiu/ml_onnx_output_error

Additional context Add any other context about the problem here.