When I input a dataset which contains large values (10000 and larger), AI-Feynman outputs a function whose graph seems to have no relation to the input values.
I then divided the dataset by it's highest value (50000), so that all the data was now stretching over a range from 0 to 1, and input the new dataset.
Surprisingly, AI-Feynman then produced a function whose graph approximates the input data points very well.
Do you have any idea what the problem is?
It might be because the NN has a hard time fitting a function with values over such a wide range. In general normalizing the data helps in machine learning problems.
When I input a dataset which contains large values (10000 and larger), AI-Feynman outputs a function whose graph seems to have no relation to the input values. I then divided the dataset by it's highest value (50000), so that all the data was now stretching over a range from 0 to 1, and input the new dataset. Surprisingly, AI-Feynman then produced a function whose graph approximates the input data points very well. Do you have any idea what the problem is?