miguelriemoliveira / OptimizationUtils

A set of utilities for using the python scipy optimizer functions
GNU General Public License v3.0
6 stars 0 forks source link

Create a cmf based on polynomial regression #61

Open DanielCoelho112 opened 2 years ago

DanielCoelho112 commented 2 years ago

The goal is to obtain something like this: 3314166135-5c013a8ecd84b_articlex Where the blue dots are from a Joint Image Histogram (JIH)

miguelriemoliveira commented 2 years ago

Hi @DanielCoelho112 ,

yes, it looks good. You could try a polynomial with more degrees and a case where a more difficult shape (going up and down several times) is needed just to see if it would work.

miguelriemoliveira commented 2 years ago

Another suggestion for @DanielCoelho112 and @lucasrdalcol is to look into the model of a polynomial and see if it can be tweaked to take in some properties we know the cmf should have.

One example of this is: how can I have a polynomial model of a curve that only grows, assuming as we should that the cmf should be monotonically increasing https://en.wikipedia.org/wiki/Monotonic_function

DanielCoelho112 commented 2 years ago

Hi @miguelriemoliveira. Seems a good idea, we'll look into it.

lucasrdalcol commented 2 years ago

Hi @miguelriemoliveira,

thanks for the suggestions. We'll take a look into it.

DanielCoelho112 commented 2 years ago

Here are the results using a third-degree polynomial: image

miguelriemoliveira commented 2 years ago

They look nice. My question is: why a third degree? I mean, I don't think there is any disadvantage in putting a lot of degrees of freedom in the polynomial. The optimization is very fast, why not try with a degree of 30?

miguelriemoliveira commented 2 years ago

Better still, why not produce a picture with 10 approaches with polynomials of different degrees?

DanielCoelho112 commented 2 years ago

Hi @miguelriemoliveira,

Better still, why not produce a picture with 10 approaches with polynomials of different degrees?

In the beginning, we had to build a different script for each polynomial regression optimization, so, build 10 approaches would require a lot of changes... But now we were able to build an n-degree polynomial regression, so that, given the n-degree, the algorithm will optimize n parameters.

Based on this, we obtained these results;

no_mif

For n > 3 the results are not great... Maybe local minimum?

We also tried to force the function to be monotonically increasing (f(x) >= f(x-1)), here are the results:

mif

Really poor results... I think that the way we are forcing the function to be monotonically increasing is not the right one, but let's talk tomorrow.

DanielCoelho112 commented 2 years ago

Hi @miguelriemoliveira, @tiagomfmadeira, and @lucasrdalcol I tried to find some bug in our implementation and couldn't find any... I implemented a polynomial regression using numpy and the results are pretty different than the ones using the optimization... (results using optimization depend on the initial guess, while the results using numpy are always the same)

As the title implies, left images are from numpy and right images are from the optimization

p4 p6

I thought that maybe the difference was the objective function, but numpy also uses square error...

miguelriemoliveira commented 2 years ago

I would like to revisit this, I think there is some bug... sorry, had no time at all...

miguelriemoliveira commented 2 years ago

Hi @DanielCoelho112 and @lucasrdalcol ,

I was testing around to see why the optimization of the cmf with the polynomial was not working and I think it is because the sensitivity of the parameters is very different. @DanielCoelho112 mentioned this when discussing the scaling of the parameters.

I did that by creating an x_scale vector and changing the diff_step. Check it out:

https://github.com/miguelriemoliveira/OptimizationUtils/blob/a98fcdf12ca6ecca8b81e639eda55dfb17b17aa5/test/workshop2021/polynomial_regression/n_degree_polynomial_regression_cmf_mike.py#L198-L202

This is working I think, but I am not sure that this is a good approach especially after seeing the svms.

The big advantage here is that we have full control of the objective function and parameters, which would allow us to create custom solutions. For example if we want to include the estimation of vignetting in the optimization problem here we could and using the other solutions I am not so sure ... perhaps we could also...

DanielCoelho112 commented 2 years ago

Hi @miguelriemoliveira, I'm glad you found the cause of one problem, but I think you created another one... I ran the code using the command: ./n_degree_polynomial_regression_cmf_mike.py -deg 6 -inp 'JIH_(6, 67).npy'

The polynomial variation between steps was much smoother, however, in my case, the optimization never stopped... I waited 1 minute and nothing.

The big advantage here is that we have full control of the objective function and parameters, which would allow us to create custom solutions. For example if we want to include the estimation of vignetting in the optimization problem here we could and using the other solutions I am not so sure ... perhaps we could also...

I don't think this will be a problem, we have full control over all solutions we've tried so far... But maybe you are seeing something ahead that I'm not.

lucasrdalcol commented 2 years ago

Hi @miguelriemoliveira and @DanielCoelho112, I've tested the polynomial regression from degree 3 to degree 11, and also degree 20. I think it's working better, but from degree 9 to higher, it has the same problem and the polynomial regression stays the straight line from the initial model. Using degree 8 we have the same problem as before, but it converges after some time. Take a look at the results for degree 8 and 9 below:

@DanielCoelho112, I've made the test for degree 6, and it converged after 37 seconds more or less. Take a look at the result as well: Screenshot from 2021-12-03 17-42-14

After testing this, my conclusion is that the x_scale has made a good job, the optimization and the polynomial regression model are better for low degrees, because in high degrees we still have problems, but I still think is not a very good approach for the color correction. I think that because we can only achieve more or less good models with lower degrees, and I think we need more degrees of freedoms and more flexibility.

The big advantage here is that we have full control of the objective function and parameters, which would allow us to create custom solutions. For example if we want to include the estimation of vignetting in the optimization problem here we could and using the other solutions I am not so sure ... perhaps we could also...

I don't think this will be a problem, we have full control over all solutions we've tried so far... But maybe you are seeing something ahead that I'm not.

Maybe we can think about another approach that can be used as an optimization problem, so we can take advantage of this tool for our goal of Color Correction.

miguelriemoliveira commented 2 years ago

Hi @lucasrdalcol ,

thanks for testing. I agree, this does not look promising.