alan-turing-institute / autoemulate

emulate simulations easily
MIT License
15 stars 1 forks source link

Model names #196

Closed mastoffel closed 4 months ago

mastoffel commented 4 months ago

resolves #175 and #190

github-actions[bot] commented 4 months ago

Coverage report

Click to see where and how coverage changed

FileStatementsMissingCoverageCoverage
(new stmts)
Lines missing
  autoemulate
  compare.py 243
  cross_validate.py 57
  hyperparam_searching.py
  model_processing.py
  printing.py
  save.py 56
  utils.py 91, 97
  autoemulate/emulators
  gaussian_process.py
  gaussian_process_mogp.py 71-83, 87, 90
  gradient_boosting.py
  neural_net_sk.py
  neural_net_torch.py
  polynomials.py
  random_forest.py
  rbf.py
  support_vector_machines.py
  xgboost.py
  autoemulate/emulators/neural_networks
  rbf.py
  tests
  test_cross_validate.py
  test_emulators.py
  test_model_processing.py
  test_printing.py
  test_save.py
  test_torch.py
  test_utils.py
Project Total  

The report is truncated to 25 files out of 29. To see the full report, please visit the workflow summary page.

This report was generated by python-coverage-comment-action

codecov-commenter commented 4 months ago

Codecov Report

Attention: Patch coverage is 94.77612% with 7 lines in your changes are missing coverage. Please review.

Project coverage is 90.31%. Comparing base (69932f1) to head (8fedada).

Files Patch % Lines
autoemulate/compare.py 85.71% 2 Missing :warning:
autoemulate/plotting.py 33.33% 2 Missing :warning:
autoemulate/hyperparam_searching.py 0.00% 1 Missing :warning:
autoemulate/save.py 87.50% 1 Missing :warning:
autoemulate/utils.py 87.50% 1 Missing :warning:
Additional details and impacted files ```diff @@ Coverage Diff @@ ## main #196 +/- ## ========================================== + Coverage 88.09% 90.31% +2.21% ========================================== Files 44 44 Lines 2083 2085 +2 ========================================== + Hits 1835 1883 +48 + Misses 248 202 -46 ```

:umbrella: View full report in Codecov by Sentry.
:loudspeaker: Have feedback on the report? Share it here.

kallewesterling commented 4 months ago

LGTM.

Have one suggestion for NeuralNetTorch model names.

Also, do you think it would be useful to have a getter function / attribute for all emulators that return the model name? e.g.

> model = NeuralNetTorch(module='mlp')
> model.model_name
NNMLP
> model = RandomForest()
> model.model_name 
RandomForest

This would be easy to implement through accessing <Class>.__class__.__name__ in a simple implementation like this:

@property
def model_name(self):
    return self.__class__.__name__

...at least I think...?

bryanlimy commented 4 months ago

LGTM. Have one suggestion for NeuralNetTorch model names. Also, do you think it would be useful to have a getter function / attribute for all emulators that return the model name? e.g.

> model = NeuralNetTorch(module='mlp')
> model.model_name
NNMLP
> model = RandomForest()
> model.model_name 
RandomForest

This would be easy to implement through accessing <Class>.__class__.__name__ in a simple implementation like this:

@property
def model_name(self):
    return self.__class__.__name__

...at least I think...?

NeuralNetTorch(module='mlp') and NeuralNetTorch(module='rbf') would have the same name in this case. We can have prefix + module like model_name = f"NN{upper(self.module)}" for all NeuralNetTorch architectures, which will lead to NNMLP, NNRBF etc.

kallewesterling commented 4 months ago

Sounds like a good solution to me!

mastoffel commented 4 months ago

Ok, implemented both of @bryanlimy's suggestions.

mastoffel commented 4 months ago

Maybe would be good if you could quickly have an eye on this too @bryanlimy ! Lots of changes here, but I think it's mostly fine.

bryanlimy commented 4 months ago

LGTM!