Open IIMunchII opened 2 years ago
I agree that implementing efficient loss functions should be the goal.
I'd also say that it would be useful if the loss functions were compatible with both neural networks (e.g., PyTorch) and tree-based models (e.g., XGBoost), as the latter is the SOTA for tabular datasets and the former for most other things.
So far @KasperGroesLudvigsen has requested the need for a general purpose loss function package, where loss functions can be accessed and used in frameworks like PyTorch etc. I believe such a library should focus on implementing efficient variants of loss functions directly from scientific articles. So, it would be a simple process of looking up articles on loss functions or statistical methods and finding a good implementation for the functions. Usually implemented in NumPy but could also be other efficient implementations.
The documentation and reference to scientific articles is vital for the trustworthiness of such a package in my opinion.