TheAlgorithms / Rust

All Algorithms implemented in Rust
MIT License
22.93k stars 2.24k forks source link

Suggestion: Adding Loss functions of Machine Learning in maths folder #559

Closed GreatRSingh closed 1 month ago

GreatRSingh commented 1 year ago

I would like to suggest adding Loss functions in this repo.

The loss function estimates how well a particular algorithm models the provided data.

Add Loss Functions to: machine_learning/loss_functions

Task List:

Navaneeth-Sharma commented 1 year ago

Hi @GreatRSingh , Thanks for the suggestion. Just to give an update, Some of them already implemented in maths section. Please check that. I have taken this up to add as many functions as possible.

GreatRSingh commented 1 year ago

@Navaneeth-Sharma Can you put up a list of loss functions that you are going to implement and which you have already implemented?

Navaneeth-Sharma commented 1 year ago

Sorry @GreatRSingh , I think I miss read loss as activation functions. Loss functions aren't implemented. But have a plans to do that specially the basic ones like Cross Entropy, MSE, RMSE. You can take it up some of those, if you like to contribute.

GreatRSingh commented 1 year ago

@Navaneeth-Sharma ok I will do that.

Navaneeth-Sharma commented 1 year ago

Hi, Just an info so that we dont implement the same losses, I will try to take up these loss functions

GreatRSingh commented 1 year ago

@Navaneeth-Sharma No none of them are already taken.

I will take up MSE, NLL, MAE, MarginRanking, KLDivergence.

GreatRSingh commented 1 year ago

@Navaneeth-Sharma @siriak I have added a list of Loss Functions in the description. Keep Suggesting any other if needed.

github-actions[bot] commented 1 year ago

This issue has been automatically marked as abandoned because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

GreatRSingh commented 1 year ago

Working on adding Hinge Loss

github-actions[bot] commented 11 months ago

This issue has been automatically marked as abandoned because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

mobley-trent commented 11 months ago

Are all these functions taken ?

GreatRSingh commented 11 months ago

No, you can start working on them.

On Fri, Jan 5, 2024, 12:21 AM Eddy Oyieko @.***> wrote:

Are all these functions taken ?

— Reply to this email directly, view it on GitHub https://github.com/TheAlgorithms/Rust/issues/559#issuecomment-1877597795, or unsubscribe https://github.com/notifications/unsubscribe-auth/ALY6WHSWG3XIZJPTNJ44IDLYM32Z5AVCNFSM6AAAAAA5ZIPZLCVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTQNZXGU4TONZZGU . You are receiving this because you were mentioned.Message ID: @.***>

avats-dev commented 10 months ago

Opened a PR to add KL divergence loss as a sub task of this issue. #656 Kindly review.

sozelfist commented 8 months ago

I have opened a PR that implements Huber loss function, which is mentioned is this issue #697 :rocket:. Let's take a look on this PR :hugs:.

jkauerl commented 6 months ago

Are there any function that can still be worked on?

siriak commented 6 months ago

I think NLL and Marginal Ranking from the list are still not implemented

jkauerl commented 6 months ago

Finished implementing both, but have a question regarding the PR. Should I open 2 separate PR's or 1 PR containing both algorithms?

siriak commented 6 months ago

2 separate PRs please

sozelfist commented 6 months ago

When opening a PR, please make sure that your PR is on a separate branch like feat/ml/loss/nll{marginal_ranking} instead of the master branch (on your fork). This helps the git history clean and avoid accidents when merging the code from the original to your fork repository. Please commit changes with meaningful messages that reflect changes on the source code, not try to make many redundant commits for a single change.

realstealthninja commented 1 month ago

I would like to suggest adding Loss functions in this repo.

The loss function estimates how well a particular algorithm models the provided data.

Add Loss Functions to: machine_learning/loss_functions

Task List:

* [x]  Cross-Entropy

* [x]  Hinge loss

* [x]  Huber loss.

* [x]  MSE

* [ ]  NLL

* [x]  MAE

* [ ]  Marginal Ranking

* [x]  KL Divergence

should probably cross of MR & NLL since that has been implemented in #742 & #734 respectively effectively closing this issue?

siriak commented 1 month ago

Well done, thank you everybody!