nnstreamer / nntrainer

NNtrainer is Software Framework for Training Neural Network Models on Devices.
Apache License 2.0
134 stars 71 forks source link

[Wait for #2580] [ Mixed Precision ] Enable Mixed Precision #2581

Open jijoongmoon opened 1 month ago

jijoongmoon commented 1 month ago

In this PR

This PR enables Mixed Precision Training. For now only FP16-FP32 is considered. Additional Test cases will be added.

. add getSortedLayerIdx to set the graph order for forwarding. . change clip_weights to lazy_apply_weights to use both cases. . add fowarding_op to run forwarding from that layer which has a gradient with nan. . add a while loop for re-run backwarding after resetting the loss scale. . add setLossScale in RunLayerContext . add check the gradient if mixed precision is enabled.

Self evaluation:

  1. Build test: [X]Passed [ ]Failed [ ]Skipped
  2. Run test: [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: jijoong.moon jijoong.moon@samsung.com

taos-ci commented 1 month ago

:memo: TAOS-CI Version: 1.5.20200925. Thank you for submitting PR #2581. Please a submit 1commit/1PR (one commit per one PR) policy to get comments quickly from reviewers. Your PR must pass all verificiation processes of cibot before starting a review process from reviewers. If you are new member to join this project, please read manuals in documentation folder and wiki page. In order to monitor a progress status of your PR in more detail, visit http://ci.nnstreamer.ai/.