ganler / ResearchReading

General system research material (not limited to paper) reading notes.
GNU General Public License v3.0
20 stars 1 forks source link

PACT'17 | End-to-end Deep Learning of Optimization Heuristics #66

Closed ganler closed 2 years ago

ganler commented 2 years ago

DeepTune

https://chriscummins.cc/pub/2017-pact.pdf

Paper Summary

DeepTune is a prior work of the ProGraML's author. It proposes an automatic feature extraction method for compiler optimization. Specifically, it directly learns valuable feature vectors from the source code of input languages. It starts from source normalizing, where the source codes are re-written to prevent syntactic differences in programs from affecting the learning representation. It then puts the normalized tokens into an LSTM model whose output embeddings are concatenated with auxiliary inputs (if present). The auxiliary inputs are also normalized with BN layer for stabilizing the scale. Finally, dense DNN layers transform the output to a range of [0, 1]. DeepTune targets 2 downstream tasks: 1) device mapping; and 2) thread coarsening. In device mapping, it compares against static mapping and systems built by Grewe et al. DeepTune generally outperforms the other 2 baselines on different devices (3.43x and 1.42 respectively), though in a few of them simple baseline might work better. In thread coarsening, it outperforms the baseline by 16%.

Strength

Weakness