google / uncertainty-baselines

High-quality implementations of standard and SOTA methods on a variety of tasks.
Apache License 2.0
1.46k stars 203 forks source link

Question about SNGP BERT flags #1252

Closed xuefei-wang closed 1 year ago

xuefei-wang commented 1 year ago

Hi,

I noticed that for BERT_SNGP model, all of the below are implemented but the flags for the first two are set to False:

I also find it difficult to reconcile the implementation here with the official tensorflow tutorial (https://www.tensorflow.org/text/tutorials/uncertainty_quantification_with_sngp_bert) Which would be the better one to refer to?

Thanks!!

jereliu commented 1 year ago

Hi Xuefei!

If memory serves. They are set to False since they were found to be hurting generalization on larger BERT models in preliminary experiments.

In general, please follow the official tutorial when working with smaller models, and consider the setting tried here for larger models loaded from pretrained checkpoints (eg BERT base or larger).

Thanks! Jeremiah

On Mon, Feb 6, 2023 at 11:37 AM Xuefei (Julie) Wang < @.***> wrote:

Hi,

I noticed that for BERT_SNGP model, all of the below are implemented but the flags for the first two are set to False:

  • 'spec_norm_att'
  • 'spec_norm_ffn'
  • 'spec_norm_plr' Is there a particular reason for that?

I also find it difficult to reconcile the implementation here with the official tensorflow tutorial ( https://www.tensorflow.org/text/tutorials/uncertainty_quantification_with_sngp_bert) Which would be a better one to follow through?

Thanks!!

— Reply to this email directly, view it on GitHub https://github.com/google/uncertainty-baselines/issues/1252, or unsubscribe https://github.com/notifications/unsubscribe-auth/AATMGLVP47SDQQMLIKRUIDLWWFHI5ANCNFSM6AAAAAAUTCR5CY . You are receiving this because you are subscribed to this thread.Message ID: @.***>

xuefei-wang commented 1 year ago

Hi Jeremiah, that's very helpful, thanks!