You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Similar to beta_1 as a tunable parameter for creating optimizers, we can add beta_2, epsilon, weight_decay_rate, and exclude_from_weight_decay as tunable parameters by passing them as argument from create_optimizer.
3. Additional context
I was recently trying to finetune a Huggingface Roberta model and while doing so, I wanted to add a scheduler as well as AdamW with custom parameters, and thus I came across these methods.
4. Are you willing to contribute it? (Yes or No)
Yes
The text was updated successfully, but these errors were encountered:
Nice!! I was following this guide for fine-tuning TF BERT, and the official nlp optimizer was used to set up the optimizer. So, if this configurable optimization is ready, where can I find some examples for that? or can we change the above-mentioned fine-tuning example by using this configurable optimization? Let me know what you think about this.
Prerequisites
1. The entire URL of the file you are using
models/official/nlp/optimization.py
Lines 68 to 107 in 7ecbac3
2. Describe the feature you request
Similar to
beta_1
as a tunable parameter for creating optimizers, we can addbeta_2
,epsilon
,weight_decay_rate
, andexclude_from_weight_decay
as tunable parameters by passing them as argument fromcreate_optimizer
.3. Additional context
I was recently trying to finetune a Huggingface Roberta model and while doing so, I wanted to add a scheduler as well as AdamW with custom parameters, and thus I came across these methods.
4. Are you willing to contribute it? (Yes or No)
Yes
The text was updated successfully, but these errors were encountered: