This is where I put things I find useful that speed up my work with Machine Learning. Ever looked in your old projects to reuse those cool functions you created before? Well, this repo is designed to be a Python Library of functions I created in my previous project that can be reused. I also share some Notebooks Tutorials and Python Code Snippets.
in your notebook :
https://colab.research.google.com/github/gmihaila/ml_things/blob/master/notebooks/pytorch/pretrain_transformers_pytorch.ipynb#scrollTo=UCLtm5BiXona
get_model_config() method load config from model_path, how to change it to custom config?,
for example
config = BertConfig(
vocab_size=10000,
hidden_size=256,
num_hidden_layers=6,
num_attention_heads=4,
intermediate_size=3072,
hidden_act="gelu",
hidden_dropout_prob=0.1,
attention_probs_dropout_prob=0.1,
max_position_embeddings=512,
type_vocab_size=2,
pad_token_id=0,
position_embedding_type="absolute",
truncation=True,
)
I wanto change the num_hidden_layers. for training a lightly Bert model
in your notebook : https://colab.research.google.com/github/gmihaila/ml_things/blob/master/notebooks/pytorch/pretrain_transformers_pytorch.ipynb#scrollTo=UCLtm5BiXona get_model_config() method load config from model_path, how to change it to custom config?, for example config = BertConfig( vocab_size=10000, hidden_size=256, num_hidden_layers=6, num_attention_heads=4, intermediate_size=3072, hidden_act="gelu", hidden_dropout_prob=0.1, attention_probs_dropout_prob=0.1, max_position_embeddings=512, type_vocab_size=2, pad_token_id=0, position_embedding_type="absolute", truncation=True, )
I wanto change the num_hidden_layers. for training a lightly Bert model