foundation-model-stack / fms-hf-tuning

🚀 Collection of tuning recipes with HuggingFace SFTTrainer and PyTorch FSDP.
Apache License 2.0
9 stars 30 forks source link

feat: use native python logger instead of transformers logger #194

Open kmehant opened 2 weeks ago

kmehant commented 2 weeks ago

Move away from using logger modules from transformers as it was strictly meant to be used (https://github.com/huggingface/transformers/blob/1c73d85b86e8afa549eb1209b3e62afdaa664cad/src/transformers/utils/logging.py#L151) when writing a submodule within transformers module. Use python native logger instead.

Also, setting levels on this logger does not seem to work and is quite buggy. It only works when we get logger with name transformers. Setting verbosity as well does not set the level for the logger which has been created using get_logger(). We should instead use python native logger and for transformer specific logging, we should simply set its log verbosity level but not use it as a logger in our code (https://huggingface.co/docs/transformers/en/main_classes/logging#transformers.utils.logging.set_verbosity_error)