Open zelenooki87 opened 1 week ago
@zelenooki87 Are your results the same or better than SeeSR? Mine are much worse for some reason: https://github.com/cswry/OSEDiff/issues/9
Hi, Thank you for wonderfull work. On windows system, this error is present during code execution:
python test_osediff.py -i input -o output --osediff_path preset/models/osediff.pkl --pretrained_model_name_or_path preset/models/stable-diffusion-2-1-base/ --ram_ft_path preset/models/DAPE.pth --ram_path preset/models/ram_swin_large_14m.pth Traceback (most recent call last): File "C:\Users\Miki\OSEDiff\test_osediff.py", line 68, in <module> DAPE = ram(pretrained=args.ram_path, File "C:\Users\Miki\OSEDiff\ram\models\ram_lora.py", line 329, in ram model = RAMLora(**kwargs) File "C:\Users\Miki\OSEDiff\ram\models\ram_lora.py", line 109, in __init__ self.tokenizer = init_tokenizer() File "C:\Users\Miki\OSEDiff\ram\models\utils.py", line 132, in init_tokenizer tokenizer = BertTokenizer.from_pretrained('/home/notebook/data/group/LowLevelLLM/LLM/bert-base-uncased', local_files_only=True) File "C:\Users\Miki\anaconda3\envs\osediff\lib\site-packages\transformers\tokenization_utils_base.py", line 1770, in from_pretrained resolved_vocab_files[file_id] = cached_file( File "C:\Users\Miki\anaconda3\envs\osediff\lib\site-packages\transformers\utils\hub.py", line 409, in cached_file resolved_file = hf_hub_download( File "C:\Users\Miki\anaconda3\envs\osediff\lib\site-packages\huggingface_hub\utils\_validators.py", line 106, in _inner_fn validate_repo_id(arg_value) File "C:\Users\Miki\anaconda3\envs\osediff\lib\site-packages\huggingface_hub\utils\_validators.py", line 154, in validate_repo_id raise HFValidationError( huggingface_hub.errors.HFValidationError: Repo id must be in the form 'repo_name' or 'namespace/repo_name': '/home/notebook/data/group/LowLevelLLM/LLM/bert-base-uncased'. Use `repo_type` argument if needed.
as a temporarily solution I use utils.py from your previous repo: https://github.com/cswry/SeeSR/blob/main/ram/models/utils.py
And everything works. :)
Hello, I also encountered the same error May I ask where to download the pre trained model that tokenizer needs to load in untils.py? Did you download it on this page
Hi, Thank you for wonderfull work. On windows system, this error is present during code execution:
python test_osediff.py -i input -o output --osediff_path preset/models/osediff.pkl --pretrained_model_name_or_path preset/models/stable-diffusion-2-1-base/ --ram_ft_path preset/models/DAPE.pth --ram_path preset/models/ram_swin_large_14m.pth Traceback (most recent call last): File "C:\Users\Miki\OSEDiff\test_osediff.py", line 68, in <module> DAPE = ram(pretrained=args.ram_path, File "C:\Users\Miki\OSEDiff\ram\models\ram_lora.py", line 329, in ram model = RAMLora(**kwargs) File "C:\Users\Miki\OSEDiff\ram\models\ram_lora.py", line 109, in __init__ self.tokenizer = init_tokenizer() File "C:\Users\Miki\OSEDiff\ram\models\utils.py", line 132, in init_tokenizer tokenizer = BertTokenizer.from_pretrained('/home/notebook/data/group/LowLevelLLM/LLM/bert-base-uncased', local_files_only=True) File "C:\Users\Miki\anaconda3\envs\osediff\lib\site-packages\transformers\tokenization_utils_base.py", line 1770, in from_pretrained resolved_vocab_files[file_id] = cached_file( File "C:\Users\Miki\anaconda3\envs\osediff\lib\site-packages\transformers\utils\hub.py", line 409, in cached_file resolved_file = hf_hub_download( File "C:\Users\Miki\anaconda3\envs\osediff\lib\site-packages\huggingface_hub\utils\_validators.py", line 106, in _inner_fn validate_repo_id(arg_value) File "C:\Users\Miki\anaconda3\envs\osediff\lib\site-packages\huggingface_hub\utils\_validators.py", line 154, in validate_repo_id raise HFValidationError( huggingface_hub.errors.HFValidationError: Repo id must be in the form 'repo_name' or 'namespace/repo_name': '/home/notebook/data/group/LowLevelLLM/LLM/bert-base-uncased'. Use `repo_type` argument if needed.
as a temporarily solution I use utils.py from your previous repo: https://github.com/cswry/SeeSR/blob/main/ram/models/utils.py
And everything works. :)
Hello, I also encountered the same error
May I ask where to download the pre trained model that tokenizer needs to load in untils.py?
Did you download it on this page
Hi, Thank you for wonderfull work. On windows system, this error is present during code execution:
python test_osediff.py -i input -o output --osediff_path preset/models/osediff.pkl --pretrained_model_name_or_path preset/models/stable-diffusion-2-1-base/ --ram_ft_path preset/models/DAPE.pth --ram_path preset/models/ram_swin_large_14m.pth Traceback (most recent call last): File "C:\Users\Miki\OSEDiff\test_osediff.py", line 68, in <module> DAPE = ram(pretrained=args.ram_path, File "C:\Users\Miki\OSEDiff\ram\models\ram_lora.py", line 329, in ram model = RAMLora(**kwargs) File "C:\Users\Miki\OSEDiff\ram\models\ram_lora.py", line 109, in __init__ self.tokenizer = init_tokenizer() File "C:\Users\Miki\OSEDiff\ram\models\utils.py", line 132, in init_tokenizer tokenizer = BertTokenizer.from_pretrained('/home/notebook/data/group/LowLevelLLM/LLM/bert-base-uncased', local_files_only=True) File "C:\Users\Miki\anaconda3\envs\osediff\lib\site-packages\transformers\tokenization_utils_base.py", line 1770, in from_pretrained resolved_vocab_files[file_id] = cached_file( File "C:\Users\Miki\anaconda3\envs\osediff\lib\site-packages\transformers\utils\hub.py", line 409, in cached_file resolved_file = hf_hub_download( File "C:\Users\Miki\anaconda3\envs\osediff\lib\site-packages\huggingface_hub\utils\_validators.py", line 106, in _inner_fn validate_repo_id(arg_value) File "C:\Users\Miki\anaconda3\envs\osediff\lib\site-packages\huggingface_hub\utils\_validators.py", line 154, in validate_repo_id raise HFValidationError( huggingface_hub.errors.HFValidationError: Repo id must be in the form 'repo_name' or 'namespace/repo_name': '/home/notebook/data/group/LowLevelLLM/LLM/bert-base-uncased'. Use `repo_type` argument if needed.
as a temporarily solution I use utils.py from your previous repo: https://github.com/cswry/SeeSR/blob/main/ram/models/utils.py And everything works. :)
Hello, I also encountered the same error May I ask where to download the pre trained model that tokenizer needs to load in untils.py? Did you download it on this page
注释\ram\models\utils.py 132行,使用131行的代码就可以了。
Hi, Thank you for wonderfull work. On windows system, this error is present during code execution:
python test_osediff.py -i input -o output --osediff_path preset/models/osediff.pkl --pretrained_model_name_or_path preset/models/stable-diffusion-2-1-base/ --ram_ft_path preset/models/DAPE.pth --ram_path preset/models/ram_swin_large_14m.pth Traceback (most recent call last): File "C:\Users\Miki\OSEDiff\test_osediff.py", line 68, in <module> DAPE = ram(pretrained=args.ram_path, File "C:\Users\Miki\OSEDiff\ram\models\ram_lora.py", line 329, in ram model = RAMLora(**kwargs) File "C:\Users\Miki\OSEDiff\ram\models\ram_lora.py", line 109, in __init__ self.tokenizer = init_tokenizer() File "C:\Users\Miki\OSEDiff\ram\models\utils.py", line 132, in init_tokenizer tokenizer = BertTokenizer.from_pretrained('/home/notebook/data/group/LowLevelLLM/LLM/bert-base-uncased', local_files_only=True) File "C:\Users\Miki\anaconda3\envs\osediff\lib\site-packages\transformers\tokenization_utils_base.py", line 1770, in from_pretrained resolved_vocab_files[file_id] = cached_file( File "C:\Users\Miki\anaconda3\envs\osediff\lib\site-packages\transformers\utils\hub.py", line 409, in cached_file resolved_file = hf_hub_download( File "C:\Users\Miki\anaconda3\envs\osediff\lib\site-packages\huggingface_hub\utils\_validators.py", line 106, in _inner_fn validate_repo_id(arg_value) File "C:\Users\Miki\anaconda3\envs\osediff\lib\site-packages\huggingface_hub\utils\_validators.py", line 154, in validate_repo_id raise HFValidationError( huggingface_hub.errors.HFValidationError: Repo id must be in the form 'repo_name' or 'namespace/repo_name': '/home/notebook/data/group/LowLevelLLM/LLM/bert-base-uncased'. Use `repo_type` argument if needed.
as a temporarily solution I use utils.py from your previous repo: https://github.com/cswry/SeeSR/blob/main/ram/models/utils.py And everything works. :)
Hello, I also encountered the same error May I ask where to download the pre trained model that tokenizer needs to load in untils.py? Did you download it on this page
注释\ram\models\utils.py 132行,使用131行的代码就可以了。
谢谢 我下载了模型后就可以了
Hi, Thank you for wonderfull work. On windows system, this error is present during code execution:
as a temporarily solution I use utils.py from your previous repo: https://github.com/cswry/SeeSR/blob/main/ram/models/utils.py
And everything works. :)