carefree0910 / carefree-creator

AI magics meet Infinite draw board.
https://creator.nolibox.com/guest
MIT License
2.13k stars 216 forks source link

cannot import name 'TPair' from 'cflearn.api.cv.diffusion' #32

Closed MiYoHu closed 1 year ago

MiYoHu commented 1 year ago

I encountered the following error when running with Docker ImportError: cannot import name 'TPair' from 'cflearn.api.cv.diffusion' (/usr/local/miniconda3/lib/python3.8/site-packages/cflearn/api/cv/diffusion.py) I did not find the definition under the cflearn project

carefree0910 commented 1 year ago

Ah, my bad, this is due to the fact that the latest carefree-learn package is not yet published to the pypi, I'll get it done today :D

carefree0910 commented 1 year ago

The dependencies have been updated now! The target versions are:

p.s. 考虑到提问日志里有写中文,推测您有可能在用清华源,那可能就会需要再等等,因为清华源同步官方 pypi 会需要一些时间🤣

MiYoHu commented 1 year ago

Thank you for your timely answer, but when I upgraded the relevant dependencies, a new error appeared😟, and the stack information is as follows: File "/usr/local/miniconda3/envs/cfenv/lib/python3.10/site-packages/cflearn/models/cv/diffusion/cond_models/clip.py", line 143, in init self.tokenizer = ITokenizer.make(tokenizer, {}) File "/usr/local/miniconda3/envs/cfenv/lib/python3.10/site-packages/cflearn/models/nlp/tokenizers/schema.py", line 32, in make return super().make(name, config) File "/usr/local/miniconda3/envs/cfenv/lib/python3.10/site-packages/cftool/misc.py", line 555, in make return base(*config) # type: ignore File "/usr/local/miniconda3/envs/cfenv/lib/python3.10/site-packages/cflearn/models/nlp/tokenizers/clip.py", line 25, in init self.tokenizer = self.base.from_pretrained(self.tag) File "/usr/local/miniconda3/envs/cfenv/lib/python3.10/site-packages/transformers/tokenization_utils_base.py", line 1804, in from_pretrained return cls._from_pretrained( File "/usr/local/miniconda3/envs/cfenv/lib/python3.10/site-packages/transformers/tokenization_utils_base.py", line 1959, in _from_pretrained tokenizer = cls(init_inputs, init_kwargs) File "/usr/local/miniconda3/envs/cfenv/lib/python3.10/site-packages/transformers/models/clip/tokenization_clip.py", line 322, in init with open(vocab_file, encoding="utf-8") as vocab_handle: TypeError: expected str, bytes or os.PathLike object, not NoneType**

The tag variable in the ICLIPTokenizer class does not seem to be defined, causing an exception to be thrown when calling the from_pretrained method

carefree0910 commented 1 year ago

Hmmm, this looks like a weird bug, and mostly looks like a bug in transformers (version, etc.), because it looks like transformers somehow pass a None to itself... 😔

My transformers version is 4.26.1, what about yours?

And, if possible, could you please add:

print("> tag", self.tag)
print("> base", self.base)

at "/usr/local/miniconda3/envs/cfenv/lib/python3.10/site-packages/cflearn/models/nlp/tokenizers/clip.py", line 25? So I can somehow 'unittest' it if it's a transformers bug!

The tag variable is defined in its subclasses (e.g., the CLIPTokenizer below). After all, if the tag variable is not defined, exception will be thrown much earlier. 😉

MiYoHu commented 1 year ago

Of course, the transformers version I use is the same as yours. After adding the debugging code, the console prints out the following results:

init APIs.SD tag openai/clip-vit-large-patch14 base <class 'transformers.models.clip.tokenization_clip.CLIPTokenizer'> This error occurs after printing😔

carefree0910 commented 1 year ago

The log is as expected, but it makes things even more weird... you see, what I've done is:

self.tokenizer = self.base.from_pretrained(self.tag)

which according to your log, should be equivalent to:

self.tokenizer = CLIPTokenizer.from_pretrained("openai/clip-vit-large-patch14")

Which is just a pure transformers usage. I tested on my local env and it's OK, could you please test the above statement locally (e.g., via ipython or a standalone script) and see if it can run in your env?

MiYoHu commented 1 year ago

Thank you very much for your help! The project successfully ran on my computer😊, and that strange error, which should not have appeared, has disappeared! Surprisingly, I didn't make any changes to the code or dependencies😮! Thankfully, everything is back to normal now!

carefree0910 commented 1 year ago

Haha that's cool! Hope you enjoy this project!! 🤩