Closed SkyFallen196 closed 1 month ago
I am trying to deploy a project in google colab using GPU T4
pip install transformers==4.38.2 pyarrow==14.0.1 requests==2.31.0 git+https://github.com/facebookresearch/nougat
Try installing a yet earlier version of transformers with pip install transformers==4.25.1
. It worked for me in google colab.
pip install transformers==4.38.2
pip install transformers==4.38.2
It's working. Thanks)
Hello! does anyone know how to fix this errors?
downloading nougat checkpoint version 0.1.0-small to path /root/.cache/torch/hub/nougat-0.1.0-small config.json: 100% 557/557 [00:00<00:00, 2.16Mb/s] pytorch_model.bin: 100% 956M/956M [00:02<00:00, 380Mb/s] special_tokens_map.json: 100% 96.0/96.0 [00:00<00:00, 518kb/s] tokenizer.json: 100% 2.04M/2.04M [00:00<00:00, 173Mb/s] tokenizer_config.json: 100% 106/106 [00:00<00:00, 587kb/s] INFO:root:Output directory does not exist. Creating output directory. /content/myenv/lib/python3.10/site-packages/torch/functional.py:512: UserWarning: torch.meshgrid: in an upcoming release, it will be required to pass the indexing argument. (Triggered internally at ../aten/src/ATen/native/TensorShape.cpp:3587.) return _VF.meshgrid(tensors, *kwargs) # type: ignore[attr-defined] 0% 0/2 [00:01<?, ?it/s] Traceback (most recent call last): File "/content/myenv/bin/nougat", line 8, in
sys.exit(main())
File "/content/myenv/lib/python3.10/site-packages/predict.py", line 167, in main
model_output = model.inference(
File "/content/myenv/lib/python3.10/site-packages/nougat/model.py", line 592, in inference
decoder_output = self.decoder.model.generate(
File "/content/myenv/lib/python3.10/site-packages/torch/utils/_contextlib.py", line 115, in decorate_context
return func( args, kwargs)
File "/content/myenv/lib/python3.10/site-packages/transformers/generation/utils.py", line 1914, in generate
result = self._sample(
File "/content/myenv/lib/python3.10/site-packages/transformers/generation/utils.py", line 2648, in _sample
model_inputs = self.prepare_inputs_for_generation(input_ids, model_kwargs)
TypeError: BARTDecoder.prepare_inputs_for_inference() got an unexpected keyword argument 'cache_position'
-> Cannot close object, library is destroyed. This may cause a memory leak!