Closed bdeva1975 closed 1 week ago
Hi @bdeva1975! This is likely a bug in a recent transformers version. Just to check: are you using the version indicated in the book? Could you show me the output that you get from pip show transformers
, please?
You can try a different model (for example, liminerity/Phigments12) or a newer transformers version (pip install -U transformers
).
@bdeva1975 is this resolved, please?
from transformers import pipeline import torch
generate_text = pipeline( model="liminerity/Phigments12", torch_dtype=torch.bfloat16, trust_remote_code=True, )
The above code worked and generated the following response:
[{'generated_text': "In this chapter, we'll discuss first steps with generative AI in Python. We'll cover the"}]
But when I run the following code:
from langchain import PromptTemplate, LLMChain
template = """Question: {question} Answer: Let's think step by step.""" prompt = PromptTemplate(template=template, input_variables=["question"]) llm_chain = LLMChain(prompt=prompt, llm=generate_text)
question = "What is electroencephalography?" print(llm_chain.run(question))
I get the following error:
ValidationError Traceback (most recent call last) Cell In[10], line 6 3 template = """Question: {question} 4 Answer: Let's think step by step.""" 5 prompt = PromptTemplate(template=template, input_variables=["question"]) ----> 6 llm_chain = LLMChain(prompt=prompt, llm=generate_text) 8 question = "What is electroencephalography?" 9 print(llm_chain.run(question))
File ~\Anaconda3\envs\langchain_ai\lib\site-packages\langchain_core\load\serializable.py:120, in Serializable.init(self, kwargs) 119 def init(self, kwargs: Any) -> None: --> 120 super().init(**kwargs) 121 self._lc_kwargs = kwargs
File ~\Anaconda3\envs\langchain_ai\lib\site-packages\pydantic\v1\main.py:341, in BaseModel.init(pydantic_self__, **data) 339 values, fields_set, validation_error = validate_model(pydantic_self.class, data) 340 if validation_error: --> 341 raise validation_error 342 try: 343 object_setattr(__pydantic_self, 'dict', values)
ValidationError: 2 validation errors for LLMChain llm instance of Runnable expected (type=type_error.arbitrary_type; expected_arbitrary_type=Runnable) llm instance of Runnable expected (type=type_error.arbitrary_type; expected_arbitrary_type=Runnable)
@bdeva1975, hi! Is this with your setup of LC version 0.1? The code on this branch/in the book is relevant to LC version 0.0.284. Have you tried pip install -r requirements.txt
or even something like this here - just copy pasting from the requirments.txt file:
pip install 'langchain[docarray]==0.0.284' 'langchain_experimental==0.0.25' 'sentence-transformers>=2.2.2'
It's really difficult to help you if you work with very different versions to what's in the book/on GitHub.
Best, Ben
Similar error from http://localhost:8888/notebooks/langchain_transformers.ipynb
TypeError: transformers.generation.utils.GenerationMixin.generate() got multiple values for keyword argument 'pad_token_id'
My machine: Docker container, MacBook M2, Ubuntu UTM.
ubuntu@ubuntu:~$ uname -a
Linux ubuntu 5.15.0-86-generic #96-Ubuntu SMP Wed Sep 20 08:29:36 UTC 2023 aarch64 aarch64 aarch64 GNU/Linux
ubuntu@ubuntu:~$ docker ps -a
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
455fb048cfbd ben_langchain_ai "jupyter notebook --\u2026" 3 minutes ago Up 3 minutes 0.0.0.0:8888->8888/tcp, :::8888->8888/tcp ben
BTW, have to resort to lower gpt4all version due to:
# ERROR: Could not find a version that satisfies the requirement gpt4all>=1.0.6 (from versions: 0.1.5, 0.1.6, 0.1.7)
# ERROR: No matching distribution found for gpt4all>=1.0.6
#gpt4all>=1.0.6
gpt4all>=0.1.7
pip list below:
(base) root@455fb048cfbd:/home# pip list
Package Version
---------------------------------------- ---------------
accelerate 0.29.2
aiofiles 23.2.1
aiohttp 3.9.4
aiohttp-cors 0.7.0
aiosignal 1.3.1
altair 5.3.0
annotated-types 0.6.0
anyio 4.3.0
apify_client 1.6.4
apify_shared 1.1.1
archspec 0.2.1
argon2-cffi 23.1.0
argon2-cffi-bindings 21.2.0
arrow 1.3.0
arxiv 2.1.0
asgiref 3.8.1
asttokens 2.4.1
async-lru 2.0.4
attrs 23.2.0
Babel 2.14.0
backoff 2.2.1
backports.tarfile 1.0.0
bcrypt 4.1.2
beautifulsoup4 4.12.3
bleach 6.1.0
blinker 1.7.0
boltons 23.0.0
Brotli 1.0.9
build 1.2.1
cachetools 5.3.3
certifi 2024.2.2
cffi 1.16.0
chardet 5.2.0
charset-normalizer 3.3.2
chroma-hnswlib 0.7.3
chromadb 0.4.24
click 8.1.7
coloredlogs 15.0.1
colorful 0.5.6
comm 0.2.2
conda 24.1.2
conda-content-trust 0.2.0
conda-libmamba-solver 24.1.0
conda-package-handling 2.2.0
conda_package_streaming 0.9.0
contourpy 1.2.1
cryptography 42.0.2
curl_cffi 0.6.2
cycler 0.12.1
dataclasses-json 0.5.14
dataclasses-json-speakeasy 0.5.11
debugpy 1.8.1
decorator 5.1.1
defusedxml 0.7.1
Deprecated 1.2.14
distlib 0.3.8
distro 1.8.0
docarray 0.32.1
docstring_parser 0.16
duckduckgo_search 5.3.0
emoji 2.11.0
executing 2.0.1
faiss-cpu 1.8.0
fastapi 0.110.1
fastjsonschema 2.19.1
feedparser 6.0.10
ffmpy 0.3.2
filelock 3.13.4
filetype 1.2.0
flatbuffers 24.3.25
fonttools 4.51.0
fqdn 1.5.1
frozenlist 1.4.1
fsspec 2024.3.1
gitdb 4.0.11
GitPython 3.1.43
google-ai-generativelanguage 0.6.1
google-api-core 2.18.0
google-api-python-client 2.125.0
google-auth 2.29.0
google-auth-httplib2 0.2.0
google-cloud-aiplatform 1.47.0
google-cloud-bigquery 3.20.1
google-cloud-core 2.4.1
google-cloud-resource-manager 1.12.3
google-cloud-storage 2.16.0
google-crc32c 1.5.0
google-generativeai 0.5.0
google-resumable-media 2.7.0
googleapis-common-protos 1.63.0
gpt4all 0.1.7
gradio 4.26.0
gradio_client 0.15.1
greenlet 3.0.3
grpc-google-iam-v1 0.13.0
grpcio 1.62.1
grpcio-status 1.62.1
h11 0.14.0
hnswlib 0.8.0
httpcore 1.0.5
httplib2 0.22.0
httptools 0.6.1
httpx 0.27.0
huggingface-hub 0.22.2
humanfriendly 10.0
idna 3.4
importlib-metadata 7.0.0
importlib_resources 6.4.0
iniconfig 2.0.0
ipykernel 6.29.4
ipython 8.23.0
ipywidgets 8.1.2
isoduration 20.11.0
jaraco.context 5.3.0
jedi 0.19.1
Jinja2 3.1.3
joblib 1.4.0
json5 0.9.25
jsonpatch 1.32
jsonpath-python 1.0.6
jsonpointer 2.1
jsonschema 4.21.1
jsonschema-specifications 2023.12.1
jupyter 1.0.0
jupyter_client 8.6.1
jupyter-console 6.6.3
jupyter_core 5.7.2
jupyter-events 0.10.0
jupyter-lsp 2.2.5
jupyter_server 2.14.0
jupyter_server_terminals 0.5.3
jupyterlab 4.1.6
jupyterlab_pygments 0.3.0
jupyterlab_server 2.26.0
jupyterlab_widgets 3.0.10
kiwisolver 1.4.5
kubernetes 29.0.0
lanarky 0.7.16
langchain 0.0.284
langchain-decorators 0.5.6
langchain-experimental 0.0.25
langdetect 1.0.9
langsmith 0.0.65
libmambapy 1.5.6
lxml 5.2.1
markdown-it-py 3.0.0
MarkupSafe 2.1.5
marshmallow 3.21.1
matplotlib 3.8.4
matplotlib-inline 0.1.6
mdurl 0.1.2
menuinst 2.0.2
mistune 3.0.2
mmh3 4.1.0
monotonic 1.6
more-itertools 10.2.0
mpmath 1.3.0
msgpack 1.0.8
multidict 6.0.5
mypy-extensions 1.0.0
nbclient 0.10.0
nbconvert 7.16.3
nbformat 5.10.4
nest-asyncio 1.6.0
networkx 3.3
nltk 3.8.1
notebook 7.1.2
notebook_shim 0.2.4
numexpr 2.10.0
numpy 1.26.4
oauthlib 3.2.2
onnxruntime 1.17.3
openai 0.28.1
opencensus 0.11.4
opencensus-context 0.1.3
opentelemetry-api 1.24.0
opentelemetry-exporter-otlp-proto-common 1.24.0
opentelemetry-exporter-otlp-proto-grpc 1.24.0
opentelemetry-instrumentation 0.45b0
opentelemetry-instrumentation-asgi 0.45b0
opentelemetry-instrumentation-fastapi 0.45b0
opentelemetry-proto 1.24.0
opentelemetry-sdk 1.24.0
opentelemetry-semantic-conventions 0.45b0
opentelemetry-util-http 0.45b0
orjson 3.10.0
overrides 7.7.0
packaging 23.1
pandas 2.2.2
pandoc 1.1.0
pandocfilters 1.5.1
parso 0.8.4
pexpect 4.9.0
pillow 10.3.0
pip 24.0
platformdirs 3.10.0
pluggy 1.0.0
plumbum 1.8.2
ply 3.11
posthog 3.5.0
prometheus_client 0.20.0
prompt-toolkit 3.0.43
promptwatch 0.4.4
proto-plus 1.23.0
protobuf 4.25.3
psutil 5.9.8
ptyprocess 0.7.0
pulsar-client 3.5.0
pure-eval 0.2.2
py-spy 0.3.14
pyarrow 15.0.2
pyasn1 0.6.0
pyasn1_modules 0.4.0
pycosat 0.6.6
pycparser 2.21
pydantic 2.7.0
pydantic_core 2.18.1
pydeck 0.8.1b0
pydub 0.25.1
Pygments 2.17.2
pyparsing 3.1.2
pypdf 4.2.0
PyPika 0.48.9
pyproject_hooks 1.0.0
PySocks 1.7.1
pytest 7.3.1
python-dateutil 2.9.0.post0
python-dotenv 1.0.1
python-iso639 2024.2.7
python-json-logger 2.0.7
python-magic 0.4.27
python-multipart 0.0.9
pytz 2024.1
PyYAML 6.0.1
pyzmq 25.1.2
qtconsole 5.5.1
QtPy 2.4.1
rapidfuzz 3.8.1
ray 2.10.0
referencing 0.34.0
regex 2023.12.25
replicate 0.25.1
requests 2.31.0
requests-oauthlib 2.0.0
rfc3339-validator 0.1.4
rfc3986-validator 0.1.1
rich 13.7.1
rpds-py 0.18.0
rsa 4.9
ruamel.yaml 0.17.21
ruff 0.3.7
safetensors 0.4.2
scikit-learn 1.4.2
scipy 1.13.0
semantic-version 2.10.0
Send2Trash 1.8.3
sentence-transformers 2.2.2
sentencepiece 0.2.0
setuptools 68.2.2
sgmllib3k 1.0.0
shapely 2.0.3
shellingham 1.5.4
six 1.16.0
smart-open 7.0.4
smmap 5.0.1
sniffio 1.3.1
soupsieve 2.5
SQLAlchemy 2.0.29
stack-data 0.6.3
starlette 0.37.2
streamlit 1.33.0
sympy 1.12
tabulate 0.9.0
tenacity 8.2.3
terminado 0.18.1
threadpoolctl 3.4.0
tiktoken 0.6.0
tinycss2 1.2.1
tokenizers 0.15.2
toml 0.10.2
tomlkit 0.12.0
toolz 0.12.1
torch 2.2.2
torchvision 0.17.2
tornado 6.4
tqdm 4.65.0
traitlets 5.14.2
transformers 4.39.3
truststore 0.8.0
typer 0.12.3
types-python-dateutil 2.9.0.20240316
types-requests 2.31.0.20240406
typing_extensions 4.11.0
typing-inspect 0.9.0
tzdata 2024.1
unstructured 0.13.2
unstructured-client 0.18.0
uri-template 1.3.0
uritemplate 4.1.1
urllib3 2.1.0
uvicorn 0.29.0
uvloop 0.19.0
virtualenv 20.25.1
watchdog 4.0.0
watchfiles 0.21.0
wcwidth 0.2.13
webcolors 1.13
webencodings 0.5.1
websocket-client 1.7.0
websockets 11.0.3
wheel 0.41.2
widgetsnbextension 4.0.10
wikipedia 1.4.0
wolframalpha 5.0.0
wrapt 1.16.0
xmltodict 0.13.0
yarl 1.9.4
zipp 3.18.1
zstandard 0.19.0
(base) root@455fb048cfbd:/home#
See if you can repro, thanks in advance
The code snippet is running fine on google colab.
@SebLKMa this looks like a bug in transformers with a particular version: https://github.com/explosion/spacy-llm/issues/463. If you install a different version you should be fine.
I am working on Windows 11. After executing the code:-
from transformers import pipeline import torch
generate_text = pipeline( model="aisquared/dlite-v1-355m", torch_dtype=torch.bfloat16, trust_remote_code=True, device_map="auto", framework="pt" ) generate_text("In this chapter, we'll discuss first steps with generative AI in Python.")
I am getting the error:-
TypeError: transformers.generation.utils.GenerationMixin.generate() got multiple values for keyword argument 'pad_token_id'
Please help.