Closed bruno-growthsales closed 2 weeks ago
To resolve the Pydantic deprecation issue when running Langflow version 1.0.15 on macOS Sonoma Version 14.6.1 with Python 3.12, you can filter out the deprecation warnings by adding the following line to your code:
import warnings
from pydantic import PydanticDeprecatedSince20
warnings.filterwarnings("ignore", category=PydanticDeprecatedSince20)
This line is already present in the provided main.py
file of Langflow:
# Ignore Pydantic deprecation warnings from Langchain
warnings.filterwarnings("ignore", category=PydanticDeprecatedSince20)
Ensure that this line is executed before any Pydantic-related code runs. This will suppress the deprecation warnings and allow your application to run without displaying these warnings [1][2].
Additionally, Langflow's pyproject.toml
specifies the compatible Python versions and dependencies. According to the pyproject.toml
file, Langflow supports Python versions >=3.10,<3.12
and uses Pydantic version ^2.5.0
. Since you are using Python 3.12, you might encounter compatibility issues. Downgrading your Python version to a supported version (e.g., Python 3.11) could help resolve these issues [3].
To continue talking to Dosu, mention @dosu.
same problem in my MacBook (macOS sonoma 14.6.1 + Python 3.12.5 + langflow 1.0.15 or 1.0.14) but it worked on another MacBook (macOS sonoma 14.6 + Python 3.12.5 + langflow 1.0.14)
Hi @bruno-growthsales and @yorkew-east8 ,
I'm trying to reproduce the issue on my machine by installing Langflow 1.0.5 in a virtual environment, but it seems to be working fine for me.
Regarding the Pydantic warnings, these are just deprecation warnings about features that will be removed in the future and shouldn't affect the execution of the flows.
Therefore, I believe the issue might be related to something else. Could you provide more details or share a flow where you're experiencing this problem?
hi, @italojohnny, you can see the installation and execution in this video in less than 5 min.
https://www.loom.com/share/26d31337a55b47e5bd1a2d078d574535?sid=3c226e40-66d9-4136-9822-3c86a0a64d4c
hi, @bruno-growthsales ,
Despite the slowness, which is extremely annoying, there are no errors shown in your video, so I can't deduce the problem...
Would you mind running the same test using the command: langflow run --log-level debug
?
Hello @italojohnny, here is the problem following the same process as in the video.
`lf) brunobs@Brunos-MacBook-Pro VSCODE % langflow run
Starting Langflow v1.0.16...
/Users/brunobs/Documents/VSCODE/lf/lib/python3.12/site-packages/pydantic/_internal/_config.py:291: PydanticDeprecatedSince20: Support for class-based `config` is deprecated, use ConfigDict instead. Deprecated in Pydantic V2.0 to be removed in V3.0. See Pydantic V2 Migration Guide at https://errors.pydantic.dev/2.8/migration/
warnings.warn(DEPRECATION_MESSAGE, DeprecationWarning)
/Users/brunobs/Documents/VSCODE/lf/lib/python3.12/site-packages/litellm/utils.py:17: DeprecationWarning: 'imghdr' is deprecated and slated for removal in Python 3.13
import imghdr
/Users/brunobs/Documents/VSCODE/lf/lib/python3.12/site-packages/litellm/utils.py:115: DeprecationWarning: open_text is deprecated. Use files() instead. Refer to https://importlib-resources.readthedocs.io/en/latest/using.html#migrating-from-legacy for migration advice.
with resources.open_text("litellm.llms.tokenizers", "anthropic_tokenizer.json") as f:
╭───────────────────────────────────────────────────────────────────╮
│ Welcome to ⛓ Langflow │
│ │
│ │
│ Collaborate, and contribute at our GitHub Repo 🌟 │
│ │
│ We collect anonymous usage data to improve Langflow. │
│ You can opt-out by setting DO_NOT_TRACK=true in your environment. │
│ │
│ Access http://127.0.0.1:7861 │
╰───────────────────────────────────────────────────────────────────╯
/Users/brunobs/Documents/VSCODE/lf/lib/python3.12/site-packages/nest_asyncio.py:126: RuntimeWarning: coroutine 'TracingService._end_and_reset' was never awaited
handle = ready.popleft()
RuntimeWarning: Enable tracemalloc to get the object allocation traceback
/Users/brunobs/Documents/VSCODE/lf/lib/python3.12/site-packages/pydantic/main.py:1059: PydanticDeprecatedSince20: The `__fields__` attribute is deprecated, use `model_fields` instead. Deprecated in Pydantic V2.0 to be removed in V3.0. See Pydantic V2 Migration Guide at https://errors.pydantic.dev/2.8/migration/
warnings.warn(
/Users/brunobs/Documents/VSCODE/lf/lib/python3.12/site-packages/nest_asyncio.py:126: RuntimeWarning: coroutine 'TracingService._end_and_reset' was never awaited
handle = ready.popleft()
RuntimeWarning: Enable tracemalloc to get the object allocation traceback
/Users/brunobs/Documents/VSCODE/lf/lib/python3.12/site-packages/langchain_core/_api/beta_decorator.py:87: LangChainBetaWarning: The function `load` is in beta. It is actively being worked on, so the API may change.
warn_beta(
[08/22/24 14:31:19] ERROR 2024-08-22 14:31:19 - ERROR - base - 'ascii' codec can't encode characters in base.py:703
position 1013-1081: ordinal not in range(128)
Traceback (most recent call last):
File "/Users/brunobs/Documents/VSCODE/lf/bin/langflow", line 8, in <module>
sys.exit(main())
| | -> <function main at 0x163226840>
| -> <built-in function exit>
-> <module 'sys' (built-in)>
File
"/Users/brunobs/Documents/VSCODE/lf/lib/python3.12/site-packages/langflow/__main_
_.py", line 593, in main
app()
-> <typer.main.Typer object at 0x143ddbf20>
File
"/Users/brunobs/Documents/VSCODE/lf/lib/python3.12/site-packages/typer/main.py",
line 310, in __call__
return get_command(self)(*args, **kwargs)
| | | -> {}
| | -> ()
| -> <typer.main.Typer object at 0x143ddbf20>
-> <function get_command at 0x102a9c0e0>
File
"/Users/brunobs/Documents/VSCODE/lf/lib/python3.12/site-packages/click/core.py",
line 1157, in __call__
return self.main(*args, **kwargs)
| | | -> {}
| | -> ()
| -> <function TyperGroup.main at 0x102a8a980>
-> <TyperGroup >
File
"/Users/brunobs/Documents/VSCODE/lf/lib/python3.12/site-packages/typer/core.py",
line 723, in main
return _main(
-> <function _main at 0x102a899e0>
File
"/Users/brunobs/Documents/VSCODE/lf/lib/python3.12/site-packages/typer/core.py",
line 193, in _main
rv = self.invoke(ctx)
| | -> <click.core.Context object at 0x1447faa20>
| -> <function MultiCommand.invoke at 0x100ea0040>
-> <TyperGroup >
File
"/Users/brunobs/Documents/VSCODE/lf/lib/python3.12/site-packages/click/core.py",
line 1688, in invoke
return _process_result(sub_ctx.command.invoke(sub_ctx))
| | | | -> <click.core.Context object
at 0x1631f4d10>
| | | -> <function Command.invoke at
0x100e8f9c0>
| | -> <TyperCommand run>
| -> <click.core.Context object at 0x1631f4d10>
-> <function MultiCommand.invoke.<locals>._process_result at
0x16328c540>
File
"/Users/brunobs/Documents/VSCODE/lf/lib/python3.12/site-packages/click/core.py",
line 1434, in invoke
return ctx.invoke(self.callback, **ctx.params)
| | | | | -> {'host': '127.0.0.1', 'workers': 1,
'timeout': 300, 'port': 7860, 'components_path':
PosixPath('/Users/brunobs/Documents/VSCO...
| | | | -> <click.core.Context object at
0x1631f4d10>
| | | -> <function run at 0x163226660>
| | -> <TyperCommand run>
| -> <function Context.invoke at 0x100e8e340>
-> <click.core.Context object at 0x1631f4d10>
File
"/Users/brunobs/Documents/VSCODE/lf/lib/python3.12/site-packages/click/core.py",
line 783, in invoke
return __callback(*args, **kwargs)
| -> {'host': '127.0.0.1', 'workers': 1, 'timeout':
300, 'port': 7860, 'components_path': PosixPath('/Users/brunobs/Documents/VSCO...
-> ()
File
"/Users/brunobs/Documents/VSCODE/lf/lib/python3.12/site-packages/typer/main.py",
line 693, in wrapper
return callback(**use_params)
| -> {'host': '127.0.0.1', 'workers': 1, 'timeout': 300,
'port': 7860, 'components_path': PosixPath('/Users/brunobs/Documents/VSCO...
-> <function run at 0x163227f60>
File
"/Users/brunobs/Documents/VSCODE/lf/lib/python3.12/site-packages/langflow/__main_
_.py", line 182, in run
process = run_on_mac_or_linux(host, port, log_level, options, app)
| | | | | ->
<fastapi.applications.FastAPI object at 0x1632916d0>
| | | | -> {'bind':
'127.0.0.1:7861', 'workers': 1, 'timeout': 300, 'worker_class':
'langflow.server.LangflowUvicornWorker', 'logger_cla...
| | | -> 'critical'
| | -> 7861
| -> '127.0.0.1'
-> <function run_on_mac_or_linux at 0x163227e20>
File
"/Users/brunobs/Documents/VSCODE/lf/lib/python3.12/site-packages/langflow/__main_
_.py", line 210, in run_on_mac_or_linux
webapp_process.start()
| -> <function BaseProcess.start at 0x102ac0720>
-> <Process name='Process-1' parent=18063 started>
File
"/Users/brunobs/Documents/VSCODE/lf/lib/python3.12/site-packages/multiprocess/pro
cess.py", line 121, in start
self._popen = self._Popen(self)
| | | | -> <Process name='Process-1' parent=18063 started>
| | | -> <staticmethod(<function Process._Popen at
0x102b93a60>)>
| | -> <Process name='Process-1' parent=18063 started>
| -> None
-> <Process name='Process-1' parent=18063 started>
File
"/Users/brunobs/Documents/VSCODE/lf/lib/python3.12/site-packages/multiprocess/con
text.py", line 224, in _Popen
return _default_context.get_context().Process._Popen(process_obj)
| | -> <Process
name='Process-1' parent=18063 started>
| -> <function DefaultContext.get_context at
0x102b93c40>
-> <multiprocess.context.DefaultContext object at 0x102ab9e20>
File
"/Users/brunobs/Documents/VSCODE/lf/lib/python3.12/site-packages/multiprocess/con
text.py", line 282, in _Popen
return Popen(process_obj)
| -> <Process name='Process-1' parent=18063 started>
-> <class 'multiprocess.popen_fork.Popen'>
File
"/Users/brunobs/Documents/VSCODE/lf/lib/python3.12/site-packages/multiprocess/pop
en_fork.py", line 19, in __init__
self._launch(process_obj)
| | -> <Process name='Process-1' parent=18063 started>
| -> <function Popen._launch at 0x163585800>
-> <multiprocess.popen_fork.Popen object at 0x1631d6360>
File
"/Users/brunobs/Documents/VSCODE/lf/lib/python3.12/site-packages/multiprocess/pop
en_fork.py", line 71, in _launch
code = process_obj._bootstrap(parent_sentinel=child_r)
| | -> 8
| -> <function BaseProcess._bootstrap at 0x102ac1120>
-> <Process name='Process-1' parent=18063 started>
File
"/Users/brunobs/Documents/VSCODE/lf/lib/python3.12/site-packages/multiprocess/pro
cess.py", line 314, in _bootstrap
self.run()
| -> <function BaseProcess.run at 0x102ac0680>
-> <Process name='Process-1' parent=18063 started>
File
"/Users/brunobs/Documents/VSCODE/lf/lib/python3.12/site-packages/multiprocess/pro
cess.py", line 108, in run
self._target(*self._args, **self._kwargs)
| | | | | -> {}
| | | | -> <Process name='Process-1' parent=18063
started>
| | | -> ('127.0.0.1', 7861, 'critical', {'bind':
'127.0.0.1:7861', 'workers': 1, 'timeout': 300, 'worker_class':
'langflow.server.Lan...
| | -> <Process name='Process-1' parent=18063 started>
| -> <function run_langflow at 0x163226de0>
-> <Process name='Process-1' parent=18063 started>
File
"/Users/brunobs/Documents/VSCODE/lf/lib/python3.12/site-packages/langflow/__main_
_.py", line 432, in run_langflow
LangflowApplication(app, options).run()
| | -> {'bind': '127.0.0.1:7861', 'workers': 1,
'timeout': 300, 'worker_class': 'langflow.server.LangflowUvicornWorker',
'logger_cla...
| -> <fastapi.applications.FastAPI object at 0x1632916d0>
-> <class 'langflow.server.LangflowApplication'>
File
"/Users/brunobs/Documents/VSCODE/lf/lib/python3.12/site-packages/gunicorn/app/bas
e.py", line 72, in run
Arbiter(self).run()
| -> <langflow.server.LangflowApplication object at 0x102ab9820>
-> <class 'gunicorn.arbiter.Arbiter'>
File
"/Users/brunobs/Documents/VSCODE/lf/lib/python3.12/site-packages/gunicorn/arbiter
.py", line 202, in run
self.manage_workers()
| -> <function Arbiter.manage_workers at 0x1635b3b00>
-> <gunicorn.arbiter.Arbiter object at 0x1635aa330>
File
"/Users/brunobs/Documents/VSCODE/lf/lib/python3.12/site-packages/gunicorn/arbiter
.py", line 571, in manage_workers
self.spawn_workers()
| -> <function Arbiter.spawn_workers at 0x1635b3c40>
-> <gunicorn.arbiter.Arbiter object at 0x1635aa330>
File
"/Users/brunobs/Documents/VSCODE/lf/lib/python3.12/site-packages/gunicorn/arbiter
.py", line 642, in spawn_workers
self.spawn_worker()
| -> <function Arbiter.spawn_worker at 0x1635b3ba0>
-> <gunicorn.arbiter.Arbiter object at 0x1635aa330>
File
"/Users/brunobs/Documents/VSCODE/lf/lib/python3.12/site-packages/gunicorn/arbiter
.py", line 609, in spawn_worker
worker.init_process()
| -> <function UvicornWorker.init_process at 0x1638551c0>
-> <langflow.server.LangflowUvicornWorker object at 0x1635a9c10>
File
"/Users/brunobs/Documents/VSCODE/lf/lib/python3.12/site-packages/uvicorn/workers.
py", line 75, in init_process
super().init_process()
File
"/Users/brunobs/Documents/VSCODE/lf/lib/python3.12/site-packages/gunicorn/workers
/base.py", line 142, in init_process
self.run()
| -> <function UvicornWorker.run at 0x1638916c0>
-> <langflow.server.LangflowUvicornWorker object at 0x1635a9c10>
File
"/Users/brunobs/Documents/VSCODE/lf/lib/python3.12/site-packages/uvicorn/workers.
py", line 107, in run
return asyncio.run(self._serve())
| | | -> <function LangflowUvicornWorker._serve at
0x1635cd620>
| | -> <langflow.server.LangflowUvicornWorker object at
0x1635a9c10>
| -> <function _patch_asyncio.<locals>.run at 0x163893240>
-> <module 'asyncio' from
'/opt/homebrew/Cellar/python@3.12/3.12.5/Frameworks/Python.framework/Versions/3.1
2/lib/python3.12/asyn...
File
"/opt/homebrew/Cellar/python@3.12/3.12.5/Frameworks/Python.framework/Versions/3.1
2/lib/python3.12/asyncio/runners.py", line 194, in run
return runner.run(main)
| | -> <coroutine object LangflowUvicornWorker._serve at
0x1638793c0>
| -> <function Runner.run at 0x102d9e480>
-> <asyncio.runners.Runner object at 0x163836f00>
File
"/opt/homebrew/Cellar/python@3.12/3.12.5/Frameworks/Python.framework/Versions/3.1
2/lib/python3.12/asyncio/runners.py", line 118, in run
return self._loop.run_until_complete(task)
| | | -> <Task pending name='Task-1'
coro=<LangflowUvicornWorker._serve() running at
/Users/brunobs/Documents/VSCODE/lf/lib/python3.12...
| | -> <function _patch_loop.<locals>.run_until_complete at
0x16392d300>
| -> <_UnixSelectorEventLoop running=True closed=False debug=False>
-> <asyncio.runners.Runner object at 0x163836f00>
File
"/opt/homebrew/Cellar/python@3.12/3.12.5/Frameworks/Python.framework/Versions/3.1
2/lib/python3.12/asyncio/base_events.py", line 674, in run_until_complete
self.run_forever()
| -> <function _patch_loop.<locals>.run_forever at 0x16392d260>
-> <_UnixSelectorEventLoop running=True closed=False debug=False>
File
"/opt/homebrew/Cellar/python@3.12/3.12.5/Frameworks/Python.framework/Versions/3.1
2/lib/python3.12/asyncio/base_events.py", line 641, in run_forever
self._run_once()
| -> <function _patch_loop.<locals>._run_once at 0x16392d3a0>
-> <_UnixSelectorEventLoop running=True closed=False debug=False>
File
"/Users/brunobs/Documents/VSCODE/lf/lib/python3.12/site-packages/nest_asyncio.py"
, line 133, in _run_once
handle._run()
| -> <function Handle._run at 0x102c6d1c0>
-> <Handle Task.__step()>
File
"/opt/homebrew/Cellar/python@3.12/3.12.5/Frameworks/Python.framework/Versions/3.1
2/lib/python3.12/asyncio/events.py", line 88, in _run
self._context.run(self._callback, *self._args)
| | | | | -> <member '_args' of 'Handle'
objects>
| | | | -> <Handle Task.__step()>
| | | -> <member '_callback' of 'Handle' objects>
| | -> <Handle Task.__step()>
| -> <member '_context' of 'Handle' objects>
-> <Handle Task.__step()>
File
"/opt/homebrew/Cellar/python@3.12/3.12.5/Frameworks/Python.framework/Versions/3.1
2/lib/python3.12/asyncio/tasks.py", line 303, in __step
self.__step_run_and_handle_result(exc)
| -> None
-> <Task pending name='Task-452' coro=<build_flow.<locals>._build_vertex()
running at /Users/brunobs/Documents/VSCODE/lf/lib/pyt...
File
"/opt/homebrew/Cellar/python@3.12/3.12.5/Frameworks/Python.framework/Versions/3.1
2/lib/python3.12/asyncio/tasks.py", line 314, in __step_run_and_handle_result
result = coro.send(None)
| -> <method 'send' of 'coroutine' objects>
-> <coroutine object build_flow.<locals>._build_vertex at
0x17fcb1640>
File
"/Users/brunobs/Documents/VSCODE/lf/lib/python3.12/site-packages/langflow/api/v1/
chat.py", line 218, in _build_vertex
vertex_build_result = await graph.build_vertex(
| -> <function Graph.build_vertex at
0x1445c6e80>
-> Graph Representation:
----------------------
Vertices (4):
ChatInput-E8tgr, Prompt-QJVR4,
ChatOutput-7vSsH, OpenAIModel-XNc...
File
"/Users/brunobs/Documents/VSCODE/lf/lib/python3.12/site-packages/langflow/graph/g
raph/base.py", line 1276, in build_vertex
await vertex.build(
| -> <function Vertex.build at 0x1445be8e0>
-> Vertex(display_name=OpenAI, id=OpenAIModel-XNcrJ, data={'type':
'OpenAIModel', 'node': {'template': {'_type': 'Component', 'a...
File
"/Users/brunobs/Documents/VSCODE/lf/lib/python3.12/site-packages/langflow/graph/v
ertex/base.py", line 792, in build
await step(user_id=user_id, **kwargs)
| | -> {'fallback_to_env_vars': False}
| -> UUID('cf2cd67f-46c4-4092-8788-987fd0a14512')
-> <bound method Vertex._build of Vertex(display_name=OpenAI,
id=OpenAIModel-XNcrJ, data={'type': 'OpenAIModel', 'node': {'templ...
File
"/Users/brunobs/Documents/VSCODE/lf/lib/python3.12/site-packages/langflow/graph/v
ertex/base.py", line 475, in _build
await self._build_results(custom_component, custom_params,
fallback_to_env_vars)
| | | | -> False
| | | -> {'input_value':
Message(text_key='text', data={'template': 'Answer the user as if you were a
pirate.\n\nUser: {user_input}\n\...
| | -> <langflow.utils.validate.OpenAIModelComponent
object at 0x17d9d96d0>
| -> <function Vertex._build_results at 0x1445be520>
-> Vertex(display_name=OpenAI, id=OpenAIModel-XNcrJ, data={'type':
'OpenAIModel', 'node': {'template': {'_type': 'Component', 'a...
> File
"/Users/brunobs/Documents/VSCODE/lf/lib/python3.12/site-packages/langflow/graph/v
ertex/base.py", line 690, in _build_results
result = await initialize.loading.get_instance_results(
| | -> <function get_instance_results at
0x143815d00>
| -> <module 'langflow.interface.initialize.loading'
from '/Users/brunobs/Documents/VSCODE/lf/lib/python3.12/site-packages/langflo...
-> <module 'langflow.interface.initialize' from
'/Users/brunobs/Documents/VSCODE/lf/lib/python3.12/site-packages/langflow/interf.
..
File
"/Users/brunobs/Documents/VSCODE/lf/lib/python3.12/site-packages/langflow/interfa
ce/initialize/loading.py", line 60, in get_instance_results
return await build_component(params=custom_params,
custom_component=custom_component)
| | ->
<langflow.utils.validate.OpenAIModelComponent object at 0x17d9d96d0>
| -> {'input_value':
Message(text_key='text', data={'template': 'Answer the user as if you were a
pirate.\n\nUser: {user_input}\n\...
-> <function build_component at 0x1445bc900>
File
"/Users/brunobs/Documents/VSCODE/lf/lib/python3.12/site-packages/langflow/interfa
ce/initialize/loading.py", line 147, in build_component
build_results, artifacts = await custom_component.build_results()
| -> <function
Component.build_results at 0x1445bc360>
->
<langflow.utils.validate.OpenAIModelComponent object at 0x17d9d96d0>
File
"/Users/brunobs/Documents/VSCODE/lf/lib/python3.12/site-packages/langflow/custom/
custom_component/component.py", line 570, in build_results
return await self._build_with_tracing()
| -> <function Component._build_with_tracing at 0x1445bc220>
-> <langflow.utils.validate.OpenAIModelComponent object at
0x17d9d96d0>
File
"/Users/brunobs/Documents/VSCODE/lf/lib/python3.12/site-packages/langflow/custom/
custom_component/component.py", line 560, in _build_with_tracing
_results, _artifacts = await self._build_results()
| -> <function Component._build_results at
0x1445bc400>
-> <langflow.utils.validate.OpenAIModelComponent
object at 0x17d9d96d0>
File
"/Users/brunobs/Documents/VSCODE/lf/lib/python3.12/site-packages/langflow/custom/
custom_component/component.py", line 592, in _build_results
result = method()
-> <bound method LCModelComponent.text_response of
<langflow.utils.validate.OpenAIModelComponent object at 0x17d9d96d0>>
File
"/Users/brunobs/Documents/VSCODE/lf/lib/python3.12/site-packages/langflow/base/mo
dels/model.py", line 57, in text_response
result = self.get_chat_result(output, stream, input_value, system_message)
| | | | | -> ''
| | | | -> Message(text_key='text',
data={'template': 'Answer the user as if you were a pirate.\n\nUser:
{user_input}\n\nAnswer: ', 'var...
| | | -> False
| | ->
ChatOpenAI(client=<openai.resources.chat.completions.Completions object at
0x17d9b7b90>, async_client=<openai.resources.chat....
| -> <function LCModelComponent.get_chat_result at 0x164d420c0>
-> <langflow.utils.validate.OpenAIModelComponent object at
0x17d9d96d0>
File
"/Users/brunobs/Documents/VSCODE/lf/lib/python3.12/site-packages/langflow/base/mo
dels/model.py", line 190, in get_chat_result
raise e
File
"/Users/brunobs/Documents/VSCODE/lf/lib/python3.12/site-packages/langflow/base/mo
dels/model.py", line 176, in get_chat_result
message = runnable.invoke(inputs) # type: ignore
| | -> {}
| -> <function RunnableBindingBase.invoke at 0x104fc8400>
-> RunnableBinding(bound=ChatPromptTemplate(input_variables=[],
messages=[HumanMessage(content='Answer the user as if you were a...
File
"/Users/brunobs/Documents/VSCODE/lf/lib/python3.12/site-packages/langchain_core/r
unnables/base.py", line 5094, in invoke
return self.bound.invoke(
| | -> <function RunnableSequence.invoke at 0x104fa3e20>
| -> ChatPromptTemplate(input_variables=[],
messages=[HumanMessage(content='Answer the user as if you were a pirate.\n\nUser:
Oi!\...
-> RunnableBinding(bound=ChatPromptTemplate(input_variables=[],
messages=[HumanMessage(content='Answer the user as if you were a...
File
"/Users/brunobs/Documents/VSCODE/lf/lib/python3.12/site-packages/langchain_core/r
unnables/base.py", line 2878, in invoke
input = context.run(step.invoke, input, config)
| | | | | -> {'tags': [], 'metadata':
{'project_name': 'Langflow'}, 'callbacks':
<langchain_core.callbacks.manager.CallbackManager object ...
| | | | ->
ChatPromptValue(messages=[HumanMessage(content='Answer the user as if you were a
pirate.\n\nUser: Oi!\n\nAnswer: ')])
| | | -> <function BaseChatModel.invoke at 0x1050eb9c0>
| | ->
ChatOpenAI(client=<openai.resources.chat.completions.Completions object at
0x17d9b7b90>, async_client=<openai.resources.chat....
| -> <method 'run' of '_contextvars.Context' objects>
-> <_contextvars.Context object at 0x17da6e2c0>
File
"/Users/brunobs/Documents/VSCODE/lf/lib/python3.12/site-packages/langchain_core/l
anguage_models/chat_models.py", line 276, in invoke
self.generate_prompt(
| -> <function BaseChatModel.generate_prompt at 0x105110040>
-> ChatOpenAI(client=<openai.resources.chat.completions.Completions object at
0x17d9b7b90>, async_client=<openai.resources.chat....
File
"/Users/brunobs/Documents/VSCODE/lf/lib/python3.12/site-packages/langchain_core/l
anguage_models/chat_models.py", line 776, in generate_prompt
return self.generate(prompt_messages, stop=stop, callbacks=callbacks,
**kwargs)
| | | | | ->
{'tags': [], 'metadata': {'project_name': 'Langflow'}, 'run_name': None,
'run_id': None}
| | | | ->
<langchain_core.callbacks.manager.CallbackManager object at 0x17d9b7a10>
| | | -> None
| | -> [[HumanMessage(content='Answer the user as if you
were a pirate.\n\nUser: Oi!\n\nAnswer: ')]]
| -> <function BaseChatModel.generate at 0x1050ebec0>
-> ChatOpenAI(client=<openai.resources.chat.completions.Completions
object at 0x17d9b7b90>, async_client=<openai.resources.chat....
File
"/Users/brunobs/Documents/VSCODE/lf/lib/python3.12/site-packages/langchain_core/l
anguage_models/chat_models.py", line 633, in generate
raise e
File
"/Users/brunobs/Documents/VSCODE/lf/lib/python3.12/site-packages/langchain_core/l
anguage_models/chat_models.py", line 623, in generate
self._generate_with_cache(
| -> <function BaseChatModel._generate_with_cache at 0x105110180>
-> ChatOpenAI(client=<openai.resources.chat.completions.Completions object at
0x17d9b7b90>, async_client=<openai.resources.chat....
File
"/Users/brunobs/Documents/VSCODE/lf/lib/python3.12/site-packages/langchain_core/l
anguage_models/chat_models.py", line 845, in _generate_with_cache
result = self._generate(
| -> <function BaseChatOpenAI._generate at 0x165d11d00>
-> ChatOpenAI(client=<openai.resources.chat.completions.Completions
object at 0x17d9b7b90>, async_client=<openai.resources.chat....
File
"/Users/brunobs/Documents/VSCODE/lf/lib/python3.12/site-packages/langchain_openai
/chat_models/base.py", line 635, in _generate
response = self.client.create(**payload)
| | | -> {'messages': [{'content': 'Answer the user
as if you were a pirate.\n\nUser: Oi!\n\nAnswer: ', 'role': 'user'}], 'model':
'gp...
| | -> <function Completions.create at 0x165a289a0>
| -> <openai.resources.chat.completions.Completions object at
0x17d9b7b90>
->
ChatOpenAI(client=<openai.resources.chat.completions.Completions object at
0x17d9b7b90>, async_client=<openai.resources.chat....
File
"/Users/brunobs/Documents/VSCODE/lf/lib/python3.12/site-packages/openai/_utils/_u
tils.py", line 274, in wrapper
return func(*args, **kwargs)
| | -> {'messages': [{'content': 'Answer the user as if you
were a pirate.\n\nUser: Oi!\n\nAnswer: ', 'role': 'user'}], 'model': 'gp...
| -> (<openai.resources.chat.completions.Completions object at
0x17d9b7b90>,)
-> <function Completions.create at 0x165a28860>
File
"/Users/brunobs/Documents/VSCODE/lf/lib/python3.12/site-packages/openai/resources
/chat/completions.py", line 668, in create
return self._post(
| -> <bound method SyncAPIClient.post of <openai.OpenAI object at
0x17d9c9f40>>
-> <openai.resources.chat.completions.Completions object at
0x17d9b7b90>
File
"/Users/brunobs/Documents/VSCODE/lf/lib/python3.12/site-packages/openai/_base_cli
ent.py", line 1260, in post
return cast(ResponseT, self.request(cast_to, opts, stream=stream,
stream_cls=stream_cls))
| | | | | | |
-> openai.Stream
| | | | | | -> False
| | | | | ->
FinalRequestOptions(method='post', url='/chat/completions', params={},
headers=NOT_GIVEN, max_retries=NOT_GIVEN, timeout=NOT_...
| | | | -> <class
'openai.types.chat.chat_completion.ChatCompletion'>
| | | -> <function SyncAPIClient.request at
0x164fb4040>
| | -> <openai.OpenAI object at 0x17d9c9f40>
| -> ~ResponseT
-> <function cast at 0x100d00360>
File
"/Users/brunobs/Documents/VSCODE/lf/lib/python3.12/site-packages/openai/_base_cli
ent.py", line 937, in request
return self._request(
| -> <function SyncAPIClient._request at 0x164fb40e0>
-> <openai.OpenAI object at 0x17d9c9f40>
File
"/Users/brunobs/Documents/VSCODE/lf/lib/python3.12/site-packages/openai/_base_cli
ent.py", line 963, in _request
request = self._build_request(options)
| | -> FinalRequestOptions(method='post',
url='/chat/completions', params={}, headers=NOT_GIVEN, max_retries=NOT_GIVEN,
timeout=NOT_...
| -> <function BaseClient._build_request at 0x164fa6b60>
-> <openai.OpenAI object at 0x17d9c9f40>
File
"/Users/brunobs/Documents/VSCODE/lf/lib/python3.12/site-packages/openai/_base_cli
ent.py", line 459, in _build_request
headers = self._build_headers(options)
| | -> FinalRequestOptions(method='post',
url='/chat/completions', params={}, headers=NOT_GIVEN, max_retries=NOT_GIVEN,
timeout=NOT_...
| -> <function BaseClient._build_headers at 0x164fa6980>
-> <openai.OpenAI object at 0x17d9c9f40>
File
"/Users/brunobs/Documents/VSCODE/lf/lib/python3.12/site-packages/openai/_base_cli
ent.py", line 417, in _build_headers
headers = httpx.Headers(headers_dict)
| | -> {'Accept': 'application/json', 'Content-Type':
'application/json', 'User-Agent': 'OpenAI/Python 1.42.0', 'X-Stainless-Lang': ...
| -> <class 'httpx.Headers'>
-> <module 'httpx' from
'/Users/brunobs/Documents/VSCODE/lf/lib/python3.12/site-packages/httpx/__init__.p
y'>
File
"/Users/brunobs/Documents/VSCODE/lf/lib/python3.12/site-packages/httpx/_models.py
", line 76, in __init__
normalize_header_value(v, encoding),
| -> None
-> <function normalize_header_value at 0x101a17560>
File
"/Users/brunobs/Documents/VSCODE/lf/lib/python3.12/site-packages/httpx/_utils.py"
, line 53, in normalize_header_value
return value.encode(encoding or "ascii")
| | -> None
| -> <method 'encode' of 'str' objects>
-> 'Bearer (lf) brunobs@Brunos-MacBook-Pro VSCODE % langflow run
Starting Langflow v1.0.16... /Users/brunobs/Documents/VSCODE/lf...
UnicodeEncodeError: 'ascii' codec can't encode characters in position 1013-1081:
ordinal not in range(128)
╭────────────────────── Traceback (most recent call last) ──────────────────────╮
│ /Users/brunobs/Documents/VSCODE/lf/lib/python3.12/site-packages/langflow/grap │
│ h/vertex/base.py:690 in _build_results │
│ │
│ 687 │ │
│ 688 │ async def _build_results(self, custom_component, custom_params, │
│ fallback_to_env_vars=False): │
│ 689 │ │ try: │
│ ❱ 690 │ │ │ result = await initialize.loading.get_instance_results( │
│ 691 │ │ │ │ custom_component=custom_component, │
│ 692 │ │ │ │ custom_params=custom_params, │
│ 693 │ │ │ │ vertex=self, │
│ │
│ /Users/brunobs/Documents/VSCODE/lf/lib/python3.12/site-packages/langflow/inte │
│ rface/initialize/loading.py:60 in get_instance_results │
│ │
│ 57 │ │ if base_type == "custom_components": │
│ 58 │ │ │ return await build_custom_component(params=custom_params, │
│ custom_component=custom_component) │
│ 59 │ │ elif base_type == "component": │
│ ❱ 60 │ │ │ return await build_component(params=custom_params, │
│ custom_component=custom_component) │
│ 61 │ │ else: │
│ 62 │ │ │ raise ValueError(f"Base type {base_type} not found.") │
│ 63 │
│ │
│ /Users/brunobs/Documents/VSCODE/lf/lib/python3.12/site-packages/langflow/inte │
│ rface/initialize/loading.py:147 in build_component │
│ │
│ 144 ): │
│ 145 │ # Now set the params as attributes of the custom_component │
│ 146 │ custom_component.set_attributes(params) │
│ ❱ 147 │ build_results, artifacts = await custom_component.build_results() │
│ 148 │ │
│ 149 │ return custom_component, build_results, artifacts │
│ 150 │
│ │
│ /Users/brunobs/Documents/VSCODE/lf/lib/python3.12/site-packages/langflow/cust │
│ om/custom_component/component.py:570 in build_results │
│ │
│ 567 │ │
│ 568 │ async def build_results(self): │
│ 569 │ │ if self._tracing_service: │
│ ❱ 570 │ │ │ return await self._build_with_tracing() │
│ 571 │ │ return await self._build_without_tracing() │
│ 572 │ │
│ 573 │ async def _build_results(self): │
│ │
│ /Users/brunobs/Documents/VSCODE/lf/lib/python3.12/site-packages/langflow/cust │
│ om/custom_component/component.py:560 in _build_with_tracing │
│ │
│ 557 │ │ inputs = self.get_trace_as_inputs() │
│ 558 │ │ metadata = self.get_trace_as_metadata() │
│ 559 │ │ async with self._tracing_service.trace_context(self, self.trace │
│ metadata): │
│ ❱ 560 │ │ │ _results, _artifacts = await self._build_results() │
│ 561 │ │ │ self._tracing_service.set_outputs(self.trace_name, _results │
│ 562 │ │ │
│ 563 │ │ return _results, _artifacts │
│ │
│ /Users/brunobs/Documents/VSCODE/lf/lib/python3.12/site-packages/langflow/cust │
│ om/custom_component/component.py:592 in _build_results │
│ │
│ 589 │ │ │ │ │ │ _results[output.name] = output.value │
│ 590 │ │ │ │ │ │ result = output.value │
│ 591 │ │ │ │ │ else: │
│ ❱ 592 │ │ │ │ │ │ result = method() │
│ 593 │ │ │ │ │ │ # If the method is asynchronous, we need to awa │
│ 594 │ │ │ │ │ │ if inspect.iscoroutinefunction(method): │
│ 595 │ │ │ │ │ │ │ result = await result │
│ │
│ /Users/brunobs/Documents/VSCODE/lf/lib/python3.12/site-packages/langflow/base │
│ /models/model.py:57 in text_response │
│ │
│ 54 │ │ stream = self.stream │
│ 55 │ │ system_message = self.system_message │
│ 56 │ │ output = self.build_model() │
│ ❱ 57 │ │ result = self.get_chat_result(output, stream, input_value, syst │
│ 58 │ │ self.status = result │
│ 59 │ │ return result │
│ 60 │
│ │
│ /Users/brunobs/Documents/VSCODE/lf/lib/python3.12/site-packages/langflow/base │
│ /models/model.py:190 in get_chat_result │
│ │
│ 187 │ │ except Exception as e: │
│ 188 │ │ │ if message := self._get_exception_message(e): │
│ 189 │ │ │ │ raise ValueError(message) from e │
│ ❱ 190 │ │ │ raise e │
│ 191 │ │
│ 192 │ @abstractmethod │
│ 193 │ def build_model(self) -> LanguageModel: # type: ignore[type-var] │
│ │
│ /Users/brunobs/Documents/VSCODE/lf/lib/python3.12/site-packages/langflow/base │
│ /models/model.py:176 in get_chat_result │
│ │
│ 173 │ │ │ if stream: │
│ 174 │ │ │ │ return runnable.stream(inputs) # type: ignore │
│ 175 │ │ │ else: │
│ ❱ 176 │ │ │ │ message = runnable.invoke(inputs) # type: ignore │
│ 177 │ │ │ │ result = message.content if hasattr(message, "content") │
│ 178 │ │ │ │ if isinstance(message, AIMessage): │
│ 179 │ │ │ │ │ status_message = self.build_status_message(message) │
│ │
│ /Users/brunobs/Documents/VSCODE/lf/lib/python3.12/site-packages/langchain_cor │
│ e/runnables/base.py:5094 in invoke │
│ │
│ 5091 │ │ config: Optional[RunnableConfig] = None, │
│ 5092 │ │ **kwargs: Optional[Any], │
│ 5093 │ ) -> Output: │
│ ❱ 5094 │ │ return self.bound.invoke( │
│ 5095 │ │ │ input, │
│ 5096 │ │ │ self._merge_configs(config), │
│ 5097 │ │ │ **{**self.kwargs, **kwargs}, │
│ │
│ /Users/brunobs/Documents/VSCODE/lf/lib/python3.12/site-packages/langchain_cor │
│ e/runnables/base.py:2878 in invoke │
│ │
│ 2875 │ │ │ │ if i == 0: │
│ 2876 │ │ │ │ │ input = context.run(step.invoke, input, config, ** │
│ 2877 │ │ │ │ else: │
│ ❱ 2878 │ │ │ │ │ input = context.run(step.invoke, input, config) │
│ 2879 │ │ # finish the root run │
│ 2880 │ │ except BaseException as e: │
│ 2881 │ │ │ run_manager.on_chain_error(e) │
│ │
│ /Users/brunobs/Documents/VSCODE/lf/lib/python3.12/site-packages/langchain_cor │
│ e/language_models/chat_models.py:276 in invoke │
│ │
│ 273 │ │ config = ensure_config(config) │
│ 274 │ │ return cast( │
│ 275 │ │ │ ChatGeneration, │
│ ❱ 276 │ │ │ self.generate_prompt( │
│ 277 │ │ │ │ [self._convert_input(input)], │
│ 278 │ │ │ │ stop=stop, │
│ 279 │ │ │ │ callbacks=config.get("callbacks"), │
│ │
│ /Users/brunobs/Documents/VSCODE/lf/lib/python3.12/site-packages/langchain_cor │
│ e/language_models/chat_models.py:776 in generate_prompt │
│ │
│ 773 │ │ **kwargs: Any, │
│ 774 │ ) -> LLMResult: │
│ 775 │ │ prompt_messages = [p.to_messages() for p in prompts] │
│ ❱ 776 │ │ return self.generate(prompt_messages, stop=stop, callbacks=cal │
│ 777 │ │
│ 778 │ async def agenerate_prompt( │
│ 779 │ │ self, │
│ │
│ /Users/brunobs/Documents/VSCODE/lf/lib/python3.12/site-packages/langchain_cor │
│ e/language_models/chat_models.py:633 in generate │
│ │
│ 630 │ │ │ except BaseException as e: │
│ 631 │ │ │ │ if run_managers: │
│ 632 │ │ │ │ │ run_managers[i].on_llm_error(e, response=LLMResult │
│ ❱ 633 │ │ │ │ raise e │
│ 634 │ │ flattened_outputs = [ │
│ 635 │ │ │ LLMResult(generations=[res.generations], llm_output=res.ll │
│ ignore[list-item] │
│ 636 │ │ │ for res in results │
│ │
│ /Users/brunobs/Documents/VSCODE/lf/lib/python3.12/site-packages/langchain_cor │
│ e/language_models/chat_models.py:623 in generate │
│ │
│ 620 │ │ for i, m in enumerate(messages): │
│ 621 │ │ │ try: │
│ 622 │ │ │ │ results.append( │
│ ❱ 623 │ │ │ │ │ self._generate_with_cache( │
│ 624 │ │ │ │ │ │ m, │
│ 625 │ │ │ │ │ │ stop=stop, │
│ 626 │ │ │ │ │ │ run_manager=run_managers[i] if run_managers el │
│ │
│ /Users/brunobs/Documents/VSCODE/lf/lib/python3.12/site-packages/langchain_cor │
│ e/language_models/chat_models.py:845 in _generate_with_cache │
│ │
│ 842 │ │ │ result = generate_from_stream(iter(chunks)) │
│ 843 │ │ else: │
│ 844 │ │ │ if inspect.signature(self._generate).parameters.get("run_m │
│ ❱ 845 │ │ │ │ result = self._generate( │
│ 846 │ │ │ │ │ messages, stop=stop, run_manager=run_manager, **kw │
│ 847 │ │ │ │ ) │
│ 848 │ │ │ else: │
│ │
│ /Users/brunobs/Documents/VSCODE/lf/lib/python3.12/site-packages/langchain_ope │
│ nai/chat_models/base.py:635 in _generate │
│ │
│ 632 │ │ │ response = raw_response.parse() │
│ 633 │ │ │ generation_info = {"headers": dict(raw_response.headers)} │
│ 634 │ │ else: │
│ ❱ 635 │ │ │ response = self.client.create(**payload) │
│ 636 │ │ return self._create_chat_result(response, generation_info) │
│ 637 │ │
│ 638 │ def _get_request_payload( │
│ │
│ /Users/brunobs/Documents/VSCODE/lf/lib/python3.12/site-packages/openai/_utils │
│ /_utils.py:274 in wrapper │
│ │
│ 271 │ │ │ │ │ else: │
│ 272 │ │ │ │ │ │ msg = f"Missing required argument: {quote(missi │
│ 273 │ │ │ │ raise TypeError(msg) │
│ ❱ 274 │ │ │ return func(*args, **kwargs) │
│ 275 │ │ │
│ 276 │ │ return wrapper # type: ignore │
│ 277 │
│ │
│ /Users/brunobs/Documents/VSCODE/lf/lib/python3.12/site-packages/openai/resour │
│ ces/chat/completions.py:668 in create │
│ │
│ 665 │ │ timeout: float | httpx.Timeout | None | NotGiven = NOT_GIVEN, │
│ 666 │ ) -> ChatCompletion | Stream[ChatCompletionChunk]: │
│ 667 │ │ validate_response_format(response_format) │
│ ❱ 668 │ │ return self._post( │
│ 669 │ │ │ "/chat/completions", │
│ 670 │ │ │ body=maybe_transform( │
│ 671 │ │ │ │ { │
│ │
│ /Users/brunobs/Documents/VSCODE/lf/lib/python3.12/site-packages/openai/_base_ │
│ client.py:1260 in post │
│ │
│ 1257 │ │ opts = FinalRequestOptions.construct( │
│ 1258 │ │ │ method="post", url=path, json_data=body, files=to_httpx_fi │
│ **options │
│ 1259 │ │ ) │
│ ❱ 1260 │ │ return cast(ResponseT, self.request(cast_to, opts, stream=stre │
│ stream_cls=stream_cls)) │
│ 1261 │ │
│ 1262 │ def patch( │
│ 1263 │ │ self, │
│ │
│ /Users/brunobs/Documents/VSCODE/lf/lib/python3.12/site-packages/openai/_base_ │
│ client.py:937 in request │
│ │
│ 934 │ │ stream: bool = False, │
│ 935 │ │ stream_cls: type[_StreamT] | None = None, │
│ 936 │ ) -> ResponseT | _StreamT: │
│ ❱ 937 │ │ return self._request( │
│ 938 │ │ │ cast_to=cast_to, │
│ 939 │ │ │ options=options, │
│ 940 │ │ │ stream=stream, │
│ │
│ /Users/brunobs/Documents/VSCODE/lf/lib/python3.12/site-packages/openai/_base_ │
│ client.py:963 in _request │
│ │
│ 960 │ │ options = self._prepare_options(options) │
│ 961 │ │ │
│ 962 │ │ retries = self._remaining_retries(remaining_retries, options) │
│ ❱ 963 │ │ request = self._build_request(options) │
│ 964 │ │ self._prepare_request(request) │
│ 965 │ │ │
│ 966 │ │ kwargs: HttpxSendArgs = {} │
│ │
│ /Users/brunobs/Documents/VSCODE/lf/lib/python3.12/site-packages/openai/_base_ │
│ client.py:459 in _build_request │
│ │
│ 456 │ │ │ else: │
│ 457 │ │ │ │ raise RuntimeError(f"Unexpected JSON data type, {type( │
│ cannot merge with `extra_body`") │
│ 458 │ │ │
│ ❱ 459 │ │ headers = self._build_headers(options) │
│ 460 │ │ params = _merge_mappings(self.default_query, options.params) │
│ 461 │ │ content_type = headers.get("Content-Type") │
│ 462 │ │ files = options.files │
│ │
│ /Users/brunobs/Documents/VSCODE/lf/lib/python3.12/site-packages/openai/_base_ │
│ client.py:417 in _build_headers │
│ │
│ 414 │ │ self._validate_headers(headers_dict, custom_headers) │
│ 415 │ │ │
│ 416 │ │ # headers are case-insensitive while dictionaries are not. │
│ ❱ 417 │ │ headers = httpx.Headers(headers_dict) │
│ 418 │ │ │
│ 419 │ │ idempotency_header = self._idempotency_header │
│ 420 │ │ if idempotency_header and options.method.lower() != "get" and │
│ not in headers: │
│ │
│ /Users/brunobs/Documents/VSCODE/lf/lib/python3.12/site-packages/httpx/_models │
│ .py:76 in __init__ │
│ │
│ 73 │ │ │ │ ( │
│ 74 │ │ │ │ │ normalize_header_key(k, lower=False, encoding=enco │
│ 75 │ │ │ │ │ normalize_header_key(k, lower=True, encoding=encod │
│ ❱ 76 │ │ │ │ │ normalize_header_value(v, encoding), │
│ 77 │ │ │ │ ) │
│ 78 │ │ │ │ for k, v in headers.items() │
│ 79 │ │ │ ] │
│ │
│ /Users/brunobs/Documents/VSCODE/lf/lib/python3.12/site-packages/httpx/_utils. │
│ py:53 in normalize_header_value │
│ │
│ 50 │ """ │
│ 51 │ if isinstance(value, bytes): │
│ 52 │ │ return value │
│ ❱ 53 │ return value.encode(encoding or "ascii") │
│ 54 │
│ 55 │
│ 56 def primitive_value_to_str(value: PrimitiveData) -> str: │
╰───────────────────────────────────────────────────────────────────────────────╯
UnicodeEncodeError: 'ascii' codec can't encode characters in position 1013-1081:
ordinal not in range(128)
/Users/brunobs/Documents/VSCODE/lf/lib/python3.12/site-packages/nest_asyncio.py:126: RuntimeWarning: coroutine 'TracingService._end_and_reset' was never awaited
handle = ready.popleft()
RuntimeWarning: Enable tracemalloc to get the object allocation traceback
`
@italojohnny I update to 1.0.16 and retried today, problem still exist: building never end. ( macOS 14.6.1(M2) + Python 3.12.5 + LangFlow 1.0.16 )
btw, I fixed 2 problems to get there:
ValueError: invalid literal for int() with base 10: ''
in "/custom_component/update"
./venv/lib/python3.12/site-packages/langflow/api/v1/endpoints.py Ln597here is final logs:
$ langflow run --log-level=debug
[08/23/24 11:37:09] DEBUG 2024-08-23 11:37:09 - DEBUG - logger - Logger set up with log level: debug logger.py:208
DEBUG 2024-08-23 11:37:09 - DEBUG - __main__ - Set OBJC_DISABLE_INITIALIZE_FORK_SAFETY to YES to avoid error __main__.py:74
DEBUG 2024-08-23 11:37:09 - DEBUG - manager - Create service ServiceType.SETTINGS_SERVICE manager.py:65
DEBUG 2024-08-23 11:37:09 - DEBUG - base - No database_url provided, trying LANGFLOW_DATABASE_URL env variable base.py:210
DEBUG 2024-08-23 11:37:09 - DEBUG - base - No database_url env variable, using sqlite database base.py:215
DEBUG 2024-08-23 11:37:09 - DEBUG - base - Saving database to langflow directory: /Users/wangyue/workspace/ai_apps/langflow_apps/venv/lib/python3.12/site-packages/langflow base.py:234
DEBUG 2024-08-23 11:37:09 - DEBUG - base - Database already exists at /Users/wangyue/workspace/ai_apps/langflow_apps/venv/lib/python3.12/site-packages/langflow/langflow.db, using it base.py:258
DEBUG 2024-08-23 11:37:09 - DEBUG - base - Setting default components path to components_path base.py:298
DEBUG 2024-08-23 11:37:09 - DEBUG - base - Components path: ['/Users/wangyue/workspace/ai_apps/langflow_apps/venv/lib/python3.12/site-packages/langflow/components'] base.py:303
DEBUG 2024-08-23 11:37:09 - DEBUG - base - Setting user agent to langflow base.py:179
DEBUG 2024-08-23 11:37:09 - DEBUG - auth - No secret key provided, generating a random one auth.py:98
DEBUG 2024-08-23 11:37:09 - DEBUG - auth - Loaded secret key auth.py:102
DEBUG 2024-08-23 11:37:09 - DEBUG - auth - Resetting superuser password to default value auth.py:76
DEBUG 2024-08-23 11:37:09 - DEBUG - util - Adding component path /Users/wangyue/workspace/ai_apps/langflow_apps/venv/lib/python3.12/site-packages/langflow/components util.py:451
DEBUG 2024-08-23 11:37:09 - DEBUG - base - Updating settings base.py:314
DEBUG 2024-08-23 11:37:09 - DEBUG - base - Updating components_path base.py:320
DEBUG 2024-08-23 11:37:09 - DEBUG - base - components_path: ['/Users/wangyue/workspace/ai_apps/langflow_apps/venv/lib/python3.12/site-packages/langflow/components'] base.py:342
DEBUG 2024-08-23 11:37:09 - DEBUG - util - Setting auto_saving_interval to True util.py:460
DEBUG 2024-08-23 11:37:09 - DEBUG - base - Updating settings base.py:314
DEBUG 2024-08-23 11:37:09 - DEBUG - base - Updating auto_saving_interval base.py:320
DEBUG 2024-08-23 11:37:09 - DEBUG - base - Updated auto_saving_interval base.py:341
DEBUG 2024-08-23 11:37:09 - DEBUG - base - auto_saving_interval: 1 base.py:342
INFO 2024-08-23 11:37:09 - INFO - main - Setting up app with static files directory None main.py:233
Starting Langflow v1.0.16...
/Users/wangyue/workspace/ai_apps/langflow_apps/venv/lib/python3.12/site-packages/pydantic/_internal/_config.py:291: PydanticDeprecatedSince20: Support for class-based `config` is deprecated, use ConfigDict instead. Deprecated in Pydantic V2.0 to be removed in V3.0. See Pydantic V2 Migration Guide at https://errors.pydantic.dev/2.8/migration/
warnings.warn(DEPRECATION_MESSAGE, DeprecationWarning)
/Users/wangyue/workspace/ai_apps/langflow_apps/venv/lib/python3.12/site-packages/litellm/utils.py:17: DeprecationWarning: 'imghdr' is deprecated and slated for removal in Python 3.13
import imghdr
/Users/wangyue/workspace/ai_apps/langflow_apps/venv/lib/python3.12/site-packages/litellm/utils.py:115: DeprecationWarning: open_text is deprecated. Use files() instead. Refer to https://importlib-resources.readthedocs.io/en/latest/using.html#migrating-from-legacy for migration advice.
with resources.open_text("litellm.llms.tokenizers", "anthropic_tokenizer.json") as f:
╭───────────────────────────────────────────────────────────────────╮
│ Welcome to ⛓ Langflow │
│ │
│ │
│ Collaborate, and contribute at our GitHub Repo 🌟 │
│ │
│ We collect anonymous usage data to improve Langflow. │
│ You can opt-out by setting DO_NOT_TRACK=true in your environment. │
│ │
│ Access http://127.0.0.1:7860 │
╰───────────────────────────────────────────────────────────────────╯
/Users/wangyue/workspace/ai_apps/langflow_apps/venv/lib/python3.12/site-packages/nest_asyncio.py:126: RuntimeWarning: coroutine 'TracingService._end_and_reset' was never awaited
handle = ready.popleft()
RuntimeWarning: Enable tracemalloc to get the object allocation traceback
/Users/wangyue/workspace/ai_apps/langflow_apps/venv/lib/python3.12/site-packages/pydantic/main.py:1059: PydanticDeprecatedSince20: The `__fields__` attribute is deprecated, use `model_fields` instead. Deprecated in Pydantic V2.0 to be removed in V3.0. See Pydantic V2 Migration Guide at https://errors.pydantic.dev/2.8/migration/
warnings.warn(
/Users/wangyue/workspace/ai_apps/langflow_apps/venv/lib/python3.12/site-packages/nest_asyncio.py:126: RuntimeWarning: coroutine 'TracingService._end_and_reset' was never awaited
handle = ready.popleft()
RuntimeWarning: Enable tracemalloc to get the object allocation traceback
/Users/wangyue/workspace/ai_apps/langflow_apps/venv/lib/python3.12/site-packages/nest_asyncio.py:126: RuntimeWarning: coroutine 'TracingService._end_and_reset' was never awaited
handle = ready.popleft()
RuntimeWarning: Enable tracemalloc to get the object allocation traceback
Hello @bruno-growthsales and @yorkew-east8 . I deeply regret not being able to resolve your issue so far... This issue has lost cohesion. It started with problems related to Pydantic messages and has now diverged into different paths for each of you.
@bruno-growthsales , the last log you showed points to the following error: UnicodeEncodeError: 'ascii' codec can't encode characters in position 1013-1081
.
@yorkew-east8 , the last log you showed points to the following warnning: RuntimeWarning: Enable tracemalloc to get the object allocation traceback
.
I suspect that you may be dealing with some difficult-to-reproduce performance bugs, which we are very interested in discovering and resolving to improve the reliability of our code. Therefore...
To maintain my sanity and keep things a bit more organized for future reference, would you mind if I close this current issue and you both open new individual issues with as much evidence as you can provide? Preferably using version 1.0.16 (if you do not intend to use Postgres).
@yorkew-east8 I solve the issue, using mini conda and Chrome browser on my Mac OS M1. Follow this steps of install https://conda.io/projects/conda/en/latest/user-guide/install/macos.html I hope it helps you!
Bug Description
I tried to solve this issue all day and can`t submit my flow because of this issue.
Pydantic Deprecated
Reproduction
Expected behavior
The flow run, just start and don't run any flow!
Who can help?
No response
Operating System
macOS Sonoma Version 14.6.1 (23G93)
Langflow Version
1.0.15
Python Version
3.12
Screenshot
All langflow versions installed don't work for me!
Flow File