langflow-ai / langflow

Langflow is a low-code app builder for RAG and multi-agent AI applications. It’s Python-based and agnostic to any model, API, or database.
http://www.langflow.org
MIT License
31.62k stars 3.92k forks source link

Ollama Error Building Component #3009

Closed NightRang3r closed 2 months ago

NightRang3r commented 2 months ago

Bug Description

When trying to build the Ollama component I get the following error:

Error Building Component Error building Component Ollama: Unsupported message type: <class 'dict'>

Ollama version 0.3.0 Langflow version: 1.0.13 Python version: 3.12

Reproduction

  1. Add a new ollama model
  2. Set base url and select model
  3. Error should popup

Detailed Error:

[07/28/24 09:35:39] ERROR    2024-07-28 09:35:39 - ERROR    - base - Unsupported message type: <class 'dict'>                                                                                                                                      base.py:671
                             Traceback (most recent call last):

                               File "/usr/local/bin/langflow", line 8, in <module>
                                 sys.exit(main())
                                 |   |    -> <function main at 0xffff532f0b80>
                                 |   -> <built-in function exit>
                                 -> <module 'sys' (built-in)>
                               File "/usr/local/lib/python3.12/site-packages/langflow/__main__.py", line 581, in main
                                 app()
                                 -> <typer.main.Typer object at 0xffff533af170>
                               File "/usr/local/lib/python3.12/site-packages/typer/main.py", line 309, in __call__
                                 return get_command(self)(*args, **kwargs)
                                        |           |      |       -> {}
                                        |           |      -> ()
                                        |           -> <typer.main.Typer object at 0xffff533af170>
                                        -> <function get_command at 0xffff82060180>
                               File "/usr/local/lib/python3.12/site-packages/click/core.py", line 1157, in __call__
                                 return self.main(*args, **kwargs)
                                        |    |     |       -> {}
                                        |    |     -> ()
                                        |    -> <function TyperGroup.main at 0xffff8204eac0>
                                        -> <TyperGroup >
                               File "/usr/local/lib/python3.12/site-packages/typer/core.py", line 723, in main
                                 return _main(
                                        -> <function _main at 0xffff8204db20>
                               File "/usr/local/lib/python3.12/site-packages/typer/core.py", line 193, in _main
                                 rv = self.invoke(ctx)
                                      |    |      -> <click.core.Context object at 0xffff53774c20>
                                      |    -> <function MultiCommand.invoke at 0xffff83a47ec0>
                                      -> <TyperGroup >
                               File "/usr/local/lib/python3.12/site-packages/click/core.py", line 1688, in invoke
                                 return _process_result(sub_ctx.command.invoke(sub_ctx))
                                        |               |       |       |      -> <click.core.Context object at 0xffff532dbb00>
                                        |               |       |       -> <function Command.invoke at 0xffff83a47880>
                                        |               |       -> <TyperCommand run>
                                        |               -> <click.core.Context object at 0xffff532dbb00>
                                        -> <function MultiCommand.invoke.<locals>._process_result at 0xffff5349a7a0>
                               File "/usr/local/lib/python3.12/site-packages/click/core.py", line 1434, in invoke
                                 return ctx.invoke(self.callback, **ctx.params)
                                        |   |      |    |           |   -> {'host': '0.0.0.0', 'workers': 1, 'timeout': 300, 'port': 7860, 'components_path': PosixPath('/usr/local/lib/python3.12/site-...
                                        |   |      |    |           -> <click.core.Context object at 0xffff532dbb00>
                                        |   |      |    -> <function run at 0xffff5349ab60>
                                        |   |      -> <TyperCommand run>
                                        |   -> <function Context.invoke at 0xffff83a46200>
                                        -> <click.core.Context object at 0xffff532dbb00>
                               File "/usr/local/lib/python3.12/site-packages/click/core.py", line 783, in invoke
                                 return __callback(*args, **kwargs)
                                                    |       -> {'host': '0.0.0.0', 'workers': 1, 'timeout': 300, 'port': 7860, 'components_path': PosixPath('/usr/local/lib/python3.12/site-...
                                                    -> ()
                               File "/usr/local/lib/python3.12/site-packages/typer/main.py", line 692, in wrapper
                                 return callback(**use_params)
                                        |          -> {'host': '0.0.0.0', 'workers': 1, 'timeout': 300, 'port': 7860, 'components_path': PosixPath('/usr/local/lib/python3.12/site-...
                                        -> <function run at 0xffff5349bec0>
                               File "/usr/local/lib/python3.12/site-packages/langflow/__main__.py", line 170, in run
                                 process = run_on_mac_or_linux(host, port, log_level, options, app)
                                           |                   |     |     |          |        -> <fastapi.applications.FastAPI object at 0xffff532da630>
                                           |                   |     |     |          -> {'bind': '0.0.0.0:7860', 'workers': 1, 'timeout': 300, 'worker_class': 'langflow.server.LangflowUvicornWorker', 'logger_class...
                                           |                   |     |     -> 'critical'
                                           |                   |     -> 7860
                                           |                   -> '0.0.0.0'
                                           -> <function run_on_mac_or_linux at 0xffff5349bf60>
                               File "/usr/local/lib/python3.12/site-packages/langflow/__main__.py", line 198, in run_on_mac_or_linux
                                 webapp_process.start()
                                 |              -> <function BaseProcess.start at 0xffff81e84360>
                                 -> <Process name='Process-1' parent=1 started>
                               File "/usr/local/lib/python3.12/site-packages/multiprocess/process.py", line 121, in start
                                 self._popen = self._Popen(self)
                                 |    |        |    |      -> <Process name='Process-1' parent=1 started>
                                 |    |        |    -> <staticmethod(<function Process._Popen at 0xffff81daf600>)>
                                 |    |        -> <Process name='Process-1' parent=1 started>
                                 |    -> None
                                 -> <Process name='Process-1' parent=1 started>
                               File "/usr/local/lib/python3.12/site-packages/multiprocess/context.py", line 224, in _Popen
                                 return _default_context.get_context().Process._Popen(process_obj)
                                        |                |                            -> <Process name='Process-1' parent=1 started>
                                        |                -> <function DefaultContext.get_context at 0xffff81daf7e0>
                                        -> <multiprocess.context.DefaultContext object at 0xffff81e88200>
                               File "/usr/local/lib/python3.12/site-packages/multiprocess/context.py", line 282, in _Popen
                                 return Popen(process_obj)
                                        |     -> <Process name='Process-1' parent=1 started>
                                        -> <class 'multiprocess.popen_fork.Popen'>
                               File "/usr/local/lib/python3.12/site-packages/multiprocess/popen_fork.py", line 19, in __init__
                                 self._launch(process_obj)
                                 |    |       -> <Process name='Process-1' parent=1 started>
                                 |    -> <function Popen._launch at 0xffff528a0ae0>
                                 -> <multiprocess.popen_fork.Popen object at 0xffff529dd880>
                               File "/usr/local/lib/python3.12/site-packages/multiprocess/popen_fork.py", line 71, in _launch
                                 code = process_obj._bootstrap(parent_sentinel=child_r)
                                        |           |                          -> 6
                                        |           -> <function BaseProcess._bootstrap at 0xffff81e84d60>
                                        -> <Process name='Process-1' parent=1 started>
                               File "/usr/local/lib/python3.12/site-packages/multiprocess/process.py", line 314, in _bootstrap
                                 self.run()
                                 |    -> <function BaseProcess.run at 0xffff81e842c0>
                                 -> <Process name='Process-1' parent=1 started>
                               File "/usr/local/lib/python3.12/site-packages/multiprocess/process.py", line 108, in run
                                 self._target(*self._args, **self._kwargs)
                                 |    |        |    |        |    -> {}
                                 |    |        |    |        -> <Process name='Process-1' parent=1 started>
                                 |    |        |    -> ('0.0.0.0', 7860, 'critical', {'bind': '0.0.0.0:7860', 'workers': 1, 'timeout': 300, 'worker_class': 'langflow.server.Langflo...
                                 |    |        -> <Process name='Process-1' parent=1 started>
                                 |    -> <function run_langflow at 0xffff532f07c0>
                                 -> <Process name='Process-1' parent=1 started>
                               File "/usr/local/lib/python3.12/site-packages/langflow/__main__.py", line 420, in run_langflow
                                 LangflowApplication(app, options).run()
                                 |                   |    -> {'bind': '0.0.0.0:7860', 'workers': 1, 'timeout': 300, 'worker_class': 'langflow.server.LangflowUvicornWorker', 'logger_class...
                                 |                   -> <fastapi.applications.FastAPI object at 0xffff532da630>
                                 -> <class 'langflow.server.LangflowApplication'>
                               File "/usr/local/lib/python3.12/site-packages/gunicorn/app/base.py", line 72, in run
                                 Arbiter(self).run()
                                 |       -> <langflow.server.LangflowApplication object at 0xffff82067bf0>
                                 -> <class 'gunicorn.arbiter.Arbiter'>
                               File "/usr/local/lib/python3.12/site-packages/gunicorn/arbiter.py", line 202, in run
                                 self.manage_workers()
                                 |    -> <function Arbiter.manage_workers at 0xffff5291f2e0>
                                 -> <gunicorn.arbiter.Arbiter object at 0xffff528c7950>
                               File "/usr/local/lib/python3.12/site-packages/gunicorn/arbiter.py", line 571, in manage_workers
                                 self.spawn_workers()
                                 |    -> <function Arbiter.spawn_workers at 0xffff5291f420>
                                 -> <gunicorn.arbiter.Arbiter object at 0xffff528c7950>
                               File "/usr/local/lib/python3.12/site-packages/gunicorn/arbiter.py", line 642, in spawn_workers
                                 self.spawn_worker()
                                 |    -> <function Arbiter.spawn_worker at 0xffff5291f380>
                                 -> <gunicorn.arbiter.Arbiter object at 0xffff528c7950>
                               File "/usr/local/lib/python3.12/site-packages/gunicorn/arbiter.py", line 609, in spawn_worker
                                 worker.init_process()
                                 |      -> <function UvicornWorker.init_process at 0xffff7d1ed3a0>
                                 -> <langflow.server.LangflowUvicornWorker object at 0xffff53362540>
                               File "/usr/local/lib/python3.12/site-packages/uvicorn/workers.py", line 75, in init_process
                                 super().init_process()
                               File "/usr/local/lib/python3.12/site-packages/gunicorn/workers/base.py", line 142, in init_process
                                 self.run()
                                 |    -> <function UvicornWorker.run at 0xffff7d21bba0>
                                 -> <langflow.server.LangflowUvicornWorker object at 0xffff53362540>
                               File "/usr/local/lib/python3.12/site-packages/uvicorn/workers.py", line 107, in run
                                 return asyncio.run(self._serve())
                                        |       |   |    -> <function LangflowUvicornWorker._serve at 0xffff52950220>
                                        |       |   -> <langflow.server.LangflowUvicornWorker object at 0xffff53362540>
                                        |       -> <function _patch_asyncio.<locals>.run at 0xffff6a9528e0>
                                        -> <module 'asyncio' from '/usr/local/lib/python3.12/asyncio/__init__.py'>
                               File "/usr/local/lib/python3.12/asyncio/runners.py", line 194, in run
                                 return runner.run(main)
                                        |      |   -> <coroutine object LangflowUvicornWorker._serve at 0xffff7d1d0520>
                                        |      -> <function Runner.run at 0xffff81adfe20>
                                        -> <asyncio.runners.Runner object at 0xffff532d9640>
                               File "/usr/local/lib/python3.12/asyncio/runners.py", line 118, in run
                                 return self._loop.run_until_complete(task)
                                        |    |     |                  -> <Task pending name='Task-1' coro=<LangflowUvicornWorker._serve() running at /usr/local/lib/python3.12/site-packages/langflow/...
                                        |    |     -> <function _patch_loop.<locals>.run_until_complete at 0xffff7d091f80>
                                        |    -> <_UnixSelectorEventLoop running=True closed=False debug=False>
                                        -> <asyncio.runners.Runner object at 0xffff532d9640>
                               File "/usr/local/lib/python3.12/asyncio/base_events.py", line 674, in run_until_complete
                                 self.run_forever()
                                 |    -> <function _patch_loop.<locals>.run_forever at 0xffff7d091ee0>
                                 -> <_UnixSelectorEventLoop running=True closed=False debug=False>
                               File "/usr/local/lib/python3.12/asyncio/base_events.py", line 641, in run_forever
                                 self._run_once()
                                 |    -> <function _patch_loop.<locals>._run_once at 0xffff7d092020>
                                 -> <_UnixSelectorEventLoop running=True closed=False debug=False>
                               File "/usr/local/lib/python3.12/site-packages/nest_asyncio.py", line 133, in _run_once
                                 handle._run()
                                 |      -> <function Handle._run at 0xffff81d5fc40>
                                 -> <Handle Task.__wakeup(<Future finis...ffff43b6a690>>)>
                               File "/usr/local/lib/python3.12/asyncio/events.py", line 88, in _run
                                 self._context.run(self._callback, *self._args)
                                 |    |            |    |           |    -> <member '_args' of 'Handle' objects>
                                 |    |            |    |           -> <Handle Task.__wakeup(<Future finis...ffff43b6a690>>)>
                                 |    |            |    -> <member '_callback' of 'Handle' objects>
                                 |    |            -> <Handle Task.__wakeup(<Future finis...ffff43b6a690>>)>
                                 |    -> <member '_context' of 'Handle' objects>
                                 -> <Handle Task.__wakeup(<Future finis...ffff43b6a690>>)>
                               File "/usr/local/lib/python3.12/asyncio/tasks.py", line 396, in __wakeup
                                 self.__step()
                                 -> <Task pending name='starlette.middleware.base.BaseHTTPMiddleware.__call__.<locals>.call_next.<locals>.coro' coro=<BaseHTTPMid...
                               File "/usr/local/lib/python3.12/asyncio/tasks.py", line 303, in __step
                                 self.__step_run_and_handle_result(exc)
                                 |                                 -> None
                                 -> <Task pending name='starlette.middleware.base.BaseHTTPMiddleware.__call__.<locals>.call_next.<locals>.coro' coro=<BaseHTTPMid...
                               File "/usr/local/lib/python3.12/asyncio/tasks.py", line 314, in __step_run_and_handle_result
                                 result = coro.send(None)
                                          |    -> <method 'send' of 'coroutine' objects>
                                          -> <coroutine object BaseHTTPMiddleware.__call__.<locals>.call_next.<locals>.coro at 0xffff43acf120>
                               File "/usr/local/lib/python3.12/site-packages/starlette/middleware/base.py", line 151, in coro
                                 await self.app(scope, receive_or_disconnect, send_no_error)
                                       |    |   |      |                      -> <function BaseHTTPMiddleware.__call__.<locals>.call_next.<locals>.send_no_error at 0xffff52558680>
                                       |    |   |      -> <function BaseHTTPMiddleware.__call__.<locals>.call_next.<locals>.receive_or_disconnect at 0xffff43ab7240>
                                       |    |   -> {'type': 'http', 'asgi': {'version': '3.0', 'spec_version': '2.4'}, 'http_version': '1.1', 'server': ('172.17.0.6', 7860), 'c...
                                       |    -> <starlette.middleware.cors.CORSMiddleware object at 0xffff52a85190>
                                       -> <langflow.main.JavaScriptMIMETypeMiddleware object at 0xffff7d251b20>
                               File "/usr/local/lib/python3.12/site-packages/starlette/middleware/cors.py", line 93, in __call__
                                 await self.simple_response(scope, receive, send, request_headers=headers)
                                       |    |               |      |        |                     -> Headers({'host': '127.0.0.1:7800', 'connection': 'keep-alive', 'content-length': '2', 'sec-ch-ua': '"Not/A)Brand";v="8", "Chr...
                                       |    |               |      |        -> <function BaseHTTPMiddleware.__call__.<locals>.call_next.<locals>.send_no_error at 0xffff52558680>
                                       |    |               |      -> <function BaseHTTPMiddleware.__call__.<locals>.call_next.<locals>.receive_or_disconnect at 0xffff43ab7240>
                                       |    |               -> {'type': 'http', 'asgi': {'version': '3.0', 'spec_version': '2.4'}, 'http_version': '1.1', 'server': ('172.17.0.6', 7860), 'c...
                                       |    -> <function CORSMiddleware.simple_response at 0xffff803a0680>
                                       -> <starlette.middleware.cors.CORSMiddleware object at 0xffff52a85190>
                               File "/usr/local/lib/python3.12/site-packages/starlette/middleware/cors.py", line 148, in simple_response
                                 await self.app(scope, receive, send)
                                       |    |   |      |        -> functools.partial(<bound method CORSMiddleware.send of <starlette.middleware.cors.CORSMiddleware object at 0xffff52a85190>>, ...
                                       |    |   |      -> <function BaseHTTPMiddleware.__call__.<locals>.call_next.<locals>.receive_or_disconnect at 0xffff43ab7240>
                                       |    |   -> {'type': 'http', 'asgi': {'version': '3.0', 'spec_version': '2.4'}, 'http_version': '1.1', 'server': ('172.17.0.6', 7860), 'c...
                                       |    -> <starlette.middleware.exceptions.ExceptionMiddleware object at 0xffff7d251ac0>
                                       -> <starlette.middleware.cors.CORSMiddleware object at 0xffff52a85190>
                               File "/usr/local/lib/python3.12/site-packages/starlette/middleware/exceptions.py", line 65, in __call__
                                 await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
                                       |                            |    |    |     |      |        -> functools.partial(<bound method CORSMiddleware.send of <starlette.middleware.cors.CORSMiddleware object at 0xffff52a85190>>, ...
                                       |                            |    |    |     |      -> <function BaseHTTPMiddleware.__call__.<locals>.call_next.<locals>.receive_or_disconnect at 0xffff43ab7240>
                                       |                            |    |    |     -> {'type': 'http', 'asgi': {'version': '3.0', 'spec_version': '2.4'}, 'http_version': '1.1', 'server': ('172.17.0.6', 7860), 'c...
                                       |                            |    |    -> <starlette.requests.Request object at 0xffff41f78350>
                                       |                            |    -> <fastapi.routing.APIRouter object at 0xffff532d9460>
                                       |                            -> <starlette.middleware.exceptions.ExceptionMiddleware object at 0xffff7d251ac0>
                                       -> <function wrap_app_handling_exceptions at 0xffff8030f060>
                               File "/usr/local/lib/python3.12/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
                                 await app(scope, receive, sender)
                                       |   |      |        -> <function wrap_app_handling_exceptions.<locals>.wrapped_app.<locals>.sender at 0xffff43bbdd00>
                                       |   |      -> <function BaseHTTPMiddleware.__call__.<locals>.call_next.<locals>.receive_or_disconnect at 0xffff43ab7240>
                                       |   -> {'type': 'http', 'asgi': {'version': '3.0', 'spec_version': '2.4'}, 'http_version': '1.1', 'server': ('172.17.0.6', 7860), 'c...
                                       -> <fastapi.routing.APIRouter object at 0xffff532d9460>
                               File "/usr/local/lib/python3.12/site-packages/starlette/routing.py", line 756, in __call__
                                 await self.middleware_stack(scope, receive, send)
                                       |    |                |      |        -> <function wrap_app_handling_exceptions.<locals>.wrapped_app.<locals>.sender at 0xffff43bbdd00>
                                       |    |                |      -> <function BaseHTTPMiddleware.__call__.<locals>.call_next.<locals>.receive_or_disconnect at 0xffff43ab7240>
                                       |    |                -> {'type': 'http', 'asgi': {'version': '3.0', 'spec_version': '2.4'}, 'http_version': '1.1', 'server': ('172.17.0.6', 7860), 'c...
                                       |    -> <bound method Router.app of <fastapi.routing.APIRouter object at 0xffff532d9460>>
                                       -> <fastapi.routing.APIRouter object at 0xffff532d9460>
                               File "/usr/local/lib/python3.12/site-packages/starlette/routing.py", line 776, in app
                                 await route.handle(scope, receive, send)
                                       |     |      |      |        -> <function wrap_app_handling_exceptions.<locals>.wrapped_app.<locals>.sender at 0xffff43bbdd00>
                                       |     |      |      -> <function BaseHTTPMiddleware.__call__.<locals>.call_next.<locals>.receive_or_disconnect at 0xffff43ab7240>
                                       |     |      -> {'type': 'http', 'asgi': {'version': '3.0', 'spec_version': '2.4'}, 'http_version': '1.1', 'server': ('172.17.0.6', 7860), 'c...
                                       |     -> <function Route.handle at 0xffff8033c4a0>
                                       -> APIRoute(path='/api/v1/build/{flow_id}/vertices/{vertex_id}', name='build_vertex', methods=['POST'])
                               File "/usr/local/lib/python3.12/site-packages/starlette/routing.py", line 297, in handle
                                 await self.app(scope, receive, send)
                                       |    |   |      |        -> <function wrap_app_handling_exceptions.<locals>.wrapped_app.<locals>.sender at 0xffff43bbdd00>
                                       |    |   |      -> <function BaseHTTPMiddleware.__call__.<locals>.call_next.<locals>.receive_or_disconnect at 0xffff43ab7240>
                                       |    |   -> {'type': 'http', 'asgi': {'version': '3.0', 'spec_version': '2.4'}, 'http_version': '1.1', 'server': ('172.17.0.6', 7860), 'c...
                                       |    -> <function request_response.<locals>.app at 0xffff532f2520>
                                       -> APIRoute(path='/api/v1/build/{flow_id}/vertices/{vertex_id}', name='build_vertex', methods=['POST'])
                               File "/usr/local/lib/python3.12/site-packages/starlette/routing.py", line 77, in app
                                 await wrap_app_handling_exceptions(app, request)(scope, receive, send)
                                       |                            |    |        |      |        -> <function wrap_app_handling_exceptions.<locals>.wrapped_app.<locals>.sender at 0xffff43bbdd00>
                                       |                            |    |        |      -> <function BaseHTTPMiddleware.__call__.<locals>.call_next.<locals>.receive_or_disconnect at 0xffff43ab7240>
                                       |                            |    |        -> {'type': 'http', 'asgi': {'version': '3.0', 'spec_version': '2.4'}, 'http_version': '1.1', 'server': ('172.17.0.6', 7860), 'c...
                                       |                            |    -> <starlette.requests.Request object at 0xffff41f79310>
                                       |                            -> <function request_response.<locals>.app.<locals>.app at 0xffff5240db20>
                                       -> <function wrap_app_handling_exceptions at 0xffff8030f060>
                               File "/usr/local/lib/python3.12/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
                                 await app(scope, receive, sender)
                                       |   |      |        -> <function wrap_app_handling_exceptions.<locals>.wrapped_app.<locals>.sender at 0xffff42060d60>
                                       |   |      -> <function BaseHTTPMiddleware.__call__.<locals>.call_next.<locals>.receive_or_disconnect at 0xffff43ab7240>
                                       |   -> {'type': 'http', 'asgi': {'version': '3.0', 'spec_version': '2.4'}, 'http_version': '1.1', 'server': ('172.17.0.6', 7860), 'c...
                                       -> <function request_response.<locals>.app.<locals>.app at 0xffff5240db20>
                               File "/usr/local/lib/python3.12/site-packages/starlette/routing.py", line 72, in app
                                 response = await func(request)
                                                  |    -> <starlette.requests.Request object at 0xffff41f79310>
                                                  -> <function get_request_handler.<locals>.app at 0xffff532f2480>
                               File "/usr/local/lib/python3.12/site-packages/fastapi/routing.py", line 278, in app
                                 raw_response = await run_endpoint_function(
                                                      -> <function run_endpoint_function at 0xffff8030e7a0>
                               File "/usr/local/lib/python3.12/site-packages/fastapi/routing.py", line 191, in run_endpoint_function
                                 return await dependant.call(**values)
                                              |         |      -> {'chat_service': <langflow.services.chat.service.ChatService object at 0xffff43ba5520>, 'current_user': User(id=UUID('c4ae8d8...
                                              |         -> <function build_vertex at 0xffff5386c540>
                                              -> <fastapi.dependencies.models.Dependant object at 0xffff53309970>
                               File "/usr/local/lib/python3.12/site-packages/langflow/api/v1/chat.py", line 198, in build_vertex
                                 ) = await graph.build_vertex(
                                           |     -> <function Graph.build_vertex at 0xffff537a14e0>
                                           -> Graph:
                                              Nodes: ['ChatInput-zHRc3', 'Prompt-HJwee', 'ChatOutput-4sJz1', 'OllamaModel-s1bW6']
                                              Connections:
                                              Prompt-HJwee --> Ol...
                               File "/usr/local/lib/python3.12/site-packages/langflow/graph/graph/base.py", line 903, in build_vertex
                                 await vertex.build(
                                       |      -> <function Vertex.build at 0xffff53998b80>
                                       -> Vertex(display_name=Ollama, id=OllamaModel-s1bW6, data={'type': 'OllamaModel', 'node': {'template': {'_type': 'Component', 'b...
                               File "/usr/local/lib/python3.12/site-packages/langflow/graph/vertex/base.py", line 760, in build
                                 await step(user_id=user_id, **kwargs)
                                       |            |          -> {'fallback_to_env_vars': False}
                                       |            -> UUID('c4ae8d8f-b12b-46ec-b6f7-8106fb034710')
                                       -> <bound method Vertex._build of Vertex(display_name=Ollama, id=OllamaModel-s1bW6, data={'type': 'OllamaModel', 'node': {'templ...
                               File "/usr/local/lib/python3.12/site-packages/langflow/graph/vertex/base.py", line 445, in _build
                                 await self._build_results(custom_component, custom_params, fallback_to_env_vars)
                                       |    |              |                 |              -> False
                                       |    |              |                 -> {'input_value': Message(text_key='text', data={'template': 'Answer the user as if you were a pirate.\n\nUser: {user_input}\n\...
                                       |    |              -> <langflow.utils.validate.ChatOllamaComponent object at 0xffff40ec6d50>
                                       |    -> <function Vertex._build_results at 0xffff539987c0>
                                       -> Vertex(display_name=Ollama, id=OllamaModel-s1bW6, data={'type': 'OllamaModel', 'node': {'template': {'_type': 'Component', 'b...
                             > File "/usr/local/lib/python3.12/site-packages/langflow/graph/vertex/base.py", line 658, in _build_results
                                 result = await loading.get_instance_results(
                                                |       -> <function get_instance_results at 0xffff542876a0>
                                                -> <module 'langflow.interface.initialize.loading' from '/usr/local/lib/python3.12/site-packages/langflow/interface/initialize/l...
                               File "/usr/local/lib/python3.12/site-packages/langflow/interface/initialize/loading.py", line 60, in get_instance_results
                                 return await build_component(params=custom_params, custom_component=custom_component)
                                              |                      |                               -> <langflow.utils.validate.ChatOllamaComponent object at 0xffff40ec6d50>
                                              |                      -> {'input_value': Message(text_key='text', data={'template': 'Answer the user as if you were a pirate.\n\nUser: {user_input}\n\...
                                              -> <function build_component at 0xffff5391e0c0>
                               File "/usr/local/lib/python3.12/site-packages/langflow/interface/initialize/loading.py", line 147, in build_component
                                 build_results, artifacts = await custom_component.build_results()
                                                                  |                -> <function Component.build_results at 0xffff5391dbc0>
                                                                  -> <langflow.utils.validate.ChatOllamaComponent object at 0xffff40ec6d50>
                               File "/usr/local/lib/python3.12/site-packages/langflow/custom/custom_component/component.py", line 140, in build_results
                                 return await self._build_with_tracing()
                                              |    -> <function Component._build_with_tracing at 0xffff5391da80>
                                              -> <langflow.utils.validate.ChatOllamaComponent object at 0xffff40ec6d50>
                               File "/usr/local/lib/python3.12/site-packages/langflow/custom/custom_component/component.py", line 130, in _build_with_tracing
                                 _results, _artifacts = await self._build_results()
                                                              |    -> <function Component._build_results at 0xffff5391dc60>
                                                              -> <langflow.utils.validate.ChatOllamaComponent object at 0xffff40ec6d50>
                               File "/usr/local/lib/python3.12/site-packages/langflow/custom/custom_component/component.py", line 158, in _build_results
                                 result = method()
                                          -> <bound method LCModelComponent.text_response of <langflow.utils.validate.ChatOllamaComponent object at 0xffff40ec6d50>>
                               File "/usr/local/lib/python3.12/site-packages/langflow/base/models/model.py", line 57, in text_response
                                 result = self.get_chat_result(output, stream, input_value, system_message)
                                          |    |               |       |       |            -> ''
                                          |    |               |       |       -> Message(text_key='text', data={'template': 'Answer the user as if you were a pirate.\n\nUser: {user_input}\n\nAnswer: ', 'var...
                                          |    |               |       -> False
                                          |    |               -> ChatOllama(metadata={}, base_url='http://host.docker.internal:11434', model='phi3:latest', mirostat=0, temperature=0.2, syste...
                                          |    -> <function LCModelComponent.get_chat_result at 0xffff647ebe20>
                                          -> <langflow.utils.validate.ChatOllamaComponent object at 0xffff40ec6d50>
                               File "/usr/local/lib/python3.12/site-packages/langflow/base/models/model.py", line 152, in get_chat_result
                                 prompt = input_value.load_lc_prompt()
                                          |           -> <function Message.load_lc_prompt at 0xffff62cda160>
                                          -> Message(text_key='text', data={'template': 'Answer the user as if you were a pirate.\n\nUser: {user_input}\n\nAnswer: ', 'var...
                               File "/usr/local/lib/python3.12/site-packages/langflow/schema/message.py", line 174, in load_lc_prompt
                                 loaded_prompt = load(self.prompt)
                                                 |    -> Message(text_key='text', data={'template': 'Answer the user as if you were a pirate.\n\nUser: {user_input}\n\nAnswer: ', 'var...
                                                 -> <function load at 0xffff7fba5d00>
                               File "/usr/local/lib/python3.12/site-packages/langchain_core/_api/beta_decorator.py", line 110, in warning_emitting_wrapper
                                 return wrapped(*args, **kwargs)
                                        |        |       -> {}
                                        |        -> ({'lc': 1, 'type': 'constructor', 'id': ['langchain', 'prompts', 'chat', 'ChatPromptTemplate'], 'kwargs': {'input_variables':...
                                        -> <function load at 0xffff7fba5c60>
                               File "/usr/local/lib/python3.12/site-packages/langchain_core/load/load.py", line 195, in load
                                 return _load(obj)
                                        |     -> {'lc': 1, 'type': 'constructor', 'id': ['langchain', 'prompts', 'chat', 'ChatPromptTemplate'], 'kwargs': {'input_variables': ...
                                        -> <function load.<locals>._load at 0xffff43d07d80>
                               File "/usr/local/lib/python3.12/site-packages/langchain_core/load/load.py", line 190, in _load
                                 return reviver(loaded_obj)
                                        |       -> {'lc': 1, 'type': 'constructor', 'id': ['langchain', 'prompts', 'chat', 'ChatPromptTemplate'], 'kwargs': {'input_variables': ...
                                        -> <langchain_core.load.load.Reviver object at 0xffff41f78830>
                               File "/usr/local/lib/python3.12/site-packages/langchain_core/load/load.py", line 126, in __call__
                                 return cls(**kwargs)
                                        |     -> {'input_variables': [], 'messages': [{'content': 'Answer the user as if you were a pirate.\n\nUser: Hi\n\nAnswer: ', 'additio...
                                        -> <class 'langchain_core.prompts.chat.ChatPromptTemplate'>
                               File "/usr/local/lib/python3.12/site-packages/langchain_core/prompts/chat.py", line 984, in __init__
                                 _convert_to_message(message, template_format) for message in messages
                                 |                            |                               -> [{'content': 'Answer the user as if you were a pirate.\n\nUser: Hi\n\nAnswer: ', 'additional_kwargs': {}, 'response_metadata'...
                                 |                            -> 'f-string'
                                 -> <function _convert_to_message at 0xffff7f156a20>
                               File "/usr/local/lib/python3.12/site-packages/langchain_core/prompts/chat.py", line 1456, in _convert_to_message
                                 raise NotImplementedError(f"Unsupported message type: {type(message)}")
                                                                                             -> {'content': 'Answer the user as if you were a pirate.\n\nUser: Hi\n\nAnswer: ', 'additional_kwargs': {}, 'response_metadata':...

                             NotImplementedError: Unsupported message type: <class 'dict'>

Expected behavior

Component should be built without errors

Who can help?

No response

Operating System

Debian Linux 12

Langflow Version

1.0.13

Python Version

None

Screenshot

Screenshot 2024-07-28 at 13 30 25

Flow File

No response

dosubot[bot] commented 2 months ago

Hey, @NightRang3r! I'm here to assist you with any bugs, questions, or contributions. Let's tackle this issue together while we wait for a human maintainer.

To resolve the "Unsupported message type: <class 'dict'>" error when building the Ollama component in Langflow, ensure that the metadata parameter, which is expected to be a dictionary, is correctly handled. The error suggests that a dictionary is being passed where it is not expected.

Here is the relevant part of the code:

llm_params = {
    "base_url": self.base_url,
    "model": self.model,
    "mirostat": mirostat_value,
    "format": self.format,
    "metadata": self.metadata,  # Ensure this is a dictionary
    "tags": self.tags.split(",") if self.tags else None,
    "mirostat_eta": mirostat_eta,
    "mirostat_tau": mirostat_tau,
    "num_ctx": self.num_ctx or None,
    "num_gpu": self.num_gpu or None,
    "num_thread": self.num_thread or None,
    "repeat_last_n": self.repeat_last_n or None,
    "repeat_penalty": self.repeat_penalty or None,
    "temperature": self.temperature or None,
    "stop": self.stop_tokens.split(",") if self.stop_tokens else None,
    "system": self.system,
    "template": self.template,
    "tfs_z": self.tfs_z or None,
    "timeout": self.timeout or None,
    "top_k": self.top_k or None,
    "top_p": self.top_p or None,
    "verbose": self.verbose,
}

# Remove parameters with None values
llm_params = {k: v for k, v in llm_params.items() if v is not None}

try:
    output = ChatOllama(**llm_params)  # Ensure ChatOllama can handle the metadata dictionary
except Exception as e:
    raise ValueError("Could not initialize Ollama LLM.") from e

Ensure that the metadata parameter is correctly formatted as a dictionary before passing it to ChatOllama. If ChatOllama does not support a dictionary for metadata, you may need to modify the ChatOllama class or adjust how metadata is being used.

Additionally, the metadata dictionary should have the following structure:

"metadata": {
    "display_name": "Metadata",
    "info": "Metadata to add to the run trace.",
    "advanced": True,
}

Make sure that the ChatOllama class is designed to handle this structure appropriately [1][2][3][4].

To continue talking to Dosu, mention @dosu.

2002dick commented 2 months ago

The same problem! Snipaste_2024-07-28_23-39-44

holdengand commented 2 months ago

I am having a similar issue.

I have noticed for me though that it started doing it when adding new Ollama components, but copying and pasting ones that were already working in other workflows works. This happened suddenly just the other day to me while working on a workflow, I didn't shut anything down, restart, update, or anything.

New Ollama components throw an error, old ones still work fine.

Even if I place a broken version of the Ollama component and replace the code with a copy from one of the working ones, its still says the same error.

And I'm not sure if its related or not, but I can't use the Prompt component with the Ollma one either. Throws the error: Error Building Component Error building Component Ollama: Unsupported message type: <class 'dict'>

NightRang3r commented 2 months ago

Edit: Looks like the issue is with the Prompt component, even when connected to an OpenAI model component I get the same error message: Unsupported message type: <class 'dict'>

It seems to be (Partially) working now. I created a Docker image based on Python 3.12, and the Ollama component builds successfully. However, when I add a prompt and connect the node to the Ollama model, I get the same error.

DockerFile:

# docker build -f DockerFile . -t nightrang3r/langflow:1.0.13
# docker run -d --restart always --add-host=host.docker.internal:host-gateway -p 7860:7860 -it --name langflow nightrang3r/langflow:1.0.13

FROM python:3.12.4-slim
RUN apt-get update && apt-get install -y --no-install-recommends build-essential net-tools git nano wget curl iputils-ping
RUN pip install --upgrade pip
RUN pip install langflow==1.0.13
RUN pip install redis
ENV LANGFLOW_HOST=0.0.0.0
ENV DO_NOT_TRACK=true
EXPOSE 7860
CMD ["langflow", "run"]
Screenshot 2024-07-29 at 11 38 53 Screenshot 2024-07-29 at 11 40 27

Detailed Error:

Error building Component Ollama: 

Unsupported message type: <class 'dict'>

Traceback (most recent call last):
  File "/usr/local/lib/python3.12/site-packages/langflow/graph/vertex/base.py", line 658, in _build_results
    result = await loading.get_instance_results(
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/langflow/interface/initialize/loading.py", line 60, in get_instance_results
    return await build_component(params=custom_params, custom_component=custom_component)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/langflow/interface/initialize/loading.py", line 147, in build_component
    build_results, artifacts = await custom_component.build_results()
                               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/langflow/custom/custom_component/component.py", line 140, in build_results
    return await self._build_with_tracing()
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/langflow/custom/custom_component/component.py", line 130, in _build_with_tracing
    _results, _artifacts = await self._build_results()
                           ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/langflow/custom/custom_component/component.py", line 158, in _build_results
    result = method()
             ^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/langflow/base/models/model.py", line 57, in text_response
    result = self.get_chat_result(output, stream, input_value, system_message)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/langflow/base/models/model.py", line 152, in get_chat_result
    prompt = input_value.load_lc_prompt()
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/langflow/schema/message.py", line 174, in load_lc_prompt
    loaded_prompt = load(self.prompt)
                    ^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/langchain_core/_api/beta_decorator.py", line 110, in warning_emitting_wrapper
    return wrapped(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/langchain_core/load/load.py", line 195, in load
    return _load(obj)
           ^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/langchain_core/load/load.py", line 190, in _load
    return reviver(loaded_obj)
           ^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/langchain_core/load/load.py", line 126, in __call__
    return cls(**kwargs)
           ^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/langchain_core/prompts/chat.py", line 984, in __init__
    _convert_to_message(message, template_format) for message in messages
    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/langchain_core/prompts/chat.py", line 1456, in _convert_to_message
    raise NotImplementedError(f"Unsupported message type: {type(message)}")
NotImplementedError: Unsupported message type: <class 'dict'>
songjinu commented 2 months ago

it occur at openai with prompt. it worked at 1.0.7

2002dick commented 2 months ago

But the installation with python is normal. Although a lot of errors are reported, at least it can connect to ollama. It seems to be a problem with the docker image. Snipaste_2024-07-29_17-50-42

ogabrielluiz commented 2 months ago

Hey all.

This is related to a version mismatch. Please follow #3022

2002dick commented 2 months ago

I think it's a network problem with the langfow container.