holoviz / lumen

Illuminate your data.
https://lumen.holoviz.org
BSD 3-Clause "New" or "Revised" License
177 stars 20 forks source link

DO NOT MERGE FOR TESTING #789

Open ahuang11 opened 5 days ago

ahuang11 commented 5 days ago

Combines all the open PRs into a single PR

codecov[bot] commented 5 days ago

Codecov Report

Attention: Patch coverage is 21.84154% with 365 lines in your changes missing coverage. Please review.

Project coverage is 59.89%. Comparing base (2c83a1b) to head (c46f9da).

Files with missing lines Patch % Lines
lumen/ai/vector_store.py 22.84% 152 Missing :warning:
lumen/command/ai.py 0.00% 83 Missing :warning:
lumen/ai/coordinator.py 5.71% 33 Missing :warning:
lumen/ai/embeddings.py 46.93% 26 Missing :warning:
lumen/ai/ui.py 13.79% 25 Missing :warning:
lumen/ai/actor.py 26.92% 19 Missing :warning:
lumen/ai/agents.py 14.28% 18 Missing :warning:
lumen/ai/llm.py 30.00% 7 Missing :warning:
lumen/ai/utils.py 84.61% 2 Missing :warning:
Additional details and impacted files ```diff @@ Coverage Diff @@ ## main #789 +/- ## ========================================== - Coverage 60.98% 59.89% -1.09% ========================================== Files 103 104 +1 Lines 12544 12882 +338 ========================================== + Hits 7650 7716 +66 - Misses 4894 5166 +272 ```

:umbrella: View full report in Codecov by Sentry.
:loudspeaker: Have feedback on the report? Share it here.

amaloney commented 5 days ago

I get the following error

Traceback (most recent call last):

  File "/Users/rmaloney/Development/open-source/holoviz-dev/lumen/lumen/ai/coordinator.py", line 399, in _execute_graph_node
    await subagent.respond(custom_messages, step_title=title, render_output=render_output, **respond_kwargs)

  File "/Users/rmaloney/Development/open-source/holoviz-dev/lumen/lumen/ai/agents.py", line 230, in respond
    system_prompt = await self._render_main_prompt(messages)
                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "/Users/rmaloney/Development/open-source/holoviz-dev/lumen/lumen/ai/agents.py", line 338, in _render_main_prompt
    pipeline = self._memory["current_pipeline"]
               ~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^

  File "/Users/rmaloney/Development/open-source/holoviz-dev/lumen/lumen/config.py", line 64, in __getitem__
    return self._global_context[key]
           ~~~~~~~~~~~~~~~~~~~~^^^^^

KeyError: 'current_pipeline'

When I put a break point in the config file, and inspect the self._global_context object it returns an empty dictionary.