xXAdonesXx / NodeGPT

ComfyUI Extension Nodes for Automated Text Generation.
GNU Affero General Public License v3.0
335 stars 26 forks source link

The autogen.Completion class requires openai<1 and diskcache. #24

Open ZiBrianQian opened 9 months ago

ZiBrianQian commented 9 months ago

Good day. I keep getting this error message every time. Please help me. Can't find anything.

Error occurred when executing TextGeneration:

(Deprecated) The autogen.Completion class requires openai<1 and diskcache.

File "D:\AI\ComfyUI_windows_portable\ComfyUI\execution.py", line 153, in recursive_execute output_data, output_ui = get_output_data(obj, input_data_all) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\AI\ComfyUI_windows_portable\ComfyUI\execution.py", line 83, in get_output_data return_values = map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\AI\ComfyUI_windows_portable\ComfyUI\execution.py", line 76, in map_node_over_list results.append(getattr(obj, func)(slice_dict(input_data_all, i))) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\AI\ComfyUI_windows_portable\ComfyUI\custom_nodes\NodeGPT\TextGeneration.py", line 32, in execute response = oai.ChatCompletion.create( ^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\AI\ComfyUI_windows_portable\python_embeded\Lib\site-packages\autogen\oai\completion.py", line 792, in create raise ERROR File "D:\AI\ComfyUI_windows_portable\ComfyUI\execution.py", line 153, in recursive_execute output_data, output_ui = get_output_data(obj, input_data_all) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\AI\ComfyUI_windows_portable\ComfyUI\execution.py", line 83, in get_output_data return_values = map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\AI\ComfyUI_windows_portable\ComfyUI\execution.py", line 76, in map_node_over_list results.append(getattr(obj, func)(slice_dict(input_data_all, i))) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\AI\ComfyUI_windows_portable\ComfyUI\custom_nodes\NodeGPT\TextGeneration.py", line 32, in execute response = oai.ChatCompletion.create( ^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\AI\ComfyUI_windows_portable\python_embeded\Lib\site-packages\autogen\oai\completion.py", line 792, in create raise ERROR File "D:\AI\ComfyUI_windows_portable\ComfyUI\execution.py", line 153, in recursive_execute output_data, output_ui = get_output_data(obj, input_data_all) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\AI\ComfyUI_windows_portable\ComfyUI\execution.py", line 83, in get_output_data return_values = map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\AI\ComfyUI_windows_portable\ComfyUI\execution.py", line 76, in map_node_over_list results.append(getattr(obj, func)(slice_dict(input_data_all, i))) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\AI\ComfyUI_windows_portable\ComfyUI\custom_nodes\NodeGPT\TextGeneration.py", line 32, in execute response = oai.ChatCompletion.create( ^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\AI\ComfyUI_windows_portable\python_embeded\Lib\site-packages\autogen\oai\completion.py", line 792, in create raise ERROR File "D:\AI\ComfyUI_windows_portable\ComfyUI\execution.py", line 153, in recursive_execute output_data, output_ui = get_output_data(obj, input_data_all) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\AI\ComfyUI_windows_portable\ComfyUI\execution.py", line 83, in get_output_data return_values = map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\AI\ComfyUI_windows_portable\ComfyUI\execution.py", line 76, in map_node_over_list results.append(getattr(obj, func)(slice_dict(input_data_all, i))) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\AI\ComfyUI_windows_portable\ComfyUI\custom_nodes\NodeGPT\TextGeneration.py", line 32, in execute response = oai.ChatCompletion.create( ^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\AI\ComfyUI_windows_portable\python_embeded\Lib\site-packages\autogen\oai\completion.py", line 792, in create raise ERROR File "D:\AI\ComfyUI_windows_portable\ComfyUI\execution.py", line 153, in recursive_execute output_data, output_ui = get_output_data(obj, input_data_all) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\AI\ComfyUI_windows_portable\ComfyUI\execution.py", line 83, in get_output_data return_values = map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\AI\ComfyUI_windows_portable\ComfyUI\execution.py", line 76, in map_node_over_list results.append(getattr(obj, func)(slice_dict(input_data_all, i))) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\AI\ComfyUI_windows_portable\ComfyUI\custom_nodes\NodeGPT\TextGeneration.py", line 32, in execute response = oai.ChatCompletion.create( ^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\AI\ComfyUI_windows_portable\python_embeded\Lib\site-packages\autogen\oai\completion.py", line 792, in create raise ERROR File "D:\AI\ComfyUI_windows_portable\ComfyUI\execution.py", line 153, in recursive_execute output_data, output_ui = get_output_data(obj, input_data_all) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\AI\ComfyUI_windows_portable\ComfyUI\execution.py", line 83, in get_output_data return_values = map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\AI\ComfyUI_windows_portable\ComfyUI\execution.py", line 76, in map_node_over_list results.append(getattr(obj, func)(slice_dict(input_data_all, i))) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\AI\ComfyUI_windows_portable\ComfyUI\custom_nodes\NodeGPT\Chat.py", line 53, in execute autogen.ChatCompletion.start_logging(conversations) File "D:\AI\ComfyUI_windows_portable\python_embeded\Lib\site-packages\autogen\oai\completion.py", line 1180, in start_logging raise ERROR File "D:\AI\ComfyUI_windows_portable\ComfyUI\execution.py", line 153, in recursive_execute output_data, output_ui = get_output_data(obj, input_data_all) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\AI\ComfyUI_windows_portable\ComfyUI\execution.py", line 83, in get_output_data return_values = map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\AI\ComfyUI_windows_portable\ComfyUI\execution.py", line 76, in map_node_over_list results.append(getattr(obj, func)(slice_dict(input_data_all, i))) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\AI\ComfyUI_windows_portable\ComfyUI\custom_nodes\NodeGPT\Chat.py", line 53, in execute autogen.ChatCompletion.start_logging(conversations) File "D:\AI\ComfyUI_windows_portable\python_embeded\Lib\site-packages\autogen\oai\completion.py", line 1180, in start_logging raise ERROR File "D:\AI\ComfyUI_windows_portable\ComfyUI\execution.py", line 153, in recursive_execute output_data, output_ui = get_output_data(obj, input_data_all) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\AI\ComfyUI_windows_portable\ComfyUI\execution.py", line 83, in get_output_data return_values = map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\AI\ComfyUI_windows_portable\ComfyUI\execution.py", line 76, in map_node_over_list results.append(getattr(obj, func)(slice_dict(input_data_all, i))) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\AI\ComfyUI_windows_portable\ComfyUI\custom_nodes\NodeGPT\Chat.py", line 53, in execute autogen.ChatCompletion.start_logging(conversations) File "D:\AI\ComfyUI_windows_portable\python_embeded\Lib\site-packages\autogen\oai\completion.py", line 1180, in start_logging raise ERROR File "D:\AI\ComfyUI_windows_portable\ComfyUI\execution.py", line 153, in recursive_execute output_data, output_ui = get_output_data(obj, input_data_all) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\AI\ComfyUI_windows_portable\ComfyUI\execution.py", line 83, in get_output_data return_values = map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\AI\ComfyUI_windows_portable\ComfyUI\execution.py", line 76, in map_node_over_list results.append(getattr(obj, func)(slice_dict(input_data_all, i))) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\AI\ComfyUI_windows_portable\ComfyUI\custom_nodes\NodeGPT\Chat.py", line 53, in execute autogen.ChatCompletion.start_logging(conversations) File "D:\AI\ComfyUI_windows_portable\python_embeded\Lib\site-packages\autogen\oai\completion.py", line 1180, in start_logging raise ERROR File "D:\AI\ComfyUI_windows_portable\ComfyUI\execution.py", line 153, in recursive_execute output_data, output_ui = get_output_data(obj, input_data_all) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\AI\ComfyUI_windows_portable\ComfyUI\execution.py", line 83, in get_output_data return_values = map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\AI\ComfyUI_windows_portable\ComfyUI\execution.py", line 76, in map_node_over_list results.append(getattr(obj, func)(slice_dict(input_data_all, i))) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\AI\ComfyUI_windows_portable\ComfyUI\custom_nodes\NodeGPT\Chat.py", line 53, in execute autogen.ChatCompletion.start_logging(conversations) File "D:\AI\ComfyUI_windows_portable\python_embeded\Lib\site-packages\autogen\oai\completion.py", line 1180, in start_logging raise ERROR File "D:\AI\ComfyUI_windows_portable\ComfyUI\execution.py", line 153, in recursive_execute output_data, output_ui = get_output_data(obj, input_data_all) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\AI\ComfyUI_windows_portable\ComfyUI\execution.py", line 83, in get_output_data return_values = map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\AI\ComfyUI_windows_portable\ComfyUI\execution.py", line 76, in map_node_over_list results.append(getattr(obj, func)(slice_dict(input_data_all, i))) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\AI\ComfyUI_windows_portable\ComfyUI\custom_nodes\NodeGPT\Chat.py", line 53, in execute autogen.ChatCompletion.start_logging(conversations) File "D:\AI\ComfyUI_windows_portable\python_embeded\Lib\site-packages\autogen\oai\completion.py", line 1180, in start_logging raise ERROR File "D:\AI\ComfyUI_windows_portable\ComfyUI\execution.py", line 153, in recursive_execute output_data, output_ui = get_output_data(obj, input_data_all) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\AI\ComfyUI_windows_portable\ComfyUI\execution.py", line 83, in get_output_data return_values = map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\AI\ComfyUI_windows_portable\ComfyUI\execution.py", line 76, in map_node_over_list results.append(getattr(obj, func)(slice_dict(input_data_all, i))) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\AI\ComfyUI_windows_portable\ComfyUI\custom_nodes\NodeGPT\Chat.py", line 53, in execute autogen.ChatCompletion.start_logging(conversations) File "D:\AI\ComfyUI_windows_portable\python_embeded\Lib\site-packages\autogen\oai\completion.py", line 1180, in start_logging raise ERROR File "D:\AI\ComfyUI_windows_portable\ComfyUI\execution.py", line 153, in recursive_execute output_data, output_ui = get_output_data(obj, input_data_all) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\AI\ComfyUI_windows_portable\ComfyUI\execution.py", line 83, in get_output_data return_values = map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\AI\ComfyUI_windows_portable\ComfyUI\execution.py", line 76, in map_node_over_list results.append(getattr(obj, func)(**slice_dict(input_data_all, i))) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\AI\ComfyUI_windows_portable\ComfyUI\custom_nodes\NodeGPT\TextGeneration.py", line 32, in execute response = oai.ChatCompletion.create( ^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\AI\ComfyUI_windows_portable\python_embeded\Lib\site-packages\autogen\oai\completion.py", line 792, in create raise ERROR

xXAdonesXx commented 9 months ago

Sorry that it dosent work. It seems that you have an installation of autogen outside of comfyUI. Normally the installation is in NodeGPT/venv/lip/sitepacages not in D:\AI\ComfyUI_windows_portable\python_embeded\Lib\site-packages\autogen. Maybe your autogen isn't updated? Alternatively you can use the llama.cpp i just added (see Workflows) insted of the LMStudio node. Then at least the TextGeneration node should work.

ZiBrianQian commented 9 months ago

Thanks. Let me try it. But I installed it using ComfyUI manager. I'll try to install it manually.

ZiBrianQian commented 9 months ago

I updated everything. And used llama.cpp workflow, but still getting this error image

xXAdonesXx commented 9 months ago

The error suggests that there may be issues with other nodes. Could you try running the llama-cpp workflow without any modifications? grafik Additionally, it would be helpful if you could verify whether the two nodes shown in the image are operational (where is the green frame when the error appear). Please send a screenshot of your workflow for further assistance.

I have also updated the nodes for easier installation and fixed some bugs. If you delete the current NodeGPT folder and download it again (via Comfy Manager or git clone), it should automatically install everything for you.

ltdrdata commented 9 months ago

This issue arose with the recent upgrade to pyautogen-0.2.0. It can be resolved by downgrading the package to version 0.1.14 using the command: pip install pyautogen==0.1.14.

yesyj commented 9 months ago

same problem. it stuck at assistant box first. i upgrade in many ways, now it stuck at chat box.

solved 20231208053025

angrysky56 commented 9 months ago

image image got prompt <autogen.agentchat.groupchat.GroupChatManager object at 0x00000189408C6D50> [autogen.oai.completion: 12-12 23:41:50] {1175} WARNING - logging via Completion.start_logging is deprecated in pyautogen v0.2. logging via OpenAIWrapper will be added back in a future release. ERROR:root:!!! Exception during processing !!! ERROR:root:Traceback (most recent call last): File "F:\comfyNodeGPT\ComfyUI_windows_portable\ComfyUI\execution.py", line 153, in recursive_execute output_data, output_ui = get_output_data(obj, input_data_all) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "F:\comfyNodeGPT\ComfyUI_windows_portable\ComfyUI\execution.py", line 83, in get_output_data return_values = map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "F:\comfyNodeGPT\ComfyUI_windows_portable\ComfyUI\execution.py", line 76, in map_node_over_list results.append(getattr(obj, func)(**slice_dict(input_data_all, i))) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "F:\comfyNodeGPT\ComfyUI_windows_portable\ComfyUI\custom_nodes\NodeGPT\Chat.py", line 53, in execute autogen.ChatCompletion.start_logging(conversations) File "F:\comfyNodeGPT\ComfyUI_windows_portable\python_embeded\Lib\site-packages\autogen\oai\completion.py", line 1180, in start_logging raise ERROR AssertionError: (Deprecated) The autogen.Completion class requires openai<1 and diskcache.

Prompt executed in 1.56 seconds

angrysky56 commented 9 months ago

got prompt model_type EPS adm 0 Using pytorch attention in VAE Working with z of shape (1, 4, 32, 32) = 4096 dimensions. Using pytorch attention in VAE missing {'cond_stage_model.clip_l.text_projection', 'cond_stage_model.clip_l.logit_scale'} left over keys: dict_keys(['alphas_cumprod', 'alphas_cumprod_prev', 'betas', 'log_one_minus_alphas_cumprod', 'model_ema.decay', 'model_ema.num_updates', 'posterior_log_variance_clipped', 'posterior_mean_coef1', 'posterior_mean_coef2', 'posterior_variance', 'sqrt_alphas_cumprod', 'sqrt_one_minus_alphas_cumprod', 'sqrt_recip_alphas_cumprod', 'sqrt_recipm1_alphas_cumprod', 'cond_stage_model.clip_l.transformer.text_model.embeddings.position_ids']) [autogen.oai.completion: 12-13 00:55:03] {786} WARNING - Completion.create is deprecated in pyautogen v0.2 and openai>=1. The new openai requires initiating a client for inference. Please refer to https://microsoft.github.io/autogen/docs/Use-Cases/enhanced_inference#api-unification ERROR:root:!!! Exception during processing !!! ERROR:root:Traceback (most recent call last): File "F:\comfyNodeGPT\ComfyUI_windows_portable\ComfyUI\execution.py", line 153, in recursive_execute output_data, output_ui = get_output_data(obj, input_data_all) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "F:\comfyNodeGPT\ComfyUI_windows_portable\ComfyUI\execution.py", line 83, in get_output_data return_values = map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "F:\comfyNodeGPT\ComfyUI_windows_portable\ComfyUI\execution.py", line 76, in map_node_over_list results.append(getattr(obj, func)(**slice_dict(input_data_all, i))) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "F:\comfyNodeGPT\ComfyUI_windows_portable\ComfyUI\custom_nodes\NodeGPT\TextGeneration.py", line 45, in execute response = oai.ChatCompletion.create( ^^^^^^^^^^^^^^^^^^^^^^^^^^ File "F:\comfyNodeGPT\ComfyUI_windows_portable\python_embeded\Lib\site-packages\autogen\oai\completion.py", line 792, in create raise ERROR AssertionError: (Deprecated) The autogen.Completion class requires openai<1 and diskcache.

Prompt executed in 1.00 seconds

Zaithe commented 9 months ago

I resolved this issue by manually copying the correct packages from C:\Python311\Lib\site-packages to my ComfyUI_windows_portable\python_embeded\Lib\site-packages folder

ltdrdata commented 9 months ago

I investigated the support for pyautogen 0.2.2 and found that it requires above of openai v1. Additionally, there has been a significant structural overhaul in openai v1. Since pyautogen is built on top of openai, its structure has also changed accordingly.

The issue lies not with GPT but with the scenario of using pyautogen based on a local LLM for the simplest use case. Even after testing, it seems that it does not work at all. (I got AttributeError: 'NoneType' object has no attribute 'prompt_tokens'). It appears that development is still ongoing, and for the time being, when using a local LLM, it may be necessary to stick with an older version of the package for a while.

To install the compatible versions, you can use the following command:

pip install pyautogen==0.1.14 openai==0.28.1

https://github.com/microsoft/autogen/issues/1002 https://github.com/microsoft/autogen/pull/1008

EDIT: PR1008 isn't enough. Another error.

ChatCompletion(id=None, choices=None, created=None, model=None, object=None, system_fingerprint=None, usage=None, error='Unexpected endpoint or method. (POST /chat/completions)', cost=0, config_id=0, pass_filter=True)

It appears that the API provided by the LM Studio server is not compatible with autogen or OpenAI.

angrysky56 commented 8 months ago

maybe this can help figure it out, not a coder sorry- https://www.youtube.com/watch?v=8RtxvXIx61Y

ltdrdata commented 8 months ago

I found the mistake of running it without entering any prompt. I confirmed that there is no issue running the following simple code with LM studio, pyautogen 0.2.2, and openai 1.6.1.

from autogen import UserProxyAgent, ConversableAgent, config_list_from_json

def main():
    config_list = config_list_from_json(env_or_file="OAI_CONFIG_LIST")

    config_list_llama = [
        {
            'base_url': "http://127.0.0.1:1234/v1",
            'api_key': "NULL"
        }
    ]

    assistant = ConversableAgent("agent", llm_config={"config_list": config_list_llama})
    user_proxy = UserProxyAgent("user", code_execution_config=False)
    assistant.initiate_chat(user_proxy, message="How can I help you today?")

if __name__ == "__main__":
    main()
ltdrdata commented 8 months ago

This PR will be compatible with latest pyautogen package. https://github.com/xXAdonesXx/NodeGPT/pull/30

0xAlcibiades commented 8 months ago

Still seeing this on latest master.

angrysky56 commented 8 months ago

Maybe CrewAI could work better? https://github.com/joaomdmoura/CrewAI

https://youtu.be/tnejrr-0a94

Be nice to run it with LM Studio or Ooba via nodes.

0xAlcibiades commented 8 months ago

No, I mean this does exactly what it needs to, just seems like there is a bug/conflict in the dependency tree rn to be fixed.

ltdrdata commented 8 months ago

No, I mean this does exactly what it needs to, just seems like there is a bug/conflict in the dependency tree rn to be fixed.

Show me the pip freeze

MultiTech-Visions commented 2 months ago

Man it's too bad none of these LLM things seem to work.