OpenInterpreter / 01

The #1 open-source voice interface for desktop, mobile, and ESP32 chips.
https://01.openinterpreter.com/
GNU Affero General Public License v3.0
4.92k stars 517 forks source link

"Let me know what you'd like to do next" repeats after answering the question and doesnt stop #259

Open contractorwolf opened 5 months ago

contractorwolf commented 5 months ago

Describe the bug I was previously running on the older version of the 01 server and updated everything a few days ago when the latest (0.2.5) was release. After updating I first tested using my Atom device. It answers the initial question but i can see in the terminal output it gets into a loop after the question and just starts outputting like this until I kill it:

macbook-pro-3:software jameswopoetry run 01

○                                                                                                                                                            

Starting...                                                                                                                                                  

INFO:     Started server process [42881]
INFO:     Waiting for application startup.

Ready.                                                                                                                                                       

INFO:     Application startup complete.
INFO:     Uvicorn running on http://0.0.0.0:10001 (Press CTRL+C to quit)
INFO:     ('127.0.0.1', 55591) - "WebSocket /" [accepted]
INFO:     connection open

Hold the spacebar to start recording. Press CTRL-C to exit.
 Recording started...
           Recording stopped.
audio/wav /var/folders/xt/5d_33m2d4mxdlx01g3x9g9br0000gn/T/input_20240424223446654411.wav /var/folders/xt/5d_33m2d4mxdlx01g3x9g9br0000gn/T/output_20240424223446656750.wav
>  Audio test.

  Your audio is working fine.                                                                                                                                

  Let me know what you'd like to do next.                                                                                                                    

  Let me know what you'd like to do next.                                                                                                                    

  Let me know what you'd like to do next.                                                                                                                    

  Let me know what you'd like to do next.                                                                                                                    

  Let me know what you'd like to do next.                                                                                                                    

  Let me know what you'd like to do next.                                                                                                                    

  Let me know what you'd like to do next.                                                                                                                    

  Let me know what you'd like to do next.                                                                                                                    

  Let me know what you'd like to do next.                                                                                                                    

  Let me know what you'd like to do next.                                                                                                                    

  Let me know what you'd like to do next.                                                                                                                    

  Let me know what you'd like to do next.                                                                                                                    

  Let me know what you'd like to do next.●                                                                                                                   
                                                                                                                                                             ^CKilled: 9

This (above) is the output from retesting with just using the spacebar (to eliminate the device as a possible source of the issue. Not sure what to try next. Rebooted, installed again on a new conda environment, same issue.

Desktop (please complete the following information):

highb commented 5 months ago

I'm also experiencing this issue with Ubuntu 22.04 and Python 3.10.12.

contractorwolf commented 5 months ago

well at least i am not the only one. Have you tried rolling back to a previous version? This sucks for me because I have an awesome idea for a "device case" that I want to demo but i need a working server first. Let me know if you figure something out @highb, ill do the same.

highb commented 5 months ago

well at least i am not the only one. Have you tried rolling back to a previous version? This sucks for me because I have an awesome idea for a "device case" that I want to demo but i need a working server first. Let me know if you figure something out @highb, ill do the same.

Unfortunately, I haven't had time to tinker with it recently. The Discord has some discussions about similar issues if you want to check them out.

ai-Ev1lC0rP commented 5 months ago

I"m having the same issue, granted... i'm only trying to run locally while i troubleshoot which might be part of my problem. I'm running either command-r (not plus) or Llama70B or Mixtral8x7B-V2.8. Plus using local whisper, plus using piper, plus running ollama on a separate server, plus trying to run the moble app (Which i got it working!) but it's.... in development and i never expected it to just... WORK especially since i'm tyring to just create a shortcut for my side button.

I get similar results. but i think part of that as well is the system prompt that ollama might be passing due to the fact i have OpenWebUI running on that same server to be able to craft models.

"It seems we're encountering similar issues, which I initially attributed to a poorly configured system or corrupted memory. Specifically, it appears to reference a Windows 7 computer, suggesting it may be necessary for utilizing the computer module."Then i had to convince it that it just got done... on my mac... (I dont even own a windows 7 computer nor have i even had a reason to have it in ANY context window)

It does seem to make a dramatic impact if you get the context window right or wrong or if you pass too much it tends to do things that no matter the model (i've tried a bunch) If you are trying to showcase it, i'd say keep it simple, but be specific about the simplicity. . . .if that makes any bit of sense. I do like how you can do the %save_message and %load_message however i think i am running into trying to pass too much context again. I'm currently trying to get it to teach itself how to use AIFS/Chroma which... should just work but it seems to be an ongoing issue and conversation.

Not here, just between me and the LLM. ha!

contractorwolf commented 5 months ago

I finally tried to start from scratch again. Blew away the install and made a new condo env, installed python 3.10. Downloaded everything again and now it seems to work fine. I wish I had some insight I could provide but I am currently no longer experiencing this issue.

Get Outlook for Androidhttps://aka.ms/AAb9ysg


From: ai-Ev1lC0rP @.> Sent: Sunday, May 5, 2024 1:16:26 AM To: OpenInterpreter/01 @.> Cc: james wolf @.>; Author @.> Subject: Re: [OpenInterpreter/01] "Let me know what you'd like to do next" repeats after answering the question and doesnt stop (Issue #259)

I"m having the same issue, granted... i'm only trying to run locally while i troubleshoot which might be part of my problem. I'm running either command-r (not plus) or Llama70B or Mixtral8x7B-V2.8. Plus using local whisper, plus using piper, plus running ollama on a separate server, plus trying to run the moble app (Which i got it working!) but it's.... in development and i never expected it to just... WORK especially since i'm tyring to just create a shortcut for my side button.

I get similar results. but i think part of that as well is the system prompt that ollama might be passing due to the fact i have OpenWebUI running on that same server to be able to craft models.

"It seems we're encountering similar issues, which I initially attributed to a poorly configured system or corrupted memory. Specifically, it appears to reference a Windows 7 computer, suggesting it may be necessary for utilizing the computer module."Then i had to convince it that it just got done... on my mac... (I dont even own a windows 7 computer nor have i even had a reason to have it in ANY context window)

It does seem to make a dramatic impact if you get the context window right or wrong or if you pass too much it tends to do things that no matter the model (i've tried a bunch) If you are trying to showcase it, i'd say keep it simple, but be specific about the simplicity. . . .if that makes any bit of sense. I do like how you can do the %save_message and %load_message however i think i am running into trying to pass too much context again. I'm currently trying to get it to teach itself how to use AIFS/Chroma which... should just work but it seems to be an ongoing issue and conversation.

Not here, just between me and the LLM. ha!

— Reply to this email directly, view it on GitHubhttps://github.com/OpenInterpreter/01/issues/259#issuecomment-2094635827, or unsubscribehttps://github.com/notifications/unsubscribe-auth/ABDZIED4SPEPLHYK62IDWIDZAW6CVAVCNFSM6AAAAABG3RXKBWVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDAOJUGYZTKOBSG4. You are receiving this because you authored the thread.Message ID: @.***>

g3ar-v commented 4 months ago

I found out it was an open-interpreter issue. The force_task_completion messages are not correctly checked in respond.py. There should be a lower function that converts the messages.

 and not any(
                    task_status.lower()
                    in interpreter.messages[-1].get("content", "").lower()
                    for task_status in force_task_completion_breakers
                )