Open tomwarias opened 5 months ago
I tried on computer and laptop, got the same result
Hi @tomwarias
It looks like you've passed your OpenAI quota
raise RateLimitError(
litellm.exceptions.RateLimitError: OpenAIException - Error code: 429 - {'error': {'message': 'You exceeded your current quota, please check your plan and billing details. For more information on this error, read the docs: https://platform.openai.com/docs/guides/error-codes/api-errors.', 'type': 'insufficient_quota', 'param': None, 'code': 'insufficient_quota'}}
Are you able to try with a local model to see if that allows you to use OI?
@MikeBirdTech I am also facing the same quota error as well. However, I don't understand why I am receiving it since my usage and billing show that I haven't used any of it.
@abhishekvijayakumar Are you able to make curl requests using the same API key and OpenAI model that you're using with OI?
https://platform.openai.com/docs/api-reference/making-requests
@MikeBirdTech It doesnt work on local model either, i should have metioned that @MikeBirdTech
I have reinstalled python and interpreter, if i try to use openai it gaves me the same error if I use local it dowlands Phi model and the opens local webserver
Local mode fix is coming in the next day or two!
@abhishekvijayakumar Are you able to make curl requests using the same API key and OpenAI model that you're using with OI?
https://platform.openai.com/docs/api-reference/making-requests
@MikeBirdTech yes, please see screenshot of response results:
Having the same issue, but without the OpenAI quota errors. Just performing any basic tasks leads to the '[IPKernelApp] WARNING | Parent appears to have exited, shutting down.' error.
@filip-van-hoeckel The error [IPKernelApp] WARNING | Parent appears to have exited, shutting down
typically occurs in the context of Jupyter notebooks or similar interactive Python environments. It indicates that the "parent" process, which is usually the Jupyter server that manages your notebook or interactive Python session, has unexpectedly stopped or is no longer communicating with the "child" process, which is the kernel running your Python code.
What is the error you're seeing prior to that?
@MikeBirdTech see https://github.com/OpenInterpreter/open-interpreter/issues/1125#issuecomment-2016929036 for an example.
Describe the bug
C:\Users\NZXT>interpreter
▌ Model set to gpt-4
Open Interpreter will require approval before running code.
Use interpreter -y to bypass this.
Press CTRL-C to exit.
, pkg: 0.2.2 OS Version and Architecture: Windows-10-10.0.22000-SP0 CPU Info: Intel64 Family 6 Model 165 Stepping 5, GenuineIntel RAM Info: 15.92 GB, used: 9.32, free: 6.61
When you execute code, it will be executed on the user's machine. The user has given you full and complete permission to execute any code necessary to complete the task. Execute the code. If you want to send data between programming languages, save the data to a txt or json. You can access the internet. Run any code to achieve the goal, and if at first you don't succeed, try again and again. You can install new packages. When a user refers to a filename, they're likely referring to an existing file in the directory you're currently executing code in. Write messages to the user in Markdown. In general, try to make plans with as few steps as possible. As for actually executing code to carry out that plan, for stateful languages (like python, javascript, shell, but NOT for html which starts from 0 every time) it's critical not to try to do everything in one code block. You should try something, print information about it, then continue from there in tiny, informed steps. You will never get it on the first try, and attempting it in one go will often lead to errors you cant see. You are capable of any task.
THE COMPUTER API
A python
computer
module is ALREADY IMPORTED, and can be used for many tasks:Do not import the computer module, or any of its sub-modules. They are already imported.
User Info{{import getpass import os import platform}} Name: {{getpass.getuser()}} CWD: {{os.getcwd()}} SHELL: {{os.environ.get('SHELL')}} OS: {{platform.system()}}"
Traceback (most recent call last): File "C:\Users\NZXT\AppData\Local\Programs\Python\Python310\lib\site-packages\litellm\llms\openai.py", line 376, in completion raise e File "C:\Users\NZXT\AppData\Local\Programs\Python\Python310\lib\site-packages\litellm\llms\openai.py", line 294, in completion return self.streaming( File "C:\Users\NZXT\AppData\Local\Programs\Python\Python310\lib\site-packages\litellm\llms\openai.py", line 476, in streaming response = openai_client.chat.completions.create(*data, timeout=timeout) File "C:\Users\NZXT\AppData\Local\Programs\Python\Python310\lib\site-packages\openai_utils_utils.py", line 275, in wrapper return func(args, **kwargs) File "C:\Users\NZXT\AppData\Local\Programs\Python\Python310\lib\site-packages\openai\resources\chat\completions.py", line 663, in create return self._post( File "C:\Users\NZXT\AppData\Local\Programs\Python\Python310\lib\site-packages\openai_base_client.py", line 1200, in post return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)) File "C:\Users\NZXT\AppData\Local\Programs\Python\Python310\lib\site-packages\openai_base_client.py", line 889, in request return self._request( File "C:\Users\NZXT\AppData\Local\Programs\Python\Python310\lib\site-packages\openai_base_client.py", line 980, in _request raise self._make_status_error_from_response(err.response) from None openai.AuthenticationError: Error code: 401 - {'error': {'message': 'Incorrect API key provided: x. You can find your API key at https://platform.openai.com/account/api-keys.', 'type': 'invalid_request_error', 'param': None, 'code': 'invalid_api_key'}}
During handling of the above exception, another exception occurred:
Traceback (most recent call last): File "C:\Users\NZXT\AppData\Local\Programs\Python\Python310\lib\site-packages\litellm\main.py", line 958, in completion raise e File "C:\Users\NZXT\AppData\Local\Programs\Python\Python310\lib\site-packages\litellm\main.py", line 931, in completion response = openai_chat_completions.completion( File "C:\Users\NZXT\AppData\Local\Programs\Python\Python310\lib\site-packages\litellm\llms\openai.py", line 382, in completion raise OpenAIError(status_code=e.status_code, message=str(e)) litellm.llms.openai.OpenAIError: Error code: 401 - {'error': {'message': 'Incorrect API key provided: x. You can find your API key at https://platform.openai.com/account/api-keys.', 'type': 'invalid_request_error', 'param': None, 'code': 'invalid_api_key'}}
During handling of the above exception, another exception occurred:
Traceback (most recent call last): File "C:\Users\NZXT\AppData\Local\Programs\Python\Python310\lib\site-packages\interpreter\core\llm\llm.py", line 235, in fixed_litellm_completions yield from litellm.completion(*params) File "C:\Users\NZXT\AppData\Local\Programs\Python\Python310\lib\site-packages\litellm\utils.py", line 2727, in wrapper raise e File "C:\Users\NZXT\AppData\Local\Programs\Python\Python310\lib\site-packages\litellm\utils.py", line 2628, in wrapper result = original_function(args, **kwargs) File "C:\Users\NZXT\AppData\Local\Programs\Python\Python310\lib\site-packages\litellm\main.py", line 2056, in completion raise exception_type( File "C:\Users\NZXT\AppData\Local\Programs\Python\Python310\lib\site-packages\litellm\utils.py", line 8146, in exception_type raise e File "C:\Users\NZXT\AppData\Local\Programs\Python\Python310\lib\site-packages\litellm\utils.py", line 6964, in exception_type raise AuthenticationError( litellm.exceptions.AuthenticationError: OpenAIException - Error code: 401 - {'error': {'message': 'Incorrect API key provided: x. You can find your API key at https://platform.openai.com/account/api-keys.', 'type': 'invalid_request_error', 'param': None, 'code': 'invalid_api_key'}}
During handling of the above exception, another exception occurred:
Traceback (most recent call last): File "C:\Users\NZXT\AppData\Local\Programs\Python\Python310\lib\runpy.py", line 196, in _run_module_as_main return _run_code(code, main_globals, None, File "C:\Users\NZXT\AppData\Local\Programs\Python\Python310\lib\runpy.py", line 86, in _run_code exec(code, run_globals) File "C:\Users\NZXT\AppData\Local\Programs\Python\Python310\Scripts\interpreter.exe__main__.py", line 7, in
sys.exit(main())
File "C:\Users\NZXT\AppData\Local\Programs\Python\Python310\lib\site-packages\interpreter\terminal_interface\start_terminal_interface.py", line 415, in main
start_terminal_interface(interpreter)
File "C:\Users\NZXT\AppData\Local\Programs\Python\Python310\lib\site-packages\interpreter\terminal_interface\start_terminal_interface.py", line 393, in start_terminalinterface
interpreter.chat()
File "C:\Users\NZXT\AppData\Local\Programs\Python\Python310\lib\site-packages\interpreter\core\core.py", line 154, in chat
for in self._streaming_chat(message=message, display=display):
File "C:\Users\NZXT\AppData\Local\Programs\Python\Python310\lib\site-packages\interpreter\core\core.py", line 183, in _streaming_chat
yield from terminal_interface(self, message)
File "C:\Users\NZXT\AppData\Local\Programs\Python\Python310\lib\site-packages\interpreter\terminal_interface\terminal_interface.py", line 136, in terminal_interface
for chunk in interpreter.chat(message, display=False, stream=True):
File "C:\Users\NZXT\AppData\Local\Programs\Python\Python310\lib\site-packages\interpreter\core\core.py", line 222, in _streaming_chat
yield from self._respond_and_store()
File "C:\Users\NZXT\AppData\Local\Programs\Python\Python310\lib\site-packages\interpreter\core\core.py", line 268, in _respond_and_store
for chunk in respond(self):
File "C:\Users\NZXT\AppData\Local\Programs\Python\Python310\lib\site-packages\interpreter\core\respond.py", line 68, in respond
for chunk in interpreter.llm.run(messages_for_llm):
File "C:\Users\NZXT\AppData\Local\Programs\Python\Python310\lib\site-packages\interpreter\core\llm\llm.py", line 205, in run
yield from run_function_calling_llm(self, params)
File "C:\Users\NZXT\AppData\Local\Programs\Python\Python310\lib\site-packages\interpreter\core\llm\run_function_calling_llm.py", line 44, in run_function_calling_llm
for chunk in llm.completions(request_params):
File "C:\Users\NZXT\AppData\Local\Programs\Python\Python310\lib\site-packages\interpreter\core\llm\llm.py", line 238, in fixed_litellm_completions
raise first_error
File "C:\Users\NZXT\AppData\Local\Programs\Python\Python310\lib\site-packages\interpreter\core\llm\llm.py", line 219, in fixed_litellm_completions
yield from litellm.completion(params)
File "C:\Users\NZXT\AppData\Local\Programs\Python\Python310\lib\site-packages\litellm\utils.py", line 2727, in wrapper raise e
File "C:\Users\NZXT\AppData\Local\Programs\Python\Python310\lib\site-packages\litellm\utils.py", line 2628, in wrapper result = original_function(*args, **kwargs)
File "C:\Users\NZXT\AppData\Local\Programs\Python\Python310\lib\site-packages\litellm\main.py", line 2056, in completion
raise exception_type(
File "C:\Users\NZXT\AppData\Local\Programs\Python\Python310\lib\site-packages\litellm\utils.py", line 8146, in exception_type
raise e
File "C:\Users\NZXT\AppData\Local\Programs\Python\Python310\lib\site-packages\litellm\utils.py", line 6995, in exception_type
raise RateLimitError(
litellm.exceptions.RateLimitError: OpenAIException - Error code: 429 - {'error': {'message': 'You exceeded your current quota, please check your plan and billing details. For more information on this error, read the docs: https://platform.openai.com/docs/guides/error-codes/api-errors.', 'type': 'insufficient_quota', 'param': None, 'code': 'insufficient_quota'}}
[IPKernelApp] WARNING | Parent appears to have exited, shutting down.
Reproduce
Reproduce set openAI API key start Interpreter prompt Expected behavior I'd expect it to work as it did before. Sorry for not being more helpful.
Screenshots No response
Open Interpreter version 0.2.2
Python version 3.10.4
Operating System name and version Windows 11
Expected behavior
It worked fine until i updated for new version, when i install older version it doesnt work even on local
Screenshots
No response
Open Interpreter version
0.2.2
Python version
3.10.4
Operating System name and version
Windows 11
Additional context
No response