smol-ai / developer

the first library to let you embed a developer agent in your own app!
https://twitter.com/SmolModels
MIT License
11.79k stars 1.03k forks source link

Syntax error with main.py #82

Closed kufton closed 1 year ago

kufton commented 1 year ago

I'm assuming I've done something wrong here as it was working 'fine' (seemed to generate half-complete files and reference non-existent libraries but was still something to work with. I updated my prompt cleared the generated directory and hit the go button and....

╭─────────────────────────────── Traceback (most recent call last) ────────────────────────────────╮
│ /Users/kalebufton/Documents/Code Ethical/self-ai/aidev/bin/modal:10 in <module>                  │
│                                                                                                  │
│    9 │   sys.argv[0] = re.sub(r'(-script\.pyw|\.exe)?$', '', sys.argv[0])                        │
│ ❱ 10 │   sys.exit(main())                                                                        │
│   11                                                                                             │
│                                                                                                  │
│ /Users/kalebufton/Documents/Code                                                                 │
│ Ethical/self-ai/aidev/lib/python3.9/site-packages/modal/__main__.py:6 in main                    │
│                                                                                                  │
│    5 def main():                                                                                 │
│ ❱  6 │   entrypoint_cli()                                                                        │
│    7                                                                                             │
│                                                                                                  │
│ /Users/kalebufton/Documents/Code                                                                 │
│ Ethical/self-ai/aidev/lib/python3.9/site-packages/click/core.py:1130 in __call__                 │
│                                                                                                  │
│   1129 │   │   """Alias for :meth:`main`."""                                                     │
│ ❱ 1130 │   │   return self.main(*args, **kwargs)                                                 │
│   1131                                                                                           │
│                                                                                                  │
│ /Users/kalebufton/Documents/Code                                                                 │
│ Ethical/self-ai/aidev/lib/python3.9/site-packages/typer/core.py:778 in main                      │
│                                                                                                  │
│   777 │   ) -> Any:                                                                              │
│ ❱ 778 │   │   return _main(                                                                      │
│   779 │   │   │   self,                                                                          │
│                                                                                                  │
│ /Users/kalebufton/Documents/Code                                                                 │
│ Ethical/self-ai/aidev/lib/python3.9/site-packages/typer/core.py:216 in _main                     │
│                                                                                                  │
│   215 │   │   │   with self.make_context(prog_name, args, **extra) as ctx:                       │
│ ❱ 216 │   │   │   │   rv = self.invoke(ctx)                                                      │
│   217 │   │   │   │   if not standalone_mode:                                                    │
│                                                                                                  │
│ /Users/kalebufton/Documents/Code                                                                 │
│ Ethical/self-ai/aidev/lib/python3.9/site-packages/click/core.py:1657 in invoke                   │
│                                                                                                  │
│   1656 │   │   │   │   with sub_ctx:                                                             │
│ ❱ 1657 │   │   │   │   │   return _process_result(sub_ctx.command.invoke(sub_ctx))               │
│   1658                                                                                           │
│                                                                                                  │
│ /Users/kalebufton/Documents/Code                                                                 │
│ Ethical/self-ai/aidev/lib/python3.9/site-packages/click/core.py:1657 in invoke                   │
│                                                                                                  │
│   1656 │   │   │   │   with sub_ctx:                                                             │
│ ❱ 1657 │   │   │   │   │   return _process_result(sub_ctx.command.invoke(sub_ctx))               │
│   1658                                                                                           │
│                                                                                                  │
│ /Users/kalebufton/Documents/Code                                                                 │
│ Ethical/self-ai/aidev/lib/python3.9/site-packages/click/core.py:1404 in invoke                   │
│                                                                                                  │
│   1403 │   │   if self.callback is not None:                                                     │
│ ❱ 1404 │   │   │   return ctx.invoke(self.callback, **ctx.params)                                │
│   1405                                                                                           │
│                                                                                                  │
│ /Users/kalebufton/Documents/Code                                                                 │
│ Ethical/self-ai/aidev/lib/python3.9/site-packages/click/core.py:760 in invoke                    │
│                                                                                                  │
│    759 │   │   │   with ctx:                                                                     │
│ ❱  760 │   │   │   │   return __callback(*args, **kwargs)                                        │
│    761                                                                                           │
│                                                                                                  │
│ /Users/kalebufton/Documents/Code                                                                 │
│ Ethical/self-ai/aidev/lib/python3.9/site-packages/click/decorators.py:26 in new_func             │
│                                                                                                  │
│    25 │   def new_func(*args, **kwargs):  # type: ignore                                         │
│ ❱  26 │   │   return f(get_current_context(), *args, **kwargs)                                   │
│    27                                                                                            │
│                                                                                                  │
│ /Users/kalebufton/Documents/Code                                                                 │
│ Ethical/self-ai/aidev/lib/python3.9/site-packages/modal/cli/run.py:116 in f                      │
│                                                                                                  │
│   115 │   │   │   else:                                                                          │
│ ❱ 116 │   │   │   │   func(*args, **kwargs)                                                      │
│   117 │   │   │   if app.function_invocations == 0:                                              │
│                                                                                                  │
│ /Users/kalebufton/Documents/Code Ethical/self-ai/developer/main.py:129 in main                   │
│                                                                                                  │
│   128 │   try:                                                                                   │
│ ❱ 129 │   │   list_actual = ast.literal_eval(filepaths_string)                                   │
│   130                                                                                            │
│                                                                                                  │
│ /Users/kalebufton/opt/anaconda3/lib/python3.9/ast.py:62 in literal_eval                          │
│                                                                                                  │
│     61 │   if isinstance(node_or_string, str):                                                   │
│ ❱   62 │   │   node_or_string = parse(node_or_string, mode='eval')                               │
│     63 │   if isinstance(node_or_string, Expression):                                            │
│                                                                                                  │
│ /Users/kalebufton/opt/anaconda3/lib/python3.9/ast.py:50 in parse                                 │
│                                                                                                  │
│     49 │   # Else it should be an int giving the minor version for 3.x.                          │
│ ❱   50 │   return compile(source, filename, mode, flags,                                         │
│     51 │   │   │   │      _feature_version=feature_version)                                      │
╰──────────────────────────────────────────────────────────────────────────────────────────────────╯
╭──────────────────────────────────────────────────────────────────────────────────────────────────╮
│ - /main.py                                                                                       │
│   ▲                                                                                              │
╰──────────────────────────────────────────────────────────────────────────────────────────────────╯
SyntaxError: invalid syntax
kufton commented 1 year ago

I did some more digging and it looks like what is actually happening is I was hitting a token limit of some form. Here's the actually helpful (for once python debug's are helpful!)

Traceback (most recent call last): File "/pkg/modal/_container_entrypoint.py", line 330, in handle_input_exception yield File "/pkg/modal/_container_entrypoint.py", line 403, in call_function_sync res = fun(*args, **kwargs) File "/root/debugger.py", line 79, in generate_response response = openai.ChatCompletion.create(**params) File "/usr/local/lib/python3.9/site-packages/openai/api_resources/chat_completion.py", line 25, in create return super().create(*args, **kwargs) File "/usr/local/lib/python3.9/site-packages/openai/api_resources/abstract/engine_api_resource.py", line 153, in create response, _, api_key = requestor.request( File "/usr/local/lib/python3.9/site-packages/openai/api_requestor.py", line 230, in request resp, got_stream = self._interpret_response(result, stream) File "/usr/local/lib/python3.9/site-packages/openai/api_requestor.py", line 624, in _interpret_response self._interpret_response_line( File "/usr/local/lib/python3.9/site-packages/openai/api_requestor.py", line 687, in _interpret_response_line raise self.handle_error_response( openai.error.InvalidRequestError: This model's maximum context length is 4097 tokens. However, your messages resulted in 35146678 tokens. Please reduce the length of the messages. Traceback (most recent call last): File "/pkg/modal/_container_entrypoint.py", line 330, in handle_input_exception yield File "/pkg/modal/_container_entrypoint.py", line 403, in call_function_sync res = fun(*args, **kwargs) File "/root/debugger.py", line 79, in generate_response response = openai.ChatCompletion.create(**params) File "/usr/local/lib/python3.9/site-packages/openai/api_resources/chat_completion.py", line 25, in create return super().create(*args, **kwargs) File "/usr/local/lib/python3.9/site-packages/openai/api_resources/abstract/engine_api_resource.py", line 153, in create response, _, api_key = requestor.request( File "/usr/local/lib/python3.9/site-packages/openai/api_requestor.py", line 230, in request resp, got_stream = self._interpret_response(result, stream) File "/usr/local/lib/python3.9/site-packages/openai/api_requestor.py", line 624, in _interpret_response self._interpret_response_line( File "/usr/local/lib/python3.9/site-packages/openai/api_requestor.py", line 687, in _interpret_response_line raise self.handle_error_response( openai.error.APIError: Internal error { "error": { "message": "Internal error", "type": "internal_error", "param": null, "code": "internal_error" } } 500 {'error': {'message': 'Internal error', 'type': 'internal_error', 'param': None, 'code': 'internal_error'}} {'Date': 'Wed, 07 Jun 2023 00:27:46 GMT', 'Content-Type': 'application/json; charset=utf-8', 'Content-Length': '152', 'Connection': 'keep-alive', 'vary': 'Origin', 'x-ratelimit-limit-requests': '3500', 'x-ratelimit-limit-tokens': '90000', 'x-ratelimit-remaining-requests': '3499', 'x-ratelimit-remaining-tokens': '85903', 'x-ratelimit-reset-requests': '17ms', 'x-ratelimit-reset-tokens': '2.73s', 'x-request-id': '29f4e81e2e54151dc0783da1b02df82d', 'strict-transport-security': 'max-age=15724800; includeSubDomains', 'CF-Cache-Status': 'DYNAMIC', 'Server': 'cloudflare', 'CF-RAY': '7d34c4be6bfc3925-IAD', 'alt-svc': 'h3=":443"; ma=86400'} Traceback (most recent call last): File "/pkg/modal/_container_entrypoint.py", line 330, in handle_input_exception yield File "/pkg/modal/_container_entrypoint.py", line 403, in call_function_sync res = fun(*args, **kwargs) File "/root/debugger.py", line 79, in generate_response response = openai.ChatCompletion.create(**params) File "/usr/local/lib/python3.9/site-packages/openai/api_resources/chat_completion.py", line 25, in create return super().create(*args, **kwargs) File "/usr/local/lib/python3.9/site-packages/openai/api_resources/abstract/engine_api_resource.py", line 153, in create response, _, api_key = requestor.request( File "/usr/local/lib/python3.9/site-packages/openai/api_requestor.py", line 230, in request resp, got_stream = self._interpret_response(result, stream) File "/usr/local/lib/python3.9/site-packages/openai/api_requestor.py", line 624, in _interpret_response self._interpret_response_line( File "/usr/local/lib/python3.9/site-packages/openai/api_requestor.py", line 687, in _interpret_response_line raise self.handle_error_response( openai.error.APIError: Internal error { "error": { "message": "Internal error", "type": "internal_error", "param": null, "code": "internal_error" } } 500 {'error': {'message': 'Internal error', 'type': 'internal_error', 'param': None, 'code': 'internal_error'}} {'Date': 'Wed, 07 Jun 2023 00:28:50 GMT', 'Content-Type': 'application/json; charset=utf-8', 'Content-Length': '152', 'Connection': 'keep-alive', 'vary': 'Origin', 'x-ratelimit-limit-requests': '3500', 'x-ratelimit-limit-tokens': '90000', 'x-ratelimit-remaining-requests': '3499', 'x-ratelimit-remaining-tokens': '85903', 'x-ratelimit-reset-requests': '17ms', 'x-ratelimit-reset-tokens': '2.73s', 'x-request-id': 'eda7165b6dc46f45ad0d94005952a39f', 'strict-transport-security': 'max-age=15724800; includeSubDomains', 'CF-Cache-Status': 'DYNAMIC', 'Server': 'cloudflare', 'CF-RAY': '7d34c59fab8d3925-IAD', 'alt-svc': 'h3=":443"; ma=86400'}

kufton commented 1 year ago

Alrighty yes i patched this by using code from one of the demo videos:

`import sys import os import modal import ast import time # Import time for sleep function

stub = modal.Stub("smol-developer-v1") generatedDir = "generated" openai_image = modal.Image.debian_slim().pip_install("openai", "tiktoken") openai_model = "gpt-4" # or 'gpt-3.5-turbo', openai_model_max_tokens = 2000 # i wonder how to tweak this properly

@stub.function( image=openai_image, secret=modal.Secret.from_dotenv(), retries=modal.Retries( max_retries=3, backoff_coefficient=2.0, initial_delay=1.0, ),

concurrency_limit=5,

# timeout=120,

) def generate_response(system_prompt, user_prompt, *args): import openai import tiktoken

def reportTokens(prompt):
    encoding = tiktoken.encoding_for_model(openai_model)
    print("\033[37m" + str(len(encoding.encode(prompt))) + " tokens\033[0m" + " in prompt: " + "\033[92m" + prompt[:50] + "\033[0m")

openai.api_key = os.environ["OPENAI_API_KEY"]

messages = []
messages.append({"role": "system", "content": system_prompt})
reportTokens(system_prompt)
messages.append({"role": "user", "content": user_prompt})
reportTokens(user_prompt)
role = "assistant"
for value in args:
    messages.append({"role": role, "content": value})
    reportTokens(value)
    role = "user" if role == "assistant" else "assistant"

params = {
    "model": openai_model,
    "messages": messages,
    "max_tokens": openai_model_max_tokens,
    "temperature": 0,
}

response = openai.ChatCompletion.create(**params)
time.sleep(1)  # Add a delay of 1 second between API calls
reply = response.choices[0]["message"]["content"]
return reply`

Now I'm running into a different error which I think has been logged. So I'll close this out. For reference it was a rate limiting issue what what I can tell, and what fixed it.