GatorEducator / execexam

:rocket: ExecExam runs executable examinations that assess Python programming skills
https://pypi.org/project/execexam/
1 stars 3 forks source link

feat: Add Exception Handling for LiteLLM #22

Open dyga01 opened 4 days ago

dyga01 commented 4 days ago
  1. Pull Request: Add Exception Handling for API key and API server exceptions for LiteLLM so it doesn't crash and give a long stack trace.

  2. Aidan Dyga (@dyga01) and Hemani Alaparthi (@hemanialaparthi)

  3. 3

  4. bug, enhancement

  5. This pull request aims to enhance LiteLLM's error handling capabilities, focusing on API key and server exceptions. By implementing the new exceptions.py file and making changes to advise.py, we hope to prevent crashes and long stack traces that can confuse users. The goal is to improve the library's stability and user experience by providing more graceful error handling and user-friendly error messages, making LiteLLM more robust and easier to debug in real-world scenarios.

  6. Coverage will be maintained for the exceptions.py file, as we have implemented new tests to ensure the new feature is fully tested.

  7. The tests have been conducted on a Mac. It would be great if I could have a Linux and Windows user test it out.

  8. Output can vary based on the exceptions.

i) For example for the command where the advice_model is incorrect, the output should be the according:

Exception Type: NotFoundError
Explanation: The requested resource was not found. Please check if your model or endpoint is correct.

If your issue persists, ensure the model you entered is listed below:
- anthropic/claude-3-haiku-20240307
- anthropic/claude-3-opus-20240229
- groq/llama3-8b-8192
- openrouter/meta-llama/llama-3.1-8b-instruct:free
- openrouter/google/gemma-2-9b-it:free

ii) If the advice_server is invalid, this should be the output:

Exception Type: APIConnectionError
Explanation: There was a connection issue to the server.
NOTE: This error can sometimes be caused by an invalid server URL. Please verify the URL you're using.

If your issue persists, ensure the model you entered is listed below:
- anthropic/claude-3-haiku-20240307
- anthropic/claude-3-opus-20240229
- groq/llama3-8b-8192
- openrouter/meta-llama/llama-3.1-8b-instruct:free
- openrouter/google/gemma-2-9b-it:free

iii) NOTE: If LiteLLM does change their exception handling, the program will still not crash because we implemented a default behaviour for non LiteLLM Exceptions and this will produce the error message and a general purpose output but not a stacktrace .

gkapfham commented 3 days ago

Hello @dyga01 and @hemanialaparthi there are two PRs connected to exception handling. Do you have a preference on the order in which they are reviewed and ultimately merged?

dyga01 commented 2 days ago

Hello @dyga01 and @hemanialaparthi there are two PRs connected to exception handling. Do you have a preference on the order in which they are reviewed and ultimately merged?

We do not have a preference for the order in which both PRs are merged. We would suggest merging them chronologically, but we do not foresee any issues.