Description: When using the Coding Mentor the program crashes when it is not given an expected model or url.
Expected Behavior: The code should not crash. Instead, it should notify the user of the unexpected input and offer an alternative.
Actual Behavior: The code crashes and gives a large error output. This is the end of the output.
BadRequestError: litellm.BadRequestError: LLM Provider NOT provided. Pass in the LLM provider you are trying to call. You passed model=https://docs.litellm.ai/docs/providers
Pass model as E.g. For 'Huggingface' inference endpoints pass in `completion(model='huggingface/starcoder',..)` Learn more: https://docs.litellm.ai/docs/providers
Proposed Solution: Implement expectation handling statements within the coding mentor that provide more useful feedback.
Description: When using the Coding Mentor the program crashes when it is not given an expected model or url.
Expected Behavior: The code should not crash. Instead, it should notify the user of the unexpected input and offer an alternative.
Actual Behavior: The code crashes and gives a large error output. This is the end of the output.
Proposed Solution: Implement expectation handling statements within the coding mentor that provide more useful feedback.