When is use the proxy with debugging the requests are nicely logged, like:
POST Request Sent from LiteLLM:
curl -X POST \
https://api-inference.huggingface.co/models/codellama/CodeLlama-34b-Instruct-hf \
-H 'content-type: application/json' -H 'Authorization: Bearer hf_MqMbDnyPAhQLkz********************' \
-d '{'inputs': '[INST] <<SYS>>\n\n You are a helpful assistent, that only communicates using JSON files.\n The expected output from you has to be: \n { \n "function_call": {\n "name": {function_name},\n "args": [],\n "ai_notes": {explanation}\n }\n }\nProduce JSON OUTPUT ONLY! The following functions are available to you:\n{\'name\': \'get_current_weather\', \'parameters\': {\'type\': \'object\', \'properties\': {\'location\': {\'type\': \'string\', \'description\': \'The city and state, e.g. San Francisco, CA\'}, \'unit\': {\'type\': \'string\', \'enum\': [\'celsius\', \'fahrenheit\']}}, \'required\': [\'location\']}, \'description\': \'Get the current weather in a given location\'}\n\n<</SYS>>\n [/INST]\n\nWhat\'s the weather like in San Francisco, Tokyo, and Paris?\n', 'parameters': {'stream': False, 'stop': ['<</SYS>>'], 'max_new_tokens': 200, 'format': 'json', 'details': True, 'return_full_text': False}, 'stream': False}'
But, it looks like the fragment is not correct as the quoting of the body is incorrect (not escaped etc.?)
How to test:
Start a proxy with debug option
Execute some logic
Copy and paste the CURL command to a shell
CURL command not executed because of incorrect syntax
What happened?
When is use the proxy with debugging the requests are nicely logged, like:
But, it looks like the fragment is not correct as the quoting of the body is incorrect (not escaped etc.?)
How to test:
Relevant log output
No response
Twitter / LinkedIn details
No response