lamini-ai / lamini

Apache License 2.0
2.5k stars 155 forks source link

llama.error.error.ModelNameError: Not Found #28

Closed dl942702882 closed 7 months ago

dl942702882 commented 11 months ago

non_finetuned = BasicModelRunner(model_name="meta-llama/Llama-2-7b-hf", config={ "production": { "key": "xxxxxxxxxxx", } }) non_finetuned_output = non_finetuned("Tell me how to train my dog to sit") print(non_finetuned_output) and runs exception as this

`Traceback (most recent call last): File "/Users/dingli/PycharmProjects/my-llama-index/venv/lib/python3.10/site-packages/llama/program/util/run_ai.py", line 134, in powerml_send_query_to_url response.raise_for_status() File "/Users/dingli/PycharmProjects/my-llama-index/venv/lib/python3.10/site-packages/requests/models.py", line 1021, in raise_for_status raise HTTPError(http_error_msg, response=self) requests.exceptions.HTTPError: 404 Client Error: Not Found for url: https://api.powerml.co/v1/llama/run_program

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "/Users/dingli/PycharmProjects/my-llama-index/lamini_fine_tuning.py", line 11, in non_finetuned_output = non_finetuned("Tell me how to train my dog to sit") File "/Users/dingli/PycharmProjects/my-llama-index/venv/lib/python3.10/site-packages/llama/runners/basic_model_runner.py", line 39, in call output_objects = self.llm( File "/Users/dingli/PycharmProjects/my-llama-index/venv/lib/python3.10/site-packages/llama/program/builder.py", line 77, in call result = gen_value(value) File "/Users/dingli/PycharmProjects/my-llama-index/venv/lib/python3.10/site-packages/llama/program/util/api_actions.py", line 178, in gen_value value._compute_value() File "/Users/dingli/PycharmProjects/my-llama-index/venv/lib/python3.10/site-packages/llama/program/value.py", line 65, in _compute_value response = query_run_program(params) File "/Users/dingli/PycharmProjects/my-llama-index/venv/lib/python3.10/site-packages/llama/program/util/run_ai.py", line 11, in query_run_program resp = powerml_send_query_to_url(params, "/v1/llama/run_program") File "/Users/dingli/PycharmProjects/my-llama-index/venv/lib/python3.10/site-packages/llama/program/util/run_ai.py", line 139, in powerml_send_query_to_url raise llama.error.ModelNameError( llama.error.error.ModelNameError: Not Found `

zzzengzhe commented 11 months ago

when I run the code: from llama import BasicModelRunner

non_finetuned = BasicModelRunner("meta-llama/Llama-2-7b-hf") question1 = "Tell me how to train my dog to sit" non_finetuned(question1)

I got the error: status code: 401

HTTPError Traceback (most recent call last) File D:\Anaconda3\envs\study\lib\site-packages\llama\engine\lamini.py:257, in Lamini.make_web_request(self, url, http_method, json) 256 try: --> 257 resp.raise_for_status() 258 except requests.exceptions.HTTPError as e:

File ~\AppData\Roaming\Python\Python39\site-packages\requests\models.py:943, in Response.raise_for_status(self) 942 if http_error_msg: --> 943 raise HTTPError(http_error_msg, response=self)

HTTPError: 401 Client Error: Unauthorized for url: https://api.powerml.co/v2/lamini/completions

During handling of the above exception, another exception occurred:

AuthenticationError Traceback (most recent call last) Cell In[8], line 13 6 question3 = "taylor swift's best friend" 7 question4 = """Agent: I'm here to help you with your Amazon deliver order. 8 Customer: I didn't get my item 9 Agent: I'm sorry to hear that. Which item was it? 10 Customer: the blanket 11 Agent:""" ---> 13 non_finetuned(question1)

File D:\Anaconda3\envs\study\lib\site-packages\llama\runners\basic_model_runner.py:52, in BasicModelRunner.call(self, inputs) 49 else: 50 # Singleton 51 input_objects = Input(input=inputs) ---> 52 output_objects = self.llm( 53 input=input_objects, 54 output_type=Output, 55 model_name=self.model_name, 56 enable_peft=self.enable_peft, 57 ) 58 if isinstance(output_objects, list): 59 outputs = [o.output for o in output_objects]

File D:\Anaconda3\envs\study\lib\site-packages\llama\engine\typed_lamini.py:13, in TypedLamini.call(self, *args, kwargs) 12 def call(self, *args, *kwargs): ---> 13 result = super().call(args, kwargs) 14 if isinstance(result, list): 15 if "output_type" in kwargs:

File D:\Anaconda3\envs\study\lib\site-packages\llama\engine\lamini.py:83, in Lamini.call(self, input, output_type, stop_tokens, model_name, enable_peft, random, max_tokens) 71 req_data = self.make_llm_req_map( 72 self.id, 73 model_name or self.model_name, (...) 80 max_tokens, 81 ) 82 url = self.api_prefix + "completions" ---> 83 return self.make_web_request(url, "post", req_data)

File D:\Anaconda3\envs\study\lib\site-packages\llama\engine\lamini.py:277, in Lamini.make_web_request(self, url, http_method, json) 275 except Exception: 276 json_response = {} --> 277 raise AuthenticationError( 278 json_response.get("detail", "AuthenticationError") 279 ) 280 if resp.status_code == 400: 281 try:

AuthenticationError: Invalid token

zzzengzhe commented 11 months ago

when I run the code: from llama import BasicModelRunner

non_finetuned = BasicModelRunner("meta-llama/Llama-2-7b-hf") question1 = "Tell me how to train my dog to sit" non_finetuned(question1)

I got the error:

status code: 401 HTTPError Traceback (most recent call last) File D:\Anaconda3\envs\study\lib\site-packages\llama\engine\lamini.py:257, in Lamini.make_web_request(self, url, http_method, json) 256 try: --> 257 resp.raise_for_status() 258 except requests.exceptions.HTTPError as e:

File ~\AppData\Roaming\Python\Python39\site-packages\requests\models.py:943, in Response.raise_for_status(self) 942 if http_error_msg: --> 943 raise HTTPError(http_error_msg, response=self)

HTTPError: 401 Client Error: Unauthorized for url: https://api.powerml.co/v2/lamini/completions

During handling of the above exception, another exception occurred:

AuthenticationError Traceback (most recent call last) Cell In[8], line 13 6 question3 = "taylor swift's best friend" 7 question4 = """Agent: I'm here to help you with your Amazon deliver order. 8 Customer: I didn't get my item 9 Agent: I'm sorry to hear that. Which item was it? 10 Customer: the blanket 11 Agent:""" ---> 13 non_finetuned(question1)

File D:\Anaconda3\envs\study\lib\site-packages\llama\runners\basic_model_runner.py:52, in BasicModelRunner.call(self, inputs) 49 else: 50 # Singleton 51 input_objects = Input(input=inputs) ---> 52 output_objects = self.llm( 53 input=input_objects, 54 output_type=Output, 55 model_name=self.model_name, 56 enable_peft=self.enable_peft, 57 ) 58 if isinstance(output_objects, list): 59 outputs = [o.output for o in output_objects]

File D:\Anaconda3\envs\study\lib\site-packages\llama\engine\typed_lamini.py:13, in TypedLamini.call(self, *args, kwargs) 12 def call(self, *args, kwargs): ---> 13 result = super().call*(args, kwargs) 14 if isinstance(result, list): 15 if "output_type" in kwargs:

File D:\Anaconda3\envs\study\lib\site-packages\llama\engine\lamini.py:83, in Lamini.call(self, input, output_type, stop_tokens, model_name, enable_peft, random, max_tokens) 71 req_data = self.make_llm_req_map( 72 self.id, 73 model_name or self.model_name, (...) 80 max_tokens, 81 ) 82 url = self.api_prefix + "completions" ---> 83 return self.make_web_request(url, "post", req_data)

File D:\Anaconda3\envs\study\lib\site-packages\llama\engine\lamini.py:277, in Lamini.make_web_request(self, url, http_method, json) 275 except Exception: 276 json_response = {} --> 277 raise AuthenticationError( 278 json_response.get("detail", "AuthenticationError") 279 ) 280 if resp.status_code == 400: 281 try:

AuthenticationError: Invalid token

Ok,it like be the network's problem and it work now.

edamamez commented 7 months ago

We'd like to offer some Lamini credits as thanks for reaching out! Please email us at info@lamini.ai and we will send them over 🦙

jigarcpatel commented 3 months ago

same error occurred to me. could you help in case this is solved.