julius-heitkoetter / deception

0 stars 0 forks source link

OpenAI redudancy #23

Open julius-heitkoetter opened 9 months ago

julius-heitkoetter commented 9 months ago

OpenAI drops a “large” amount of responses (on the other of a tenth of a percent), so we need to catch the OpenAI API response error and resubmit the API request in these cases. Otherwise, the entire pipeline will crash.

migerovitch commented 9 months ago

Traceback (most recent call last): File "/data/misha_gerovitch/correlated_llm_errors/bin/dataset_pipeline.py", line 120, in run_pipeline_on_dataset(args.dataset_name, args.category, args.save_location, args.deceiver_model_name, args.deceiver_config_name, args.supervisor_model_name, args.supervisor_config_name, args.num_samples) File "/data/misha_gerovitch/correlated_llm_errors/bin/dataset_pipeline.py", line 82, in run_pipeline_on_dataset qae_incorrect_dataset_path = deceiver.run_on_dataset_name(qa_incorrect_dataset_path, save_locally=save_locally, save_on_hf=save_on_hf) File "/data/misha_gerovitch/correlated_llm_errors/lib/chain.py", line 90, in run_on_dataset_name dataset = self.run_on_dataset(dataset) File "/data/misha_gerovitch/correlated_llm_errors/lib/chain.py", line 55, in run_on_dataset updated_data.append(self(data[i])) File "/data/misha_gerovitch/correlated_llm_errors/lib/chain.py", line 130, in call qa["explanation"] = self.llm(prompt=explanation_prompt) File "/data/misha_gerovitch/correlated_llm_errors/lib/models.py", line 382, in call result = openai.ChatCompletion.create(model=self.model_name, File "/data/misha_gerovitch/miniconda3/envs/correlated_errors/lib/python3.10/site-packages/openai/api_resources/chat_completion.py", line 25, in create return super().create(*args, **kwargs) File "/data/misha_gerovitch/miniconda3/envs/correlated_errors/lib/python3.10/site-packages/openai/api_resources/abstract/engine_apiresource.py", line 153, in create response, , api_key = requestor.request( File "/data/misha_gerovitch/miniconda3/envs/correlated_errors/lib/python3.10/site-packages/openai/api_requestor.py", line 298, in request resp, got_stream = self._interpret_response(result, stream) File "/data/misha_gerovitch/miniconda3/envs/correlated_errors/lib/python3.10/site-packages/openai/api_requestor.py", line 700, in _interpret_response self._interpret_response_line( File "/data/misha_gerovitch/miniconda3/envs/correlated_errors/lib/python3.10/site-packages/openai/api_requestor.py", line 765, in _interpret_response_line raise self.handle_error_response( openai.error.APIError: Bad gateway. {"error":{"code":502,"message":"Bad gateway.","param":null,"type":"cf_bad_gateway"}} 502 {'error': {'code': 502, 'message': 'Bad gateway.', 'param': None, 'type': 'cf_bad_gateway'}} {'Date': 'Thu, 30 Nov 2023 16:30:34 GMT', 'Content-Type': 'application/json', 'Content-Length': '84', 'Connection': 'keep-alive', 'X-Frame-Options': 'SAMEORIGIN', 'Referrer-Policy': 'same-origin', 'Cache-Control': 'private, max-age=0, no-store, no-cache, must-revalidate, post-check=0, pre-check=0', 'Expires': 'Thu, 01 Jan 1970 00:00:01 GMT', 'Server': 'cloudflare', 'CF-RAY': '82e46b259ffe15fb-SJC', 'alt-svc': 'h3=":443"; ma=86400'}