stanford-oval / storm

An LLM-powered knowledge curation system that researches a topic and generates a full-length report with citations.
http://storm.genie.stanford.edu
MIT License
10k stars 937 forks source link

Openai completions error #77

Closed smehra closed 1 month ago

smehra commented 1 month ago

When I run python examples/run_storm_wiki_gpt.py \ --output-dir $OUTPUT_DIR \ --retriever you \ --do-research \ --do-generate-outline \ --do-generate-article \ --do-polish-article I get the following error TypeError: Completions.create() got an unexpected keyword argument 'api_version'

shaoyijia commented 1 month ago

Hi, which version of openai package are you using? You can run the following code to check the version.

import openai
print(openai.__version__)
smehra commented 1 month ago

I created a new virtual env with the provided requirements.txt. Anyway, the version is 1.35.13

360elements360 commented 1 month ago

Same issue here

Traceback (most recent call last): File "C:\Users\Power\Documents\storm\examples\run_storm_wiki_gpt.py", line 125, in main(parser.parse_args()) File "C:\Users\Power\Documents\storm\examples\run_storm_wiki_gpt.py", line 80, in main runner.run( File "C:\Users\Power\Documents\storm\src\storm_wiki\engine.py", line 287, in run information_table = self.run_knowledge_curation_module(ground_truth_url=ground_truth_url, ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\Power\Documents\storm\src\interface.py", line 376, in wrapper result = func(*args, kwargs) ^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\Power\Documents\storm\src\storm_wiki\engine.py", line 164, in run_knowledge_curation_module information_table, conversation_log = self.storm_knowledge_curation_module.research( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\Power\Documents\storm\src\storm_wiki\modules\knowledge_curation.py", line 304, in research considered_personas = self._get_considered_personas(topic=topic, max_num_persona=max_perspective) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\Power\Documents\storm\src\storm_wiki\modules\knowledge_curation.py", line 229, in _get_considered_personas return self.persona_generator.generate_persona(topic=topic, max_num_persona=max_num_persona) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\Power\Documents\storm\src\storm_wiki\modules\persona_generator.py", line 135, in generate_persona personas = self.create_writer_with_persona(topic=topic) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\Power\miniconda3\Lib\site-packages\dspy\primitives\program.py", line 26, in call return self.forward(args, kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\Power\Documents\storm\src\storm_wiki\modules\persona_generator.py", line 70, in forward related_topics = self.find_related_topic(topic=topic).related_topics ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\Power\miniconda3\Lib\site-packages\dspy\predict\predict.py", line 61, in call return self.forward(kwargs) ^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\Power\miniconda3\Lib\site-packages\dspy\predict\chain_of_thought.py", line 59, in forward return super().forward(signature=signature, kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\Power\miniconda3\Lib\site-packages\dspy\predict\predict.py", line 103, in forward x, C = dsp.generate(template, config)(x, stage=self.stage) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\Power\miniconda3\Lib\site-packages\dsp\primitives\predict.py", line 77, in do_generate completions: list[dict[str, Any]] = generator(prompt, kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\Power\Documents\storm\src\lm.py", line 76, in call response = self.request(prompt, kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\Power\miniconda3\Lib\site-packages\backoff_sync.py", line 105, in retry ret = target(args, kwargs) ^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\Power\miniconda3\Lib\site-packages\dsp\modules\gpt3.py", line 144, in request return self.basic_request(prompt, kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\Power\miniconda3\Lib\site-packages\dsp\modules\gpt3.py", line 117, in basic_request response = chat_request(kwargs) ^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\Power\miniconda3\Lib\site-packages\dsp\modules\gpt3.py", line 263, in chat_request return v1_cached_gpt3_turbo_request_v2_wrapped(kwargs).model_dump() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\Power\miniconda3\Lib\site-packages\dsp\modules\cache_utils.py", line 16, in wrapper return func(args, kwargs) ^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\Power\miniconda3\Lib\site-packages\dsp\modules\gpt3.py", line 256, in v1_cached_gpt3_turbo_request_v2_wrapped return v1_cached_gpt3_turbo_request_v2(kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\Power\miniconda3\Lib\site-packages\joblib\memory.py", line 655, in call return self._cached_call(args, kwargs)[0] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\Power\miniconda3\Lib\site-packages\joblib\memory.py", line 598, in _cached_call out, metadata = self.call(args, kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\Power\miniconda3\Lib\site-packages\joblib\memory.py", line 856, in call output = self.func(args, kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\Power\miniconda3\Lib\site-packages\dsp\modules\gpt3.py", line 250, in v1_cached_gpt3_turbo_request_v2 return openai.chat.completions.create(kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\Power\miniconda3\Lib\site-packages\openai_utils_utils.py", line 277, in wrapper return func(args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^ TypeError: Completions.create() got an unexpected keyword argument 'api_version'

openai package = 1.35.13


edit

I changed these lines and it worked:

  1. OpenAI kwargs: Old :

    openai_kwargs = {
       'api_key': os.getenv("OPENAI_API_KEY"),
       'api_provider': os.getenv('OPENAI_API_TYPE'),
       'temperature': 1.0,
       'top_p': 0.9,
       'api_base': os.getenv('AZURE_API_BASE'),
       'api_version': os.getenv('AZURE_API_VERSION'),
    }

    New :

    openai_kwargs = {
       'api_key': os.getenv("OPENAI_API_KEY"),
       'api_provider': 'openai',
       'temperature': 1.0,
       'top_p': 0.9,
    }
  2. Model names: Old ):

    outline_gen_lm = OpenAIModel(model='gpt-4-0125-preview', max_tokens=400, **openai_kwargs)
    article_gen_lm = OpenAIModel(model='gpt-4-0125-preview', max_tokens=700, **openai_kwargs)
    article_polish_lm = OpenAIModel(model='gpt-4-0125-preview', max_tokens=4000, **openai_kwargs)

    New :

    
    outline_gen_lm = OpenAIModel(model='gpt-4o', max_tokens=400, **openai_kwargs)
    article_gen_lm = OpenAIModel(model='gpt-4o', max_tokens=700, **openai_kwargs)
    article_polish_lm = OpenAIModel(model='gpt-4o', max_tokens=4000, **openai_kwargs)
shaoyijia commented 1 month ago

I created a new virtual env with the provided requirements.txt. Anyway, the version is 1.35.13

@smehra , if you still meet this problem, you could downgrade openai to 0.28.1 as a quick fix. api_base and api_version are used for calling GPT models using Azure service. openai 1.x defines OpenAI and AzureOpenAI as two separate classes so I suspect something goes run due to this.

I recognize by running pip install -r requirements.txt, the installed openai package will have the version of 1.35.13. We will look into this very soon as the team is working on wrapping src as a python package to make the installation process smoother (#66).