Open qiuxia-alone opened 2 years ago
If you add a comma at the end of the line "model_type": "openai-gpt"
, it'll fix the issue you've identified. However, you'll run into another one AttributeError: 'OpenAIGPTConfig' object has no attribute 'n_inner'
. This is because we don't support OpenAIGPTConfig today.
cc @tianleiwu @wangyems
@qiuxia-alone, the convert_to_onnx script only supports gpt2 model (Example configuration: https://huggingface.co/gpt2/blob/main/config.json).
I think you can modify it a little to export other GPT model family (like openai-gpi, megatron-gpt or gpt-neo).
Describe the bug i download the whole file from CDial-GPT_LCCC-large , then add
"model_type": "openai-gpt"
toconfig.json
, meet error as converting:OSError: Couldn't reach server at 'CDial-GPT_LCCC-large/config.json' to download configuration file or configuration file is not a valid JSON file. Please check network or file content here: CDial-GPT_LCCC-large/config.json.
Urgency none
System information
To Reproduce code:
python -m onnxruntime.transformers.convert_to_onnx -m CDial-GPT_LCCC-large/ --output cdialgpt.onnx -p fp32 --optimize_onnx