openvinotoolkit / openvino.genai

Run Generative AI models using native OpenVINO C++ API
Apache License 2.0
111 stars 150 forks source link

chatglm3 fails with jsonl input #476

Closed avinashbhat09 closed 2 months ago

avinashbhat09 commented 3 months ago

Context

when I try to run chatglm3 it fails. command used: image

error: [ ERROR ] An exception occurred [ INFO ] Traceback (most recent call last): File "C:\Users\Local_Admin\openvino.genai\llm_bench\python\benchmark.py", line 547, in main iter_data_list, pretrain_time = CASE_TO_BENCH[model_args['use_case']](model_path, framework, args.device, model_args, args.num_iters) File "C:\Users\Local_Admin\openvino.genai\llm_bench\python\benchmark.py", line 210, in run_text_generation_benchmark run_text_generation(input_text, num, model, tokenizer, args, iter_data_list, warmup_md5, prompt_idx, bench_hook, model_precision, proc_id) File "C:\Users\Local_Admin\openvino.genai\llm_bench\python\benchmark.py", line 107, in run_text_generation result = model.generate(input_data, max_new_tokens=int(max_gen_tokens), num_beams=args['num_beams'], use_cache=True, eos_token_id=None) File "C:\Users\Local_Admin\openvino.genai\llm_bench\python\python-env\lib\site-packages\torch\utils_contextlib.py", line 115, in decorate_context return func(*args, *kwargs) File "C:\Users\Local_Admin\openvino.genai\llm_bench\python\python-env\lib\site-packages\optimum\intel\openvino\modeling_decoder.py", line 642, in generate result = super().generate( File "C:\Users\Local_Admin\openvino.genai\llm_bench\python\python-env\lib\site-packages\torch\utils_contextlib.py", line 115, in decorate_context return func(args, kwargs) File "C:\Users\Local_Admin\openvino.genai\llm_bench\python\python-env\lib\site-packages\transformers\generation\utils.py", line 1576, in generate result = self._greedy_search( File "C:\Users\Local_Admin\openvino.genai\llm_bench\python\utils\hook_greedy_search.py", line 234, in new_greedy_search model_inputs = self.prepare_inputs_for_generation(input_ids, **model_kwargs) File "C:\Users\Local_Admin\openvino.genai\llm_bench\python\utils\ov_model_classes.py", line 311, in prepare_inputs_for_generation mask_positions.append(seq.index(tmp_mask_token)) ValueError: 130000 is not in list

What needs to be done?

not sure

Example Pull Requests

No response

Resources

Contact points

NA

Ticket

No response

avinashbhat09 commented 3 months ago

console log: chatGLM. error.txt

avinashbhat09 commented 3 months ago

Issue is seen with simple prompts as well (-p) cli

peterchen-intel commented 3 months ago

@avinashbhat09 If this is same as CVS-136766?

avinashbhat09 commented 3 months ago

@peterchen-intel : Yes, it looks to be same issue

peterchen-intel commented 3 months ago

@avinashbhat09 Can you please provide one details openvino.genai commit id hugging face model commit ID or how you get the IR model? conversion command line

avinashbhat09 commented 3 months ago

current genai commit id: a5b14c7b325a59959a6f189e198822424eb2d576 I got the model from HF: https://huggingface.co/THUDM/chatglm3-6b conversion command line: python convert.py --model_id THUDM/chatglm3-6b --output_dir chatglm3-6b_ov --precision FP16 –c INT4_SYM

wgzintel commented 3 months ago

@avinashbhat09 genai commit id: 5562025d83 (master) openvino: 4fcfaf24b30 (master) chatglm3-6b can run successfully on both CPU and dGPU on RPL i9-14900K chatglm3_run_on_dGPU.txt chatglm3_run_on_CPU.txt pls pip install requirements.txt in genai (commit id: 5562025d83)

peterchen-intel commented 2 months ago

@avinashbhat09 We can't reproduce it. Can you try with updated openvino.genai?

avinashbhat09 commented 2 months ago

Hi, we can close this ticket. Issue not seen with below commit genai commit id: https://github.com/openvinotoolkit/openvino.genai/commit/5562025d83ebc2264e47134008bd10d9218d98b2 (master) openvino: 4fcfaf24b30 (master)