This PR updated the vllm version to 0.6.2 (requires vllm >= 0.6.0) to address an issue that importing vllm, which triggers SQLite loading, leads to a deadlock of SQLite I/O when the scripts is run in parallel. (See #7831 in https://github.com/vllm-project/vllm/pull)
Additionally, to meet the requirements of the updated vllm, it was necessary to upgrade the transformers library. However, this version change conflicts with the requirements of llm-jp-eval v1.4.1. As a solution, this PR separated the Python environments (venv) for offline evaluation with vllm and llm-jp-eval v1.4.1.
This PR updated the vllm version to 0.6.2 (requires vllm >= 0.6.0) to address an issue that importing vllm, which triggers SQLite loading, leads to a deadlock of SQLite I/O when the scripts is run in parallel. (See #7831 in https://github.com/vllm-project/vllm/pull)
Additionally, to meet the requirements of the updated vllm, it was necessary to upgrade the transformers library. However, this version change conflicts with the requirements of llm-jp-eval v1.4.1. As a solution, this PR separated the Python environments (venv) for offline evaluation with vllm and llm-jp-eval v1.4.1.