Open fleabites opened 1 year ago
Hi, I'm not sure but it seems like due to Guidance. The current version of Guidance is not stable (as they have plan to make a big update soon, so just wait for it). At the moment, you may try with the older version pip install guidance==0.0.63
.
What I've done:
What I see
Server starts but when I try to run any question I get the following run -time error:
`(gptq) david@shodan:~/Documents/Programming/Personal/llm/langchain/localLLM_guidance-main$ python app.py start to install package: redis successfully installed package: redis start to install package: redis_om successfully installed package: redis_om Loading model ... /home/david/miniconda3/envs/gptq/lib/python3.9/site-packages/safetensors/torch.py:99: UserWarning: TypedStorage is deprecated. It will be removed in the future and UntypedStorage will be the only storage class. This should only matter to you if you are using storages directly. To access UntypedStorage directly, use tensor.untyped_storage() instead of tensor.storage() with safe_open(filename, framework="pt", device=device) as f: Done. Running on local URL: http://0.0.0.0:7860
To create a public link, set
share=True
inlaunch()
. Traceback (most recent call last): File "/home/david/miniconda3/envs/gptq/lib/python3.9/site-packages/gradio/routes.py", line 516, in predict output = await route_utils.call_process_api( File "/home/david/miniconda3/envs/gptq/lib/python3.9/site-packages/gradio/route_utils.py", line 219, in call_process_api output = await app.get_blocks().process_api( File "/home/david/miniconda3/envs/gptq/lib/python3.9/site-packages/gradio/blocks.py", line 1437, in process_api result = await self.call_function( File "/home/david/miniconda3/envs/gptq/lib/python3.9/site-packages/gradio/blocks.py", line 1109, in call_function prediction = await anyio.to_thread.run_sync( File "/home/david/miniconda3/envs/gptq/lib/python3.9/site-packages/anyio/to_thread.py", line 33, in run_sync return await get_asynclib().run_sync_in_worker_thread( File "/home/david/miniconda3/envs/gptq/lib/python3.9/site-packages/anyio/_backends/_asyncio.py", line 877, in run_sync_in_worker_thread return await future File "/home/david/miniconda3/envs/gptq/lib/python3.9/site-packages/anyio/_backends/_asyncio.py", line 807, in run result = context.run(func, args) File "/home/david/miniconda3/envs/gptq/lib/python3.9/site-packages/gradio/utils.py", line 650, in wrapper response = f(args, kwargs) File "/home/david/Documents/Programming/Personal/llm/langchain/localLLM_guidance-main/app.py", line 23, in greet final_answer = custom_agent(name) File "/home/david/Documents/Programming/Personal/llm/langchain/localLLM_guidance-main/server/agent.py", line 71, in call prompt_start = self.guidance(prompt_start_template) File "/home/david/miniconda3/envs/gptq/lib/python3.9/site-packages/guidance/init.py", line 22, in call return Program(template, llm=llm, cache_seed=cache_seed, logprobs=logprobs, silent=silent, async_mode=async_mode, stream=stream, caching=caching, await_missing=await_missing, logging=logging, kwargs) File "/home/david/miniconda3/envs/gptq/lib/python3.9/site-packages/guidance/_program.py", line 155, in init self._execute_complete = asyncio.Event() # fires when the program is done executing to resolve await File "/home/david/miniconda3/envs/gptq/lib/python3.9/asyncio/locks.py", line 177, in init self._loop = events.get_event_loop() File "/home/david/miniconda3/envs/gptq/lib/python3.9/asyncio/events.py", line 642, in get_event_loop raise RuntimeError('There is no current event loop in thread %r.' RuntimeError: There is no current event loop in thread 'AnyIO worker thread'. `Modifications to the code:
os.environ["SERPER_API_KEY"] = '<my SERPER_API_KEY>' MODEL_PATH = '/home/david/ai/text-generation-webui/models/TheBloke_Wizard-Vicuna-7B-Uncensored-GPTQ/' CHECKPOINT_PATH = '/home/david/ai/text-generation-webui/models/TheBloke_Wizard-Vicuna-7B-Uncensored-GPTQ/model.safetensors'
Any clues as to why I'm getting a RuntimeError: There is no current event loop in thread 'AnyIO worker thread' error?Many thanks, Dave