jonppe / funsearch

Simple working implementation for google-deepmind FunSearch algorithm
Apache License 2.0
17 stars 9 forks source link

How can i use a different llm? #4

Open Alex4210987 opened 8 months ago

Alex4210987 commented 8 months ago

Due to some financial problems, I can only use a intermediate interface instead of the original openai one. How can I configure a different model? I followed instructions from here but failed. Can you guys kindly offer some help?

Rubiel1 commented 8 months ago

Hello, So I tried to use the library llm and installed llm-gpt4all, unfortunately, there is an issue with the package: https://github.com/jonppe/funsearch/issues/1#issuecomment-1905481246

Another possibility is to work with huggingface. Here is the leaderboard of LLM https://evalplus.github.io/leaderboard.html , and every two months a new one appears.

Alex4210987 commented 8 months ago

Thanks for the message, let me make my question clearer: my extra-openai-models.yaml under the project root funsearch folder looks like:

- model_id: test
  model_name: gpt-3.5-turbo
  api_base: "https://a_website/v1/chat"
  completion: true

but when I run llm -m test 'What is the capital of France?' the output is: Error: 'test' is not a known model I don't know, maybe I've messed up with the yaml file?

Alex4210987 commented 8 months ago

or maybe there is some opensource llms to walk around the openai kry issue?

Rubiel1 commented 8 months ago

What I suggest is to use local llm, like the ones in huggingface I shared before.

Alex4210987 commented 8 months ago

What I suggest is to use local llm, like the ones in huggingface I shared before.

thanks for the suggestion, I will try

liby-meigui commented 1 month ago

Can you share how you can switch to a different model? I want to try using llama3 I connected to the local virtual machine via SSH by deploying Meta-Llama-3-70B-Instruct via ollama on the server, but (venv) lmg@lmg-VMware-Virtual-Platform:~/funsearch$ funsearch run examples/cap_set_spec.py 11 --model_name Meta-Llama-3-70B-Instruct --sandbox_type ExternalProcessSandbox Starting sandbox.py... INFO:root:Writing logs to data/1730532597 Traceback (most recent call last): File "/home/lmg/funsearch/venv/lib/python3.12/site-packages/llm/init.py", line 148, in get_model return aliases[name]


KeyError: 'Meta-Llama-3-70B-Instruct'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/home/lmg/funsearch/venv/bin/funsearch", line 8, in <module>
    sys.exit(main())
             ^^^^^^
  File "/home/lmg/funsearch/venv/lib/python3.12/site-packages/click/core.py", line 1130, in __call__
    return self.main(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/lmg/funsearch/venv/lib/python3.12/site-packages/click/core.py", line 1055, in main
    rv = self.invoke(ctx)
         ^^^^^^^^^^^^^^^^
  File "/home/lmg/funsearch/venv/lib/python3.12/site-packages/click/core.py", line 1657, in invoke
    return _process_result(sub_ctx.command.invoke(sub_ctx))
                           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/lmg/funsearch/venv/lib/python3.12/site-packages/click/core.py", line 1404, in invoke
    return ctx.invoke(self.callback, **ctx.params)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/lmg/funsearch/venv/lib/python3.12/site-packages/click/core.py", line 760, in invoke
    return __callback(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/lmg/funsearch/venv/lib/python3.12/site-packages/funsearch/__main__.py", line 97, in run
    model = llm.get_model(model_name)
            ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/lmg/funsearch/venv/lib/python3.12/site-packages/llm/__init__.py", line 150, in get_model
    raise UnknownModelError("Unknown model: " + name)
llm.UnknownModelError: 'Unknown model: Meta-Llama-3-70B-Instruct'