Open psteinb opened 8 months ago
So I looked into this further, there must be something in the code, which triggers this problem with the temperature
member variable. Because the following snippet works just fine:
import os
import openai
def main():
openai.api_key = os.environ.get("OPENAI_API_KEY")
openai.base_url = os.environ.get("OPENAI_API_BASE")
response = openai.ChatCompletion.create(
model="Mistral-7B-Instruct-v0.2",
messages=[{"role": "user", "content": "Tell me what fastapi!"}],
)
print(response)
if __name__ == "__main__":
main()
Hi, @psteinb! Thanks for your interest in our library!
Firstly, we do not currently support talking to custom openai-like APIs. It seems like a simple thing to implement though, we will try to do that in a short time.
On the second issue, about the temperature setting. Please note, that parameters
option should be an instance of GenerationParameters
class, defined in lm_polygraph.utils.generation_parameters
module. So in your case is if you want to set temperature other than defaul value of 1.0 you should do like this:
from lm_polygraph.utils.generation_parameters import GenerationParameters
def main():
params = GenerationParameters(temperature=0.7)
print(f":: black box test, using Mistral-7B-Instruct-v0.2 from {os.environ["OPENAI_API_BASE"]}")
model = BlackboxModel(openai_api_key=os.environ["OPENAI_API_KEY"], model_path="Mistral-7B-Instruct-v0.2", parameters=params)
print(model.parameters)
print(":: using estimator EigValLaplacian")
estimator = EigValLaplacian(verbose=True)
answer = estimate_uncertainty(
model, estimator, input_text="When did Albert Einstein die?"
)
print(">>",answer)
When you pass a dict
as parameters
option, you replace default config dataclass with it, and polygraph fails trying to get generation config from it as if it was a dataclass.
We will make sure to reflect this in our documentation as we continue to improve it.
Please let me know if you have any other questions.
@cant-access-rediska0123 can you try implementing base_url
parameter passing to openai library along with api_key
?
@cant-access-rediska0123 could you check that HF wrapper works?
Hi,
thanks for providing the community with this library. I believe uncertainties of LLM queries are an important topic. I tried to play around with this library and am a bit stuck. So I'd like to use a remote model that is accessible through the
openai
library. For this, I have to provide a customOPENAI_API_BASE
and myOPENAI_API_KEY
. However, the library tells me that it doesn't know how to query the remote model?Here is the code that I drafted given your example:
So I get the following error:
I tried a couple of things, but I am simply unclear where to supply the temperature?
Best P