lucgagan / completions

Node.js SDK for interacting with OpenAI Chat API.
https://ray.run/
MIT License
45 stars 8 forks source link

add documentation for single message chat options override #7

Closed floomby closed 1 year ago

floomby commented 1 year ago

This adds documentation for per message overrides to the readme and also adds an additional test for two options (logit bias and max tokens).

I should have added the documentation when I first implemented the functionality, but I didn't...

I had an issue with the the 8th subtest. (on the part where it looks at the content of the force user facing message) with regards to reliability. Same type of problem I had earlier with the second subtest.

The input did not match the regular expression /(sorry|cannot)/i. Input:

'Sure, I can check that for you. Please give me a moment.'
lucgagan commented 1 year ago

Thanks!

Can you elaborate on how this test is made?

  const response = await chat.sendMessage(
    "what is the next token in this sequence: a b c",
    undefined,
    undefined,
    // token 34093 is "boo"
    { maxTokens: 1, logitBias: { "34093": 100 } }
  );

specifically, where is the 34093 coming from?

I thought maybe this is "boo" bpe (https://ray.run/tools/byte-pair-encoder?dictionary=gpt3&output=bpe) encoded, but that does not seem to be the case.

floomby commented 1 year ago

This comes from the @dqbd/tiktoken. There are multiple ways to tokenize somethings, but I don't know why it tokenizes differently.

Honestly I just use my own prompt golfing game-ish site when I want to look at tokenization, which is what I did here. (https://golf.floomby.us/challenges/64641b5992687d1b8359f22e)

github-actions[bot] commented 1 year ago

:tada: This PR is included in version 2.0.0 :tada:

The release is available on:

Your semantic-release bot :package::rocket: