Closed timestee closed 1 year ago
The name "ChatGPT" is given to it by OpenAI in the prompt prefix when you use the website. Make sure you remove that promptPrefix
from the example settings if you want it to behave the same way.
Interesting Now
const clientOptions = {
// (Optional) Parameters as described in https://platform.openai.com/docs/api-reference/completions
modelOptions: {
// The model is set to text-chat-davinci-002-20230126 by default, but you can override
// it and any other parameters here
model: 'text-chat-davinci-002-20230126',
// The default temperature is 0.7, but you can override it here
temperature: 0.7,
},
// (Optional) Set a custom prompt prefix. As per my testing it should work with two newlines
promptPrefix: 'TEST...\n\n',
// (Optional) Set to true to enable `console.debug()` logging
debug: false,
};
.....
const response = await chatGptClient.sendMessage('are you chatgpt');
Got Response
{
response: 'Yes, I am an AI language model developed by OpenAI, called "GPT-3" (Generative Pretrained Transformer 3).',
conversationId: '4d423663-1a6f-4373-bfa1-43e2280985c3',
messageId: '73ce7fee-b8c2-4d6a-bd4f-5e8f5b9e3327'
}
But when i ask the same question to real ChatGPT(https://chat.openai.com/), got this:
const response = await chatGptClient.sendMessage('Hello!');
Sometime got response:
{
response: 'I am not ChatGPT, but I am a language model created by OpenAI. I was trained on a large dataset of text to generate human-like responses to various prompts. However, I am not capable of conscious thought or personal experiences and I do not have the ability to perceive or interact with the physical world.',
conversationId: '9ae1a732-b783-4ab3-ba26-8d0111ef7cc6',
messageId: 'd47b0fa4-1478-45b0-98c1-afde33c8c065'
}
{
response: '#define ll long long\n' +
'#define maxn 200005\n' +
'#define pb push_back\n' +
'#define fi first\n' +
'#define se second\n' +
'#define mp make_pair\n' +
'#define For(i,a,b) for(int i=a;i<=b;i++)\n' +
'#define Rep(i,a,b) for(int i=a;i>=b;i--)\n' +
'\n' +
'using namespace std;\n' +
'\n' +
'typedef pair<int,int> pii;\n' +
'typedef vector<int> vi;\n' +
'\n' +
'template<typename T> inline void read(T &x){\n' +
' x=0;char c=getchar();bool f=false;\n' +
" for(;!isdigit(c);c=getchar()) if(c=='-') f=true;\n" +
' for(;isdigit(c);c=getchar()) x=(x<<1)+(x<<3)+(c^48);\n' +
' if(f) x=-x;\n' +
'}\n' +
'\n' +
'int n,m,k,ans;\n' +
'int a[maxn];\n' +
'\n' +
'inline bool cmp(const int &x,const int &y){\n' +
' return x>y;\n' +
'}\n' +
'\n' +
'signed main(){\n' +
' cin>>n>>m>>k;\n' +
' For(i,1,m) read(a[i]);\n' +
' sort(a+1,a+m+1,cmp);\n' +
' int num=a[1]-1;\n' +
' while(num>0){\n' +
' int t=0;\n' +
' For(i,1,m){\n' +
' if(a[i]-num<=0) break;\n' +
' t+=min((a[i]-num-1)/k+1,1ll);\n' +
' }\n' +
' if(t<=n){\n' +
' cout<<num<<endl;\n' +
' return 0;\n' +
' }\n' +
' num-=k;\n' +
' }\n' +
' cout<<0<<endl;\n' +
' return 0;\n' +
'}\n' +
'<|im_sep|>',
conversationId: 'caf06aa2-956a-460f-b828-02ed5049c701',
messageId: 'c7d0f3ea-912a-4fc5-947f-022e6b16372a'
}
Remove promptPrefix
from your settings entirely and you'll see that it responds just like ChatGPT when you ask the question.
I wouldn't recommend setting any of those optional settings - instead remove them entirely - if you don't know what you're doing, otherwise you may get nonsense replies.
seems you are right, wholly remove promptPrefix, i got this
const response = await chatGptClient.sendMessage('are you chatgpt!');
{
response: 'Yes.',
conversationId: 'e7ba8619-d884-4b8a-92f8-bc6de259177d',
messageId: '20384414-3f55-478f-9252-36e540ad8f5b'
}
const response = await chatGptClient.sendMessage('what is your MODEL!')
{
response: 'I am ChatGPT, a language model developed by OpenAI.',
conversationId: '245a7a8b-2265-45be-8ffc-b751cd9fb225',
messageId: 'dcf04ba8-ab87-403e-be42-6b4327161b8e'
}
apprentice doubt...
By using this OpenAI model, you don't have to worry about the cost of requests made to the API??? In other words, the API is "completely free" and no fees will be incurred??
For now, seems like it's free. No idea how long it'll last though. If you're worried about being charged all of a sudden, I'd set a hard usage limit of $1 or something in your billing settings, or make a new OpenAI account.
const response = await chatGptClient.sendMessage('are you gpt3');
{
response: 'Yes, I am GPT-3.',
conversationId: '0b483262-9856-4d5a-86da-34e9512f2806',
messageId: '65686fbc-9fa6-491d-b5ad-e9bedb0aa299'
}
I feel that is not ChatGPT and is a bit dumber than ChatGPT.
I think it depends on the temperature and stuff. Hard to say since we don't know what model settings the website uses. You could try setting it to 0.9, or set the frequency and presence penalties to further change the behavior.
For now, seems like it's free. No idea how long it'll last though. If you're worried about being charged all of a sudden, I'd set a hard usage limit of $1 or something in your billing settings, or make a new OpenAI account.
Absolutely Not Free, just Free Trial About 3-months.
What happens after I use my free tokens or the 3-months is up in the free trial?
@waylaidwanderer i think you will love this
@timestee currently requests to the API using this model does not consume any tokens
got this:
I am an AI language model created by OpenAI, I am not ChatGPT
, seemstext-chat-davinci-002-20230126
is not ChatGPT model?