UKD1211 / WordWhiz_codeX

"Unlock the power of language with `WordWhiz_codeX`, your trusted companion for limitless creativity, problem-solving, and transformative conversations." finding some issues regarding the server component 😓 trying to solve this asap
https://word-whiz-code-x.vercel.app/
1 stars 0 forks source link

question #1

Open ixnel1337 opened 1 year ago

ixnel1337 commented 1 year ago

is there any way to run this on localhost ?

UKD1211 commented 1 year ago

yes u can actually while running the npm run dev then u will get the localhost link in the terminal and try to run the client and the server side simultaneously,will be helpful

ixnel1337 commented 1 year ago

image well im getting this

ixnel1337 commented 1 year ago

Error: Request failed with status code 429 at createError (C:\Users\Ahmet\Documents\limitlessgpt\server\node_modules\axios\lib\core\createError.js:16:15) at settle (C:\Users\Ahmet\Documents\limitlessgpt\server\node_modules\axios\lib\core\settle.js:17:12) at IncomingMessage.handleStreamEnd (C:\Users\Ahmet\Documents\limitlessgpt\server\node_modules\axios\lib\adapters\http.js:322:11) at IncomingMessage.emit (node:events:526:35) at endReadableNT (node:internal/streams/readable:1359:12) at process.processTicksAndRejections (node:internal/process/task_queues:82:21) { config: { transitional: { silentJSONParsing: true, forcedJSONParsing: true, clarifyTimeoutError: false }, adapter: [Function: httpAdapter], transformRequest: [ [Function: transformRequest] ], transformResponse: [ [Function: transformResponse] ], timeout: 0, xsrfCookieName: 'XSRF-TOKEN', xsrfHeaderName: 'X-XSRF-TOKEN', maxContentLength: -1, maxBodyLength: -1, validateStatus: [Function: validateStatus], headers: { Accept: 'application/json, text/plain, /', 'Content-Type': 'application/json', 'User-Agent': 'OpenAI/NodeJS/3.2.1', Authorization: 'Bearer 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', 'Content-Length': 143 }, method: 'post', data: '{"model":"text-davinci-003","prompt":"testing this\n","temperature":0,"max_tokens":3000,"top_p":1,"frequency_penalty":0.5,"presence_penalty":0}', url: 'https://api.openai.com/v1/completions' }, request: <ref 1> ClientRequest { _events: [Object: null prototype] { abort: [Function (anonymous)], aborted: [Function (anonymous)], connect: [Function (anonymous)], error: [Function (anonymous)], socket: [Function (anonymous)], timeout: [Function (anonymous)], finish: [Function: requestOnFinish] }, _eventsCount: 7, _maxListeners: undefined, outputData: [], outputSize: 0, writable: true, destroyed: false, _last: true, chunkedEncoding: false, shouldKeepAlive: false, maxRequestsOnConnectionReached: false, _defaultKeepAlive: true, useChunkedEncodingByDefault: true, sendDate: false, _removedConnection: false, _removedContLen: false, _removedTE: false, strictContentLength: false, _contentLength: 143, _hasBody: true, _trailer: '', finished: true, _headerSent: true, _closed: false, socket: TLSSocket { _tlsOptions: [Object], _secureEstablished: true, _securePending: false, _newSessionPending: false, _controlReleased: true, secureConnecting: false, _SNICallback: null, servername: 'api.openai.com', alpnProtocol: false, authorized: true, authorizationError: null, encrypted: true, _events: [Object: null prototype], _eventsCount: 10, connecting: false, _hadError: false, _parent: null, _host: 'api.openai.com', _closeAfterHandlingError: false, _readableState: [ReadableState], _maxListeners: undefined, _writableState: [WritableState], allowHalfOpen: false, _sockname: null, _pendingData: null, _pendingEncoding: '', server: undefined, _server: null, ssl: [TLSWrap], _requestCert: true, _rejectUnauthorized: true, parser: null, _httpMessage: [Circular 1],

  [Symbol(verified)]: true,
  [Symbol(pendingSession)]: null,
  [Symbol(async_id_symbol)]: 42,
  [Symbol(kHandle)]: [TLSWrap],
  [Symbol(lastWriteQueueSize)]: 0,
  [Symbol(timeout)]: null,
  [Symbol(kBuffer)]: null,
  [Symbol(kBufferCb)]: null,
  [Symbol(kBufferGen)]: null,
  [Symbol(kCapture)]: false,
  [Symbol(kSetNoDelay)]: false,
  [Symbol(kSetKeepAlive)]: true,
  [Symbol(kSetKeepAliveInitialDelay)]: 60,
  [Symbol(kBytesRead)]: 0,
  [Symbol(kBytesWritten)]: 0,
  [Symbol(connect-options)]: [Object]
},
_header: 'POST /v1/completions HTTP/1.1\r\n' +
  'Accept: application/json, text/plain, */*\r\n' +
  'Content-Type: application/json\r\n' +
  'User-Agent: OpenAI/NodeJS/3.2.1\r\n' +
  'Authorization: Bearer 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\r\n' +
  'Content-Length: 143\r\n' +
  'Host: api.openai.com\r\n' +
  'Connection: close\r\n' +
  '\r\n',
_keepAliveTimeout: 0,
_onPendingData: [Function: nop],
agent: Agent {
  _events: [Object: null prototype],
  _eventsCount: 2,
  _maxListeners: undefined,
  defaultPort: 443,
  protocol: 'https:',
  options: [Object: null prototype],
  requests: [Object: null prototype] {},
  sockets: [Object: null prototype],
  freeSockets: [Object: null prototype] {},
  keepAliveMsecs: 1000,
  keepAlive: false,
  maxSockets: Infinity,
  maxFreeSockets: 256,
  scheduling: 'lifo',
  maxTotalSockets: Infinity,
  totalSocketCount: 1,
  maxCachedSessions: 100,
  _sessionCache: [Object],
  [Symbol(kCapture)]: false
},
socketPath: undefined,
method: 'POST',
maxHeaderSize: undefined,
insecureHTTPParser: undefined,
joinDuplicateHeaders: undefined,
path: '/v1/completions',
_ended: true,
res: IncomingMessage {
  _readableState: [ReadableState],
  _events: [Object: null prototype],
  _eventsCount: 4,
  _maxListeners: undefined,
  socket: [TLSSocket],
  httpVersionMajor: 1,
  httpVersionMinor: 1,
  httpVersion: '1.1',
  complete: true,
  rawHeaders: [Array],
  rawTrailers: [],
  joinDuplicateHeaders: undefined,
  aborted: false,
  upgrade: false,
  url: '',
  method: null,
  statusCode: 429,
  statusMessage: 'Too Many Requests',
  client: [TLSSocket],
  _consuming: false,
  _dumped: false,
  req: [Circular *1],
  responseUrl: 'https://api.openai.com/v1/completions',
  redirects: [],
  [Symbol(kCapture)]: false,
  [Symbol(kHeaders)]: [Object],
  [Symbol(kHeadersCount)]: 22,
  [Symbol(kTrailers)]: null,
  [Symbol(kTrailersCount)]: 0
},
aborted: false,
timeoutCb: null,
upgradeOrConnect: false,
parser: null,
maxHeadersCount: null,
reusedSocket: false,
host: 'api.openai.com',
protocol: 'https:',
_redirectable: Writable {
  _writableState: [WritableState],
  _events: [Object: null prototype],
  _eventsCount: 3,
  _maxListeners: undefined,
  _options: [Object],
  _ended: true,
  _ending: true,
  _redirectCount: 0,
  _redirects: [],
  _requestBodyLength: 143,
  _requestBodyBuffers: [],
  _onNativeResponse: [Function (anonymous)],
  _currentRequest: [Circular *1],
  _currentUrl: 'https://api.openai.com/v1/completions',
  [Symbol(kCapture)]: false
},
[Symbol(kCapture)]: false,
[Symbol(kBytesWritten)]: 0,
[Symbol(kNeedDrain)]: false,
[Symbol(corked)]: 0,
[Symbol(kOutHeaders)]: [Object: null prototype] {
  accept: [Array],
  'content-type': [Array],
  'user-agent': [Array],
  authorization: [Array],
  'content-length': [Array],
  host: [Array]
},
[Symbol(errored)]: null,
[Symbol(kHighWaterMark)]: 16384,
[Symbol(kRejectNonStandardBodyWrites)]: false,
[Symbol(kUniqueHeaders)]: null

}, response: { status: 429, statusText: 'Too Many Requests', headers: { date: 'Fri, 01 Sep 2023 10:11:47 GMT', 'content-type': 'application/json; charset=utf-8', 'content-length': '222', connection: 'close', vary: 'Origin', 'x-request-id': '0914e245ec55284372f420da7ff5f321', 'strict-transport-security': 'max-age=15724800; includeSubDomains', 'cf-cache-status': 'DYNAMIC', server: 'cloudflare', 'cf-ray': '7ffcbb2b5daf0522-OTP', 'alt-svc': 'h3=":443"; ma=86400' }, config: { transitional: [Object], adapter: [Function: httpAdapter], transformRequest: [Array], transformResponse: [Array], timeout: 0, xsrfCookieName: 'XSRF-TOKEN', xsrfHeaderName: 'X-XSRF-TOKEN', maxContentLength: -1, maxBodyLength: -1, validateStatus: [Function: validateStatus], headers: [Object], method: 'post', data: '{"model":"text-davinci-003","prompt":"testing this\n","temperature":0,"max_tokens":3000,"top_p":1,"frequency_penalty":0.5,"presence_penalty":0}', url: 'https://api.openai.com/v1/completions' }, request: <ref 1> ClientRequest { _events: [Object: null prototype], _eventsCount: 7, _maxListeners: undefined, outputData: [], outputSize: 0, writable: true, destroyed: false, _last: true, chunkedEncoding: false, shouldKeepAlive: false, maxRequestsOnConnectionReached: false, _defaultKeepAlive: true, useChunkedEncodingByDefault: true, sendDate: false, _removedConnection: false, _removedContLen: false, _removedTE: false, strictContentLength: false, _contentLength: 143, _hasBody: true, _trailer: '', finished: true, _headerSent: true, _closed: false, socket: [TLSSocket], _header: 'POST /v1/completions HTTP/1.1\r\n' + 'Accept: application/json, text/plain, /*\r\n' + 'Content-Type: application/json\r\n' + 'User-Agent: OpenAI/NodeJS/3.2.1\r\n' + 'Authorization: Bearer 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\r\n' + 'Content-Length: 143\r\n' + 'Host: api.openai.com\r\n' + 'Connection: close\r\n' + '\r\n', _keepAliveTimeout: 0, _onPendingData: [Function: nop], agent: [Agent], socketPath: undefined, method: 'POST', maxHeaderSize: undefined, insecureHTTPParser: undefined, joinDuplicateHeaders: undefined, path: '/v1/completions', _ended: true, res: [IncomingMessage], aborted: false, timeoutCb: null, upgradeOrConnect: false, parser: null, maxHeadersCount: null, reusedSocket: false, host: 'api.openai.com', protocol: 'https:', _redirectable: [Writable],

  [Symbol(kBytesWritten)]: 0,
  [Symbol(kNeedDrain)]: false,
  [Symbol(corked)]: 0,
  [Symbol(kOutHeaders)]: [Object: null prototype],
  [Symbol(errored)]: null,
  [Symbol(kHighWaterMark)]: 16384,
  [Symbol(kRejectNonStandardBodyWrites)]: false,
  [Symbol(kUniqueHeaders)]: null
},
data: { error: [Object] }

}, isAxiosError: true, toJSON: [Function: toJSON] }

UKD1211 commented 1 year ago

I see ,there must be something path issues I guess actually ,as I deployed my backend or the server in render ,and my frontend in vercel but the main issue I guess you will also face regarding the api key that I put in the .env file ,obvio that's for security so I think you should create the open ai account and from thet u should create ur own api key .

ixnel1337 commented 1 year ago

i need a premium account ?

UKD1211 commented 1 year ago

i guess so as there is also some rate limit issues , regarding the api keys