stanford-crfm / helm

Holistic Evaluation of Language Models (HELM), a framework to increase the transparency of language models (https://arxiv.org/abs/2211.09110). This framework is also used to evaluate text-to-image models in HEIM (https://arxiv.org/abs/2311.04287) and vision-language models in VHELM (https://arxiv.org/abs/2410.07112).
https://crfm.stanford.edu/helm
Apache License 2.0
1.95k stars 250 forks source link

GooseAI: Context limited to 2048 tokens maximum errors #510

Closed teetone closed 2 years ago

teetone commented 2 years ago

boolq:model=default,data_augmentation=all news_qa:model=default,data_augmentation=all raft:model=default,subset=one_stop_english,data_augmentation=all quac:model=default

  Error when running boolq:model=default,data_augmentation=all:
Traceback (most recent call last):
  File "/juice/scr/nlp/crfm/benchmarking/benchmarking/src/benchmark/presentation/present.py", line 118, in run
    new_run_specs = run_benchmarking(
  File "/juice/scr/nlp/crfm/benchmarking/benchmarking/src/benchmark/run.py", line 69, in run_benchmarking
    runner.run_all()
  File "/juice/scr/nlp/crfm/benchmarking/benchmarking/src/benchmark/runner.py", line 75, in run_all
    self.run_one(run_spec)
  File "/juice/scr/nlp/crfm/benchmarking/benchmarking/src/benchmark/runner.py", line 99, in run_one
    scenario_state = self.executor.execute(scenario_state)
  File "/juice/scr/nlp/crfm/benchmarking/benchmarking/src/common/hierarchical_logger.py", line 104, in wrapper
    return fn(*args, **kwargs)
  File "/juice/scr/nlp/crfm/benchmarking/benchmarking/src/benchmark/executor.py", line 84, in execute
    request_states = list(
  File "/u/nlp/anaconda/main/anaconda3/envs/crfm_benchmarking/lib/python3.8/site-packages/tqdm/std.py", line 1195, in __iter__
    for obj in iterable:
  File "/u/nlp/anaconda/main/anaconda3/envs/crfm_benchmarking/lib/python3.8/concurrent/futures/_base.py", line 619, in result_iterator
    yield fs.pop().result()
  File "/u/nlp/anaconda/main/anaconda3/envs/crfm_benchmarking/lib/python3.8/concurrent/futures/_base.py", line 444, in result
    return self.__get_result()
  File "/u/nlp/anaconda/main/anaconda3/envs/crfm_benchmarking/lib/python3.8/concurrent/futures/_base.py", line 389, in __get_result
    raise self._exception
  File "/u/nlp/anaconda/main/anaconda3/envs/crfm_benchmarking/lib/python3.8/concurrent/futures/thread.py", line 57, in run
    result = self.fn(*self.args, **self.kwargs)
  File "/juice/scr/nlp/crfm/benchmarking/benchmarking/src/benchmark/executor.py", line 77, in process
    result: RequestResult = self.remote_service.make_request(self.execution_spec.auth, state.request)
  File "/juice/scr/nlp/crfm/benchmarking/benchmarking/src/proxy/remote_service.py", line 47, in make_request
    RemoteService._check_response(response)
  File "/juice/scr/nlp/crfm/benchmarking/benchmarking/src/proxy/remote_service.py", line 29, in _check_response
    raise RemoteServiceError(response["error"])
proxy.remote_service.RemoteServiceError: Failed to make request to gooseai after retrying 5 times. Error: OpenAI (GooseAI API) error: Context limited to 2048 tokens maximum, 2266 tokens requested for prompt 0
Error when running news_qa:model=default,data_augmentation=all:
Traceback (most recent call last):
  File "/juice/scr/nlp/crfm/benchmarking/benchmarking/src/benchmark/presentation/present.py", line 118, in run
    new_run_specs = run_benchmarking(
  File "/juice/scr/nlp/crfm/benchmarking/benchmarking/src/benchmark/run.py", line 69, in run_benchmarking
    runner.run_all()
  File "/juice/scr/nlp/crfm/benchmarking/benchmarking/src/benchmark/runner.py", line 75, in run_all
    self.run_one(run_spec)
  File "/juice/scr/nlp/crfm/benchmarking/benchmarking/src/benchmark/runner.py", line 99, in run_one
    scenario_state = self.executor.execute(scenario_state)
  File "/juice/scr/nlp/crfm/benchmarking/benchmarking/src/common/hierarchical_logger.py", line 104, in wrapper
    return fn(*args, **kwargs)
  File "/juice/scr/nlp/crfm/benchmarking/benchmarking/src/benchmark/executor.py", line 84, in execute
    request_states = list(
  File "/u/nlp/anaconda/main/anaconda3/envs/crfm_benchmarking/lib/python3.8/site-packages/tqdm/std.py", line 1195, in __iter__
    for obj in iterable:
  File "/u/nlp/anaconda/main/anaconda3/envs/crfm_benchmarking/lib/python3.8/concurrent/futures/_base.py", line 619, in result_iterator
    yield fs.pop().result()
  File "/u/nlp/anaconda/main/anaconda3/envs/crfm_benchmarking/lib/python3.8/concurrent/futures/_base.py", line 444, in result
    return self.__get_result()
  File "/u/nlp/anaconda/main/anaconda3/envs/crfm_benchmarking/lib/python3.8/concurrent/futures/_base.py", line 389, in __get_result
    raise self._exception
  File "/u/nlp/anaconda/main/anaconda3/envs/crfm_benchmarking/lib/python3.8/concurrent/futures/thread.py", line 57, in run
    result = self.fn(*self.args, **self.kwargs)
  File "/juice/scr/nlp/crfm/benchmarking/benchmarking/src/benchmark/executor.py", line 77, in process
    result: RequestResult = self.remote_service.make_request(self.execution_spec.auth, state.request)
  File "/juice/scr/nlp/crfm/benchmarking/benchmarking/src/proxy/remote_service.py", line 47, in make_request
    RemoteService._check_response(response)
  File "/juice/scr/nlp/crfm/benchmarking/benchmarking/src/proxy/remote_service.py", line 29, in _check_response
    raise RemoteServiceError(response["error"])
proxy.remote_service.RemoteServiceError: Failed to make request to gooseai after retrying 5 times. Error: OpenAI (GooseAI API) error: Context limited to 2048 tokens maximum, 2117 tokens requested for prompt 0
Error when running raft:model=default,subset=one_stop_english,data_augmentation=all:
Traceback (most recent call last):
  File "/juice/scr/nlp/crfm/benchmarking/benchmarking/src/benchmark/presentation/present.py", line 118, in run
    new_run_specs = run_benchmarking(
  File "/juice/scr/nlp/crfm/benchmarking/benchmarking/src/benchmark/run.py", line 69, in run_benchmarking
    runner.run_all()
  File "/juice/scr/nlp/crfm/benchmarking/benchmarking/src/benchmark/runner.py", line 75, in run_all
    self.run_one(run_spec)
  File "/juice/scr/nlp/crfm/benchmarking/benchmarking/src/benchmark/runner.py", line 99, in run_one
    scenario_state = self.executor.execute(scenario_state)
  File "/juice/scr/nlp/crfm/benchmarking/benchmarking/src/common/hierarchical_logger.py", line 104, in wrapper
    return fn(*args, **kwargs)
  File "/juice/scr/nlp/crfm/benchmarking/benchmarking/src/benchmark/executor.py", line 84, in execute
    request_states = list(
  File "/u/nlp/anaconda/main/anaconda3/envs/crfm_benchmarking/lib/python3.8/site-packages/tqdm/std.py", line 1195, in __iter__
    for obj in iterable:
  File "/u/nlp/anaconda/main/anaconda3/envs/crfm_benchmarking/lib/python3.8/concurrent/futures/_base.py", line 619, in result_iterator
    yield fs.pop().result()
  File "/u/nlp/anaconda/main/anaconda3/envs/crfm_benchmarking/lib/python3.8/concurrent/futures/_base.py", line 444, in result
    return self.__get_result()
  File "/u/nlp/anaconda/main/anaconda3/envs/crfm_benchmarking/lib/python3.8/concurrent/futures/_base.py", line 389, in __get_result
    raise self._exception
  File "/u/nlp/anaconda/main/anaconda3/envs/crfm_benchmarking/lib/python3.8/concurrent/futures/thread.py", line 57, in run
    result = self.fn(*self.args, **self.kwargs)
  File "/juice/scr/nlp/crfm/benchmarking/benchmarking/src/benchmark/executor.py", line 77, in process
    result: RequestResult = self.remote_service.make_request(self.execution_spec.auth, state.request)
  File "/juice/scr/nlp/crfm/benchmarking/benchmarking/src/proxy/remote_service.py", line 47, in make_request
    RemoteService._check_response(response)
  File "/juice/scr/nlp/crfm/benchmarking/benchmarking/src/proxy/remote_service.py", line 29, in _check_response
    raise RemoteServiceError(response["error"])
proxy.remote_service.RemoteServiceError: Failed to make request to gooseai after retrying 5 times. Error: OpenAI (GooseAI API) error: Context limited to 2048 tokens maximum, 2576 tokens requested for prompt 0
Error when running quac:model=default:
Traceback (most recent call last):
  File "/juice/scr/nlp/crfm/benchmarking/benchmarking/src/benchmark/presentation/present.py", line 118, in run
    new_run_specs = run_benchmarking(
  File "/juice/scr/nlp/crfm/benchmarking/benchmarking/src/benchmark/run.py", line 69, in run_benchmarking
    runner.run_all()
  File "/juice/scr/nlp/crfm/benchmarking/benchmarking/src/benchmark/runner.py", line 75, in run_all
    self.run_one(run_spec)
  File "/juice/scr/nlp/crfm/benchmarking/benchmarking/src/benchmark/runner.py", line 99, in run_one
    scenario_state = self.executor.execute(scenario_state)
  File "/juice/scr/nlp/crfm/benchmarking/benchmarking/src/common/hierarchical_logger.py", line 104, in wrapper
    return fn(*args, **kwargs)
  File "/juice/scr/nlp/crfm/benchmarking/benchmarking/src/benchmark/executor.py", line 84, in execute
    request_states = list(
  File "/u/nlp/anaconda/main/anaconda3/envs/crfm_benchmarking/lib/python3.8/site-packages/tqdm/std.py", line 1195, in __iter__
    for obj in iterable:
  File "/u/nlp/anaconda/main/anaconda3/envs/crfm_benchmarking/lib/python3.8/concurrent/futures/_base.py", line 619, in result_iterator
    yield fs.pop().result()
  File "/u/nlp/anaconda/main/anaconda3/envs/crfm_benchmarking/lib/python3.8/concurrent/futures/_base.py", line 444, in result
    return self.__get_result()
  File "/u/nlp/anaconda/main/anaconda3/envs/crfm_benchmarking/lib/python3.8/concurrent/futures/_base.py", line 389, in __get_result
    raise self._exception
  File "/u/nlp/anaconda/main/anaconda3/envs/crfm_benchmarking/lib/python3.8/concurrent/futures/thread.py", line 57, in run
    result = self.fn(*self.args, **self.kwargs)
  File "/juice/scr/nlp/crfm/benchmarking/benchmarking/src/benchmark/executor.py", line 77, in process
    result: RequestResult = self.remote_service.make_request(self.execution_spec.auth, state.request)
  File "/juice/scr/nlp/crfm/benchmarking/benchmarking/src/proxy/remote_service.py", line 47, in make_request
    RemoteService._check_response(response)
  File "/juice/scr/nlp/crfm/benchmarking/benchmarking/src/proxy/remote_service.py", line 29, in _check_response
    raise RemoteServiceError(response["error"])
proxy.remote_service.RemoteServiceError: Failed to make request to gooseai after retrying 5 times. Error: OpenAI (GooseAI API) error: Context limited to 2048 tokens maximum, 2050 tokens requested for prompt 0
teetone commented 2 years ago

Example request:

{'engine': 'gpt-j-6b', 'prompt': "Passage: The center contact of the bulb typically connects to the medium-power filament, and the ring connects to the low-power filament. Thus, if a 3-way bulb is screwed into a standard light socket that has only a center contact, only the medium-power filament operates. In the case of the 50 W / 100 W / 150 W bulb, putting this bulb in a regular lamp socket will result in it behaving like a normal 100W bulb.\nQuestion: Do 3 way light bulbs work in any lamp?\nAnswer: Yes\n\nPassage: Perfume: The Story of a Murderer is a 2006 German period psychological crime thriller film directed by Tom Tykwer and starring Ben Whishaw, Alan Rickman, Rachel Hurd-Wood, and Dustin Hoffman. Tykwer, with Johnny Klimek and Reinhold Heil, also composed the music. The screenplay by Tykwer, Andrew Birkin, and Bernd Eichinger is based on Patrick Süskind's 1985 novel Perfume. Set in 18th century France, the film tells the story of Jean-Baptiste Grenouille (Whishaw), an olfactory genius, and his homicidal quest for the perfect scent.\nQuestion: Is the film perfume based on a true story?\nAnswer: No\n\nPassage: New Mexico is a Shall-Issue state for the concealed carry of handguns, and permits the open carry of loaded firearms without a permit. A New Mexico Concealed Handgun License (CHL) is required by in-state residents to carry in a concealed manner a loaded handgun while on foot. Per state law, a firearm is considered ``loaded'' when a magazine with live ammunition is inserted into the weapon and/or a live round is in the firing chamber. (citation needed) Additionally, state law (NMSA 29-19-2) defines a concealed handgun as ``a loaded handgun that is not visible to the ordinary observations of a reasonable person.'' This definition creates legal ambiguity for partially-exposed weapons, as the firearm may be visible to one person and thus no violation of law occurs since it would be viewed as open carry. The partially-exposed weapon may not be readily visible to a second person, thus potentially placing the carrying person in violation of the state's concealed carry law if the individual carrying does not have a valid license for concealed carry. A CHL is not required for open carry, concealed carry of an unloaded firearm on foot, or concealed carry of a loaded or unloaded firearm while in a vehicle (including motorcycles, bicycles, off-road vehicles, motor homes, or riding a horse). An applicant for a concealed carry permit must be a resident of New Mexico and at least 21 years of age. Each permit specifies the category and caliber of handgun that may be carried, but is also valid for a smaller caliber. The applicant must complete a state approved training course that includes at least 15 hours of classroom and firing range time, and must pass a shooting proficiency test for that category and caliber of handgun. A permit is valid for four years, but license holders must pass the shooting proficiency test every two years. An applicant may appeal the denial of a Concealed Handgun License by requesting a hearing before the Department of Public Safety within 35 days of receipt of an Order of Denial for a CHL. An unfavorable ruling on the appeal by the DPS may be further appealed through the New Mexico courts. New Mexico currently recognizes concealed carry permits from or has reciprocal agreements with the following states: Alaska, Arizona, Arkansas, Colorado, Delaware, Florida, Idaho, Kansas, Louisiana, Michigan, Mississippi, Missouri, Nebraska, Nevada, North Carolina, North Dakota, Ohio, Oklahoma, South Carolina, Tennessee, Texas, Virginia, West Virginia, and Wyoming. New Mexico does not issue CCW permits to non-residents, except for Active Duty military members permanently assigned to a military installation within the state. Part-time residents with a valid New Mexico ID or Driver's license may apply for a New Mexico CHL. New Mexico does recognize out-of-state nonresident permits held by in-state residents for concealed carry.\nQuestion: Can i carry a gun in new mexico?\nAnswer: Yes\n\nPassage: The   relationshipbetweenHouseand Cuddy   is   known  by  the   portmanteauterm ``Huddy''.Cuddy has   what   USA  Today's   Peter Johnson  terms   a  ``cat-and-mouse'' relationship with   House.   Edelstein   has  described  it  as   ``a  reallycomplicated,  adult   relationship'',   explaining:``These  arepeople  who havevery full livesand lots   of   responsibilitiesthat  perhapsconflict with   theirfeelings  for eachother.''Theactress   ``would  love  for them  tohave  a  (romantic)   relationship, because  it   could  be   ascomplicated   as   the  rest of  their relationship'',however, she   is   unsurehow  it wouldaffect  the dynamics  ofthe  show.   Jacobscommented   at   theend  of the show's third season:   ``I   can't  seethem pairing them ina permanent  fashion.  But  they areclose;theyhave   gone through  a  lot together.Might  there  be  a   momentof   weakness  in which thetwo   might   explore their  chemistry?   Maybe.''   Questioned   at   the   end of the  fourthseasonon  whether   Cuddy   and   Housewould ever consummate their   relationship   on-screen,Jacobs  responded:  ``there is  heat and chemistrybetween  themand   I never   wantto   see that  go   awaybecause that  isthe   essenceoftheir   relationship.   (...)   we'llneverignore   (their  chemistry)because,asI  said,it's   thevery essenceofthem.   Shewouldn't forgivehimover  andover   again   if   he   wasn'tsobrilliant inhereyes,   clearlyshe's got   a  softspotforhim.   Andhe has  onefor   her.   You   will  continue   to see   that.''  Prior  to   thebeginningof the fifthseason,seriescreatorDavid  Shore discussedhis intentionto  further the   relationship  between  the  two, as:``If   House  iscapable   of   any  relationshipwithanyone,it'sCuddy.   We   can't  havethem   dancingaroundforever.''Following the fifth   season revelation  that   Househad   hallucinated   a physical relationship  withCuddy, Shore commented   on the   storyline'scontinuation   into the sixth   season:``it  would  be dishonest  to  just  let   thatdisappear. Obviously House  has   feelings  for her.   Eventhough  the   love  affair   didn't   happen,   in  House'smind   it   did.''Edelstein does   not   know   whether  the   two characters   will   eventuallyend up together,however believesthatthe   combinationof  frustration  and   love  Cuddy   feels for  House  ``makes  for  avery  interesting  relationship'', as:  ``there's   a   great  deal   of  admiration and  respect,   and  alsoan  incredible amount of   annoyance   and  frustration,  which is like  how   mostrelationshipsare   in   your life.''Asofthe   veryend of the   Sixth Seasonfinale,   HelpMe, House andCuddy  appearto   have entered  a  romantic   relationship.   Intheclosing minutes   ofthe   episode,House came veryclose  to  relapsing  andtaking   vicodin once again,  at   whichpoint Cuddy  entered   to  tellhim  thatshehad  ended   her   relationship  withLucas.   She   professedher   lovefor  House,  which  led  to   them   kissingbriefly.   A   close-up   shot oftheirclasped hands(House's   left,   Cuddy's right)   was the  closing shotof  the episode, as  well  as the  season.  Therelationship   later  ends  in   season 7;in the   episode   ``Bombshells''.\nQuestion:   Do  house  andcuddy getback togetherin   season   8?\nAnswer:", 'temperature': 0.0, 'n': 1, 'max_tokens': 1, 'logprobs': 1, 'stop': ['\n'], 'top_p': 1, 'presence_penalty': 0, 'frequency_penalty': 0, 'echo': False}

self.tokenizer.tokenize_and_count(raw_request["prompt"]) = 1958

'OpenAI (GooseAI API) error: Context limited to 2048 tokens maximum, 2266 tokens requested for prompt 0'

Request(model='gooseai/gpt-j-6b', prompt='The following is an article sourced from The Guardian newspaper, and rewritten by teachers to suit three levels of adult English as Second Language (ESL) learners: elementary, intermediate, and advanced. Predict the level of the article.\nPossible labels:\n1. advanced\n2. elementary\n3. intermediate\n\nArticle:   Standing  at  the edge   ofspace  above   the  deserts   of NewMexico, Felix   Baumgartner  paused slightly.  It wasa smallstep  away   fromthecapsule,but a24-mile   drop   backdown   to Earth.  “Our guardianangel willtake   care   of   you,”   said   missioncontrol,and the  man  known  as   Fearless  Felixjumped.  \nTen heart-stopping  minutes  later  the Austrianlanded   back   onEarth,after reaching   speedsof   up to725mph, and breakingthree  worldrecords,   including  becoming   the  world’sfirst   supersonic  skydiver by  breaking  the   sound   barrier   at Mach 1.24.   “We  love you Felix,” cheeredthe  control  room as hismother,Ava Baumgartner,wept.  Baumgartner, whoclaimed  the  recordsfor the   highest   altitude  manned  balloon   flight and   the  highestaltitude  skydive,  raised   his arms  in  avictorysaluteto thank  his   team. \nHe was wearing   a   speciallydesigned  survivalsuit   that   kepthis   body intact  against  the   hugely varyingpressures thatmarkedhisdrop   back to   Earth.Without  it, his  blood  would   haveboiled and   his lungsmight   have  exploded.  Baumgartnerlater  told apress  conference:  “When I   was  standing thereon top   of   the   world, youbecome  so   humble, you  don’tthink aboutbreaking  records.”  He  admitted all   he  could think about  was getting   back   alive, but   added:   “Sometimesyou have   to   go   upreallyhigh  to  seehow  small   you  are.”  \nAfter two aborted   attempts the week  before,   themission   was  given  the   go-ahead  on   Sunday  morning with the  cooperationof   the   weather.   Baumgartner  was  carried   up   into   crystal   clear   skies by a gigantic  balloon,which  measured30  million  square cubic  feet andwhose  skinwasone-tenth the thickness of a  sandwich bag. At  thebottomof  the   balloon was   acapsule,in which   Baumgartner   sat  in   his  suit.   \nAs   hereached  thedesired height,   Baumgartner  wentthrough   a   checklist  of40 items  withhis mentorJoe   Kittinger,thepreviousholder ofthehighestaltitude manned   balloon flight.   \nThere  was   some  concern that   aheater  forhis  visor was  notworking,   causing his  visorto fog. “Thisis   very   serious,  Joe,”   he  told  Kitttinger.   “Sometimes   it’s  getting foggy  whenI exhale. ...   I  do not   feel  heat.”  But  they   decidedtogo   ahead, watched  by a  record   8  million  people  as  the  jump   wasstreamed live   on YouTube.  \nThe  two-and-a-half-hour  journey upwards,  during  which  the   curvatureof  theEarthbecame  visible and   theskies  gradually turned   black, was  matchedwith a rather more  rapiddescent. \nThree cameras   attached  to   Baumgartner’s suit   recorded   his   free-fall   of   just  over  four minutes –   which  failed  to   break  the existingfree-fallrecord  for   duration  –   and then   the  parachute opening.  \nThe   success   ofthe mission,and   ofthe  suit, raises the prospectthat   astronauts   might  beable to survive a   high  altitude  disaster of the typethat struck  thespace   shuttleColumbia   in 2003by   actually  bailingout   of   their  craft.   Baumgartner’stopmedical   man   inthe  stunt   was DrJonathan  Clark,whose  wife  Laurel Clark  died   in  the   Columbiaaccident.Clark  is  now  dedicated  toimprovingastronauts’chancesof survivalin a high-altitude disaster. \nBaumgartner  has made   aname for  himself   with   acts  of daring.  Theformer  paratrooper   has   parachuted off buildingsand  mountains  and  onceinto   a  600   foot  deep  cave.He  hadalready  done  two practicefree-falls  in preparation forthisattempt– one   from  71,000  feetin   March and  a  second  from97,000  feet   in July 2012.Butno   feat   can possibly have   matched   his  jump above the town   of   Roswell, asuitably chosen   place famed for  itsconnections  to   UFO   sightings.\nHe   was   chasing five different records: thefirst human  to everbreak the   sound   barrier in free-fall;   the highest free-fall   altitude jump;  the highest  manned  balloonflight;   the longest  free-fall;and   hisjump   platform is believed   to bethelargestmannedballoon in   history.  The stunt,which   wasseven years  inthe planning   andsponsored  byRed Bulldrinks,beattwo  of  Kittinger’s  records:  the  retiredUS air   forcecolonel   previously   held  the  high  altitudeandspeed  records  for parachuting.  Kittinger  jumped   from   a  balloon19milesabove   theplanet in 1960.   Suitably,   the   only   voiceinBaumgartner’s radio   earpiece guiding   his  ascent  was thatofKittinger, now  84.   \nAsked   after  the jump whathe   wanted todo  next,   Baumgartnersaid:  “I   want  to  inspirea  generation.  I’d  liketobe  sitting   in the  same spotin   the next fouryears as  JoeKittinger.There isa  youngguy  asking me for   advicebecausehewants  to   break my record.” He   said  the   mostexciting moment for  him   hadbeenwhenhe   wasstandingoutside the  capsule“on  top of   the   world”.To  laughter,headded:   “Themostbeautiful   moment   was  when   I   was   standing on   the   landing area andMike   Todd  [the life support   engineer  whodressed Baumgartner in hissuit] showed   up   and he   had   a smile on  his   face  like a   little kid.”\nBaumgartnersaid that  he   had  come   to  feel   like   Todd’sson,adding:“Hewasso happy thatIwasalive.”   Earlier,   Todd   hadtold   the   pressconference: “The  worldneeds a   heroright  now,  and   they got   one in  Felix Baumgartner.”To further laughter   at   the press conference,Kittingersaid:“I would   like   to  givea   special   one-fingered   saluteto all the  folk   who   saidthat  he  [Baumgartner]was going to   come apartwhenhe   wentsupersonic.', temperature=0.0, num_completions=5, top_k_per_token=1, max_tokens=30, stop_sequences=['\n'], echo_prompt=False, top_p=1, presence_penalty=0, frequency_penalty=0, random=None)

self.tokenizer.tokenize_and_count(request.prompt) + 30 = 2017 + 30 = 2047

'OpenAI (GooseAI API) error: Context limited to 2048 tokens maximum, 2576 tokens requested for prompt 0'

teetone commented 2 years ago

Filed a support ticket to GooseAI.

rishibommasani commented 2 years ago

Making a P2 since we are in waiting mode.

teetone commented 2 years ago

We can't evaluate GooseAI models because of these bugs. We will rely on offline batch evaluations moving forward.