rish-16 / gpt2client

✍🏻 gpt2-client: Easy-to-use TensorFlow Wrapper for GPT-2 117M, 345M, 774M, and 1.5B Transformer Models 🤖 📝
MIT License
372 stars 74 forks source link

UnboundLocalError: local variable 'generated' referenced before assignment #35

Closed wrenashe closed 3 years ago

wrenashe commented 4 years ago

Describe the bug Run the demo " Start generating text!" code on google colab, and get the python error.


UnboundLocalError Traceback (most recent call last)

in () 5 6 gpt2.generate(interactive=True) # Asks user for prompt ----> 7 gpt2.generate(n_samples=4) # Generates 4 pieces of text 8 text = gpt2.generate(return_text=True) # Generates text and returns it in an array 9 gpt2.generate(interactive=True, n_samples=3) # A different prompt each time /usr/local/lib/python3.6/dist-packages/gpt2_client/gpt2_client.py in generate(self, interactive, n_samples, words, display, return_text) 152 print (colored('Generating sample...', 'yellow')) 153 --> 154 while n_samples == 0 or generated < n_samples: 155 out = sess.run(output) 156 for i in range(batch_size): UnboundLocalError: local variable 'generated' referenced before assignment **To Reproduce** In google colab, 1, install !pip install gpt2-client !pip install tensorflow==1.14 2. Run from gpt2_client import GPT2Client gpt2 = GPT2Client('345M') # This could also be `117M`, `345M`, `774M`, or `1558M` gpt2.load_model() gpt2.generate(interactive=True) # Asks user for prompt gpt2.generate(n_samples=4) # Generates 4 pieces of text text = gpt2.generate(return_text=True) # Generates text and returns it in an array gpt2.generate(interactive=True, n_samples=3) # A different prompt each time
rish-16 commented 3 years ago

Hey,

So sorry for the delayed response, I haven't been very active here. A fix is on the way – I haven't had the time to get to it. Will definitely keep you in the loop. Appreciate your patience!