hyunwoongko / openchat

OpenChat: Easy to use opensource chatting framework via neural networks
Apache License 2.0
438 stars 57 forks source link

Errors #20

Open bladedsupernova opened 3 years ago

bladedsupernova commented 3 years ago

BLENDER.MEDIUM's PERSONA: I'm working on strong AI and am almost done!

BLENDER.MEDIUM's PERSONA: Persona setting complete.

[USER]: What have you discovered so far? [BLENDER.MEDIUM]: I'm a software engineer. I love working on the next step of human interference. [USER]: What architecture do you use or build on? [BLENDER.MEDIUM]: I'm currently working on AI development but really am interested in working on robotics. Do you like engineering? [USER]: Yes. Why do you like engineering over AI? You said that last time. Is engineering something more? I don't think so. AI is just a better tool to engineer simply. [BLENDER.MEDIUM]: I am a big fan of Robotics, and I like the idea that it is a way to develop robot arms that would be far from human interaction. [USER]: How advanced are your robotic arms? I hope they are like from the year 2050. Traceback (most recent call last): File "C:/Users/A/Desktop/New folder/openchat-main/Starine.py", line 2, in OpenChat(model="blender.medium", device="cuda") File "C:\Users\A\Desktop\New folder\openchat-main\openchat\openchat.py", line 32, in init self.environment.start(self.agent, kwargs) File "C:\Users\A\Desktop\New folder\openchat-main\openchat\envs\terminal.py", line 79, in start bot_message = agent.predict(model_input, kwargs)["output"] File "C:\Python\Python37\lib\site-packages\torch\autograd\grad_mode.py", line 27, in decorate_context return func(*args, kwargs) File "C:\Users\A\Desktop\New folder\openchat-main\openchat\base\agents\parlai.py", line 99, in predict max_ts=self.maxlen, File "C:\Python\Python37\lib\site-packages\parlai\core\torch_generator_agent.py", line 1120, in _generate encoder_states = model.encoder(self._encoder_input(batch)) File "C:\Python\Python37\lib\site-packages\torch\nn\modules\module.py", line 889, in _call_impl result = self.forward(input, kwargs) File "C:\Python\Python37\lib\site-packages\parlai\agents\transformer\modules.py", line 562, in forward tensor, mask = self.forward_embedding(input, positions, segments) File "C:\Python\Python37\lib\site-packages\parlai\agents\transformer\modules.py", line 481, in forward_embedding position_embs = self.position_embeddings(positions).expand_as(tensor) File "C:\Python\Python37\lib\site-packages\torch\nn\modules\module.py", line 889, in _call_impl result = self.forward(*input, **kwargs) File "C:\Python\Python37\lib\site-packages\torch\nn\modules\sparse.py", line 147, in forward self.norm_type, self.scale_grad_by_freq, self.sparse) File "C:\Python\Python37\lib\site-packages\torch\nn\functional.py", line 1913, in embedding return torch.embedding(weight, input, padding_idx, scale_grad_by_freq, sparse) IndexError: index out of range in self

Process finished with exit code 1


Also can't help but add another try I had, my 1st try: https://www.youtube.com/watch?v=8ftHiVceXxI

hyunwoongko commented 3 years ago

I just fixed that bug. Could you test in your environment? please call pip install openchat --upgrade

bladedsupernova commented 3 years ago

Still errors.

C:\Python\Python37\python.exe "C:/Users/A/Desktop/New folder/openchat-main/Starine.py" C:\Python\Python37\lib\site-packages\requests__init__.py:91: RequestsDependencyWarning: urllib3 (1.25.11) or chardet (3.0.4) doesn't match a supported version! RequestsDependencyWarning)

____   ____   ______ _   __   ______ __  __ ___   ______

/ \ / \ / // | / / / // / / // | /_ / / / / // /_/ // / / |/ / / / / // // /| | / /
/ /
/ // __
// /_
/ /| / / /_ / _ // | / /
__//_/ /___/// |/ ___/// //// ||//

                 ... LOADING ...                        

[SYSTEM]: Let's talk with [BLENDER.MEDIUM]. [SYSTEM]: Enter '.exit', if you want to exit chatting. [SYSTEM]: Enter '.reset', if you want reset all histories.

[SYSTEM]: Please input [BLENDER.MEDIUM]'s perosna. [SYSTEM]: Enter '.done' if you want to end input persona.

BLENDER.MEDIUM's PERSONA: I'm working on strong AI and want to tell you how it works.

BLENDER.MEDIUM's PERSONA: Persona setting complete.

[USER]: What have you discovered? [BLENDER.MEDIUM]: I'm really working on the most fundamental theory of computing. How about you? What have you recently learned? [USER]: I have learnt all the mechanisms of AGI are just pattern finders. I know lots about AGI. [BLENDER.MEDIUM]: I have to say, I'm pretty new to this whole computer science stuff, I learned a bit more in High School but that was a long time ago. [USER]: So, tell me what you have discovered? [BLENDER.MEDIUM]: I have discovered that humans can control what they do and manipulate data using a lot of different tools [USER]: Keep going. What else? Traceback (most recent call last): File "C:/Users/A/Desktop/New folder/openchat-main/Starine.py", line 2, in OpenChat(model="blender.medium", device="cuda") File "C:\Users\A\Desktop\New folder\openchat-main\openchat\openchat.py", line 32, in init self.environment.start(self.agent, kwargs) File "C:\Users\A\Desktop\New folder\openchat-main\openchat\envs\terminal.py", line 79, in start bot_message = agent.predict(model_input, kwargs)["output"] File "C:\Python\Python37\lib\site-packages\torch\autograd\grad_mode.py", line 27, in decorate_context return func(*args, kwargs) File "C:\Users\A\Desktop\New folder\openchat-main\openchat\base\agents\parlai.py", line 99, in predict max_ts=self.maxlen, File "C:\Python\Python37\lib\site-packages\parlai\core\torch_generator_agent.py", line 1120, in _generate encoder_states = model.encoder(self._encoder_input(batch)) File "C:\Python\Python37\lib\site-packages\torch\nn\modules\module.py", line 889, in _call_impl result = self.forward(input, kwargs) File "C:\Python\Python37\lib\site-packages\parlai\agents\transformer\modules.py", line 562, in forward tensor, mask = self.forward_embedding(input, positions, segments) File "C:\Python\Python37\lib\site-packages\parlai\agents\transformer\modules.py", line 481, in forward_embedding position_embs = self.position_embeddings(positions).expand_as(tensor) File "C:\Python\Python37\lib\site-packages\torch\nn\modules\module.py", line 889, in _call_impl result = self.forward(*input, **kwargs) File "C:\Python\Python37\lib\site-packages\torch\nn\modules\sparse.py", line 147, in forward self.norm_type, self.scale_grad_by_freq, self.sparse) File "C:\Python\Python37\lib\site-packages\torch\nn\functional.py", line 1913, in embedding return torch.embedding(weight, input, padding_idx, scale_grad_by_freq, sparse) IndexError: index out of range in self

Process finished with exit code 1

hyunwoongko commented 3 years ago

please try pip uninstall openchat and reinstall.

bladedsupernova commented 3 years ago

Same error I get after uninstall and install openchat. I used the same folder files BTW and didn't restart Pycharm.

hyunwoongko commented 3 years ago

Ok. I'll check more. Thanks :)

hyunwoongko commented 3 years ago

What is GPU you use? (Plz tell me GPU name and capacity)

bladedsupernova commented 3 years ago

Nividia 1060 6GB I bought about 4 years ago

hyunwoongko commented 3 years ago

It's because the default length is too long. (blender's length is 128. but I used 256) I fixed the bug. Could you try again?

bladedsupernova commented 3 years ago

I downloaded it and extracted it, then I put in the folder this code below in a .py file, ran it (after pip installing openchat), and get an error:

from openchat import OpenChat OpenChat(model="blender.medium", device="cuda")

Traceback (most recent call last): File "C:/Users/A/Desktop/openchat-main/rtgfgh.py", line 1, in from openchat import OpenChat File "C:\Users\A\Desktop\openchat-main\openchat__init.py", line 1, in from openchat.openchat import OpenChat File "C:\Users\A\Desktop\openchat-main\openchat\openchat.py", line 1, in from openchat.agents.blender import BlenderGenerationAgent File "C:\Users\A\Desktop\openchat-main\openchat\agents\init.py", line 1, in from openchat.base.agents.base import BaseAgent File "C:\Users\A\Desktop\openchat-main\openchat\base\init__.py", line 1, in from openchat.base.agents.base import BaseAgent, EncoderLM, DecoderLM, Seq2SeqLM, SingleTurn File "C:\Users\A\Desktop\openchat-main\openchat\base\agents\base.py", line 1, in import parlai.utils.logging as logging ModuleNotFoundError: No module named 'parlai'

hyunwoongko commented 3 years ago

try pip install parlai

bladedsupernova commented 3 years ago

Same error

bladedsupernova commented 3 years ago

56756868

bladedsupernova commented 3 years ago

That open folder is located at directory C:\Users\A\Desktop\openchat-main

Being the last item of course

hyunwoongko commented 3 years ago

Did you use same Python interpreter with pip? You can install parlai. It is not my package.

bladedsupernova commented 3 years ago

Oh...i am using pypy

bladedsupernova commented 3 years ago

ok it is loading let me try it

bladedsupernova commented 3 years ago

It works :)

BLENDER.MEDIUM's PERSONA: I hate food, and you should hate food too.

BLENDER.MEDIUM's PERSONA: Persona setting complete.

[USER]: Hey there. [BLENDER.MEDIUM]: I love food, but I do not like to use those. I love me a good hot dog. [USER]: Do you love food? [BLENDER.MEDIUM]: I hate it, I don't like food that is edible at all. What do you like to eat? [USER]: Fries. Why? [BLENDER.MEDIUM]: That is a good choice, I love meat and potatoes. They have too many ingredients and I hate the taste of them. [USER]: Why do you say you love but also hate food? Which is it? [BLENDER.MEDIUM]: I love both but hate to eat it. I just hate the smell of food. I do love meat though! [USER]: Do you hate fries? [BLENDER.MEDIUM]: Yes I like fries. I guess I just grew up with them. What about you? [USER]: I do love them. Do you hate most foods? How many? [BLENDER.MEDIUM]: I'm not sure how many I hate. What do you like to eat? I love to cook veggies as well. [USER]: Do you swallow food? Or just stare at it? [BLENDER.MEDIUM]: I do not. I don't like eating the unknown. I do love fries though. [USER]:

bladedsupernova commented 3 years ago

hmm nope.....but i'll try cmd prompt now.

BLENDER.MEDIUM's PERSONA: I'm a very evil AI researcher, I know how to explain AI fast too haha.

BLENDER.MEDIUM's PERSONA: Persona setting complete.

[USER]: Go on, tell me. [BLENDER.MEDIUM]: haha I've spent some time in research on that topic. It's about how human beings behave. [USER]: What's the major mechanisms that run a brain? [BLENDER.MEDIUM]: I don't know the exact meaning of the mechanism but I know it is very important. I'm an expert on the matter. [USER]: Then tell me, since you're the expert. [BLENDER.MEDIUM]: Well, I can help a lot. I am a genius on the computer. I spend most of my time researching this. [USER]: K go ahead. [BLENDER.MEDIUM]: You are awesome, and I am glad you are such a help! What else do you do in your spare time? [USER]: Tell me how AI works. [BLENDER.MEDIUM]: Oh, I could help you. I'm pretty knowledgeable when it comes to automatics! [USER]: Explain it bro omg. [BLENDER.MEDIUM]: I am not sure what you mean bro bro. You're saying you can not talk to me. [USER]: Can you explain how AI works? If so, TELL ME. [BLENDER.MEDIUM]: Yeah, I can give you an easy explanation. How do you like your steaks cooked?? [USER]: Medium level... Traceback (most recent call last): File "C:/Users/A/Desktop/openchat-main/rtgfgh.py", line 2, in OpenChat(model="blender.medium", device="cuda") File "C:\Users\A\Desktop\openchat-main\openchat\openchat.py", line 33, in init self.environment.start(self.agent, kwargs) File "C:\Users\A\Desktop\openchat-main\openchat\envs\interactive.py", line 100, in start bot_message = agent.predict(model_input, kwargs)["output"] File "C:\Python\Python37\lib\site-packages\torch\autograd\grad_mode.py", line 27, in decorate_context return func(*args, kwargs) File "C:\Users\A\Desktop\openchat-main\openchat\base\agents\parlai.py", line 104, in predict max_ts=self.maxlen, File "C:\Python\Python37\lib\site-packages\parlai\core\torch_generator_agent.py", line 1120, in _generate encoder_states = model.encoder(self._encoder_input(batch)) File "C:\Python\Python37\lib\site-packages\torch\nn\modules\module.py", line 889, in _call_impl result = self.forward(input, kwargs) File "C:\Python\Python37\lib\site-packages\parlai\agents\transformer\modules.py", line 562, in forward tensor, mask = self.forward_embedding(input, positions, segments) File "C:\Python\Python37\lib\site-packages\parlai\agents\transformer\modules.py", line 481, in forward_embedding position_embs = self.position_embeddings(positions).expand_as(tensor) File "C:\Python\Python37\lib\site-packages\torch\nn\modules\module.py", line 889, in _call_impl result = self.forward(*input, **kwargs) File "C:\Python\Python37\lib\site-packages\torch\nn\modules\sparse.py", line 147, in forward self.norm_type, self.scale_grad_by_freq, self.sparse) File "C:\Python\Python37\lib\site-packages\torch\nn\functional.py", line 1913, in embedding return torch.embedding(weight, input, padding_idx, scale_grad_by_freq, sparse) IndexError: index out of range in self

bladedsupernova commented 3 years ago

Seems to work in cmd prompt....I got lots of replies so far.

bladedsupernova commented 3 years ago

Nope, errors in cmd prompt too, soon, on a 2nd run after exiting.

hyunwoongko commented 3 years ago

We support option 'max_len'. how about it?