Meatfucker / metatron

A discord.py based machine learning bot. It provides a LLM chatbot via the oobabooga API, and a Stable Diffusion generation bot via the AUTOMATIC1111 API, and speech generation command via a Bark API.
3 stars 0 forks source link

processedreply out of scope line 161 #1

Closed saphtea closed 10 months ago

saphtea commented 12 months ago

Hey! Running this and found that processedreply is out of scope of the return at line 161. Looking into it to see if I can just fix it myself, it returns None when I put it in scope, so just gonna read over your code and figure it out. Figured I'd let you know though!

saphtea commented 12 months ago

Ahhh, they deprecated the old api in favor of the OpenAI api https://github.com/oobabooga/text-generation-webui/pull/4539

Meatfucker commented 12 months ago

Yeah, sadly they did. For now stay on the older version of ooba. Im working on a rewrite that no longer has api dependencies in the metatron2 repo due to oobas api changing often enough to annoy me as well as some race conditions with the llm history state that can sometime break a users history. Overall itll be a more stable project not needing the outside apis.

Currently I have a proper generation queueing system and the llm stuff fully implemented besides the multimodal part which transformers is close to being done implementing at which point Ill implement the image part of the llm as well. Wrapping up the speech stuff now, will probably push it later today. Then Ill be implementing the image pipeline.

If you do rework it to work with the OpenAI api, please do make a PR so it can continue to work on newer versions of ooba. If you do I might be able to reuse it to implement api support into metatron2. The original metatron should continue to function fine as long as you stick to the ooba pull before the breaking api change.

saphtea commented 12 months ago

Gotcha, yeah this is the ideal setup for me, I run SD.Next running diffusers to run pixart, deepfloyd, sdxl, etc. and ooba for multimodal llm stuff with llava. I was doing some jank coding in oobabot before to get the multimodal bits to work lmfao.

I'll make a pull request if I end up getting the API to work, appreciate the hard work you've done on this so far!

Meatfucker commented 12 months ago

Thanks. The end goal for the metatron2 project is to support both running its llm/speech/image generation internally or via api, so any work you might do on metatron will benefit both projects.

Meatfucker commented 10 months ago

Transformers has added llava and I have implemented it into the https://github.com/Meatfucker/metatron2 project. This now deprecates the old metatron software and you should migrate to metatron2. It has a similar config but not exactly, so read the docs while installing. Overall its a much better project.