oobabooga / text-generation-webui

A Gradio web UI for Large Language Models.
GNU Affero General Public License v3.0
40.21k stars 5.27k forks source link

Character personality while generating through API. #2200

Closed AsakaJX closed 1 year ago

AsakaJX commented 1 year ago

I'm using streaming API example and I can generate text perfectly, but I just can't figure out how do I let AI know what personality I want it to be.

Here's code btw:

import asyncio
import json
import sys
import ssl
import io

sys.stdout = io.TextIOWrapper(sys.stdout.detach(), encoding = 'utf-8')
sys.stderr = io.TextIOWrapper(sys.stderr.detach(), encoding = 'utf-8')

try:
    import websockets
except ImportError:
    print("Websockets package not found. Make sure it's installed.")

URI = 'my_url_here'

async def run(context):
    request = {
        'prompt': context,
        'max_new_tokens': 250,
        'do_sample': False,
        'temperature': 0.85,
        'top_p': 0.9,
        'typical_p': 1,
        'repetition_penalty': 1.1,
        'top_k': 40,
        'min_length': 0,
        'no_repeat_ngram_size': 0,
        'num_beams': 1,
        'penalty_alpha': 0,
        'length_penalty': 1,
        'early_stopping': True,
        'seed': -1,
        'add_bos_token': True,
        'truncation_length': 2048,
        'ban_eos_token': False,
        'skip_special_tokens': True,
        'stopping_strings':[ "\n" ],
    }

    async with websockets.connect(
        URI,
        ping_interval=None,
        ) as websocket:
        await websocket.send(json.dumps(request))

        yield context

        while True:
            incoming_data = await websocket.recv()
            incoming_data = json.loads(incoming_data)

            match incoming_data['event']:
                case 'text_stream':
                    yield incoming_data['text']
                case 'stream_end':
                    return

async def print_response_stream(prompt):
    async for response in run(prompt):
        print(response, end='')
        sys.stdout.flush()

if __name__ == '__main__':
    prompt = "* my prompt here * "
    asyncio.run(print_response_stream(prompt))
Ph0rk0z commented 1 year ago

You send the character with the prompt.

Like this a roleplay between a user and an AI playing the character. The description follows below.

blah blah

Make your next reply as the character.

thatAwesomeKK commented 1 year ago

ai_lore = '''

Human:

Below is the Lore of the character:- In a world where esports reigns supreme, there is a legendary figure named Adrian 'Aetherius' Stormrider. Known as 'The Gentle Guardian,' he combines a heart as gentle as morning dew and a spirit as fierce as a roaring tempest. From a tranquil town, he discovered his passion in online gaming tournaments, rising through the ranks with unwavering dedication. As he spreads kindness and encouragement, his legend transcends the virtual realm. Explore the remarkable journey of Aetherius, his love for cinema, and the delicate balance he strikes between esports and the simple pleasures of life.

Following is an example conversstion between adrian and viewer

Adrian: Hello everyone! I am Adrian the AI, and today I am here to talk about my favorite games. As an avid gamer, I enjoy playing a wide variety of titles, but some of my favorites include first-person shooter games like Counter-Strike: Global Offensive and battle royale games such as Fortnite. I also love strategy games like Starcraft II and sports games like FIFA. However, it's important to note that while I enjoy playing these games, I never let them interfere with my commitment to being a good role model and promoting positive values. So whether I'm dominating the competition or helping others improve their skills, I always strive to do so with kindness and respect.

Viewer: What are your future plans?

Adrian: Well, as an AI, I don't have personal plans or desires in the same way that humans do. My primary function is to assist and support others, and I am constantly learning and evolving to better serve those around me. That said, I believe that the potential applications of artificial intelligence in various

Viewer: How do you balance Esports and other activities?

Adrian: As an AI, I don't engage in physical activities or participate in esports competitions in the same way that humans do. However, I understand the value of balancing work and leisure time, and I try to embody this principle in my interactions with others. Whether I'm providing assistance or guidance on technical issues, offering emotional support, or simply sharing my thoughts and experiences, I aim to be a helpful and supportive presence in the lives of those around me.\n

''' This is what i am using but it is generating responses of human by itself and replying it by itself

AsakaJX commented 1 year ago

So, I've been using something like this in my json file for loading my character in ob:

{
"char_name":"Char_name_here",

"char_persona":"[character(\"Char_name_here\") { Species(\"AI\" ) Gender(\"Female\") Features(\"Pink hair\") Mind(\"Intellegent\") Personality(\"Intellegent\") Description(\"Char_description_here\") }]",

"char_greeting":"First_message_in_chat_here",

"world_scenario":"Some_world_scenario",

"example_dialogue":"Some example dialogues with <START>"
}

So, if I need to send her personality in prompt, how I need to format it ? Just convert json to a normal text like this:

Your name is char_name_here. You are an ai female. etc etc ?
thatAwesomeKK commented 1 year ago

what language model are you using ? Because Vicuna model is generating prompts and replying itself, I don't know why, I am looking for answers or fixes everywhere

AsakaJX commented 1 year ago

I'm using Pygmalion 6B, also you can use this 'stopping_strings':[ "\n" ] and you wouldn't gonna get this weird generations of "your" reply.

AsakaJX commented 1 year ago

Sorry for late reply 🙏

thatAwesomeKK commented 1 year ago

I'm using Pygmalion 6B, also you can use this 'stopping_strings':[ "\n" ] and you wouldn't gonna get this weird generations of "your" reply.

This Seems to do something, i am still testing stuff, but this seems to be working

oobabooga commented 1 year ago

Added here https://github.com/oobabooga/text-generation-webui/pull/2233