run-llama / llama_index

LlamaIndex is a data framework for your LLM applications
https://docs.llamaindex.ai
MIT License
36.84k stars 5.28k forks source link

response_stream.response_gen giving addition spaces when we used this to show response using react frontend #5999

Closed vishalp-simplecrm closed 1 year ago

vishalp-simplecrm commented 1 year ago

Getting Broken Words

taking unnecessary spaces within the words please resolve this issue BERT is single word given as B ERT Bidirectional as Bid irectional

image

logan-markewich commented 1 year ago

Are you using streaming? tbh I've never seen anyone have this issue before. Might be a problem with your frontend code?

vishalp-simplecrm commented 1 year ago

please refer this code which I am using also when we directly use chat.completion

response = openai.Completion.create( model="text-davinci-003", prompt="Write a tagline for an ice cream shop." )

instead of llama index It is working correctly

@app.route('/query', methods=['POST']) def query(): query_engine = index.as_query_engine(similarity_top_k=3, streaming=True) prompt = request.json

prompt['prompt'] += " Give in html format"

print('prompt given issss:',prompt)
response_stream = query_engine.query(prompt['prompt']) 
# response_stream.print_response_stream()
def generate_response():
    for text in response_stream.response_gen:
            yield  f"data: {text}\n\n" 
            # print(text)
return Response(stream_with_context(generate_response()), mimetype='text/event-stream')
dosubot[bot] commented 1 year ago

Hi, @vishalp-simplecrm! I'm Dosu, and I'm here to help the LlamaIndex team manage their backlog. I wanted to let you know that we are marking this issue as stale.

From what I understand, the issue is related to the response_stream.response_gen function adding extra spaces within words when used to display responses in a React frontend. You mentioned that the issue is resolved when using a different method, chat.completion. Another user, logan-markewich, suggested that the problem might be with the frontend code.

Could you please let us know if this issue is still relevant to the latest version of the LlamaIndex repository? If it is, please comment on this issue to let us know. Otherwise, feel free to close the issue yourself or it will be automatically closed in 7 days.

Thank you for your understanding and cooperation!