Closed vishalp-simplecrm closed 1 year ago
Are you using streaming? tbh I've never seen anyone have this issue before. Might be a problem with your frontend code?
response = openai.Completion.create( model="text-davinci-003", prompt="Write a tagline for an ice cream shop." )
instead of llama index It is working correctly
@app.route('/query', methods=['POST']) def query(): query_engine = index.as_query_engine(similarity_top_k=3, streaming=True) prompt = request.json
print('prompt given issss:',prompt)
response_stream = query_engine.query(prompt['prompt'])
# response_stream.print_response_stream()
def generate_response():
for text in response_stream.response_gen:
yield f"data: {text}\n\n"
# print(text)
return Response(stream_with_context(generate_response()), mimetype='text/event-stream')
Hi, @vishalp-simplecrm! I'm Dosu, and I'm here to help the LlamaIndex team manage their backlog. I wanted to let you know that we are marking this issue as stale.
From what I understand, the issue is related to the response_stream.response_gen
function adding extra spaces within words when used to display responses in a React frontend. You mentioned that the issue is resolved when using a different method, chat.completion
. Another user, logan-markewich, suggested that the problem might be with the frontend code.
Could you please let us know if this issue is still relevant to the latest version of the LlamaIndex repository? If it is, please comment on this issue to let us know. Otherwise, feel free to close the issue yourself or it will be automatically closed in 7 days.
Thank you for your understanding and cooperation!
Getting Broken Words
taking unnecessary spaces within the words please resolve this issue BERT is single word given as B ERT Bidirectional as Bid irectional