langchain-ai / langchainjs

🦜🔗 Build context-aware reasoning applications 🦜🔗
https://js.langchain.com/docs/
MIT License
11.79k stars 1.97k forks source link

Cannot convert IterableReadableStream into ReadableStream due to missing `.values()` API #4506

Open movila opened 4 months ago

movila commented 4 months ago

I'm getting an error in typescript: Property 'values' is missing in type 'IterableReadableStream<BaseMessageChunk>' but required in type 'ReadableStream<any>'

dosubot[bot] commented 4 months ago

🤖

Hey there @movila! 😄 Great to see you diving deep into the code again. How's everything going on your end?

The error message you're seeing is indicating that the ReadableStream interface requires a values property, but it's not present in your IterableReadableStream class.

However, in the provided context, there is no direct usage or requirement for a values property in the ReadableStream interface. This might be due to the TypeScript version you're using or the specific environment where your code is running.

If you're sure that your environment requires a values property in ReadableStream, you can add it to your IterableReadableStream class. Here's how you can do it:

export class IterableReadableStream<T>
  extends ReadableStream<T>
  implements IterableReadableStreamInterface<T>
{
  public reader: ReadableStreamDefaultReader<T>;

  // Add the values property
  get values(): ReadableStreamDefaultReader<T> {
    this.ensureReader();
    return this.reader;
  }

  // Rest of your code...
}

In this code, we've added a values getter that ensures a reader is available and then returns it. This should satisfy the TypeScript error you're seeing. However, please make sure this implementation aligns with your specific use case and environment requirements.

You can find more details about the IterableReadableStream class in the langchain-core/src/utils/stream.ts file.

Please let me know if this helps or if you need further assistance.

Sources

#### About Dosu This response is meant to be useful and save you time. It is **not meant to be a precise solution**, but rather a starting point for your own research. Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant. If you want to continue the conversation, start your reply with @dosu-bot.
jacoblee93 commented 4 months ago

Can you share your code and env?

movila commented 4 months ago

Below are the simplified version of my code, and it's running on Azure function:

import { ChatOpenAI } from "@langchain/openai";
import {
  HttpRequest,
  HttpResponse,
  InvocationContext,
  app
} from '@azure/functions';

interface ReqProps {
  input: string
}

export async function streamtest(
  request: HttpRequest,
  context: InvocationContext
): Promise<HttpResponse> {
  const model = new ChatOpenAI({});
  const req = await request.json() as ReqProps;

  const stream = await model.stream(req.input);
  return {
    body: stream
  };
}

app.post('streamtest', {
  authLevel: 'anonymous',
  handler: streamtest
});
furquanh commented 4 months ago

Can you share your code and env?

Hi Jacob! I have a similar problem. The StreamingTextResponse from vercel ai packacge requires a ReadableStream, but the RemoteRunnable's .stream returns IterableReadableStream. I checked the code base and found IterableReadableStream simply extends ReadableStream but still stream is not being rendered to the UI. Here's the code:

const remoteChain = new RemoteRunnable({
  url: 'http://172.28.224.1:8000/chat'
})
const remotestream = await remoteChain.stream({

  'chat_history': tupledPreviousMessages,
  'question': lastUserQuery,
  'supabase_authorization_token': authorization,
  'config': {}, // Additional configuration if needed
  'kwargs': {}  // Additional keyword arguments if needed
});

// This works!
//for await (const chunk of remotestream) {
//  console.log(chunk)
//}

// This doesnt
return new StreamingTextResponse(remotestream, { headers: corsHeaders});

There's nothing wrong with my UI because the OpenAIStream renders perfectly:

const completionStream = await openai.chat.completions.create({
    model: 'gpt-3.5-turbo-0613',
    messages: completionMessages,
    max_tokens: 1024,
    temperature: 0,
    stream: true,
  })
const stream = OpenAIStream(completionStream)
// Runs flawlessly
return new StreamingTextResponse(stream, { headers: corsHeaders})

Hope you can help me out with this.

jacoblee93 commented 4 months ago

Oh, yeah HTTP responses require binary data streams. What does your remote runnable return as chunks?

https://js.langchain.com/docs/modules/model_io/output_parsers/types/http_response

furquanh commented 4 months ago

Oh. my response schema is just the string from the StrOutputParser()!

_inputs = RunnableMap(
    input=RunnablePassthrough(),
    standalone_question=RunnablePassthrough.assign(
        chat_history=lambda x: _format_chat_history(x["chat_history"])
    )
    | CONDENSE_QUESTION_PROMPT
    | ChatOpenAI(temperature=0)
    | StrOutputParser(),
)

_context_with_supabase = {
    "context": ({"question": itemgetter("standalone_question"), "input": itemgetter("input")} |  RunnableLambda(_get_supabase_retriever) | _combine_documents),
    "question": lambda x: x["standalone_question"],
}

# User input
class ChatHistory(BaseModel):
    """Chat history with the bot."""

    chat_history: List[Tuple[str, str]] = Field(
        ...,
        extra={"widget": {"type": "chat", "input": "question"}},
    )
    question: str
    supabase_authorization_token: str

# Chain output
class LangServeResponse(BaseModel):
    """Language model server response."""
    response: str

conversational_qa_chain_with_supabase = (
    _inputs| _context_with_supabase | ANSWER_PROMPT | ChatOpenAI() | StrOutputParser()
).with_types(input_type=ChatHistory, output_type=LangServeResponse)

add_routes(app, conversational_qa_chain_with_supabase, path='/chat', enable_feedback_endpoint=True)

if __name__ == "__main__":
    import uvicorn

    uvicorn.run(app, host="0.0.0.0", port=8000)
jacoblee93 commented 4 months ago

Ah ok. If it's a string I think this will work:

const httpOutputParser = new HttpOutputParser();

const stream = remoteRunnable.pipe(httpOutputParser).stream(...);

Then pass that stream to the http response.

mgreystone commented 2 months ago

I'm running into the same issue.

It looks like IterableReadableStream is implementing ReadableStream from the dom lib, not ReadableStream from @types/node. These two implementations are a little different. The web implementation is missing definitions for values().

This does make handling streams within node awkward. IterableReadableStream is not assignable to a ReadableStream imported from stream/web, causing compile issues. For example, typescript is not allowing us to pipe through to a TransformStream imported from 'stream/web'.

jacoblee93 commented 2 months ago

Got it - hopefully implementing .values() will fix it?

And just to make sure, you're on > Node 16?

mgreystone commented 2 months ago

@jacoblee93 Yes, i am on node 20.

mgreystone commented 2 months ago

And yes, adding the values() method to IterableReadableStream should fix. You may want to consider implementing or directly utilizing the node implementation of ReadableStream, as it currently a superset of the dom's ReadableStream and already implements AsyncIterable. Or implement both the dom's & node's implementation to guarantee both node & browser environments will always be supported.

jacoblee93 commented 2 months ago

We can't directly utilize the node implementation due to web environment and bundler constraints 😕 we can and should add an export test though.

muditjai commented 2 weeks ago

I'm running into this issue as well. What's the fix for it?

jacoblee93 commented 2 weeks ago

It'll be updating our custom streaming class to fit the Node definition. Will keep you all posted although we went with the web readable stream by choice.

jacoblee93 commented 2 weeks ago

So it's a bit more complicated than just implementing .values(). There's a bunch of other small differences

Will keep looking