Open movila opened 4 months ago
🤖
Hey there @movila! 😄 Great to see you diving deep into the code again. How's everything going on your end?
The error message you're seeing is indicating that the ReadableStream
interface requires a values
property, but it's not present in your IterableReadableStream
class.
However, in the provided context, there is no direct usage or requirement for a values
property in the ReadableStream
interface. This might be due to the TypeScript version you're using or the specific environment where your code is running.
If you're sure that your environment requires a values
property in ReadableStream
, you can add it to your IterableReadableStream
class. Here's how you can do it:
export class IterableReadableStream<T>
extends ReadableStream<T>
implements IterableReadableStreamInterface<T>
{
public reader: ReadableStreamDefaultReader<T>;
// Add the values property
get values(): ReadableStreamDefaultReader<T> {
this.ensureReader();
return this.reader;
}
// Rest of your code...
}
In this code, we've added a values
getter that ensures a reader is available and then returns it. This should satisfy the TypeScript error you're seeing. However, please make sure this implementation aligns with your specific use case and environment requirements.
You can find more details about the IterableReadableStream
class in the langchain-core/src/utils/stream.ts file.
Please let me know if this helps or if you need further assistance.
Can you share your code and env?
Below are the simplified version of my code, and it's running on Azure function:
import { ChatOpenAI } from "@langchain/openai";
import {
HttpRequest,
HttpResponse,
InvocationContext,
app
} from '@azure/functions';
interface ReqProps {
input: string
}
export async function streamtest(
request: HttpRequest,
context: InvocationContext
): Promise<HttpResponse> {
const model = new ChatOpenAI({});
const req = await request.json() as ReqProps;
const stream = await model.stream(req.input);
return {
body: stream
};
}
app.post('streamtest', {
authLevel: 'anonymous',
handler: streamtest
});
Can you share your code and env?
Hi Jacob! I have a similar problem. The StreamingTextResponse
from vercel ai packacge requires a ReadableStream
, but the RemoteRunnable's .stream
returns IterableReadableStream
. I checked the code base and found IterableReadableStream
simply extends ReadableStream
but still stream is not being rendered to the UI. Here's the code:
const remoteChain = new RemoteRunnable({
url: 'http://172.28.224.1:8000/chat'
})
const remotestream = await remoteChain.stream({
'chat_history': tupledPreviousMessages,
'question': lastUserQuery,
'supabase_authorization_token': authorization,
'config': {}, // Additional configuration if needed
'kwargs': {} // Additional keyword arguments if needed
});
// This works!
//for await (const chunk of remotestream) {
// console.log(chunk)
//}
// This doesnt
return new StreamingTextResponse(remotestream, { headers: corsHeaders});
There's nothing wrong with my UI because the OpenAIStream renders perfectly:
const completionStream = await openai.chat.completions.create({
model: 'gpt-3.5-turbo-0613',
messages: completionMessages,
max_tokens: 1024,
temperature: 0,
stream: true,
})
const stream = OpenAIStream(completionStream)
// Runs flawlessly
return new StreamingTextResponse(stream, { headers: corsHeaders})
Hope you can help me out with this.
Oh, yeah HTTP responses require binary data streams. What does your remote runnable return as chunks?
https://js.langchain.com/docs/modules/model_io/output_parsers/types/http_response
Oh. my response schema is just the string from the StrOutputParser()!
_inputs = RunnableMap(
input=RunnablePassthrough(),
standalone_question=RunnablePassthrough.assign(
chat_history=lambda x: _format_chat_history(x["chat_history"])
)
| CONDENSE_QUESTION_PROMPT
| ChatOpenAI(temperature=0)
| StrOutputParser(),
)
_context_with_supabase = {
"context": ({"question": itemgetter("standalone_question"), "input": itemgetter("input")} | RunnableLambda(_get_supabase_retriever) | _combine_documents),
"question": lambda x: x["standalone_question"],
}
# User input
class ChatHistory(BaseModel):
"""Chat history with the bot."""
chat_history: List[Tuple[str, str]] = Field(
...,
extra={"widget": {"type": "chat", "input": "question"}},
)
question: str
supabase_authorization_token: str
# Chain output
class LangServeResponse(BaseModel):
"""Language model server response."""
response: str
conversational_qa_chain_with_supabase = (
_inputs| _context_with_supabase | ANSWER_PROMPT | ChatOpenAI() | StrOutputParser()
).with_types(input_type=ChatHistory, output_type=LangServeResponse)
add_routes(app, conversational_qa_chain_with_supabase, path='/chat', enable_feedback_endpoint=True)
if __name__ == "__main__":
import uvicorn
uvicorn.run(app, host="0.0.0.0", port=8000)
Ah ok. If it's a string I think this will work:
const httpOutputParser = new HttpOutputParser();
const stream = remoteRunnable.pipe(httpOutputParser).stream(...);
Then pass that stream to the http response.
I'm running into the same issue.
It looks like IterableReadableStream
is implementing ReadableStream
from the dom lib, not ReadableStream
from @types/node
. These two implementations are a little different. The web implementation is missing definitions for values()
.
This does make handling streams within node awkward. IterableReadableStream
is not assignable to a ReadableStream
imported from stream/web
, causing compile issues. For example, typescript is not allowing us to pipe through to a TransformStream
imported from 'stream/web'.
Got it - hopefully implementing .values()
will fix it?
And just to make sure, you're on > Node 16?
@jacoblee93 Yes, i am on node 20.
And yes, adding the values()
method to IterableReadableStream
should fix. You may want to consider implementing or directly utilizing the node implementation of ReadableStream
, as it currently a superset of the dom's ReadableStream
and already implements AsyncIterable
. Or implement both the dom's & node's implementation to guarantee both node & browser environments will always be supported.
We can't directly utilize the node implementation due to web environment and bundler constraints 😕 we can and should add an export test though.
I'm running into this issue as well. What's the fix for it?
It'll be updating our custom streaming class to fit the Node definition. Will keep you all posted although we went with the web readable stream by choice.
So it's a bit more complicated than just implementing .values()
. There's a bunch of other small differences
Will keep looking
I'm getting an error in typescript:
Property 'values' is missing in type 'IterableReadableStream<BaseMessageChunk>' but required in type 'ReadableStream<any>'