Closed palmm closed 1 year ago
Answer generated by a 🤖
From your description, it appears that you're encountering a TypeError: sdk_1.Anthropic is not a constructor
error when trying to instantiate an Anthropic streaming chat model using the LangChain framework. This error typically occurs when an object is not correctly instantiated, and in this case, it seems to be related to the Anthropic
object from the @anthropic-ai/sdk
package.
Based on the error message and the code you've shared, it seems that the Anthropic
object is being instantiated in two places: once for the batchClient
and once for the streamingClient
. The error is likely occurring in one of these two places.
The instantiation of the Anthropic
object looks like this:
this.streamingClient = new AnthropicApi({
...options,
apiKey: this.anthropicApiKey,
});
and
this.batchClient = new AnthropicApi({
...options,
apiKey: this.anthropicApiKey,
});
The error message suggests that AnthropicApi
is not a constructor, which means it's not a class or function that can be used with the new
keyword.
This could be due to several reasons:
The @anthropic-ai/sdk
package might have been updated and the Anthropic
class might have been removed or renamed. I recommend checking the documentation or source code of the @anthropic-ai/sdk
package to confirm this.
There might be an issue with the way the Anthropic
class is being imported. The import statement is import { Anthropic as AnthropicApi } from "@anthropic-ai/sdk";
. Please confirm that the Anthropic
class is being exported correctly from the @anthropic-ai/sdk
package.
There might be an issue with the TypeScript configuration or the build process that is causing the Anthropic
class to not be recognized as a constructor. I suggest checking your TypeScript configuration and build process to confirm this.
Please note that this is just one potential solution based on the information provided. If you believe this is a bug that could impact other users, feel free to make a pull request with the necessary changes. If my suggestion contradicts your experience, I recommend upgrading to the latest version of the relevant packages and trying again.
This response is meant to be useful, save you time, and share context. It is not meant to be a precise solution, but rather a starting point for your own research.
Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.
The above @dosu-beta comment is actually more or less what I would look at.
Can you tell me what environment you're running in? new sdk_1.Anthropic
sounds like there's some bundling going on.
Do non-streaming requests work?
Also, I can't imagine this wouldn't be the case, but can you double check that the version of Anthropic's SDK is 0.5.3
?
I'm running in a nodejs env with the bundling done via webpack. I think this is some esoteric issue related to requirements. I can see that the node runtime is picking up the CommonJS output version of langchain and I can repro this on a clean nodejs repo with this code in a javascript file and running in node (no bundling required):
const { ChatAnthropic } = require('langchain/chat_models/anthropic');
const { HumanMessage } = require('langchain/schema');
const model = new ChatAnthropic({
anthropicApiKey: "KEY_HERE",
modelName: 'claude-instant-1',
streaming: true,
});
const response = model.call(
[
new HumanMessage(
'Please, write me a list of 10 good name options for a company that makes colorful socks.'
),
],
{},
[
{
handleLLMNewToken: (token) => {
console.log({ token });
},
},
]
);
I think the issue is that, if the cjs version of langchain is used, the anthropic import in the cjs code is:
const sdk_1 = require("@anthropic-ai/sdk");
which then is used like this:
this.streamingClient = new sdk_1.Anthropic({
...options,
apiKey: this.anthropicApiKey,
});
the issue arises from the fact that there is a collision on Anthropic
from the anthropic sdk. It starts as an export of the client API:
export class Anthropic extends Core.APIClient {
but then below it is re-exported:
export namespace Anthropic {
and then default exported:
export default Anthropic;
So I think when node runs the langchain cjs code, sdk_1.Anthropic
doesn't resolve to the desired class. Instead it resolves to the namespace (i'm assuming).
does not repro if I used the mjs
EcmaScript module version of langchain, e.g.
import { ChatAnthropic } from 'langchain/chat_models/anthropic';
import { HumanMessage } from 'langchain/schema';
const model = new ChatAnthropic({
anthropicApiKey: "KEY_HERE",
modelName: 'claude-instant-1',
streaming: true,
});
const response = model.call(
[
new HumanMessage(
'Please, write me a list of 10 good name options for a company that makes colorful socks.'
),
],
{},
[
{
handleLLMNewToken: (token) => {
console.log({ token });
},
},
]
);
Raised a fix here: https://github.com/hwchase17/langchainjs/pull/1969 Explicitly using the default export in langchain emits the correct cjs code, solving this issue.
Calling an anthropic streaming chat model results in:
Afaict, https://github.com/hwchase17/langchainjs/pull/1932 appears to have broken streaming anthropic chat models.
Probable repro (untested):