langchain-ai / langchainjs

🦜🔗 Build context-aware reasoning applications 🦜🔗
https://js.langchain.com/docs/
MIT License
12.31k stars 2.08k forks source link

Can not get answer from Amazon Titan Text Express #2812

Closed Kirchen99 closed 11 months ago

Kirchen99 commented 11 months ago

I followed this documentation: https://js.langchain.com/docs/modules/model_io/models/llms/integrations/bedrock and created the following code:

import { Bedrock } from "langchain/llms/bedrock";

const model = new Bedrock({
    model: "amazon.titan-text-express-v1",
    region: "us-east-1",
});

const res = await model.invoke("Tell me a joke");
console.log(res);

And the res I received is undefined.

Compare to the javascript version, the code in python works fine:

from langchain.llms import Bedrock

llm = Bedrock(
    credentials_profile_name="default",
    model_id="amazon.titan-text-express-v1"
)

from langchain.chains import ConversationChain
from langchain.memory import ConversationBufferMemory

conversation = ConversationChain(
    llm=llm, verbose=True, memory=ConversationBufferMemory()
)

result = conversation.predict(input="What is the most exciting question you can ask an LLM?")
print(result)

Did I miss something in my javascript code?

dosubot[bot] commented 11 months ago

🤖

Hello,

Thank you for reaching out and providing detailed information about your issue.

Based on the code you've shared, it seems like you're trying to use the invoke method of the Bedrock class in the langchainjs framework to get a response from the "amazon.titan-text-express-v1" model. However, the response you're getting is undefined.

The invoke method is used in the Bedrock class to call the Bedrock service model and return the string generated by the model. It's used as part of the URL to make a POST request to the Bedrock service model. The URL is constructed as follows: https://${endpointHost}/model/${this.model}/${bedrockMethod}. The _signedFetch method is responsible for preparing the input, creating the request, signing it using AWS Signature V4, and sending the request to the Bedrock service model.

However, the "amazon.titan-text-express-v1" model might not be supported in the current implementation of the Bedrock class in the langchainjs framework. The Bedrock class checks if the model name (the part before the dot) is included in the allowedModels array, which only includes "ai21", "anthropic", and "amazon". Therefore, if you're trying to use the "amazon.titan-text-express-v1" model, it will not be recognized as a valid model, and an error will be thrown.

To resolve this issue, you could modify the allowedModels array to include "amazon.titan-text-express-v1", or you could modify the check to allow any model that starts with "amazon.". However, please note that this might not be enough to make the "amazon.titan-text-express-v1" model work correctly, as there might be other parts of the code that are not compatible with this model.

I hope this helps! If you have any other questions or need further clarification, please don't hesitate to ask.

Best, Dosu

Sources


This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.

Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.

If you want to continue the conversation, start your reply with @dosu-bot.

jacoblee93 commented 11 months ago

Thanks for the report, will have a look!

jacoblee93 commented 11 months ago

I actually don't have access to this one, perhaps @cwoolum can help?

cwoolum commented 11 months ago

I'll take a look and see what's up

cwoolum commented 11 months ago

@Kirchen99, can you please confirm this resolves your issue?

Kirchen99 commented 11 months ago

@Kirchen99, can you please confirm this resolves your issue?

It works! Thank you very much!

jacoblee93 commented 11 months ago

Awesome!

Thank you again @cwoolum!