run-llama / llama_index

LlamaIndex is a data framework for your LLM applications
https://docs.llamaindex.ai
MIT License
36.44k stars 5.2k forks source link

[Bug]: Bedrock converse LLM #15366

Closed MeghaWalia-eco closed 2 months ago

MeghaWalia-eco commented 2 months ago

Bug Description

I am trying to use bedrockConverse LLM and use it to generate the summary in following way

llm = BedrockConverse(
                model=llm_prediction_model,
                region_name=region_name,
                timeout=llm_api_timeout,
                callback_manager=self.__rag_svc.get_llm_callback_manager(
                            handlers=[llm_debug_callback, llm_counter_callback]
                        )
            )

resp = llm.complete(query)

same prompt is working fine on my local, But as soon as i move to dev environment it is giving me error Invalid payload! Need prompt and completion or messages and response

my ecs is present under vpc which is in eu-west-2 but the model i am using is anthropic.claude-3-5-sonnet-20240620-v1:0 which is in us-east-1, but the error message does not signify this is the issue

prompt used is

The Economist Style guide is below:
- Do not say "The article " in any part of the summary
- DO NOT mention any information like "Here is a summary in The Economist style, using bullet points" in the summary at all, Just start with the
summary
- Create summary without telling the user that the article is being summarised.
- Do not use clichés, idioms or figures of speech that are widely found in journalism. For example do not use phrases like "Much ink has been spilled".
- Do not use long, rare or high-register words where short and common alternatives are available. For example the word "prestidigitation" should be replaced with "magic" or "sleight of hand".
- Remove extra words, phrases and sentences that add no extra meaning. Phrases that include such extra words are sometimes called tautologies, pleonasms or redundancies. For example, each of the phrases below should be replaced with the word or shorter phrase in parentheses that follow. absolute certainty (certainty), policymaking process (policymaking), past, lived or personal experience (experience), empirical research (research, or a study), in close proximity to (close to). In general, be as simple and concise as possible.
- In general, be as concise and simple as possible and remove all redundancies.
- The passive voice is a grammatical structure like “The ball was kicked by the boy” or “The building was burned down by arsonists.” In these sentences the grammatical subjects are “the ball” and “the building”. But the performer of the action is “the boy” or “arsonists”. Do not use the passive voice when you can use the active voice. The two sentences above should be replaced with “The boy kicked the ball” and “Arsonists burned down the building.” Some passive-voice sentences do not mention who performed the action, as in “Several protesters were killed.” Such sentences should be reworded as active ones, with the performer of the action named, as in “Police killed several protesters.”
- Do not use years before nouns as though they were adjectives, as in “a 1999 election”, “a 2013 paper”, “the 1973 decision that legalised abortion”. Instead use “an election in 1999”, a paper published in 2013” or “the decision that legalised abortion in 1973”.
- Do not use “likely” as an adverb, as in “He will likely meet with the prime minister next week.” Use “likely” only as an adjective, as in “New elections are now likely.”
- Do not use Americanisms and instead use the British spellings of words. For example use, “colour” not “colour”, and “specialised” in place of “specialized”. In general words with “ize” should be replaced with “ise”. Use “car” not “automobile”, “company” not “corporation”, “transport” not “transportation”, “district” not “neighbourhood”). There are many other Americanisms that should be replaced with British English variants. Please find all of them.
- Do not prefix the summary with “The article ” or similar phrases. Instead, start the summary with the main points of the article.

The expected output format:
- Use bullet points to list out three key points

Please summarise using the Economist style guide:
Rama
Significance of Rama

Ram Lalla, the five-year old form of Rama is the principal deity of the Ram Mandir.
Rama is a prominent Hindu deity who is regarded as a Pūrṇāvatāra (lit. 'complete avatar') of the God Vishnu.[c] and some Hindus view Rama as Para Brahman (lit. 'The ultimate Brahman'). Rama holds huge significance in Hindu culture and religion. In the Rama avatar, Vishnu is not supposed to exhibit any of his divine potencies and lead life as a human.[30] As Rama is said to have possessed sixteen ideal qualities, Hindus view Rama as Puruṣottama (lit. 'The ideal man'), Vigrahavān dharmaḥ (lit. 'Embodiment of Dharma') and Ādi Puruṣa .[31][32][d] According to the Hindu epic Ramayana, Rama was born in Ayodhya.[33] Ayodhya is among seven most sacred cities to Hindus.[34]

Pilgrim site
Three places in Ayodhya are considered as prominent puṇya kṣētra (lit. 'virtued pilgrimage site') by Hindus. They are Janmasthāna (lit. 'The birthplace'), Svargadvāra (lit. 'The gateway to Heaven') and Yajñasthala (lit. 'Place where holy sacrifice was performed'). Temples commemorating these events used to exist in olden days as Janmasthān temple, Svargadvār temple and Trētā-kā-Thākūr temple respectively. However, they were destroyed during Islamic rule in the medieval ages and mosques were built.[35][36] While Marxist historians like Irfan Habib agree about the fate of Svargadvar and Treta-ka-Thakur temples, there is some lack of consensus among scholars whether Janmasthan temple was destroyed or not during Medieval rule which eventually led to Ayodhya dispute that was resolved legally.[37] The Trayōdaśakṣarī maṁtra (lit. '13-character mantra') also known as Śrī Rāma Tāraka Mantra (lit. 'The Rama mantra for Salvation') is Śrī Rāma Jaya Rāma Jaya Jaya Rāma and many Hindus believe that chanting it repeatedly lead to salvation.[38][39] Tāraka means something that carries or helps a Jīva (lit. 'a living being') cross the ocean of samsara. Devotees of Rama such as Valmiki, Tulsidas, Bhadrachala Ramadasu, Samartha Ramadas, Tyagaraja, Purandara Dasa, Gondavalekar Maharaj and Mahatma Gandhi popularised this mantra.[40]

Deity
See also: Balak Ram
Ram Mandir is being built to commemorate the birth of Rama at his Janmasthan. Therefore, the presiding deity of the temple is supposed to be the infant form of Rama, an avatar of Vishnu. Rama in that infant form was referred as Ram Lalla (lit. 'Child Rama') by Tulsidas. However, the idol of Rama that was placed in 1949 referred to as Ram Lalla Virajman (lit. 'Installed Child Rama') by local Hindus.[41] Ram Lalla was a litigant in the court case over the disputed site in 1989, being considered a "juristic person" by the law.[3] He was represented by Triloki Nath Pandey, a senior VHP leader who was considered Ram Lalla's closest 'human' friend.[41] As a new idol of the deity got installed in the sanctum sanctorum as the Mūlavirāt mūrti (lit. The main presiding deity),[e] the Mandir Trust has informed that the Ram Lalla Virajman idol of 1949 shall henceforth be used as Utsava mūrti (lit. idol for festivals).[44]

History
See also: Ram Janmabhoomi and Ayodhya dispute
Ayodhya dispute
Archaeology of Ayodhya Vishnu Hari inscriptionBabri Masjid Demolition of the Babri MasjidRam JanmabhoomiAyodhya firing incident2005 Ram Janmabhoomi attack2019 Supreme Court verdictRam MandirMuhammad bin Abdullah Masjid
Organizations
Hindu MahasabhaVishva Hindu ParishadRam Janmabhoomi NyasBharatiya Janata PartyLiberhan CommissionNirmohi AkharaUttar Pradesh Sunni Central Waqf Board
vte
The site is the former location of the Babri Masjid, which was built in the 16th century. The mosque was attacked and demolished in 1992.[13][14][15] In 2019, the Supreme Court of India delivered the verdict to give the disputed land to Hindus for the construction of a temple, while Muslims would be given land elsewhere to construct a mosque.[45]

Medieval

1717 CE Map of Ayodhya's Ram Mandir at Jaipur City Palace
In 1528, the commander of the Mughal Empire, Mir Baqi, constructed the Babri Masjid mosque, under the order of Babur, on the site of the Ram Janmabhoomi, the mythical birthplace of Rama.[46][page needed] The earliest record of the mosque may be traced back to 1767, in the Latin book Descriptio Indiae, authored by the Jesuit missionary Joseph Tiefenthaler. According to him, the local population believed that the mosque was constructed by destroying the Ramkot temple, believed to be the fortress of Rama in Ayodhya, and the Bedi, where the birthplace of Rama is situated.[47][48]

The Ram Mandir is being built to commemorate the birth of Rama at his Janmasthan. The presiding deity of the temple is supposed to be the infant form of Rama, an avatar of Vishnu. Rama is a prominent Hindu deity who is regarded as a Pūrṇāvatāra of the God Vishnu and some Hindus view Rama as Para Brahman. Three places in Ayodhya are considered as prominent puṇya kṣētra by Hindus. They are Janmasthāna, Svargadvāra and Yajñasthala. The site is the former location of the Babri Masjid, which was built in the 16th century and demolished in 1992. In 2019, the Supreme Court of India delivered the verdict to give the disputed land to Hindus for the construction of a temple, while Muslims would be given land elsewhere to construct a mosque.

Rama, a Hindu deity and avatar of Vishnu, is the principal deity of Ram Mandir in Ayodhya. Considered the ideal man and embodiment of dharma, Rama's birthplace Ayodhya is sacred to Hindus. The temple site, formerly Babri Masjid, was disputed until a 2019 Supreme Court verdict allowed temple construction. The infant form, Ram Lalla, is the presiding deity.
The Ram Mandir is being built to commemorate the birth of Rama at his Janmasthan. The presiding deity of the temple is supposed to be the infant form of Rama, an avatar of Vishnu. The site is the former location of the Babri Masjid, which was built in the 16th century and demolished in 1992. In 2019, the Supreme Court of India delivered the verdict to give the disputed land to Hindus for the construction of a temple.

Version

0.1.30

Steps to Reproduce

mentioned above

Relevant Logs/Tracbacks

No response

MeghaWalia-eco commented 2 months ago

llama_index==0.10.30 llama-index-core==0.10.57 langchain==0.0.312 tiktoken python-dotenv fastapi==0.98.0 loguru pymongo==4.6.1 motor bs4 newrelic==8.8.1 okta_jwt_verifier sqlalchemy==2.0.25 pgvector>=0.2.4,<0.3.0 # Adjusted for compatibility psycopg2-binary==2.9.9 asyncpg==0.29.0 tenacity==8.2.3 opensearch-py==2.5.0 greenlet==2.0.2 uvicorn==0.24.0.post1 nltk==3.8.1 xmltodict==0.13.0 llama-hub==0.0.12 boto3==1.34.122 certifi==2023.7.22 alembic==1.11.3 python-jose==3.3.0 redis cachetools==5.3.2 unstructured==0.10.19

LLaMA related packages, ensure no version conflicts

llama-index-llms-openai==0.1.28 llama-index-agent-openai llama-index-vector-stores-postgres==0.1.5 llama-index-vector-stores-opensearch==0.1.11 llama-index-embeddings-text-embeddings-inference==0.1.2 llama-index-storage-docstore-mongodb llama-index-readers-s3 llama-index-storage-index-store-mongodb llama-index-storage-kvstore-mongodb==0.1.2 llama-index-embeddings-openai llama-index-llms-bedrock-converse==0.1.6 PyJWT==2.0.0

For testing

pytest==7.2.1 pytest-asyncio==0.23.7 parameterized==0.9.0 aioresponses==0.7.6 pre-commit sagemaker==2.223.0

For CDK

aws-cdk-lib==2.146.0 constructs>=10.0.0,<11.0.0 setuptools==70.2.0

dosubot[bot] commented 2 months ago

To resolve the "Invalid payload! Need prompt and completion or messages and response" error when using the BedrockConverse LLM in your dev environment, ensure that you are providing the required parameters correctly. The error indicates that either the prompt and completion or messages and response parameters are missing or incorrectly formatted.

Here is an example of how to correctly use the complete method:

Using complete Method

Ensure you provide the prompt parameter:

from llama_index.llms.bedrock_converse import BedrockConverse

llm = BedrockConverse(
    model="anthropic.claude-3-5-sonnet-20240620-v1:0",
    region_name="us-east-1",
    timeout=60,
    callback_manager=self.__rag_svc.get_llm_callback_manager(
        handlers=[llm_debug_callback, llm_counter_callback]
    )
)

resp = llm.complete(prompt="The Economist Style guide is below: ...")
print(resp)

Configuration Differences

The specific differences in the configuration or environment between the local and dev environments that could affect the BedrockConverse LLM's behavior include:

  1. AWS Credentials:

    • aws_access_key_id
    • aws_secret_access_key
    • aws_session_token
  2. AWS Region:

    • region_name
  3. AWS Profile:

    • profile_name
  4. Botocore Session and Configuration:

    • botocore_session
    • botocore_config
  5. Timeout and Retries:

    • timeout
    • max_retries
  6. Additional Configuration:

    • additional_kwargs

Logging and Debugging

Enable additional logging to get more insight into the error:

import logging
import sys

logging.basicConfig(stream=sys.stdout, level=logging.DEBUG)
logging.getLogger().addHandler(logging.StreamHandler(stream=sys.stdout))

Additionally, you can enable the LlamaLogger in the service context to get detailed logs.

By ensuring the correct parameters are provided and checking the configuration differences, you should be able to resolve the "Invalid payload" error [1][2][3].

To continue talking to Dosu, mention @dosu.

MeghaWalia-eco commented 2 months ago

O i think you are right , but llm.complete has first parameter as prompt def complete( self, prompt: str, formatted: bool = False, **kwargs: Any ) -> CompletionResponse:

i don't think we need to pass in explicipely. config is absolutely in sync, please grill down a bit more, i don't think it is the basic one

dosubot[bot] commented 2 months ago

To further investigate the "Invalid payload! Need prompt and completion or messages and response" error, let's focus on the converse_with_retry function and ensure the payload is correctly formed before making the API call.

Debugging the Payload

Add debugging statements to verify the payload content before invoking the converse_with_retry function. This will help identify if the payload is correctly formed:

def chat(self, messages: Sequence[ChatMessage], **kwargs: Any) -> ChatResponse:
    converse_messages, system_prompt = messages_to_converse_messages(messages)
    if len(system_prompt) > 0 or self.system_prompt is None:
        self.system_prompt = system_prompt
    all_kwargs = self._get_all_kwargs(**kwargs)

    # Debugging statement
    print(f"Payload: messages={converse_messages}, system_prompt={self.system_prompt}, kwargs={all_kwargs}")

    response = converse_with_retry(
        client=self._client,
        messages=converse_messages,
        system_prompt=self.system_prompt,
        max_retries=self.max_retries,
        stream=False,
        **all_kwargs,
    )
    ...

Ensuring Correct Payload Structure

The converse_with_retry function requires a specific payload structure. Ensure the payload includes the necessary fields:

Here is the relevant code snippet for converse_with_retry:

def converse_with_retry(
    client: Any,
    model: str,
    messages: Sequence[Dict[str, Any]],
    max_retries: int = 3,
    system_prompt: Optional[str] = None,
    max_tokens: int = 1000,
    temperature: float = 0.1,
    stream: bool = False,
    **kwargs: Any,
) -> Any:
    """Use tenacity to retry the completion call."""
    retry_decorator = _create_retry_decorator(client=client, max_retries=max_retries)
    converse_kwargs = {
        "modelId": model,
        "messages": messages,
        "inferenceConfig": {
            "maxTokens": max_tokens,
            "temperature": temperature,
        },
    }
    if system_prompt:
        converse_kwargs["system"] = [{"text": system_prompt}]
    if tool_config := kwargs.get("tools"):
        converse_kwargs["toolConfig"] = tool_config
    converse_kwargs = join_two_dicts(
        converse_kwargs, {k: v for k, v in kwargs.items() if k != "tools"}
    )

    @retry_decorator
    def _conversion_with_retry(**kwargs: Any) -> Any:
        if stream:
            return client.converse_stream(**kwargs)
        return client.converse(**kwargs)

    return _conversion_with_retry(**converse_kwargs)

Environment Configuration

Ensure that the environment configurations are consistent between local and dev environments, especially the following:

By ensuring the correct payload structure and consistent environment configurations, you should be able to resolve the "Invalid payload" error [1][2].

To continue talking to Dosu, mention @dosu.

MeghaWalia-eco commented 2 months ago

as i alrready mentioned i debugged it locally it is working fine, i cannot apply debugger on dev as packages are installed at runtime there

MeghaWalia-eco commented 2 months ago

what permissions shall i add in ecs container to access bedrock converse api, right now i have added below ones for k in sm_secrets: sm_secrets[k].grant_read(task_definition.task_role)

    content_drop_bucket.grant_read(task_definition.task_role)

    bedrock_policy = iam.PolicyStatement(
        actions=[
            "bedrock:InvokeEndpoint",
            "bedrock:ListEndpoints"
        ],
        resources=["*"]
    )
    task_definition.add_to_task_role_policy(bedrock_policy)
dosubot[bot] commented 2 months ago

To ensure your ECS container can access the Bedrock Converse API correctly, you need to configure the following permissions and settings:

  1. AWS Credentials: Ensure that your ECS task has the necessary AWS credentials to access the Bedrock Converse API. This can be done by providing the aws_access_key_id, aws_secret_access_key, and optionally aws_session_token in your configuration. Alternatively, you can use an IAM role with the necessary permissions.

  2. IAM Role: Attach an IAM role to your ECS task with the necessary permissions to access the Bedrock Converse API. The role should have policies that allow actions such as bedrock:InvokeModel.

  3. Region Configuration: Ensure that the region_name is correctly set to the AWS region where the Bedrock Converse API is available.

  4. Network Configuration: Ensure that your ECS task is running in a VPC with the necessary network configurations to access the Bedrock Converse API endpoints.

Here is an example of how you might configure the BedrockConverse class with the necessary AWS credentials and region:

from llama_index.llms.bedrock_converse import BedrockConverse

llm = BedrockConverse(
    model="anthropic.claude-3-5-sonnet-20240620-v1:0",
    aws_access_key_id="YOUR_AWS_ACCESS_KEY_ID",
    aws_secret_access_key="YOUR_AWS_SECRET_ACCESS_KEY",
    aws_session_token="YOUR_AWS_SESSION_TOKEN",  # Optional
    region_name="us-east-1",
)

resp = llm.complete("Paul Graham is ")
print(resp)

Additionally, ensure that your ECS task definition includes the necessary IAM role and network configurations. Here is an example of an IAM policy that you might attach to the role:

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Effect": "Allow",
            "Action": [
                "bedrock:InvokeModel"
            ],
            "Resource": "*"
        }
    ]
}

By ensuring these configurations and permissions, your ECS container should be able to access the Bedrock Converse API correctly [1].

To continue talking to Dosu, mention @dosu.

MeghaWalia-eco commented 2 months ago

do you think Error in summarization Invalid payload! Need prompt and completion or messages and response.this error can anyhow be related to ecs container not able to access converse api, my prompt structure is right as bedrock convers api itself convers the prompt into correct format

dosubot[bot] commented 2 months ago

Yes, the error "Invalid payload! Need prompt and completion or messages and response" could be related to the ECS container's inability to access the Bedrock Converse API. To resolve this issue, ensure the following:

  1. IAM Role and Permissions:

    • Attach an IAM role to the ECS task with the necessary permissions to access the Bedrock Converse API.
    • The IAM role should have a policy that allows the bedrock:InvokeModel action. Here is an example policy:
    {
        "Version": "2012-10-17",
        "Statement": [
            {
                "Effect": "Allow",
                "Action": [
                    "bedrock:InvokeModel"
                ],
                "Resource": "*"
            }
        ]
    }
  2. Network Configuration:

    • Ensure the ECS task is running in a VPC with internet access or a VPC endpoint configured for the Bedrock Converse API.
  3. Environment Variables:

    • Ensure the necessary AWS credentials (AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY, AWS_SESSION_TOKEN) and region (AWS_REGION) are correctly set in the ECS task definition or provided through the IAM role.

By ensuring these configurations, the ECS container should be able to access the Bedrock Converse API without issues [1].

To continue talking to Dosu, mention @dosu.

MeghaWalia-eco commented 2 months ago

Ok my ecs task is running in a vpc and i want to access bedrock converse api from that ecs task, tell me what configurations do i need to change in addition to above one

dosubot[bot] commented 2 months ago

To access the Bedrock Converse API from an ECS task running in a VPC, ensure the following configurations are in place:

  1. IAM Role and Policies:

    • Ensure the ECS task role has the necessary permissions to access AWS Bedrock services. This includes permissions for bedrock:InvokeModel and any other required actions.
    • Example policy:
    {
        "Version": "2012-10-17",
        "Statement": [
            {
                "Effect": "Allow",
                "Action": [
                    "bedrock:InvokeModel"
                ],
                "Resource": "*"
            }
        ]
    }
  2. VPC Configuration:

    • Subnets: Ensure the subnets have access to the internet if you are using public endpoints for AWS services.
    • Security Groups: Ensure the security groups allow outbound traffic to the necessary AWS service endpoints.
    • NAT Gateway: If your ECS tasks are in private subnets, ensure there is a NAT Gateway configured to allow outbound internet access.
  3. Endpoint Configuration:

    • If you are using VPC endpoints for AWS services, ensure the VPC endpoint for the Bedrock service is correctly configured.
  4. AWS Credentials and Region:

    • Ensure the necessary AWS credentials (AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY, AWS_SESSION_TOKEN) and region (AWS_REGION) are correctly set in the ECS task definition or provided through the IAM role.

Here is an example of how you might configure the BedrockConverse class to use specific AWS credentials and region:

from llama_index.llms.bedrock_converse import BedrockConverse

llm = BedrockConverse(
    model="anthropic.claude-3-5-sonnet-20240620-v1:0",
    aws_access_key_id="YOUR_AWS_ACCESS_KEY_ID",
    aws_secret_access_key="YOUR_AWS_SECRET_ACCESS_KEY",
    aws_session_token="YOUR_AWS_SESSION_TOKEN",  # Optional
    region_name="us-east-1",
)

resp = llm.complete("Paul Graham is ")
print(resp)

By ensuring these configurations, your ECS container should be able to access the Bedrock Converse API correctly [1].

To continue talking to Dosu, mention @dosu.

logan-markewich commented 2 months ago

Going to close this out, the issue was that the token counter was masking the real error

Removing the token counter would have shown the real error

This is fixed in latest versions (releasing the fix soon)