aws / sagemaker-python-sdk

A library for training and deploying machine learning models on Amazon SageMaker
https://sagemaker.readthedocs.io/
Apache License 2.0
2.1k stars 1.14k forks source link

Unable to deploy huggingface-llm 1.3.3 #4332

Closed LvffY closed 10 months ago

LvffY commented 11 months ago

Describe the bug

I'd like to deploy mistral 0.2 LLM on sagemaker it seems that we need to have the hugging face llm version 1.3.3. For now the huggingface-llm is limited to some versions that does not include this version.

To reproduce

Run the following code :

#!/usr/bin/env python3
import json
import re

import boto3
import sagemaker
from sagemaker.huggingface import HuggingFaceModel, get_huggingface_llm_image_uri

try:
    role = sagemaker.get_execution_role()
except ValueError:
    iam = boto3.client("iam")
    role = iam.get_role(RoleName="exec-role")["Role"][
        "Arn"
    ]

# Hub Model configuration. https://huggingface.co/models
hub = {
    "HF_MODEL_ID": "mistralai/Mistral-7B-Instruct-v0.2",
    "SM_NUM_GPUS": json.dumps(1),
    # "HF_MODEL_QUANTIZE": "gptq",
    # 'HF_TASK':'question-answering',
    # Enable to have long input length, and override default sagemaker values
    # See https://github.com/facebookresearch/llama/issues/450#issuecomment-1645247796
    "MAX_INPUT_LENGTH": json.dumps(4095),
    "MAX_TOTAL_TOKENS": json.dumps(4096),
}

# Ensure endpoint name will be compliant for AWS
regex = r"[^\-a-zA-Z0-9]+"

compliant_name = re.sub(regex, "-", hub["HF_MODEL_ID"])

# create Hugging Face Model Class
huggingface_model = HuggingFaceModel(
    # Here we'd like to have at least 1.3.3
    # See https://github.com/huggingface/text-generation-inference/issues/1342
    image_uri=get_huggingface_llm_image_uri("huggingface", version="1.3.3"),
    env=hub,
    role=role,
    name=compliant_name,
)

# deploy model to SageMaker Inference
predictor = huggingface_model.deploy(
    initial_instance_count=1,
    instance_type="ml.g5.2xlarge",
    container_startup_health_check_timeout=300,
    endpoint_name=compliant_name,
)

Expected behavior

Being able to deploy the huggingface llm version 1.3.3.

Screenshots or logs

image

System information A description of your system. Please provide:

Additional context

If it's a quick fix I could probably help for the PR if needed.

cfregly commented 11 months ago

I believe @philschmid mentions that we're waiting for this PR to be accepted: https://github.com/aws/sagemaker-python-sdk/pull/4314

image
LvffY commented 11 months ago

@cfregly I don't think we're waiting for the same version because @philschmid seemed to wait for 1.3.1 version while I'D like to see the 1.3.3 version

But I may look into this PR to see if I can update the sdk myself if anyone answer here :)

philschmid commented 11 months ago

1.3.1 should be released in the sdk by now. https://github.com/aws/sagemaker-python-sdk/pull/4314

@LvffY can you share why 1.3.3? This could help us accelerate the release

LvffY commented 11 months ago

@philschmid The main idea is to be able to run Mistral 0.2 models. For now, with all supported versions are throwing the issue described in the huggingface repostory.

Looking at the comments, we see that this should be fixed with this PR which is included in the latest released version 1.3.3

cfregly commented 11 months ago

Confirmed that 1.3.1 (SageMaker Python SDK 2.200.1) still throws the same error.

amzn-choeric commented 11 months ago

Noting that the reason we are not able to fetch a v1.3.3 image through the SDK is because there is no actual DLC release in itself for v1.3.3. It is not a bug in the SDK from what I have read so far.

LvffY commented 11 months ago

Noting that the reason we are not able to fetch a v1.3.3 image through the SDK is because there is no actual DLC release in itself for v1.3.3. It is not a bug in the SDK from what I have read so far.

So what should be the way to go here ?

amzn-choeric commented 11 months ago

We will release 1.3.3 through DLC with an ETA of tomorrow.

The associated SDK change can be tracked through https://github.com/aws/sagemaker-python-sdk/pull/4335. However, you can also just reference the image URIs specifically to avoid waiting for an SDK release. Available tags and sample image URI can be found here: https://github.com/aws/deep-learning-containers/releases?q=tgi&expanded=true.

pangyiwei commented 10 months ago

@amzn-choeric Is it possible to release 1.3.4 through DLC as well?

1.3.4 has a fix for this issue which will allow some Mistral models (with flash attention v2) to run on instances with non Ampere GPU

amzn-choeric commented 10 months ago

I believe HuggingFace has requested that we hold until they are able to merge the fixes in for https://github.com/huggingface/text-generation-inference/issues/1334.

Michellehbn commented 10 months ago

hi! A fix has been applied for https://github.com/huggingface/text-generation-inference/issues/1334. cc @philschmid

amzn-choeric commented 10 months ago

I believe that would need to be included in a new release version as deemed appropriate where we can then discuss with HuggingFace about the next steps after.

With regards to the issue at hand, though, it does look like the SDK change has been merged. Thus, closing the issue.