-
**Is your feature request related to a problem?**
There is no way to start or stop Sagemaker notebook instances via CR.
**Describe the solution you'd like**
```
apiVersion: sagemaker.service…
-
Traceback (most recent call last):
File "/home/sagemaker-user/CogVLM/basic_demo/cli_demo_sat.py", line 162, in
main()
File "/home/sagemaker-user/CogVLM/basic_demo/cli_demo_sat.py", line 37…
-
please address this issue. if a sagemaker endpoint deployment hits a resource limit, it gets stuck forever and there is no option to delete it: https://stackoverflow.com/questions/65678237/sagemaker-e…
-
Hi, hope some one may help pls:
Im getting this error when try to implement my bento on sagemaker:
```
2024-02-07T14:58:58+0000 [INFO] [cli] Service loaded from Bento directory: bentoml.Service(t…
-
Currently, the SageMaker plugin SDK cannot properly start without a code patch to the SDK. The following is the error they get:
```
2022-06-28T16:29:02.737432Z [rstudio-sagemaker-launcher] INFO Re…
-
## **error pops on local search while embedding, global search works fine.**
creating embedding llm client with {'api_key': 'REDACTED,len=32', 'type': "openai_embedding", 'model': 'nomic-embed-t…
-
It is desirable to collect snow patch GT, independent of the SCL labels, and at 10m resolution. AWS SageMaker appears to have the capability. This issues concerns creation of the tool/environment …
-
This causes Exception https://github.com/BerriAI/litellm/blob/cace0bd6fbd77e3abf4723db7c1d459c90e5abe2/litellm/llms/sagemaker.py#L593
`{
"error": {
"message": "Received client error (…
-
Hi, my team and I are trying to implement Flowise for our GenAI projects. We are wondering if there is a way to invoke ChatModels/LLMs/Embeddings through AWS SageMaker Endpoints similar to how Hugging…
-
For the Type 3 AUC23 challenge we are organizing, it would be nice to give participants the option to develop their training algorithms in the SageMaker environment during the T2 part of the challenge…