-
Tool calling and structured output both are not working in ChatBedrock and ChatBedrockConverse when using the LLama3.1 model.
It works fine with Claude
-
### Describe the need of your request
Is it possible to connect AWS Bedrock LLM Models?
### Proposed solution
Add AWS Bedrock configuration to config list
### Additional context
_No response_
-
### Bug Description
I was trying to create a property graph using llama index and tried to store those graphs in AWS Neptune DB, I was getting the connection timed out error. I read that the applicat…
-
With `aws.sdk.kotlin:bedrockruntime` I can setup the client like this:
```kts
val client = BedrockRuntimeClient {
region = "eu-central-1"
credentialsProvider = ProfileCredentials…
-
-
**Is your feature request related to a problem? Please describe.**
The Dockerfile_ecs hardcodes the port `80`. Although this can be remapped to any port on a local environment, this is not an option …
-
I would like to locally host the docker container and pass the relevant environment variables, is this possible and do you have a list of the variables names which would be needed to allow access to A…
-
I'm trying to access the llama 3.1 405B model on AWS bedrock. Here's the code in my Sveltekit server side file:
`
import { json } from '@sveltejs/kit';
import {PRIVATE_AWS_ACCESS_KEY, PRIVATE_…
-
### Problem Statement
Add support for https://docs.sentry.io/product/insights/llm-monitoring/ in JavaScript
### Solution Brainstorm
Things we can support:
- https://github.com/getsentry/sentry-jav…
-
### Before submitting your bug report
- [X] I believe this is a bug. I'll try to join the [Continue Discord](https://discord.gg/NWtdYexhMs) for questions
- [X] I'm not able to find an [open issue](ht…