Open markusz opened 2 weeks ago
v4.0.14
works, but main
still points to v3.1.3
, so this likely affects everyone following the deploy instructions in the docs
Added a fix in #594
That was fast. Thanks!
Well, when I run "npm run config" in Cloud9 in eu-central-1, selecting RAG with Knowledge base, it exits without asking me to save a config file and no config.json is created. Same with "npm run create". I suspect it is related to the change done to config.ts for this fix.
@yoavchaws , I added a fix. Sorry about that.
Thanks @charles-marion. The fix works fine. However, when I create a bedrock based RAG configuration, it does not ask me for the embedding model (embeddingModels[]) to use. As a result, the cdk bootstrap operation falls later with "no embedding model set". A work around is to change one of the embeddingModels[].default to true.
Added a fix in the PR https://github.com/aws-samples/aws-genai-llm-chatbot/pull/598 . Please use the work around listed above until merged.
Thank you for reporting the issue @yoavchaws , the change it merged.
Running
npm run config
fails reproducible when following the deploy instructions.I get the following error:
I verified
npm run config
failing in different environments and configurationsSteps to reproduce:
main
. This will point tov3.1.3
of the aws-genai-llm-chatbot