Closed N-E-W-T-O-N closed 1 year ago
If it work I will PR in main project .
If it work I will PR in main project .
I am currently deploying this example using Docker. However, there is a caveat: you need to override the Kestrel__Endpoints__Https__Url
setting. When using HTTPS in containers, there can be issues, so I use the environment variable Kestrel__Endpoints__Https__Url=http://+:80
to override this setting.
webapp
FROM node:lts-alpine AS build
WORKDIR /build
COPY . ./
RUN --mount=type=cache,target=/root/.yarn YARN_CACHE_FOLDER=/root/.yarn \
yarn && yarn build
FROM httpd:alpine
WORKDIR /usr/local/apache2/htdocs/
COPY --from=build /build/build/ .
webapi
FROM mcr.microsoft.com/dotnet/sdk:6.0 AS build
WORKDIR /src
# Publish
COPY . ./
RUN --mount=type=cache,id=nuget,target=/root/.nuget/packages\
dotnet build -c Release && dotnet publish -c Release --no-restore -o /out
# Build runtime image
FROM mcr.microsoft.com/dotnet/aspnet:6.0
WORKDIR /app
COPY --from=build /out .
ENTRYPOINT ["dotnet", "CopilotChatWebApi.dll"]
docker-compose.yaml
version: '3'
services:
backend:
container_name: webapi
build: ./webapi
ports:
# replace your webapp's .env 'REACT_APP_BACKEND_URI' with it
- 8090:80
environment:
- Kestrel__Endpoints__Https__Url=http://+:80
- AIService__Endpoint=REPLACE_YOUR_ENDPOINT
- AIService__Key=REPLACE_YOUR_KEY
- AIService__Models__Completion=gpt-35-turbo-0301
- AIService__Models__Embedding=text-embedding-ada-002
- AIService__Models__Planner=gpt-35-turbo-0301
- AIService__Type=AzureOpenAI
- AllowedOrigins__0=http://localhost:3000
- ChatStore__Filesystem__FilePath=./data/chatstore.json
- ChatStore__Type=filesystem
- MemoriesStore__Qdrant__Host=http://qdrant
- MemoriesStore__Type=qdrant
volumes:
- chat_store_data:/app/data
depends_on:
- qdrant
frontend:
container_name: webapp
build: ./webapp
ports:
- 3000:80
qdrant:
container_name: qdarant
image: qdrant/qdrant:latest
ports:
- 6333:6333
volumes:
chat_store_data:
Thank @JadynWong for the dockerfiles.
I am working to Containerize the Copilot-chat-sample using docker-composer . To run Copilot you required two images. 1) webapi(backend) written in .NET which have semantic kernel to call the LLM And generate answer based on user query . 2) webapp(frontend) written in TYPESCRIPT.
I am successfully able to create webapp image
which runs on localhost/3000 .
Now I am facing issue to run the webapi image . The first thing I observed is port localhst/40443 is passing in build through appsetting.json file . I don't know what I am doing wrong . Please anyone which tell me what I am doing wrong
note : currently, I am passing all credentials directly in appsetting.json