Closed FrankIversen closed 9 months ago
Hi @FrankIversen ,
How do you build your docker image ? Can you share your Dockerfile please ?
Kr,
Hi @LGouellec
This is the dockerfile:
FROM mcr.microsoft.com/dotnet/aspnet:7.0-alpine AS base
WORKDIR /app EXPOSE 3333 EXPOSE 443
FROM mcr.microsoft.com/dotnet/sdk:7.0-alpine AS build
COPY ["./nuget.config", "./streamService/"]
ENV PATH="${PATH}:/root/.dotnet/tools" RUN dotnet tool install -g RecreateSolutionStructure RUN export PATH="$PATH:/root/.dotnet/tools"
COPY ["Directory.Build.props", ".editorconfig","stream.service.sln", "./streamService/"]
COPY ["stream.service.sln", "src//.csproj", "src///.csproj","tests//.csproj", "tests///.csproj", "./streamService/"]
COPY ["./src/stream.service.Application/", "./streamService/src/stream.service.Application/"] COPY ["./src/stream.service.Client.WebApi/", "./streamService/src/stream.service.Client.WebApi/"] COPY ["./src/stream.service.Domain/", "./streamService/src/stream.service.Domain/"] COPY ["./src/stream.service.Infrastructure/", "./streamService/src/stream.service.Infrastructure/"] COPY ["./src/logging.lib/", "./streamService/src/logging.lib/"] COPY ["./src/util.schema.events/", "./streamService/src/util.schema.events/"] COPY ["./src/uuid.generator.lib/", "./streamService/src/uuid.generator.lib/"]
RUN recreate-sln-structure "./streamService/stream.service.sln"
RUN dotnet restore "./streamService/stream.service.sln" --configfile "./streamService/nuget.config"
RUN dotnet build "./streamService/src/stream.service.Client.WebApi/stream.service.Client.WebApi.csproj"
FROM build AS publish
RUN dotnet publish "./streamService/src/stream.service.Client.WebApi/stream.service.Client.WebApi.csproj" -c Release --property:PublishDir=/app/publish
FROM base AS final ARG ROCKSDB_VERSION=v7.4.3 WORKDIR /app
COPY --from=publish /app/publish . COPY --from=build "./streamService/src/stream.service.Client.WebApi/Properties" ./Properties/
RUN apk add --no-cache rocksdb libstdc++ bzip2 lz4 RUN ln -s /usr/lib/librocksdb.so.7 /usr/lib/librocksdb.so
RUN chmod a+x "stream.service.Client.WebApi.dll"
EXPOSE 3333 ENV ASPNETCORE_URLS=http://+:3333
ENV SERVICE_ENVIRONMENT=dev3 ENTRYPOINT ["sh", "-c", "dotnet stream.service.Client.WebApi.dll --environment=$SERVICE_ENVIRONMENT"]
@FrankIversen Can you enable the debug logs, rebuild your docker image and share all the logs please ?
var config = new StreamConfig<StringSerDes, StringSerDes>
{
ApplicationId = $"test-app",
BootstrapServers = "localhost:9092",
AutoOffsetReset = AutoOffsetReset.Earliest,
Logger = LoggerFactory.Create(b =>
{
b.SetMinimumLevel(LogLevel.Debug);
b.AddConsole();
}),
Debug = "all"
};
yes, here we go:
info: streamService.Program[0]
Starting up streamService....
info: streamService.Program[0]
ENV=dev3
info: Streamiz.Kafka.Net.KafkaStream[0]
stream-application[2[System.Exception,Streamiz.Kafka.Net.ExceptionHandlerResponse] production.exception.handler: System.Func
2[Confluent.Kafka.DeliveryReport2[System.Byte[],System.Byte[]],Streamiz.Kafka.Net.ExceptionHandlerResponse] deserialization.exception.handler: System.Func
4[Streamiz.Kafka.Net.ProcessorContext,Confluent.Kafka.ConsumeResult2[System.Byte[],System.Byte[]],System.Exception,Streamiz.Kafka.Net.ExceptionHandlerResponse] rocksdb.config.setter: System.Action
2[System.String,Streamiz.Kafka.Net.State.RocksDb.RocksDbOptions]
follow.metadata: False
state.dir: /tmp/streamiz-kafka-net
replication.factor: 1
windowstore.changelog.additional.retention.ms: 86400000
offset.checkpoint.manager:
metrics.interval.ms: 30000
metrics.recording.level: INFO
log.processing.summary: 00:01:00
metrics.reporter: System.Action1[System.Collections.Generic.IEnumerable
1[Streamiz.Kafka.Net.Metrics.Sensor]]
expose.librdkafka.stats: False
start.task.delay.ms: 5000
parallel.processing: False
max.degree.of.parallelism: 8
application.id:
dbug: Streamiz.Kafka.Net.Kafka.Internal.KafkaLoggerAdapter[0] Log admin Unknown - [thrd:app]: Selected provider PLAIN (builtin) for SASL mechanism PLAIN dbug: Streamiz.Kafka.Net.Kafka.Internal.KafkaLoggerAdapter[0] Log admin Unknown - [thrd:app]: Using statically linked OpenSSL version OpenSSL 3.0.8 7 Feb 2023 (0x30000080, librdkafka built with 0x30000080) dbug: Streamiz.Kafka.Net.Kafka.Internal.KafkaLoggerAdapter[0] Log admin Unknown - [thrd:app]: Setting default CA certificate location to /etc/ssl/certs/ca-certificates.crt, override with ssl.ca.location Segmentation fault (core dumped)
Look strange, your application seg fault just at the beginning @FrankIversen Your application run well with an executable on a virtual machine for instance ? What is the host environment where this container is running ?
@LGouellec we traced the problem to the streamsconfig. When we disabled these two lines as seen below, everything was running smoothly
It is probably only one the lines which is a problem, and it is probably the implementation itself of those handlers, but it was enough to have the entire thing grind to a halt with the segmentation error as the result.
Issue closed due to no longer from Streamiz
Description
We are trying to use kafka-streams-dotnet in a docker container and we are following the examples in kafka-streams-dotnet-samples in the setup of the docker container.
We are however hitting a snag. The docker container begins without an issue, but it goes dead when trying to create a stream application
The last thing we see in the log is an info message showing the entire stream config and then after that we get nothing.
the log message reads in the beginning "Start creation of the stream application with this configuration:"![image](https://github.com/LGouellec/kafka-streams-dotnet-samples/assets/33311049/42e8aa1f-ea37-49c5-88f5-55935f7a5288)
bottom of log message![image](https://github.com/LGouellec/kafka-streams-dotnet-samples/assets/33311049/fb836e68-08ea-4693-a5ac-6dbecf72e2e4)
Is this something you have experienced setting up streams in a docker container? Currently we have very little to go on. Do you have an idea how we can move forward? It works fine as long as we dont run it in a container.
How to reproduce
Checklist
Please provide the following information: