Closed fanliang11 closed 6 years ago
@fanliang11 do you have some step to reproduce the problem?
@fanliang11 in linux it my be https://github.com/Azure/DotNetty/pull/336
@caozhiyuan it looks like read issue, not the write outbound.
it because send mix byte order , so parse packet error. i send 40m , in linux it will get wouldblock . and it has bug, it may be only send 1m. netty get it send complete and send next message. so server will parse error in 336 pull has fixed it , code run ok on dotnet core sdk 2.0.3 or more. it has bug in dotnet core sdk 2.0.0 in linux
@caozhiyuan anyway, if you could share some steps to reproduce, that would be a bit more helpful.
ServerBootstrap added:
pipeline.AddLast (new TransportMessageChannelHandlerAdapter (_transportMessageDecoder));
TransportMessageChannelHandlerAdapter :
class TransportMessageChannelHandlerAdapter : ChannelHandlerAdapter
{
private readonly ITransportMessageDecoder _transportMessageDecoder;
public TransportMessageChannelHandlerAdapter(ITransportMessageDecoder transportMessageDecoder)
{
_transportMessageDecoder = transportMessageDecoder;
}
#region Overrides of ChannelHandlerAdapter
public override void ChannelRead(IChannelHandlerContext context, object message)
{
var buffer = (IByteBuffer)message;
var data = new byte[buffer.ReadableBytes];
buffer.ReadBytes(data);
**var transportMessage = _transportMessageDecoder.Decode(data);** //throw BUG
context.FireChannelRead(transportMessage);
ReferenceCountUtil.Release(buffer);
}
#endregion Overrides of ChannelHandlerAdapter
}
big data can not be decoded?Is it caused by sending in batches?
windows or linux? @fanliang11
@caozhiyuan Thank you for your answer,Only test the window to throw a question,I use json.net, messagepack or protobuffer decoding error
@fanliang11 could you also share the data send to the server?
Just by looking at this, have you considered ByteOrder as @caozhiyuan mentioned above. Also, if the data is sent from another platform, there might be ByteOrder watermarks. @fanliang11 could you capture the received data?
@StormHub thank you for your answor, I was encoded through third-party components to send, and then receive the decoding。
may be this bug? https://github.com/Azure/DotNetty/pull/326/files , online nuget pkg has bug in LengthFieldPrepender.cs @fanliang11 git clone dotnetty use dev, and try agian.
@fanliang11 could capture the data causing the error, I am wondering whether it is a byte order issue or not.
@StormHub I am sorry, reply late,no error was thrown and no complete packet was received resulting in a decoding failure
@caozhiyuan Thank you for your answer, Seems to be caused by this problem, when to release nuget。
dotnetty is very good, so surging is based on the microservice framework of dotnetty, but there is a problem that I decode in the ChannelRead method, decoding over 40KB will throw BUG, I have set LengthFieldBasedFrameDecoder, also need to set other options? Please help me, this problem has troubled me for a long time. https://github.com/dotnetcore/surging/tree/master/src/Surging.Core/Surging.Core.DotNetty