Open kgparmar739 opened 2 years ago
@mahdighorbanpour I am getting below error
Response status code does not indicate success: BadRequest (400); Substatus: 0; ActivityId: d84019bf-37a2-4bde-9c88-acdb95b0ee56; Reason: (Message: {"Errors":["Encountered exception while executing function. Exception = TypeError: Object doesn't support property or method 'warn'\r\nStack trace: TypeError: Object doesn't support property or method 'warn'\n at Anonymous function (script.js:21:7)\n at Anonymous function (script.js:687:29)"]}
@kgparmar739 are you using v 1.0 or 2.0?
@kgparmar739 I've created a PR that will bubble the insert exceptions you are encountering up to the Serilog
SelfLog
where you can get more information about what the issue is. Once accepted and pushed you can diagnose the issue. Please follow that PR for updates.
@kgparmar739 I've created a PR that will bubble the insert exceptions you are encountering up to the
Serilog
SelfLog
where you can get more information about what the issue is. Once accepted and pushed you can diagnose the issue. Please follow that PR for updates.
Thanks @tghamm, i wil check with latest version.
I have a question like is bulk import will work if all records are having individual partition key value?
As in my case I have records which has different partition key value for each record..
@kgparmar739 that's probably your problem. This sink assumes all logs in a single batch will have the same partition key value. It's derived from the first event in the batch, so if others in the same batch don't have the same value, it might cause your error.
@kgparmar739 this can probably be worked around as well. Can you provide more insight into your partition provider and value?
@tghamm my partition provider is custom one which will read the value from message context. and value will be new GUID.
@kgparmar739 ok thanks, give me a bit and I'll see what I can cook up to deal with it. The easy fix is to group by the key that's created but in your case that means losing bulk insert as it's written so I want to see if there's any way to rewrite the bulk query to handle your scenario.
@kgparmar739 is "PartitionKey" an object on your LogEvent
? I notice you are using it as your partition key.
@kgparmar739 I'm having trouble reproducing. Even when using a guid
, it seems to insert, it just uses the first guid
it comes across in the batch for all in the batch, they don't actually fail to insert. Would you have the ability to run my PR version and enable Serilog.Debugging.SelfLog.Enable(Console.WriteLine)
to see what error message is being returned to help further diagnose?
@kgparmar739 is "PartitionKey" an object on your
LogEvent
? I notice you are using it as your partition key.
It is not object. Actually we don't want to use that partition key as our partition key resides inside the context. so, it is something like this properties/track/partitionkey where track is our custom object which contains information.
Model look like this
public class Log
{
public Int64 EventIdHash { get; set; }
public DateTime TimeStamp { get; set; }
public string Level { get; set; }
public string Message { get; set; }
public string MessageTemplate { get; set; }
public Properties Properties { get; set; }
public Guid Id { get; set; }
public string PartitionKey { get; set; }
}
public class Properties
{
public Track Track { get; set; }
}
public class Track
{
public string PartitionKey { get; set; }
}
While inserting PartitionKey value will be guid or you can say always unique. While inserting I am doing this
Log.ForContext("Track", trackObj, true).Information($"{trackObj.PartitionKey}")
Let me know if you need more information.
@kgparmar739 I'm having trouble reproducing. Even when using a
guid
, it seems to insert, it just uses the firstguid
it comes across in the batch for all in the batch, they don't actually fail to insert. Would you have the ability to run my PR version and enableSerilog.Debugging.SelfLog.Enable(Console.WriteLine)
to see what error message is being returned to help further diagnose?
I will try and let you know by tomorrow eod
Thanks!
Hi when I try to insert 10 records using postman runner. Out of 10 only 2 or 3 records are getting added in cosmos db rest are getting 400 bad request. Here is my code.
I have simple post api.
Is there any configuration or setting which i missed over here?
Can you also guide me how I can enable detail level logging for capturing more information about failure.