supabase-community / postgrest-csharp

A C# Client library for Postgrest
https://supabase-community.github.io/postgrest-csharp/api/Postgrest.html
MIT License
119 stars 25 forks source link

Issue sending non-string data through RPC #74

Closed corrideat closed 1 year ago

corrideat commented 1 year ago

Bug report

Describe the bug

Line 100 in Postgrest/Client.cs seems wrong, but also intentional.

It is taking a Dictionary<string, object> in, serialising it as JSON and then deserialising it as a Dictionary<string, string>.

Why is this being done? Forcing things to be a string seems to break nested structures, with no good alternative, unless I'm missing something.

To Reproduce

For example, the following gets a deserialization error:

await db.Rpc("foo", new Dictionary<string, object> {
    {
        "bar",
        new Dictionary<string, object> { { "baz", "qux" } }
    }
});

Expected behavior

Objects should work. I believe the fix could be as simple as (1) not serialising and deserialising or (2) deserialising as Dictionary<string, object>.

Screenshots

NA

System information

Additional context

NA

wiverson commented 1 year ago

What kind of non-string/JSON data are you trying to send? blob/bytea? Off the cuff I think most folks are using the storage APIs for this not PostgREST...

corrideat commented 1 year ago

@wiverson That was fast! I'm trying to send JSON data, just not strings (and typically I don't do C#, so I may have missed something obvious).

Am I missing something for sending nested JSON data?

Simplified, this is my actual code:

var data = new Dictionary<string, object> {
    {
        "data",
        new Dictionary<string, object> {
            { "year", 2023 },
            { "start_date", "2023-01-01" },
            { "end_date", "2023-12-31" },
            { "currency", "USD" },
        }
    },
    { "source_id", 1 },
    {
        "organization_external_id",
        new Dictionary<string, object> {
            { "name", "foo" },
            { "value", "bar" },
        }
    },
};

await supabase.Rpc("import_data", data);

Running that results in this stacktrace:

Unhandled exception. Newtonsoft.Json.JsonReaderException: Unexpected character encountered while parsing value: {. Path 'data', line 1, position 9.
   at Newtonsoft.Json.JsonTextReader.ReadStringValue(ReadType readType)
   at Newtonsoft.Json.JsonTextReader.ReadAsString()
   at Newtonsoft.Json.JsonReader.ReadForType(JsonContract contract, Boolean hasConverter)
   at Newtonsoft.Json.Serialization.JsonSerializerInternalReader.PopulateDictionary(IDictionary dictionary, JsonReader reader, JsonDictionaryContract contract, JsonProperty containerProperty, String id)
   at Newtonsoft.Json.Serialization.JsonSerializerInternalReader.CreateObject(JsonReader reader, Type objectType, JsonContract contract, JsonProperty member, JsonContainerContract containerContract, JsonProperty containerMember, Object existingValue)
   at Newtonsoft.Json.Serialization.JsonSerializerInternalReader.CreateValueInternal(JsonReader reader, Type objectType, JsonContract contract, JsonProperty member, JsonContainerContract containerContract, JsonProperty containerMember, Object existingValue)
   at Newtonsoft.Json.Serialization.JsonSerializerInternalReader.Deserialize(JsonReader reader, Type objectType, Boolean checkAdditionalContent)
   at Newtonsoft.Json.JsonSerializer.DeserializeInternal(JsonReader reader, Type objectType)
   at Newtonsoft.Json.JsonSerializer.Deserialize(JsonReader reader, Type objectType)
   at Newtonsoft.Json.JsonConvert.DeserializeObject(String value, Type type, JsonSerializerSettings settings)
   at Newtonsoft.Json.JsonConvert.DeserializeObject[T](String value, JsonSerializerSettings settings)
   at Newtonsoft.Json.JsonConvert.DeserializeObject[T](String value)
   at Postgrest.Client.Rpc(String procedureName, Dictionary`2 parameters)
   at Supabase.Client.Rpc(String procedureName, Dictionary`2 parameters)

If I were using the HTTP API directly, I'd send a JSON payload looking like this:

{
    "data": {
        "year": 2023,
        "start_date": "2023-01-01",
        "end_date": "2023-12-31",
        "currency": "USD"
    },
    "source_id": 1,
    "organization_external_id": {
        "name": "foo",
        "value": "bar"
    }
}

Or, using the JavaScript API, it'd look like:

await supabase
        .rpc('import_data', {
                data: {
                        year: 2023,
                        start_date: '2023-01-01',
                        end_date: '2023-12-31',
                        currency: 'USD',
                },
                source_id: 1,
                organization_external_id: {
                        name: 'foo',
                        value: 'bar',
                },
        });
wiverson commented 1 year ago

Oh, you just caught me right while I was cleaning out email ;)

Here's what I'm doing, might be helpful. Getting the RPC stuff working was one of the first things I did when I started working with Supabase C#, and when I got this working I kind of moved on to the next thing, which was sorting out auth stuff.

public async Task<Child> Create(Family family, string nickname)
{
    Task<BaseResponse> createChild = _supabase.Rpc("child_add",
        new Dictionary<string, object>
        {
            { "family_uuid", family.Id },
            { "child_nickname", nickname }
        });
    try
    {
        await createChild;
        if (createChild.Exception != null)
            throw new ServerException(createChild.Exception.Message, createChild.Exception);

        Child child = new()
        {
            Id = GuidUtil.ToGuid(createChild.Result),
            FamilyId = family.Id,
            Nickname = nickname
        };

        return child;
    }
    catch (PostgrestException requestException)
    {
        throw new ServerException(requestException.Message, requestException);
    }
}

TBH this is probably not right in that I think that I'm probably doing more work than I should in terms of the marshaling back. IIRC there is a version of calling the rpc stuff where it lets you do the fancier stuff (e.g. filtering if it's a table result set) but this working for me. Now I have a bunch of services and test cases all working like this and I don't know if I'm going to bother cleaning up. Ahem.

corrideat commented 1 year ago

Thanks! Well, your example works because your datatypes are strings and you aren't nesting dictionaries. While I could use strings and use tuple types, it's not quite as convenient as named parameters, especially if the DB schema changes.

I was mostly wondering if there's a reason for the double marshalling (maybe some kind of implicit validation?) I could make a PR with the changes to support this, which looks simple enough, probably changing just that one line plus testing.

wiverson commented 1 year ago

Yeah, I think a PR with other ways to call the RPC would be nice. It's a bit confusing as there's calling a function with a string argument and then there's calling it with json and I'm not quite sure how that all matches up. I'm also just focused on writing PL/pgSQL and not using JavaScript/V8. At first I was interested but then I realized that's the JS stuff isn't like the modern TS stuff I'm used to with SvelteKit and vite, lol. At first I wasn't sure about PL/pgSQL (my background is more hardcore Java & ORMs) but tbh ChatGPT worked really well to help me come up to speed.

At one point I was thinking a method that just takes a string and returns the response as a string would be nice. That way you could control the processing independently.

corrideat commented 1 year ago

I've now made a PR (#75) which addresses this and adds a new test case.

I'm also just focused on writing PL/pgSQL

I get you. I'm using this to call into some PL/pgSQL to insert data and process it. The good thing is that you can do quite a bit with it and is nice to use with PostgREST because you get a pretty good REST API, and you're free to switch the frontend to whatever works.

Initially, I was gonna write the application I'm using this for in JS, but as it turns out it wasn't a good choice to read huge Excel files, which actual Excel can do well.

At one point I was thinking a method that just takes a string

I thought of that too and would've solved my issue here! Since PostgREST (mostly) expects JSON I think the current approach is fine, with the fix to support arbitrary JSON.