marmelab / ra-supabase

Supabase adapter for react-admin, the frontend framework for building admin applications on top of REST/GraphQL services.
MIT License
156 stars 27 forks source link

Initial file storage support #52

Open kav opened 9 months ago

kav commented 9 months ago

Initial implementation for review. @slax57 what do you think about something like this?

adds options methods

  ...
  storagePath?: string | ((resource: string) => string)
  filenameFromData?: (args: {
    data: string
    resource: string
    field: string
    filename: string
  }) => string

Fixes: #51

adguernier commented 9 months ago

Thank you for this contribution @kav ! :pray:

We need to keep in mind that ra-supabase is a library used by other people, so it should offer customization. We need to think about how the user will use this library smoothly. I see some pitfalls in this first implementation such as:

Based on @slax57 comment, we could implement it as follow:

Expose a helper method to upload files:

export const storeInSupabase = async ({
    supabaseClient,
    bucket,
    data,
}: {
    supabaseClient: SupabaseClient;
    bucket: string;
    data: any;
}) => {
    // Upload file to the bucket
    return Promise.resolve('newFiles');
};

With this in place, the user can use this method in the withLifecycleCallbacks helper on their own:

export const dataProvider = withLifecycleCallbacks(
    supabaseDataProvider({
        instanceUrl: REACT_APP_SUPABASE_URL,
        apiKey: REACT_APP_SUPABASE_KEY,
        supabaseClient,
    }),
    [
        {
            resource: 'posts',
            beforeSave: async (data: any, dataProvider: any) => {
                const newFiles = await storeInSupabase({
                    supabaseClient,
                    bucket: 'my-bucket',
                    data,
                });
                return { ...data, ...newFiles };
            },
        },
    ]
);

We could even go further by exposing another helper to directly improve the dataProvider

export const addUploadCapabilities = ({
    supabaseClient,
    dataProvider,
    resource,
}: {
    supabaseClient: SupabaseClient;
    dataProvider: DataProvider;
    resource: string;
}) =>
    withLifecycleCallbacks(dataProvider, [
        {
            resource,
            beforeUpdate: async params => {
                await storeInSupabase({
                    supabaseClient,
                    bucket: 'my-bucket',
                    data: params,
                });
                return params;
            },
        },
    ]);

With this in place the user can add the capabilities to upload file as follow to their dataProvider:

const dataProvider = addUploadCapabilities({
    supabaseClient,
    dataProvider,
    resource: 'posts',
});

What do you think about that? This code is not irrevocable, feel free to update it.

Do not forget to document it as well please :smile:

Thanks again for contributing!

kav commented 9 months ago

Thank you for this contribution @kav ! 🙏

We need to keep in mind that ra-supabase is a library used by other people, so it should offer customization. We need to think about how the user will use this library smoothly.

Yes, I am deeply aware of the necessary thought that must go into building libraries and frameworks for a wide range of use cases. This was the point of my initial issue and this draft. Namely; discussing the areas where we need to allow customization to make it both easy and flexible enough for all use cases of supabase storage while note interfering with folks not using supabase storage at all.

I think a key goal here for me is that users of the supabaseProvider should have an "easy" time adding file handling. By "easy" I mean not have to read more than the typescript signature or the README for the provider init to get basic file handling working. That means solutions that require the user to have a lot of experience in ReactAdmin, for example adding withLifecycleCallback themselves, are frankly too complex. Conversely solutions that don't support users' use cases and force them to patch or roll their own withLifecycleCallbacks are also... not great. I think the wrapper you suggest might offer the right balance here.

I see some pitfalls in this first implementation such as:

  • it automatically adds support for file upload, which may be something the user does not want,

It's a no-op if they don't provide the storagePath parameter, we could further remove the withLifecycleCallbacks wrapper completely if an option is not provided or as you note add the functionality with a wrapper. It's worth noting it only modifies fields with RawFiles attached. Sending RawFiles directly to ra-postgrest is almost definitely incorrect as that just ends up sending the local path and various other properties as serialized json. The file itself is lost.

That said I'm with you that a user might want to handle some files via this mechanism and some via some other file management system so further granularity on which files to upload is definitely needed.

It's possible a user could handle that by wrapping the existing implentation with a withLifecycleCallbacks of their own to handle other file systems. That would allow them to handle other file systems with their own hooks before we get the data and so without RawFiles we'd skip any already handled.

  • it targets all 'resources' (*), which may be something the user doesn't want either,

Good point, given the user might want to handle different resources with a different mechanism, I suspect this is the best point to handle the attachment of the hooks. I can add the ability to set resources to use the file upload. I suspect this will justify the addUploadCapabilites wrapper being user facing as if and only if a resource is provided does the storagePath parameter become required.

Counter argument to all that is as the last stop before ra-postgrest any file we don't handle won't be handled elsewhere and will therefore be lost no?

  • it does not allow the user to manipulate data

This is an excellent point, I'll look at a good place to put this but I suspect something similar to the existing functions makes sense. Do you have any further thoughts there?

To the opening objective; I don't know that the storeInSupabase helper on it's own is particularly valuable as a user facing function, but I do really like structuring it provides along with addUploadCapabilities regardless of how it's exposed to the user. I'll restructure to that.

I think allowing composition of addUploadCapabilites or perhaps something like withSupabaseStorage to split that out might also be compelling for folks using only the storage component with a different underlying data provider. To that end maybe a different package is in order? Any thoughts there? Biggest obvious downside would be both providers needing a supabaseClient passed but that's not too bad.

One key bit of missing functionality, and unfortunately I think it's intrinsic to the existing LifecycleHooks implementation is that we don't offer any ability to handle the previous file when a user changes it. Obviously the user can structure their files in such a way that it can be inferred but that's something to consider. Related question is upsert, likely we'll need an option with a sensible default there.

djhi commented 9 months ago

One key bit of missing functionality, and unfortunately I think it's intrinsic to the existing LifecycleHooks implementation is that we don't offer any ability to handle the previous file when a user changes it. Obviously the user can structure their files in such a way that it can be inferred but that's something to consider. Related question is upsert, likely we'll need an option with a sensible default there.

Don't use the beforeSave for that. Prefer the beforeCreate and beforeUpdate. beforeUpdate has access to all the update params, including previousData