BingzhaoZhu / XTab

30 stars 1 forks source link

A finetuning issue #1

Open arandomgoodguy opened 12 months ago

arandomgoodguy commented 12 months ago

Hi, highly interesting work, I gave XTab a try immediately after I saw it.

But there is an issue about finetuning.

I had tried to install Autogluon==0.5.3 on Colab but it just failed.

So I installed 0.8.2 instead to test XTab on Colab, but at the beginning of the training process it said that the "finetune_on" parameter causes a KeyError. It seems that there is no place to specify the initial checkpoint directory.

Do you know how to solve this?

Thank you.

BingzhaoZhu commented 11 months ago

Thanks for your interest in our work. I think XTab are not merged to the official AG releases. I uploaded my own branch as a .zip file in this repo. Could you try installing AG from that?

arandomgoodguy commented 11 months ago

Hi, thank for your quick response.

You mean download autogluon-submission.zip and access the files in it, right?

I wonder what's the file name that contains your modified XTab which can accept "finetune_on" parameter? Is it ft_transformer.py which is under "automm" folder?

Can I just extract the core code of your modified model as custom model and then train the model under AG==0.8.2?

Will this cause into numerous errors since the code hadn't been merged, or I must use the AG in your zip, otherwise the code won't work?

BingzhaoZhu commented 11 months ago

I am not sure whether the pretrained checkpoint will work with AG-0.8.2 or not, since I am not aware of the latest updates. A safe way is to just use the version in this repo.

Or you may try this: init a Transformer in AG-0.8.2, save the weights, partly replace the saved weight with the pretrained ckpts (if compatible), and continue train with AG-0.8.2. Sorry that this can be a bit inconvient.

arandomgoodguy commented 11 months ago

I don't see any method regarding saving or replacing the weight of FTT in documentation of AG. So I think I will have to use your version, but I am not entirely sure how to do from 0 to 1 still. Can you show me in detail and the steps? Also, do we need to install other requirement packages?

Innixma commented 8 months ago

Hi @arandomgoodguy, we are currently in the process of adding XTab's pretrained checkpoint to AutoGluon officially!

You can see the PR that is adding the logic here: https://github.com/autogluon/autogluon/pull/3859, which you can try out yourself (refer to the added unit test that uses the XTab weights)