google-research / big_transfer

Official repository for the "Big Transfer (BiT): General Visual Representation Learning" paper.
https://arxiv.org/abs/1912.11370
Apache License 2.0
1.5k stars 175 forks source link

Extract Features #27

Open wyp19930313 opened 4 years ago

wyp19930313 commented 4 years ago

I read your paper. The upstream pre-training input is 224. There are 128 and 480 inputs for fineturn in downstream tasks. If I want to use your pre-trained model to extract features, may I ask if my input should be set to 224, or should I use a larger input of 300 or even 480? Looking forward to your reply.

lucasb-eyer commented 4 years ago

Good question, I have no idea what would work best since we have not tried this setting! My guess is that it will depend on your dataset and the task you are considering, so best to just try it out!

baqarhussain11-ux commented 4 years ago

I have read the paper and also reviewed the code shared for the fine tuning. In my opinion this kind of work and the idea is fantastic. I have understood your code somehow but have some queries regarding fine tuning.

  1. Can i use that code for fine tuning my own customized dataset from my PC directory or from google drive? because what i have experimented is that this code can only use the datasets that are available in the tfds(Tensorflow Datasets) directory, it downloads only those datasets that are available in the tensorflow website. What changes can i expect in respect of the shared code to use my customized dataset from PC directory?
  2. I want to ask another thing , Can i use that code for fine tuning customize layers i.e(last one layer, last 5 layers) ? if yes, then how with respect to the available code ?. Looking forward to your reply.