huawei-noah / Efficient-AI-Backbones

Efficient AI Backbones including GhostNet, TNT and MLP, developed by Huawei Noah's Ark Lab.
4.03k stars 706 forks source link

how to train ghostnet? #9

Open gtfaiwxm opened 4 years ago

gtfaiwxm commented 4 years ago

how to train ghostnet in my dataset?

iamhankai commented 4 years ago

I think you have the training example for your dataset. Just replace your network with GhostNet and train it.

gtfaiwxm commented 4 years ago

OK,thanks

JayFu commented 4 years ago

OK,thanks

Hi , I was going to replace my network with Ghostnet but it keep throwing bug in tensorpack part. This example works fine with MobileNetv2. File "/media/e/hujiang/anaconda3/envs/tf/lib/python3.6/site-packages/tensorpack/models/batch_norm.py", line 176, in BatchNorm training = ctx.is_training AttributeError: 'NoneType' object has no attribute 'is_training'

Can you tell me how you handle it ?

iamhankai commented 4 years ago

What's the version of your tensorflow and tensorpack? I recommend TensorFlow-1.13.1, Tensorpack-0.9.7.

JayFu commented 4 years ago

What's the version of your tensorflow and tensorpack? I recommend TensorFlow-1.13.1, Tensorpack-0.9.7.

Oh thanks! I was using tf 1.5.0 and tensorpack 0.9.9 indeed. But it seems doesn't work after attempt. Got the same error. Guessing that's because my training example doesn't suit it? Could tell me any recomending training example?

ppwwyyxx commented 4 years ago

You need https://tensorpack.readthedocs.io/tutorial/symbolic.html#use-models-outside-tensorpack

iamhankai commented 4 years ago

Thanks @ppwwyyxx

JayFu commented 4 years ago

@ppwwyyxx TY

KnightWin123 commented 4 years ago

@gtfaiwxm Have you known how to train the Ghostnet?

PistonY commented 4 years ago

Hi @iamhankai thanks for sharing this good work. I success training GhostNet 1.3x to 75.78/92.77 top1/top5, it's almost your paper mentioned. Details here But I use the same training setting with I train MobileNetV3 with some tricks Label smoothing, No decay bias and Dropout. I see your paper and reply in https://github.com/iamhankai/ghostnet.pytorch/issues/18#, it's seem that you don't use any tricks when training this. I've tried MobileNetV3 can't get such high accuracy leaving these tricks. I wonder what tricks when training, if it's possible I remove these and get same result?

iamhankai commented 4 years ago

@PistonY Thanks for you attention. We trained GhostNet using the tricks similar to MobileNetV3 paper, including Label smoothing, No decay bias and Dropout.

ghost commented 3 years ago
from tensorpack import TowerContext
with TowerContext('', is_training=False):
    model = GhostNet(....)
    model.data_format = 'NHWC'
    image_tensor = tf.placeholder(dtype=np.float32, shape=(7, 320, 320, 3), name='image_tensor')
    logits = model.get_logits(image_tensor)
    print(logits)