-
How to train custom datasets with efficientnetv2?
-
Hi, I couldn't find any tutorials on how to train custom dataset for EfficientNetV2
-
Amazing paper, especially so soon after the last one, great work!
I had one question regarding the oidv6 dataset: did you do any preprocessing to filter out bad classes etc? How many of the 9 million…
-
Hi,I would like to know why you use the a sequential of ConvBnAct instead of EdgeResidual at the first of EfficientNetV2 architecture?In my understanding,the code in https://github.com/google/automl/b…
-
https://colab.research.google.com/github/google/automl/blob/master/efficientnetv2/tfhub.ipynb
```
'efficientnetv2-b0-21k-ft1k': f'gs://cloud-tpu-checkpoints/efficientnet/v2/hub/efficientnetv2-b0-2…
-
Hi @leondgarse
Can you please share the training configuration of the **Mobilenet_emb** model you shared?
https://drive.google.com/file/d/1i0B6Hy1clGgfeOYtUXVPNveDEe2DTIBa/view
(Scheduler para…
-
Hi,
I tried running the sample inference code efficientnetV2/tutorial.ipynb locally and got the error `slice index -1 of dimension 0 out of bounds`.
Running the notebook on google colab gives t…
-
I ran a notebook last night, woke up this morning and re-ran it and it is giving me this error:
RuntimeError: running_mean should contain 4304 elements not 8608
Here is what happened with my froz…
-
Hi,
Thanks for you code. I just find the model parameters which is around 6 million. But most of the researchers report around 78 million parameters of DeepLab v3+. Will you guide me about this probl…
-
Hi
i made tfrecord dataset for training
the datasets made by tfslim code which is download_and_convert_data.py and custom python files.
so i have tfrecord files but this is grayscale datasets…