Closed st1312st closed 4 years ago
Hello @st1312st, thank you for your interest in our work! Please visit our Custom Training Tutorial to get started, and see our Google Colab Notebook, Docker Image, and GCP Quickstart Guide for example environments.
If this is a bug report, please provide screenshots and minimum viable code to reproduce your issue, otherwise we can not help you.
@st1312st that is a very good discovery!
And a bit strange also, as our mAP results are great with both the other darknet-trained yolo3.weights and our newer pytorch-trained yolov3-spp-ultralytics.pt models, so it appears this omission has not negatively impacted anything.
Perhaps this means that leaky(leaky(x) + leaky(y))
is not that different that leaky(x) + leaky(y)
OK,it is amazing ,thank you! ######################################################### elif mdef['type'] == 'shortcut': # nn.Sequential() placeholder for 'shortcut' layer layers = mdef['from'] filters = output_filters[-1] routs.extend([i + l if l < 0 else l for l in layers]) modules = WeightedFeatureFusion(layers=layers, weight='weights_type' in mdef) if mdef['activation'] == 'leaky': modules.add_module('activation', nn.LeakyReLU(0.1, inplace=True))
############################################################ Maybe this code can work.
@st1312st can you please validate the effects of the change by running test.py on coco mAP before and after? Thank you!
This issue is stale because it has been open 30 days with no activity. Remove Stale label or comment or this will be closed in 5 days.
Hello!
I see [shotcut] have activation in .cfg file,but i can not see shotcut use activation in code. ####################################### in .cfg [shortcut] from=-4 activation=leaky
####################################### in model.py
####################################### in utils/layers.py
class WeightedFeatureFusion(nn.Module): def init(self, layers, weight=False): super(WeightedFeatureFusion, self).init() self.layers = layers # layer indices self.weight = weight # apply weights boolean self.n = len(layers) + 1 # number of layers if weight: self.w = nn.Parameter(torch.zeros(self.n), requires_grad=True) # layer weights
########################################
so can i use [shotcut] with activation? If everyone can help me, Thank you!