cxiang26 / mygluon_cv

yolact, fcos, gluoncv
Apache License 2.0
14 stars 5 forks source link

__init__() got an unexpected keyword argument 'filters' #3

Open satchelwu opened 4 years ago

satchelwu commented 4 years ago

Thanks for your wonderful work for yolact. When I begin to train coco datasets:

init() got an unexpected keyword argument 'filters' , this error appears, I followed and debugged, found that

it seems that the class below does not use the parameter "filters" and "fpn" in construct function, I wonder if there was something missed. Hope you to reply ,thanks very much.

class YOLACT(HybridBlock): """Single-shot Object Detection Network: https://arxiv.org/abs/1512.02325.

Parameters
----------
network : string or None
    Name of the base network, if `None` is used, will instantiate the
    base network from `features` directly instead of composing.
base_size : int
    Base input size, it is speficied so YOLACT can support dynamic input shapes.
features : list of str or mxnet.gluon.HybridBlock
    Intermediate features to be extracted or a network with multi-output.
    If `network` is `None`, `features` is expected to be a multi-output network.
num_filters : list of int
    Number of channels for the appended layers, ignored if `network`is `None`.
sizes : iterable fo float
    Sizes of anchor boxes, this should be a list of floats, in incremental order.
    The length of `sizes` must be len(layers) + 1. For example, a two stage YOLACT
    model can have ``sizes = [30, 60, 90]``, and it converts to `[30, 60]` and
    `[60, 90]` for the two stages, respectively. For more details, please refer
    to original paper.
ratios : iterable of list
    Aspect ratios of anchors in each output layer. Its length must be equals
    to the number of YOLACT output layers.
steps : list of int
    Step size of anchor boxes in each output layer.
classes : iterable of str
    Names of all categories.
use_1x1_transition : bool
    Whether to use 1x1 convolution as transition layer between attached layers,
    it is effective reducing model capacity.
use_bn : bool
    Whether to use BatchNorm layer after each attached convolutional layer.
reduce_ratio : float
    Channel reduce ratio (0, 1) of the transition layer.
min_depth : int
    Minimum channels for the transition layers.
global_pool : bool
    Whether to attach a global average pooling layer as the last output layer.
pretrained : bool or str
    Boolean value controls whether to load the default pretrained weights for model.
    String value represents the hashtag for a certain version of pretrained weights.
stds : tuple of float, default is (0.1, 0.1, 0.2, 0.2)
    Std values to be divided/multiplied to box encoded values.
nms_thresh : float, default is 0.45.
    Non-maximum suppression threshold. You can specify < 0 or > 1 to disable NMS.
nms_topk : int, default is 400
    Apply NMS to top k detection results, use -1 to disable so that every Detection
     result is used in NMS.
post_nms : int, default is 100
    Only return top `post_nms` detection results, the rest is discarded. The number is
    based on COCO dataset which has maximum 100 objects per image. You can adjust this
    number if expecting more objects. You can use -1 to return all detections.
anchor_alloc_size : tuple of int, default is (128, 128)
    For advanced users. Define `anchor_alloc_size` to generate large enough anchor
    maps, which will later saved in parameters. During inference, we support arbitrary
    input image by cropping corresponding area of the anchor map. This allow us
    to export to symbol so we can run it in c++, scalar, etc.
ctx : mx.Context
    Network context.
norm_layer : object
    Normalization layer used (default: :class:`mxnet.gluon.nn.BatchNorm`)
    Can be :class:`mxnet.gluon.nn.BatchNorm` or :class:`mxnet.gluon.contrib.nn.SyncBatchNorm`.
    This will only apply to base networks that has `norm_layer` specified, will ignore if the
    base network (e.g. VGG) don't accept this argument.
norm_kwargs : dict
    Additional `norm_layer` arguments, for example `num_devices=4`
    for :class:`mxnet.gluon.contrib.nn.SyncBatchNorm`.

"""

def init(self, network, base_size, features, sizes, ratios, steps, classes, num_prototypes=64, global_pool=False, pretrained=False, stds=(0.1, 0.1, 0.2, 0.2), nms_thresh=0.45, nms_topk=400, post_nms=100, anchor_alloc_size=128, sge=False, kwargs): super(YOLACT, self).init(kwargs)