paperswithcode / paperswithcode-data

The full dataset behind paperswithcode.com
306 stars 32 forks source link

Duplicate method in papers-with-abstracts #10

Open jmelot opened 3 years ago

jmelot commented 3 years ago

Thanks for this great resource! We ingest PWC data daily and as of about a week ago (April 13) one of our automated checks that checks whether method full_names are unique within a paper in papers-with-abstracts.json.gz started failing. I haven't checked whether this is the case for more than one paper, but for arXiv id 1912.07651, there appear to be two versions of the same method, both:

      {
        "name": "DNAS",
        "full_name": "Differentiable Neural Architecture Search",
        "description": "**DNAS**, or **Differentiable Neural Architecture Search**, uses gradient-based methods to optimize ConvNet architectures, avoiding enumerating and training individual architectures separately as in previous methods. DNAS allows us to explore a layer-wise search space where we can choose a different block for each layer of the network. DNAS represents the search space by a super net whose operators execute stochastically. It relaxes the problem of finding the optimal architecture to find a distribution that yields the optimal architecture. By using the Gumbel Softmax technique, it is possible to directly train the architecture distribution using gradient-based optimization such as SGD.\r\n\r\nThe loss used to train the stochastic super net consists of both the cross-entropy loss that leads to better accuracy and the latency loss that penalizes the network's latency on a target device. To estimate the latency of an architecture, the latency of each operator in the search space is measured and a lookup table model is used to compute the overall latency by adding up the latency of each operator. Using this model allows for estimation of the latency of architectures in an enormous search space. More importantly, it makes the latency differentiable with respect to layer-wise block choices.",
        "introduced_year": 2000,
        "source_url": null,
        "source_title": null,
        "code_snippet_url": "",
        "main_collection": {
          "name": "Neural Architecture Search",
          "description": "**Neural Architecture Search** methods are search methods that seek to learn architectures for machine learning tasks, including the underlying build blocks. Below you can find a continuously updating list of neural architecture search algorithms. ",
          "parent": null,
          "area": "General"
        }
      },

and

      {
        "name": "Differentiable NAS",
        "full_name": "Differentiable Neural Architecture Search",
        "description": "",
        "introduced_year": 2000,
        "source_url": null,
        "source_title": null,
        "code_snippet_url": null,
        "main_collection": {
          "name": "Neural Architecture Search",
          "description": "**Neural Architecture Search** methods are search methods that seek to learn architectures for machine learning tasks, including the underlying build blocks. Below you can find a continuously updating list of neural architecture search algorithms. ",
          "parent": null,
          "area": "General"
        }
      },