dmlc / gluon-nlp

NLP made easy
https://nlp.gluon.ai/
Apache License 2.0
2.56k stars 538 forks source link

[CI] AWS batch job tool for GluonNLP (Part I) #1251

Closed szha closed 4 years ago

szha commented 4 years ago

Description

AWS batch job tool for GluonNLP

Checklist

Essentials

Changes

Comments

cc @dmlc/gluon-nlp-team

codecov[bot] commented 4 years ago

Codecov Report

Merging #1251 into numpy will not change coverage. The diff coverage is n/a.

Impacted file tree graph

@@           Coverage Diff           @@
##            numpy    #1251   +/-   ##
=======================================
  Coverage   82.44%   82.44%           
=======================================
  Files          38       38           
  Lines        5450     5450           
=======================================
  Hits         4493     4493           
  Misses        957      957           
sxjscience commented 4 years ago

Would you bind the MXNet version to mxnet>=2.0.0b20200604, <2.0.0b20200621: https://github.com/dmlc/gluon-nlp/blob/numpy/.github/workflows/unittests.yml#L38

chenw23 commented 4 years ago

I have looked at the CI error log. Now I find that https://github.com/dmlc/gluon-nlp/blob/numpy/tests/test_attention_cell.py#L32 produces the error and it seems that this is a variable type error.

self = <[AttributeError("'MultiHeadAttentionCell' object has no attribute '_query_units'",) raised in repr()] MultiHeadAttentionCell object at 0x7f905e3062e8>
query_units = 16, num_heads = 2, attention_dropout = 0.0, scaled = False
normalized = False, eps = 1e-06, dtype = 'float32', layout = 'NKT'
use_einsum = False, prefix = None, params = None
    def __init__(self, query_units=None, num_heads=None, attention_dropout=0.0,
                 scaled: bool = True, normalized: bool = False, eps: float = 1E-6,
                 dtype='float32', layout='NTK', use_einsum=False,
                 prefix=None, params=None):
>       super().__init__(prefix=prefix, params=params)
E       TypeError: __init__() got an unexpected keyword argument 'prefix'
src/gluonnlp/attention_cell.py:606: TypeError
____ test_multi_head_dot_attention_cell[no_share_head-False-False-False-3] _____

Maybe the code owner of this file could have a look at this issue and fix it so that CI can pass tests.

sxjscience commented 4 years ago

This is due to the recent upgrade of gluon API.

Get Outlook for iOShttps://aka.ms/o0ukef


From: WANG, Chen notifications@github.com Sent: Saturday, June 27, 2020 12:49:59 AM To: dmlc/gluon-nlp gluon-nlp@noreply.github.com Cc: Xingjian SHI xshiab@connect.ust.hk; Team mention team_mention@noreply.github.com Subject: Re: [dmlc/gluon-nlp] [CI] AWS batch job tool for GluonNLP (Part I) (#1251)

I have looked at the CI error log. Now I find that https://github.com/dmlc/gluon-nlp/blob/numpy/tests/test_attention_cell.py#L32 produces the error and it seems that this is a variable type error.

self = <[AttributeError("'MultiHeadAttentionCell' object has no attribute '_query_units'",) raised in repr()] MultiHeadAttentionCell object at 0x7f905e3062e8> query_units = 16, num_heads = 2, attention_dropout = 0.0, scaled = False normalized = False, eps = 1e-06, dtype = 'float32', layout = 'NKT' use_einsum = False, prefix = None, params = None def init(self, query_units=None, num_heads=None, attention_dropout=0.0, scaled: bool = True, normalized: bool = False, eps: float = 1E-6, dtype='float32', layout='NTK', use_einsum=False, prefix=None, params=None):

  super().__init__(prefix=prefix, params=params)

E TypeError: init() got an unexpected keyword argument 'prefix' src/gluonnlp/attention_cell.py:606: TypeError test_multi_head_dot_attention_cell[no_sharehead-False-False-False-3]

Maybe the code owner of this file could have a look at this issue and fix it so that CI can pass tests.

— You are receiving this because you are on a team that was mentioned. Reply to this email directly, view it on GitHubhttps://github.com/dmlc/gluon-nlp/pull/1251#issuecomment-650517575, or unsubscribehttps://github.com/notifications/unsubscribe-auth/ABHQH3UUEDVH4VI735CVBMDRYWQCPANCNFSM4OE5OR3A.