awslabs / gluonts

Probabilistic time series modeling in Python
https://ts.gluon.ai
Apache License 2.0
4.62k stars 753 forks source link

RNNEncoder: "HybridBlock, we do not support mixed NDArrays and Symbols" #583

Closed jaheba closed 4 years ago

jaheba commented 4 years ago

I've tried installing mxnet 1.6 (pre-release) and running our tests with it.

There seems to be an issue with the RNNEncoder:

src/gluonts/evaluation/backtest.py:177: in backtest_metrics
    predictor = forecaster.train(train_dataset)
src/gluonts/model/estimator.py:223: in train
    return self.train_model(training_data, validation_data).predictor
src/gluonts/model/estimator.py:208: in train_model
    validation_iter=validation_data_loader,
src/gluonts/trainer/_base.py:297: in __call__
    epoch_loss = loop(epoch_no, train_iter)
src/gluonts/trainer/_base.py:237: in loop
    output = net(*inputs)
../../.virtualenvs/gluon-ts/lib/python3.7/site-packages/mxnet/gluon/block.py:694: in __call__
    out = self.forward(*args)
../../.virtualenvs/gluon-ts/lib/python3.7/site-packages/mxnet/gluon/block.py:1152: in forward
    return self._call_cached_op(x, *args)
../../.virtualenvs/gluon-ts/lib/python3.7/site-packages/mxnet/gluon/block.py:982: in _call_cached_op
    self._build_cache(*args)
../../.virtualenvs/gluon-ts/lib/python3.7/site-packages/mxnet/gluon/block.py:934: in _build_cache
    data, out = self._get_graph(*args)
../../.virtualenvs/gluon-ts/lib/python3.7/site-packages/mxnet/gluon/block.py:926: in _get_graph
    out = self.hybrid_forward(symbol, *grouped_inputs, **params)  # pylint: disable=no-value-for-parameter
src/gluonts/model/seq2seq/_forking_network.py:93: in hybrid_forward
    past_target, feat_static_real, past_feat_dynamic_real
../../.virtualenvs/gluon-ts/lib/python3.7/site-packages/mxnet/gluon/block.py:694: in __call__
    out = self.forward(*args)

self = gluonts.block.encoder.RNNEncoder(bidirectional=True, hidden_size=50, mode="gru", num_layers=1, prefix="encoder_"), x = <Symbol data0>
args = (
[]
<NDArray 0 @cpu(0)>, 
[]
<NDArray 0 @cpu(0)>), has_symbol = True, has_ndarray = True, ctx_set = {cpu(0)}

    def forward(self, x, *args):
        """Defines the forward computation. Arguments can be either
        :py:class:`NDArray` or :py:class:`Symbol`."""

        has_symbol, has_ndarray, ctx_set, first_ctx = _gather_type_ctx_info([x] + list(args))
        if has_symbol and has_ndarray:
>           raise ValueError('In HybridBlock, we do not support mixed NDArrays and Symbols'
                             ' types for the input. Please check the type of the args.\n')
E           ValueError: In HybridBlock, we do not support mixed NDArrays and Symbols types for the input. Please check the type of the args.
lostella commented 4 years ago

Issue could be related to these objects being ndarrays independently of F: https://github.com/awslabs/gluon-ts/blob/c44f0a0041092be82302ab76356c2a5ee08ea3ee/src/gluonts/model/seq2seq/_forking_network.py#L88

lostella commented 4 years ago

@jaheba @lovvge @jgasthaus one possible fix here is to just not support hybridization for the models using this block (and therefore not test it), but it’s not ideal. The other option is to check whether mxnet now has some way of encoding an empty array as a symbol?

A third option would be that of reducing the interface of encoders and decoders to something simpler: just one tensor, which will contain the stacked target + dynamic features + (repeated, embedded) static features. That will remove some flexibility of course, and not allow treating these three channels independently.