Closed dnguyen1196 closed 4 years ago
Could you comment a bit on what is the general purposes of these helper functions?
I thought our pinot.Net
object already have these functionalities, with its expectation
, condition
and loss
methods?
Maybe @karalets has more thoughts on this?
Some of these errors are in tests that I did not modify.
npt.assert_almost_equal(
> net.parametrization.weight[0].detach().numpy(), -2.0, decimal=3
)
E AssertionError:
E Arrays are not almost equal to 3 decimals
E
E Mismatched elements: 1 / 1 (100%)
E Max absolute difference: 0.94424546
E Max relative difference: 0.47212273
E x: array([-1.056], dtype=float32)
E y: array(-2.)
and
> npt.assert_almost_equal(torch.exp(net.forward(x)[:, 1].mean()) + 1e-5, torch.std(y))
pinot/tests/test_nets_linear_regression.py:125:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
<__array_function__ internals>:6: in iscomplexobj
???
/usr/share/miniconda/envs/test/lib/python3.6/site-packages/numpy/lib/type_check.py:317: in iscomplexobj
type_ = asarray(x).dtype.type
/usr/share/miniconda/envs/test/lib/python3.6/site-packages/numpy/core/_asarray.py:85: in asarray
return array(a, dtype, copy=False, order=order)
Should I repush or is something else going on here? @yuanqing-wang
Could you comment a bit on what is the general purposes of these helper functions? I thought our
pinot.Net
object already have these functionalities, with itsexpectation
,condition
andloss
methods?
I see. OK, in that way, it makes more sense for the model to compute its own condition and loss then. My intention was for the case where we have an ensemble of predictors or when we have multiple "samples" of a predictor.
But, then, the error is odd. When you last pushed, these errors didnt appear?
ah I it has always been failing I was going to pause it. see #20
Some basic metrics for prediction with ensembles
Mostly a test pull request to ensure the workflow is smooth