Closed adamian closed 9 years ago
Did you do the the unlink link combination for the kernel?
On 22 Sep 2015, at 17:25, Andreas Damianou notifications@github.com wrote:
When I run an mrd/bgplvm with an additions of kernels (for the ones we have the cross-statistics, eg linear plus bias) I get an error. I don't remember this being a problem before the release, and from the error trace (below) it seems that the problem is with the interface rather than the actual core (so I mark it as a bug).
To replicate the error e.g. run GPy.examples.dimensionality_reduction.mrd_simulation() after changing k = kern.Linear(Q, ARD=True) to k = kern.Linear(Q, ARD=True) + kern.Bias(Q). You'd get this error:
GPy.examples.dimensionality_reduction.mrd_simulation()
IndexError Traceback (most recent call last)
in () ----> 1 GPy.examples.dimensionality_reduction.mrd_simulation() /home/andreas/Dropbox/_PhD/Software/github/GPy/GPy/examples/dimensionality_reduction.pyc in mrd_simulation(optimize, verbose, plot, plot_sim, *_kw) 410 # Ylist = [Ylist[0]] 411 k = kern.Linear(Q, ARD=True) + kern.Bias(Q) --> 412 m = MRD(Ylist, input_dim=Q, num_inducing=num_inducing, kernel=k, initx="PCA_concat", initz='permute', *_kw) 413 414 m['.*noise'] = [Y.var() / 40. for Y in Ylist] /home/andreas/Dropbox/_PhD/Software/github/GPy/GPy/core/parameterization/parameterized.pyc in __call__(self, _args, *_kw) 17 self._in_init_ = True 18 #import ipdb;ipdb.set_trace() ---> 19 self = super(ParametersChangedMeta, self).**call**(_args, *_kw) 20 logger.debug("finished init") 21 self._in_init_ = False /home/andreas/Dropbox/_PhD/Software/github/GPy/GPy/models/mrd.pyc in __init__(self, Ylist, input_dim, X, X_variance, initx, initz, num_inducing, Z, kernel, inference_method, likelihoods, name, Ynames, normalizer, stochastic, batchsize) 149 missing_data=md, 150 stochastic=stochastic, --> 151 batchsize=bs) 152 spgp.kl_factr = 1./len(Ynames) 153 spgp.unlink_parameter(spgp.Z) /home/andreas/Dropbox/_PhD/Software/github/GPy/GPy/core/parameterization/parameterized.pyc in __call__(self, _args, *_kw) 25 self._highest_parent_._connect_fixes() 26 logger.debug("calling parameters changed") ---> 27 self.parameters_changed() 28 return self 29 /home/andreas/Dropbox/_PhD/Software/github/GPy/GPy/models/bayesian_gplvm_minibatch.pyc in parameters_changed(self) 111 112 def parameters_changed(self): --> 113 super(BayesianGPLVMMiniBatch,self).parameters_changed() 114 115 kl_fctr = self.kl_factr /home/andreas/Dropbox/_PhD/Software/github/GPy/GPy/models/sparse_gp_minibatch.pyc in parameters_changed(self) 315 self.psi0 = self.kern.psi0(self.Z, self.X) 316 self.psi1 = self.kern.psi1(self.Z, self.X) --> 317 self.psi2 = self.kern.psi2n(self.Z, self.X) 318 else: 319 self.psi0 = self.kern.Kdiag(self.X) /home/andreas/Dropbox/_PhD/Software/github/GPy/GPy/kern/_src/kernel_slice_operations.pyc in wrap(self, Z, variational_posterior) 138 def wrap(self, Z, variational_posterior): 139 with _Slice_wrap(self, Z, variational_posterior) as s: --> 140 ret = f(self, s.X, s.X2) 141 return ret 142 return wrap /home/andreas/Dropbox/_PhD/Software/github/GPy/GPy/util/caching.pyc in __call__(self, _args, *_kwargs) 182 except KeyError: 183 cacher = caches[self.f] = Cacher(self.f, self.limit, self.ignore_args, self.force_kwargs) --> 184 return cacher(_args, *_kwargs) 185 186 class Cache_this(object): /home/andreas/Dropbox/_PhD/Software/github/GPy/GPy/util/caching.pyc in __call__(self, _args, *_kw) 118 # 3: This is when we never saw this chache_id: 119 self.ensure_cache_length(cache_id) --> 120 self.add_to_cache(cache_id, inputs, self.operation(_args, *_kw)) 121 except: 122 self.reset() /home/andreas/Dropbox/_PhD/Software/github/GPy/GPy/kern/_src/add.py in psi2n(self, Z, variational_posterior) 153 elif isinstance(p2, Bias) and isinstance(p1, (RBF, Linear)): 154 tmp = p1.psi1(Z, variational_posterior).sum(axis=0) --> 155 psi2 += p2.variance \* (tmp[:, :, None] + tmp[:, None, :]) 156 elif isinstance(p2, (RBF, Linear)) and isinstance(p1, (RBF, Linear)): 157 assert np.intersect1d(p1.active_dims, p2.active_dims).size == 0, "only non overlapping kernel dimensions allowed so far" IndexError: too many indices for array — Reply to this email directly or view it on GitHub.
I think he means the adding of RBF + Linear on the same active dimensions, that requires the cross-terms that we've yet to compute. We should really do this! I think both Zhenwen and Javier (and possibly you Max?) all have the maths, just needs to be put into code!
Alan
On 23 September 2015 at 23:17, Max Zwiessele notifications@github.com wrote:
Did you do the the unlink link combination for the kernel?
On 22 Sep 2015, at 17:25, Andreas Damianou notifications@github.com wrote:
When I run an mrd/bgplvm with an additions of kernels (for the ones we have the cross-statistics, eg linear plus bias) I get an error. I don't remember this being a problem before the release, and from the error trace (below) it seems that the problem is with the interface rather than the actual core (so I mark it as a bug).
To replicate the error e.g. run GPy.examples.dimensionality_reduction.mrd_simulation() after changing k = kern.Linear(Q, ARD=True) to k = kern.Linear(Q, ARD=True) + kern.Bias(Q). You'd get this error:
GPy.examples.dimensionality_reduction.mrd_simulation()
IndexError Traceback (most recent call last)
in () ----> 1 GPy.examples.dimensionality_reduction.mrd_simulation() /home/andreas/Dropbox/_PhD/Software/github/GPy/GPy/examples/dimensionality_reduction.pyc in mrd_simulation(optimize, verbose, plot, plot_sim, *_kw) 410 # Ylist = [Ylist[0]] 411 k = kern.Linear(Q, ARD=True) + kern.Bias(Q) --> 412 m = MRD(Ylist, input_dim=Q, num_inducing=num_inducing, kernel=k, initx="PCA_concat", initz='permute', *_kw) 413 414 m['.*noise'] = [Y.var() / 40. for Y in Ylist] /home/andreas/Dropbox/_PhD/Software/github/GPy/GPy/core/parameterization/parameterized.pyc in __call__(self, _args, *_kw) 17 self._in_init_ = True 18 #import ipdb;ipdb.set_trace() ---> 19 self = super(ParametersChangedMeta, self).**call**(_args, *_kw) 20 logger.debug("finished init") 21 self._in_init_ = False /home/andreas/Dropbox/_PhD/Software/github/GPy/GPy/models/mrd.pyc in __init__(self, Ylist, input_dim, X, X_variance, initx, initz, num_inducing, Z, kernel, inference_method, likelihoods, name, Ynames, normalizer, stochastic, batchsize) 149 missing_data=md, 150 stochastic=stochastic, --> 151 batchsize=bs) 152 spgp.kl_factr = 1./len(Ynames) 153 spgp.unlink_parameter(spgp.Z) /home/andreas/Dropbox/_PhD/Software/github/GPy/GPy/core/parameterization/parameterized.pyc in __call__(self, _args, *_kw) 25 self._highest_parent_._connect_fixes() 26 logger.debug("calling parameters changed") ---> 27 self.parameters_changed() 28 return self 29 /home/andreas/Dropbox/_PhD/Software/github/GPy/GPy/models/bayesian_gplvm_minibatch.pyc in parameters_changed(self) 111 112 def parameters_changed(self): --> 113 super(BayesianGPLVMMiniBatch,self).parameters_changed() 114 115 kl_fctr = self.kl_factr /home/andreas/Dropbox/_PhD/Software/github/GPy/GPy/models/sparse_gp_minibatch.pyc in parameters_changed(self) 315 self.psi0 = self.kern.psi0(self.Z, self.X) 316 self.psi1 = self.kern.psi1(self.Z, self.X) --> 317 self.psi2 = self.kern.psi2n(self.Z, self.X) 318 else: 319 self.psi0 = self.kern.Kdiag(self.X) /home/andreas/Dropbox/_PhD/Software/github/GPy/GPy/kern/_src/kernel_slice_operations.pyc in wrap(self, Z, variational_posterior) 138 def wrap(self, Z, variational_posterior): 139 with _Slice_wrap(self, Z, variational_posterior) as s: --> 140 ret = f(self, s.X, s.X2) 141 return ret 142 return wrap /home/andreas/Dropbox/_PhD/Software/github/GPy/GPy/util/caching.pyc in __call__(self, _args, *_kwargs) 182 except KeyError: 183 cacher = caches[self.f] = Cacher(self.f, self.limit, self.ignore_args, self.force_kwargs) --> 184 return cacher(_args, *_kwargs) 185 186 class Cache_this(object): /home/andreas/Dropbox/_PhD/Software/github/GPy/GPy/util/caching.pyc in __call__(self, _args, *_kw) 118 # 3: This is when we never saw this chache_id: 119 self.ensure_cache_length(cache_id) --> 120 self.add_to_cache(cache_id, inputs, self.operation(_args, *_kw)) 121 except: 122 self.reset() /home/andreas/Dropbox/_PhD/Software/github/GPy/GPy/kern/_src/add.py in psi2n(self, Z, variational_posterior) 153 elif isinstance(p2, Bias) and isinstance(p1, (RBF, Linear)): 154 tmp = p1.psi1(Z, variational_posterior).sum(axis=0) --> 155 psi2 += p2.variance \* (tmp[:, :, None] + tmp[:, None, :]) 156 elif isinstance(p2, (RBF, Linear)) and isinstance(p1, (RBF, Linear)): 157 assert np.intersect1d(p1.active_dims, p2.active_dims).size == 0, "only non overlapping kernel dimensions allowed so far" IndexError: too many indices for array — Reply to this email directly or view it on GitHub. — Reply to this email directly or view it on GitHub https://github.com/SheffieldML/GPy/issues/257#issuecomment-142746727.
No, I did mean the Linear + Bias (I know the cross-terms for Lin + RBF doesn't work, but the Lin + Bias should work).
Alan and Carl looked into this today a bit more, perhaps Alan can give an update?
Andreas
On Thu, Sep 24, 2015 at 9:35 AM, Alan Saul notifications@github.com wrote:
I think he means the adding of RBF + Linear on the same active dimensions, that requires the cross-terms that we've yet to compute. We should really do this! I think both Zhenwen and Javier (and possibly you Max?) all have the maths, just needs to be put into code!
Alan
On 23 September 2015 at 23:17, Max Zwiessele notifications@github.com wrote:
Did you do the the unlink link combination for the kernel?
On 22 Sep 2015, at 17:25, Andreas Damianou notifications@github.com wrote:
When I run an mrd/bgplvm with an additions of kernels (for the ones we have the cross-statistics, eg linear plus bias) I get an error. I don't remember this being a problem before the release, and from the error trace (below) it seems that the problem is with the interface rather than the actual core (so I mark it as a bug).
To replicate the error e.g. run GPy.examples.dimensionality_reduction.mrd_simulation() after changing k = kern.Linear(Q, ARD=True) to k = kern.Linear(Q, ARD=True) + kern.Bias(Q). You'd get this error:
GPy.examples.dimensionality_reduction.mrd_simulation()
IndexError Traceback (most recent call last)
in () ----> 1 GPy.examples.dimensionality_reduction.mrd_simulation() /home/andreas/Dropbox/_PhD/Software/github/GPy/GPy/examples/dimensionality_reduction.pyc in mrd_simulation(optimize, verbose, plot, plot_sim, **kw)
410 # Ylist = [Ylist[0]] 411 k = kern.Linear(Q, ARD=True) + kern.Bias(Q) --> 412 m = MRD(Ylist, input_dim=Q, num_inducing=num_inducing, kernel=k, initx="PCA_concat", initz='permute', *_kw) 413 414 m['._noise'] = [Y.var() / 40. for Y in Ylist]
/home/andreas/Dropbox/_PhD/Software/github/GPy/GPy/core/parameterization/parameterized.pyc in call(self, _args, *_kw)
17 self._ininit = True 18 #import ipdb;ipdb.set_trace() ---> 19 self = super(ParametersChangedMeta, self).call(_args, *_kw) 20 logger.debug("finished init") 21 self._ininit = False
/home/andreas/Dropbox/_PhD/Software/github/GPy/GPy/models/mrd.pyc in init(self, Ylist, input_dim, X, X_variance, initx, initz, num_inducing, Z, kernel, inference_method, likelihoods, name, Ynames, normalizer, stochastic, batchsize) 149 missing_data=md, 150 stochastic=stochastic, --> 151 batchsize=bs) 152 spgp.kl_factr = 1./len(Ynames) 153 spgp.unlink_parameter(spgp.Z)
/home/andreas/Dropbox/_PhD/Software/github/GPy/GPy/core/parameterization/parameterized.pyc in call(self, _args, *_kw)
25 self._highestparent._connect_fixes() 26 logger.debug("calling parameters changed") ---> 27 self.parameters_changed() 28 return self 29
/home/andreas/Dropbox/_PhD/Software/github/GPy/GPy/models/bayesian_gplvm_minibatch.pyc in parameters_changed(self)
111 112 def parameters_changed(self): --> 113 super(BayesianGPLVMMiniBatch,self).parameters_changed() 114 115 kl_fctr = self.kl_factr
/home/andreas/Dropbox/_PhD/Software/github/GPy/GPy/models/sparse_gp_minibatch.pyc in parameters_changed(self)
315 self.psi0 = self.kern.psi0(self.Z, self.X) 316 self.psi1 = self.kern.psi1(self.Z, self.X) --> 317 self.psi2 = self.kern.psi2n(self.Z, self.X) 318 else: 319 self.psi0 = self.kern.Kdiag(self.X)
/home/andreas/Dropbox/_PhD/Software/github/GPy/GPy/kern/_src/kernel_slice_operations.pyc in wrap(self, Z, variational_posterior)
138 def wrap(self, Z, variational_posterior): 139 with _Slice_wrap(self, Z, variational_posterior) as s: --> 140 ret = f(self, s.X, s.X2) 141 return ret 142 return wrap
/home/andreas/Dropbox/_PhD/Software/github/GPy/GPy/util/caching.pyc in call(self, _args, _kwargs) 182 except KeyError: 183 cacher = caches[self.f] = Cacher(self.f, self.limit, self.ignore_args, self.force_kwargs) --> 184 return cacher(_args, _kwargs) 185 186 class Cache_this(object):
/home/andreas/Dropbox/_PhD/Software/github/GPy/GPy/util/caching.pyc in call(self, _args, _kw) 118 # 3: This is when we never saw this chache_id: 119 self.ensure_cache_length(cache_id) --> 120 self.add_to_cache(cache_id, inputs, self.operation(_args, _kw)) 121 except: 122 self.reset()
/home/andreas/Dropbox/_PhD/Software/github/GPy/GPy/kern/_src/add.py in psi2n(self, Z, variational_posterior) 153 elif isinstance(p2, Bias) and isinstance(p1, (RBF, Linear)): 154 tmp = p1.psi1(Z, variational_posterior).sum(axis=0) --> 155 psi2 += p2.variance * (tmp[:, :, None] + tmp[:, None, :]) 156 elif isinstance(p2, (RBF, Linear)) and isinstance(p1, (RBF, Linear)): 157 assert np.intersect1d(p1.active_dims, p2.active_dims).size == 0, "only non overlapping kernel dimensions allowed so far"
IndexError: too many indices for array — Reply to this email directly or view it on GitHub.
— Reply to this email directly or view it on GitHub https://github.com/SheffieldML/GPy/issues/257#issuecomment-142746727.
— Reply to this email directly or view it on GitHub https://github.com/SheffieldML/GPy/issues/257#issuecomment-142857533.
fixed with the commit by Alan.
When I run an mrd/bgplvm with an additions of kernels (for the ones we have the cross-statistics, eg linear plus bias) I get an error. I don't remember this being a problem before the release, and from the error trace (below) it seems that the problem is with the interface rather than the actual core (so I mark it as a bug).
To replicate the error e.g. run
GPy.examples.dimensionality_reduction.mrd_simulation()
after changingk = kern.Linear(Q, ARD=True)
tok = kern.Linear(Q, ARD=True) + kern.Bias(Q)
. You'd get this error: