Closed Miles-Garnsey closed 8 years ago
This issue might be related to #535, but I'm uncertain what the current state of play is for the randomwalk type?
The gradient is not implemented for T.repeat. You could circumvent this by making alpha and beta the right shape including your rep_pattern. I dont really know how your test data looks like. But broadcasting is implemented in theano. And repeat would propably not be necessary if you put your data in the right format to make use of broadcasting. Chers!
I've managed to resolve this one by referring back to some material included in #535. If anyone else is having a similar issue;
The code A.repeat(B)
fails because there is no gradient implemented where ndim of B >0.
But you can use a generator expression inside Tensor.concatenate like so;
N = length(rep_pattern)
T.concatenate([alpha[i].repeat(rep_pattern[i]) for i in range(N)])
This appears to work, although it also uses a lot of memory and has thrown some errors (constant folding error, and also a few memoryerrors where Python itself has crashed suddenly). I'll keep this threat up to date if this solution proves insufficient.
My only outstanding question is whether I should maybe raise this issue in the Theano project to see if maybe this might be a better way to handle things generally in T.repeat().
You can also use T.alloc in order to have an alternative to repeat. Its gradient is implemented. They are well aware of this, so you dont need to report that ;) . Chers!
Hi there,
I'm running the bleeding edge version of Theano and Pymc3, coming across an odd (and uninformative) error and I'm having trouble determining where the bug is.
I'm effectively trying to do a rolling regression as described here. In my case, I have multiple observations per day, so I'd like these to share the same regression coefficients.
The only change from the code in that example is to do with the way I've done the repetition of the alpha_r and beta_r variables, which I've changed to be
alpha_r = T.repeat(alpha, rep_pattern)
beta_r = T.repeat(beta,rep_pattern)
where
rep_pattern = T.constant(dates_series.values, dtype="int32")
which is an array of the counts grouped by day.Sorry if this is an obvious question, it's just that I've been working on it for some time and have hit a wall, still a good chance it's user error, but I wanted to check with the experts.
Full code;
The error I get back is as below:
I'm assuming this is because the gradient is not available for some component in my model - does anyone have any ideas which part might be causing the error?