Closed Einvon closed 2 months ago
]
:tada: Welcome to PyMC! :tada: We're really excited to have your input into the project! :sparkling_heart:
If you haven't done so already, please make sure you check out our Contributing Guidelines and Code of Conduct.
That example is a bit silly. Usually, you would write
beta = pm.Normal(..., shape=4)
mu = x @ beta
Which is also more efficient because it's a one node graph.
You want to use pm.math.sum, otherwise. Python sum would still create an inefficient graph with as many nodes as items in the iterable
mu = pm.math.sum(list(iterable))
That example is just silly. You would write
beta = pm.Normal(..., shape=4) mu = beta.sum()
Which is also more efficient because it's a 2 node graph. Of course even this example is fake, you would probably have predictors and do
mu = x @ beta
I tried mu = x @ beta
, no work, and x are need, not just betas. 'beta.sum()' is no use (to me).
For the inefficient alternatives:
If you want a loop you can, just initialize to 0.
mu = 0
for beta, x in zip(betas, xs):
mu += beta * x
This is totally valid. I don't know where you got your concern about TensorVariables not supporting +
.
Or pm.math.add
allows arbitrary number of inputs
mu = pm.math.add(*[x * b for x, b in zip(xs, betas)]
If you want a loop you can still, just initialize to 0
mu = 0 for beta, x in zip(betas, xs): mu += beta * x
Finally
pm.mat.add
allows arbitrary number of inputmu = pm.math.add(*[x * b for x, b in zip(xs, betas)]
i tried ‘mu=0\mu=''\mu=None...etc.’ in such loops before, no work. May the pm.mat.add will fix, thanks.
If you want to understand PyTensor a bit better I suggest just browsing the tutorial: https://pytensor.readthedocs.io/en/latest/tutorial/index.html
And the docs on how PyMC uses PyTensor: https://www.pymc.io/projects/docs/en/stable/learn/core_notebooks/pymc_pytensor.html
If you want to understand PyTensor a bit better I suggest just browsing the tutorial: https://pytensor.readthedocs.io/en/latest/tutorial/index.html
And the docs on how PyMC uses PyTensor: https://www.pymc.io/projects/docs/en/stable/learn/core_notebooks/pymc_pytensor.html
thank you, i will check these out.
I updated the first example, I missed the xs originally because of how the message was formatted with everything inside backticks.
In general you can use math.sum, math.add and math.matmul. Sometimes @ fails if x is not a Pytensor object yet and numpy gets called first. You can use matmul or force x to be a TensorVariable with pt.as_tensor(x) @ beta
i tried ‘mu=0\mu=''\mu=None...etc.’ in such loops before, no work. May the pm.mat.add will fix, thanks.
Can you share a small example? It should definitely work. += is supported by TensorVariables. To be on the safe side you can initialize mu = pt.as_tensor(0)
so everything is a TensorVariable from the get go
i tried ‘mu=0\mu=''\mu=None...etc.’ in such loops before, no work. May the pm.mat.add will fix, thanks.
Can you share a small example? It should definitely work. += is supported by TensorVariables. To be on the safe side you can initialize
mu = pt.as_tensor(0)
so everything is a TensorVariable from the get go
i used '+=' like two months ago, i am not sure if this is 'definitely works' yezt, i will check this now.thank u for your exhuastive explaination which solves my ques.
I'll close this issue for now, let us know if you come across a bug
I'll close this issue for now, let us know if you come across a bug
the '+=' from works, seems the problem has been fixed earlier, thank you!
Before
Pymc's documentation implies that it used theano's tensors to represent 'betas'. According to the official example given by pymc, most time entering expressions are manually, which can be very annoying if there are too many 'betas'.
Say:
Note that the larger of betas, the input of the 'mu' parameter becomes even more complicated, like 'beta_0 + beta_1 x1 + beta_2 x2 + beta_3 x3 + beta_4 x4...' etc.
After
At the very beginning my solution to this is a 'for loop'. Say:
-- However this works nicht, because the symbolic tensors used to represent betas in Pymc are not original 'theano forms', instead, a class that read 'TensorVariable'. The 'TensorVariable' may not support '+(add)' operator in for loops. This drives me to write this issue: overloading of the
TensorVariable(Class)
add(+) operator may needed in further versions.Never the less, I found another solution to this, which is quite simple. But this may require a warning(or raise AttributeError) in the class 'TensorVariable' or class 'Normal(and other distributions)' that guarantee users must not use the add(+) operator in loops, both check that users whether using Python's built-in function 'sum' or not. Say:
(initialization of betas and x\y)
-- Python's built-in function'sum' does not change properties of the obj being called, which means that it can keep properties of the 'TensorVariable' objs remain unchanged, so this solution is easier and more workable however need a IO comment(or warning\raise) to suggest users to use 'sum' function.