Replies: 8 comments
-
Related to #5440 Also original PR was requesting discussion: #5386 |
Beta Was this translation helpful? Give feedback.
-
Issue already opened in #5446. Edit: That discussion is subtly different. Reopening |
Beta Was this translation helpful? Give feedback.
-
But we broadcast elsewhere, right? e.g. |
Beta Was this translation helpful? Give feedback.
-
That's allowed (by One is a broadcast from valid dimensions, the other is a broadcast to valid dimensions. Another way of thinking is that we are basically doing a numpy vectorize from a base case. This fails: import numpy as np
def func(mu, cov):
return np.random.multivariate(mu, cov).rvs()
vfunc = np.vectorize(func, signature='(n),(n,n)->(n)')
vfunc(np.ones(1), np.eye(3))
ValueError: inconsistent size for core dimension 'n': 3 vs 1 Having said that, it doesn't bother me much if we do this helper broadcast because I don't think there's ever an ambiguity as to the intention. But we have to make our minds :) |
Beta Was this translation helpful? Give feedback.
-
I also expect many people running into this problem, including myself.
Could the error be added automatically for any RV just given it's Continuing this line of thought, does the situation apply to time series distributions too? |
Beta Was this translation helpful? Give feedback.
-
For Time Series distributions I currently have no strong opinion or direction. it should eventually match whatever the decision ends up being here |
Beta Was this translation helpful? Give feedback.
-
If it counts, I do think that removing the auto broadcasting step and adding an error message will be helpful. Coming to the error message:
Furthermore, if we decide on removing broadcasting step, will we be having enough information to add an error message? I am wondering because at the time of distribution initialization in assert mu.shape[-1].eval() == cov.shape[-1].eval() ? |
Beta Was this translation helpful? Give feedback.
-
I guess this could be done in the baseclass of RandomVariable, in
I don't think we should do that. We usually only validate broadcasting shapes at runtime. Also, inputs may be shared variables, in which case the shapes (but not ndims) can change between model evaluations. |
Beta Was this translation helpful? Give feedback.
-
See PR for reference
Originally posted by @ricardoV94 in #5437 (comment)
mu=[0, 0]
Technically, it's incorrect to specify a scalar mu for the multivariate normal, although for backwards compatibility we reshape mu behind the scenes
Reply via ReviewNB
Beta Was this translation helpful? Give feedback.
All reactions