-
-
Notifications
You must be signed in to change notification settings - Fork 2.1k
This issue was moved to a discussion.
You can continue the conversation there. Go to discussion →
Drop scalar parameters for multivariate dists #5447
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
Issue already opened in #5446. Edit: That discussion is subtly different. Reopening |
But we broadcast elsewhere, right? e.g. |
That's allowed (by One is a broadcast from valid dimensions, the other is a broadcast to valid dimensions. Another way of thinking is that we are basically doing a numpy vectorize from a base case. This fails: import numpy as np
def func(mu, cov):
return np.random.multivariate(mu, cov).rvs()
vfunc = np.vectorize(func, signature='(n),(n,n)->(n)')
vfunc(np.ones(1), np.eye(3))
ValueError: inconsistent size for core dimension 'n': 3 vs 1 Having said that, it doesn't bother me much if we do this helper broadcast because I don't think there's ever an ambiguity as to the intention. But we have to make our minds :) |
I also expect many people running into this problem, including myself.
Could the error be added automatically for any RV just given it's Continuing this line of thought, does the situation apply to time series distributions too? |
For Time Series distributions I currently have no strong opinion or direction. it should eventually match whatever the decision ends up being here |
If it counts, I do think that removing the auto broadcasting step and adding an error message will be helpful. Coming to the error message:
Furthermore, if we decide on removing broadcasting step, will we be having enough information to add an error message? I am wondering because at the time of distribution initialization in assert mu.shape[-1].eval() == cov.shape[-1].eval() ? |
I guess this could be done in the baseclass of RandomVariable, in
I don't think we should do that. We usually only validate broadcasting shapes at runtime. Also, inputs may be shared variables, in which case the shapes (but not ndims) can change between model evaluations. |
This issue was moved to a discussion.
You can continue the conversation there. Go to discussion →
See PR for reference
Originally posted by @ricardoV94 in #5437 (comment)
mu=[0, 0]
Technically, it's incorrect to specify a scalar mu for the multivariate normal, although for backwards compatibility we reshape mu behind the scenes
Reply via ReviewNB
The text was updated successfully, but these errors were encountered: