Skip to content

Remove NormalMixture comp_shape kwarg #6986

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
Tracked by #7053
ricardoV94 opened this issue Nov 2, 2023 · 8 comments · Fixed by #7098
Closed
Tracked by #7053

Remove NormalMixture comp_shape kwarg #6986

ricardoV94 opened this issue Nov 2, 2023 · 8 comments · Fixed by #7098

Comments

@ricardoV94
Copy link
Member

Description

This has no effect besides what shape already does

CC @lucianopaz do you remember why this existed? I imagine it had something to do with the old way that weight used to work.

@mohammed052
Copy link
Contributor

@ricardoV94 Can you elaborate this issue more,I wish to contribute to this issue.

@lucianopaz
Copy link
Contributor

@ricardoV94, in version 3.x it used to be necessary to supply the component distribution shape. This could be used to do weird things like making multivariate mixture components out of scalar distributions (what I mean is that the shape of the support dimensions could effectively be changed). As far as I understand, this is no longer the case since version 4.x. Mixtures reshape their component batch dimensions so that they broadcast correctly with the mixture shape, and there is no way of changing the sport dimensions.

@ricardoV94
Copy link
Member Author

Thanks for the context Luciano!

We may reintroduce something like that later down the road but for now seems safe to remove the useless kwarg

@ricardoV94
Copy link
Member Author

@mohammed052 sorry for the delay. If you are still interested, it's as simple as removing any mentions of comp_shape in

class NormalMixture:
R"""
Normal mixture log-likelihood
.. math::
f(x \mid w, \mu, \sigma^2) = \sum_{i = 1}^n w_i N(x \mid \mu_i, \sigma^2_i)
======== =======================================
Support :math:`x \in \mathbb{R}`
Mean :math:`\sum_{i = 1}^n w_i \mu_i`
Variance :math:`\sum_{i = 1}^n w_i (\sigma^2_i + \mu_i^2) - \left(\sum_{i = 1}^n w_i \mu_i\right)^2`
======== =======================================
Parameters
----------
w : tensor_like of float
w >= 0 and w <= 1
the mixture weights
mu : tensor_like of float
the component means
sigma : tensor_like of float
the component standard deviations
tau : tensor_like of float
the component precisions
comp_shape : shape of the Normal component
notice that it should be different than the shape
of the mixture distribution, with the last axis representing
the number of components.
Notes
-----
You only have to pass in sigma or tau, but not both.
Examples
--------
.. code-block:: python
n_components = 3
with pm.Model() as gauss_mix:
μ = pm.Normal(
"μ",
mu=data.mean(),
sigma=10,
shape=n_components,
transform=pm.distributions.transforms.ordered,
initval=[1, 2, 3],
)
σ = pm.HalfNormal("σ", sigma=10, shape=n_components)
weights = pm.Dirichlet("w", np.ones(n_components))
y = pm.NormalMixture("y", w=weights, mu=μ, sigma=σ, observed=data)
"""
def __new__(cls, name, w, mu, sigma=None, tau=None, comp_shape=(), **kwargs):
_, sigma = get_tau_sigma(tau=tau, sigma=sigma)
return Mixture(name, w, Normal.dist(mu, sigma=sigma, size=comp_shape), **kwargs)
@classmethod
def dist(cls, w, mu, sigma=None, tau=None, comp_shape=(), **kwargs):
_, sigma = get_tau_sigma(tau=tau, sigma=sigma)
return Mixture.dist(w, Normal.dist(mu, sigma=sigma, size=comp_shape), **kwargs)

@mohammed052
Copy link
Contributor

Thank you @ricardoV94 and @lucianopaz for explanation of the issue and its solution
As I see, comp_shape is passed as parameter in new and dist functions
So removing comp_shape from the argument list of these functions and removing size passed to Normal.dist parameter list must solve this issue
Correct me if I am wrong,I am new to open source community

@ricardoV94
Copy link
Member Author

Yes, that's about it

@ricardoV94
Copy link
Member Author

If it's your first time, you may find this guide useful: https://www.pymc.io/projects/docs/en/latest/contributing/pr_tutorial.html

@mohammed052
Copy link
Contributor

@ricardoV94 can you review the pull request I sent on 11th jan for closing this issue

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants