-
-
Notifications
You must be signed in to change notification settings - Fork 269
Update case study: LKJ #206
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Check out this pull request on See visual diffs & provide feedback on Jupyter Notebooks. Powered by ReviewNB |
View / edit / reply to this conversation on ReviewNB chiral-carbon commented on 2021-08-09T17:11:22Z get too many warnings here but not sure how to address these. OriolAbril commented on 2021-08-10T07:28:20Z see which variables are the culprits of the warnings. I think only the diagonal of
Once using coords via chiral-carbon commented on 2021-08-10T22:43:38Z sure. will try that |
see which variables are the culprits of the warnings. I think only the diagonal of
Once using coords via View entire conversation on ReviewNB |
View / edit / reply to this conversation on ReviewNB OriolAbril commented on 2021-08-10T07:31:45Z Line #2. trace = pm.sample(random_seed=RANDOM_SEED, init="adapt_diag", return_inferencedata=True)
It would be good to show how one can use named dims and coords for cholesky matrices. There is one example in model
Line #3. az.summary(trace, var_names=["~chol"], round_to=2)
Here I am not sure this
chiral-carbon commented on 2021-08-12T16:22:25Z yup, there is a variable named chol, corr, stds = pm.LKJCholeskyCov( "chol", n=2, eta=2.0, sd_dist=pm.Exponential.dist(1.0), compute_corr=True ) OriolAbril commented on 2021-08-12T16:36:25Z You have to look this in the InferenceData, not in the model. I am sure this line creates the
chiral-carbon commented on 2021-08-12T19:15:46Z oh yes, that it does. I have printed the entire data here. |
View / edit / reply to this conversation on ReviewNB OriolAbril commented on 2021-08-10T07:31:46Z As personal preference, here I would use non filled ellipses so it's easier to see the differences. I would also use some other colors because those two are hard to distinguish for people with tritanomaly. chiral-carbon commented on 2021-08-12T19:17:38Z I removed fill color and wanted to use colors that were contrasting enough but red and yellow, for example, kind of blended together. used black and yellow in the end, these are also not very clear but better than before. if you have a better suggestion though then let me know. |
also fix the comment on the issue description about using |
sure. will try that View entire conversation on ReviewNB |
yup, there is a variable named chol, corr, stds = pm.LKJCholeskyCov( "chol", n=2, eta=2.0, sd_dist=pm.Exponential.dist(1.0), compute_corr=True ) View entire conversation on ReviewNB |
You have to look this in the InferenceData, not in the model. I am sure this line creates the
View entire conversation on ReviewNB |
oh yes, that it does. I have printed the entire data here. View entire conversation on ReviewNB |
I removed fill color and wanted to use colors that were contrasting enough but red and yellow, for example, kind of blended together. used black and yellow in the end, these are also not very clear but better than before. if you have a better suggestion though then let me know. View entire conversation on ReviewNB |
View / edit / reply to this conversation on ReviewNB MarcoGorelli commented on 2021-08-13T10:35:03Z no divergences and good r-hats
One of the r-hats is NaN - do you know why? OriolAbril commented on 2021-08-13T12:37:31Z the diagonal of the correlation matrix is not a random variable, it's always 1. As rhat compares the distributions from different chains to see if they are compatible with being the same with terms like within chain variance and between chain variance, both 0 for constants, and some are on the denominator rhat should be NaN as it doesn't really make sense to use the diagnostic on constants. What I don't understand is why
MarcoGorelli commented on 2021-08-15T16:40:36Z What I don't understand is why I'm seeing the same in NumPyro - looks like, for that element, there's some small numerical errors which make it not exactly 1 everywhere: ipdb> x.shape
Perhaps a quick explanation would make the " no divergences and good r-hats" clearer? something like:
no divergences and good r-hats (except for the diagonal elements of the correlation matrix - however, these are not a concern, because, they should be equal chiral-carbon commented on 2021-08-17T13:09:51Z @MarcoGorelli should I add this explanation in notebook? MarcoGorelli commented on 2021-08-17T13:32:59Z Hey @chiral-carbon - yeah, I think so - just here in the markdown cell would be fine, no need to re-run the whole notebook |
the diagonal of the correlation matrix is not a random variable, it's always 1. As rhat compares the distributions from different chains to see if they are compatible with being the same with terms like within chain variance and between chain variance, both 0 for constants, and some are on the denominator rhat should be NaN as it doesn't really make sense to use the diagnostic on constants. What I don't understand is why
View entire conversation on ReviewNB |
What I don't understand is why I'm seeing the same in NumPyro - looks like, for that element, there's some small numerical errors which make it not exactly 1 everywhere: ipdb> x.shape
Perhaps a quick explanation would make the " no divergences and good r-hats" clearer? something like:
no divergences and good r-hats (except for the diagonal elements of the correlation matrix - however, these are not a concern, because, they should be equal View entire conversation on ReviewNB |
@MarcoGorelli should I add this explanation in notebook? View entire conversation on ReviewNB |
Hey @chiral-carbon - yeah, I think so - just here in the markdown cell would be fine, no need to re-run the whole notebook View entire conversation on ReviewNB |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
looks good to me (@OriolAbril not sure how actionable the comment about all the warnings is at the moment or whether that's OK to leave as a follow-up, will leave this with you 😄 )
View / edit / reply to this conversation on ReviewNB OriolAbril commented on 2021-08-24T17:12:38Z Line #29. rect = plt.Rectangle((0, 0), 1, 1, fc="C0", alpha=0.5) This should be updated to use |
We'll go over the two minor nits still missing in my opinion and merge in the next few days. Trying to make the final plot a bit more clear and see if we can remove some (or even all) warnings in the plot_trace figure |
View / edit / reply to this conversation on ReviewNB OriolAbril commented on 2021-08-26T15:26:49Z just seeing this now, the mean should use named dimension names. same in the cell below. It should also use
|
View / edit / reply to this conversation on ReviewNB OriolAbril commented on 2021-08-26T15:26:50Z Line #14. e.set_alpha(0.5) I would not make the lines transparent here.
And I would make the dashed line be above the continuous line. This can be modified with the
Line #16. e.set_zorder(10)
|
Addresses issue #53 and aims to advance it to best practices.
I did not see much to update here apart from using
numpy.random.default_rng()
so let me know if any improvements could be made in the code or plotting.