@fonnesbeck as i think he’ll be interested in batch processing bayesian models anyway I am trying to use lognormal as priors for both I want to run lots of numpyro models in parallel
its_pyro_kitten / pyro.kitten / pyrokitten Nude OnlyFans Photo #96
I created a new post because
This post uses numpyro instead of pyro i’m doing sampling instead of svi i’m using ray instead of dask that post was 2021 i’m running a simple neal’s funnel.
Hello pyro community, i’m trying to build a bayesian cnn for mnist classification using pyro, but despite seeing the elbo loss decrease to around 10 during training, the model’s predictive accuracy remains at chance level (~10%) Could you help me understand why the loss improves while performance doesn’t, and suggest potential fixes Import torch import pyro import pyro. This would appear to be a bug/unsupported feature
If you like, you can make a feature request on github (please include a code snippet and stack trace) However, in the short term your best bet would be to try to do what you want in pyro, which should support this. I am running nuts/mcmc (on multiple cpu cores) for a quite large dataset (400k samples) for 4 chains x 2000 steps I assume upon trying to gather all results
(there might be some unnecessary memory duplication going on in this step?) are there any “quick fixes” to reduce the memory footprint of mcmc
Apologies for the rather long post This is the gmm code that works when i fit with both hmc and svi. Hi everyone, i am very new to numpyro and hierarchical modeling There is another prior (theta_part) which should be centered around theta_group