Skip to content

emcee proposal#2

Open
JohannesBuchner wants to merge 6 commits intoadammoss:masterfrom
JohannesBuchner:master
Open

emcee proposal#2
JohannesBuchner wants to merge 6 commits intoadammoss:masterfrom
JohannesBuchner:master

Conversation

@JohannesBuchner
Copy link
Copy Markdown
Contributor

Hi Adam,

I played with three modifications to the MCMC sampling:

  1. Iteratively train the network, sample, train, etc.
  2. I store multiple networks during the training process. Then I sample proportional to their sampling efficiency. This avoids using a seemly good but actually overfitted best network.
  3. Use emcee as a mcmc sampler. The population proposal samples a Gaussian ball (which we are training for) in moderate dimensions more efficiently than a Gaussian metropolis-hastings proposal.

For Rosenbrock it seems --num_blocks=5 --hidden_dim=40 --num_layers=1 works reliably in 2-20 dimensions. After a few sample-train iterations, the sampler becomes much more efficient than standard emcee (tested by setting num_blocks=1 num_layers=0).

For Himmelblau the sampler tends to lose modes. I think this is because I restart the sampler from scratch, but I should initialise it with the last sampler population (the reshape in my sample() flattens everything, I haven't figured out how to just return as is).

Later I want to look at nested sampling.

Cheers,
Johannes

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant