Understanding Event-Generation Networks via Uncertainties

Following the growing success of generative neural networks in LHC simulations, the crucial question is how to control the networks and assign uncertainties to their event output. We show how Bayesian normalizing flow capture uncertainties from the training. Fundamentally, the interplay between density and uncertainty estimates indicates that these networks learn functions in analogy to parameter fits rather than binned event counts

Wednesday, 16th March 2021, 14:30 — Zoom seminar