Generative machine learning has shown great potential to transform LHC simulation methods. We introduce two diffusion models and an autoregressive transformer for the task of LHC event generation. Bayesian versions allow us to capture training uncertainties and gain insights into how the models learn the target density. Diffusion models show similar patterns as normalizing flows, the current state-of-the-art model, while matching or even surpassing their precision. The transformer shows the most promising scaling with the phase space dimensionality. Given their distinct strengths and weaknesses, we expect LHC physics to benefit from dedicated use cases for all three models.