19.02.2025 12:15 Jane Coons (Max Planck Institute of Molecular Cell Biology and Genetics, Dresden): Iterative Proportional Scaling and Log-Linear Models with Rational Maximum Likelihood Estimator
In the field of algebraic statistics, we view statistical models as part of an algebraic variety and use tools from algebra, geometry, and combinatorics to learn statistically relevant information about these models. In this talk, we discuss the algebraic interpretation of likelihood inference for discrete statistical models. We present recent work on the iterative proportional scaling (IPS) algorithm, which is used to compute the maximum likelihood estimate (MLE), and give algebraic conditions under which this algorithm outputs the exact MLE in one cycle. Next, we introduce quasi-independence models, which describe the joint distribution of two random variables where some combinations of their states cannot co-occur, but they are otherwise independent. We combinatorially classify the quasi-independence models whose MLEs are rational functions of the data. We show that each of these has a parametrization which satisfies the conditions that guarantee one-cycle convergence of the IPS algorithm.
Quelle
19.03.2025 12:15 Vincent Fortuin (Helmholtz/TUM): Recent Advances in Bayesian Deep Learning
Combining Bayesian principles with the power of deep learning has long been an attractive direction of research, but its real-world impact has fallen short of the promises. Especially in the context of uncertainty estimation, there seem to be simpler methods that perform at least as well. In this talk, I want to argue that uncertainties are not the only reason to use Bayesian deep learning models, but that they also offer improved model selection and incorporation of prior knowledge. I will showcase these benefits supported by the results of two recent papers and situate them in the context of current research trends in Bayesian deep learning.
\[ \]
Bio: Vincent Fortuin is a tenure-track research group leader at Helmholtz AI in Munich, leading the group for Efficient Learning and Probabilistic Inference for Science (ELPIS), and a faculty member at the Technical University of Munich. He is also a Branco Weiss Fellow, an ELLIS Scholar, a Fellow of the Konrad Zuse School of Excellence in Reliable AI, and a Senior Researcher at the Munich Center for Machine Learning. His research focuses on reliable and data-efficient AI approaches, leveraging Bayesian deep learning, deep generative modeling, meta-learning, and PAC-Bayesian theory. Before that, he did his PhD in Machine Learning at ETH Zürich and was a Research Fellow at the University of Cambridge. He is a regular reviewer and area chair for all major machine learning conferences, an action editor for TMLR, and a co-organizer of the Symposium on Advances in Approximate Bayesian Inference (AABI) and the ICBINB initiative.
Quelle