
Investigating maximum likelihood based training of infinite mixtures for uncertainty quantification
Uncertainty quantification in neural networks gained a lot of attention ...
read it

Permutationbased uncertainty quantification about a mixing distribution
Nonparametric estimation of a mixing distribution based on data coming f...
read it

Bayesian Inference for Polycrystalline Materials
Polycrystalline materials, such as metals, are comprised of heterogeneou...
read it

A Kernel Framework to Quantify a Model's Local Predictive Uncertainty under Data Distributional Shifts
Traditional Bayesian approaches for model uncertainty quantification rel...
read it

Efficient and Scalable Bayesian Neural Nets with Rank1 Factors
Bayesian neural networks (BNNs) demonstrate promising success in improvi...
read it

Approximate Bayesian inference with queueing networks and coupled jump processes
Queueing networks are systems of theoretical interest that give rise to ...
read it

Regression with Uncertainty Quantification in Large Scale Complex Data
While several methods for predicting uncertainty on deep networks have b...
read it
Predictive Uncertainty Quantification with Compound Density Networks
Despite the huge success of deep neural networks (NNs), finding good mechanisms for quantifying their prediction uncertainty is still an open problem. Bayesian neural networks are one of the most popular approaches to uncertainty quantification. On the other hand, it was recently shown that ensembles of NNs, which belong to the class of mixture models, can be used to quantify prediction uncertainty. In this paper, we build upon these two approaches. First, we increase the mixture model's flexibility by replacing the fixed mixing weights by an adaptive, inputdependent distribution (specifying the probability of each component) represented by NNs, and by considering uncountably many mixture components. The resulting class of models can be seen as the continuous counterpart to mixture density networks and is therefore referred to as compound density networks (CDNs). We employ both maximum likelihood and variational Bayesian inference to train CDNs, and empirically show that they yield better uncertainty estimates on outofdistribution data and are more robust to adversarial examples than the previous approaches.
READ FULL TEXT
Comments
There are no comments yet.