Trick, Susanne Gabriele (2024)
Bayesian fusion of probabilistic forecasts.
Technische Universität Darmstadt
doi: 10.26083/tuprints-00027385
Ph.D. Thesis, Primary publication, Publisher's Version
Text
dissertation_susanne_trick_2024.pdf Copyright Information: In Copyright. Download (16MB) |
Item Type: | Ph.D. Thesis | ||||
---|---|---|---|---|---|
Type of entry: | Primary publication | ||||
Title: | Bayesian fusion of probabilistic forecasts | ||||
Language: | English | ||||
Referees: | Rothkopf, Prof. PhD Constantin A. ; Endres, Prof. Dr. Dominik | ||||
Date: | 29 May 2024 | ||||
Place of Publication: | Darmstadt | ||||
Collation: | xxiii, 196 Seiten | ||||
Date of oral examination: | 26 April 2024 | ||||
DOI: | 10.26083/tuprints-00027385 | ||||
Abstract: | Due to pervasive noise and ambiguity, our world is dominated by uncertainty. In order to face uncertain perception, action, and decision making, humans have an internal representation of their uncertainty and communicate it in interactions with other humans. Furthermore, humans combine information from different information sources to reduce their uncertainty about the world’s state. Specifically in perceptual tasks people have been shown to integrate redundant sensory cues by weighting different cues according to their uncertainty to maximally reduce the uncertainty of the integrated sensory estimate, as described by Bayes' theorem. Like humans, Artificial Intelligence (AI) systems and robots as their embodied form should represent, consider, reduce, and communicate uncertainty in order to cope with our uncertain world. In particular, this can increase the safety of critical AI applications and improve the quality of the interaction between human and AI, e.g., in AI-supported human decision making or human-robot collaboration in industry or caregiving settings. Quantifying an AI system’s uncertainty can be realized with probabilistic methods, e.g., probabilistic classifiers that output categorical distributions over all classes. Also, AI systems often combine different information sources: Classifier ensembles, which combine multiple individual classifiers in order to improve classification performance, are known to be the most successful classification methods. However, classifier ensembles are usually optimized to merely maximize the classification performance instead of reducing the uncertainty. Thus, an open question is how to optimally combine probabilistic forecasts provided by classifiers while explicitly considering and correctly reducing their uncertainty, similar to how humans combine multiple cues in perception. Since the individual classifiers in an ensemble are usually correlated, particular focus should be put on the combination of correlated classifiers. This thesis investigates how to optimally combine the outputs of probabilistic classifiers. It provides a normative Bayesian model that formalizes how to optimally fuse individual classifiers according to their properties, such as uncertainty, bias, and variance, given different assumptions. Moreover, our model explicitly considers the correlation of the individual classifiers. It models the classifiers’ correlations with a newly introduced probability distribution, the correlated Dirichlet distribution. The resulting Correlated Fusion Model quantifies how classification uncertainty should be reduced through Bayes optimal classifier combination depending on the individual classifiers’ uncertainty, bias, variance, and correlation and outperforms related Bayesian classifier fusion models on simulated and real data sets. A special case of the correlated Dirichlet distribution introduced for modeling correlated probabilistic classifiers is the bivariate beta distribution. The bivariate beta distribution models two beta-distributed random variables with a positive correlation. Thus, it is particularly interesting for modeling binary probabilistic classifiers but is also of general interest in statistics. While the bivariate beta distribution has been proposed before, previous work used an approximate and sometimes inaccurate method to compute the distribution's covariance and correlation and estimate its parameters. Therefore, in this thesis, we derive all product moments and the exact covariance and introduce an algorithm for estimating the bivariate beta distribution’s parameters using moment matching. A promising application of Bayes optimal fusion of multiple probabilistic classifiers is multimodal human-robot interaction. Since humans interact multimodally using modalities such as speech, gestures, and gaze directions, an intuitive and natural interaction between humans and robots requires robots to also interact multimodally. In particular, a robot should be able to process people’s uncertain multimodal signals, e.g., about their intentions, and correctly combine its uncertainties about them. However, present approaches for multimodal intention recognition in human-robot interaction do not focus on how to correctly consider individual modalities’ uncertainties and reduce uncertainty. Therefore, in this thesis, we recognize human intentions from multimodal data using probabilistic classifiers for each modality whose output distributions are combined Bayes optimally. We present three applications of Bayes optimal classifier fusion to different human-robot interaction scenarios. We first detect human intentions from multimodal data including speech, gestures, gaze directions, and scene objects. In an interaction task between a human and a 7-Degrees-of-Freedom robot arm, we show that adding more modalities contributes to increased detection performance and reduced uncertainty. Second, we apply Bayesian fusion to enable humans to teach a 7-Degrees-of-Freedom robot arm using multimodal action advice given by speech and gestures for interactive reinforcement learning. Evaluations with human participants show that the learning speed can be improved significantly compared to other methods. Third, we learn to detect humans' intention to start an interaction with a robot, the intention for interaction, from natural human behavior. We recorded a multimodal data set including speech and body poses with human participants in a collaborative task with a two-armed assistive robot. We compare different unimodal and multimodal classifiers and show that the intention for interaction can be detected better from multimodal data using Bayesian classifier fusion. Bayes optimal fusion methods can not only be applied to combine classifiers but also to combine subjective probability estimates provided by humans. Such probabilistic estimates or forecasts, e.g., provided by experts, are of particular importance in many domains, such as finance, politics, engineering, meteorology, and public health, and can further be used to build rule-based AI systems. While combining forecasts is known to increase forecasting performance, as for classifier fusion, there is a need for a normative model that defines how to correctly combine human forecasts while explicitly considering their uncertainty. In this thesis, we present a family of normative Bayesian models for the aggregation of subjective probability estimates, which are closely related to our normative Bayesian model for classifier fusion. We model the forecasting behavior of individual forecasters with beta distributions, implicitly calibrate their probability estimates, and combine them accordingly in order to obtain the Bayes optimal uncertainty of the fused forecast. However, the proposed fusion models disregard the correlation between the forecasters, reduce too much uncertainty, and are thus overconfident. Therefore, in a second step, we extend these models to a Bayesian model for the combination of correlated subjective probability estimates. By explicitly representing the skills of the individual forecasters and the difficulties of individual queries for which forecasts are provided, this model can represent the correlation between individual forecasters and can consider it when fusing forecasts. As a consequence, its fusion performance is improved compared to our previous models and related fusion models. In summary, this thesis investigates the fundamental computational problem of combining uncertain probabilistic forecasts that humans, robots, as well as AI systems in general are facing. While human perception unconsciously integrates multiple sensory cues to provide an optimal percept, here we develop normative Bayesian fusion models for combining probabilistic forecasts provided by classifiers or humans. The proposed models define how probabilistic forecasts should be fused Bayes optimally, in particular if the forecasts are correlated. We demonstrate that the developed algorithms outperform related fusion methods and successfully apply them in multimodal human-robot interaction. |
||||
Alternative Abstract: |
|
||||
Status: | Publisher's Version | ||||
URN: | urn:nbn:de:tuda-tuprints-273857 | ||||
Additional Information: | In reference to IEEE copyrighted material which is used with permission in this thesis, the IEEE does not endorse any of Technical University of Darmstadt’s products or services. Internal or personal use of this material is permitted. If interested in reprinting/republishing IEEE copyrighted material for advertising or promotional purposes or for creating new collective works for resale or redistribution, please go to http://www.ieee.org/publications_standards/publications/rights/rights_link.html to learn how to obtain a License from RightsLink. If applicable, University Microfilms and/or ProQuest Library, or the Archives of Canada may supply single copies of the dissertation. |
||||
Classification DDC: | 000 Generalities, computers, information > 004 Computer science 100 Philosophy and psychology > 150 Psychology 500 Science and mathematics > 500 Science |
||||
Divisions: | 03 Department of Human Sciences > Institute for Psychology > Psychology of Information Processing | ||||
TU-Projects: | Bund/BMBF|16SV7984|KoBo34 | ||||
Date Deposited: | 29 May 2024 12:27 | ||||
Last Modified: | 06 Jun 2024 06:27 | ||||
URI: | https://tuprints.ulb.tu-darmstadt.de/id/eprint/27385 | ||||
PPN: | 518827518 | ||||
Export: |
View Item |