Date of Completion

7-6-2015

Embargo Period

7-6-2015

Keywords

Bayesian model diagnostics, Bayesian model selection, Bregman clustering, Bregman divergence, Gaussian and Diffused-gamma (GD) prior, Iterated Conditional Modes, Posterior consistency, Rank reduction, Sparse high-dimensional data

Major Advisor

Dipak K. Dey

Associate Advisor

Ming-Hui Chen

Associate Advisor

Xiaojing Wang

Field of Study

Statistics

Degree

Doctor of Philosophy

Open Access

Open Access

Abstract

This dissertation has mainly focused on the development of statistical theory, methodology, and application from a Bayesian perspective using a general class of divergence measures (or loss functions), called Bregman divergence. Many applications of Bregman divergence have played a key role in recent advances in machine learning. My goal is to turn the spotlight on Bregman divergence and its applications in Bayesian modeling. Since Bregman divergence includes many well-known loss functions such as squared error loss, Kullback-Leibler divergence, Itakura-Saito distance, and Mahalanobis distance, the theoretical and methodological development unify and extend many existing Bayesian methods. The broad applicability of both Bregman divergence and Bayesian approach can handle diverse types of data such as circular data, high-dimensional data, multivariate data and functional data. Furthermore, the developed methods are flexible to be applied to real applications in various scientific fields including biology, physical sciences, and engineering.

Share

COinS