Font Size: a A A

A Bayesian decision theoretical approach to supervised learning, selective sampling, and empirical function optimization

Posted on:2011-09-28Degree:Ph.DType:Thesis
University:Brigham Young UniversityCandidate:Carroll, James LFull Text:PDF
GTID:2448390002951199Subject:Computer Science
Abstract/Summary:
Many have used the principles of statistics and Bayesian decision theory to model specific learning problems. It is less common to see models of the processes of learning in general. One exception is the model of the supervised learning process known as the "Extended Bayesian Formalism" or EBF. This model is descriptive, in that it can describe and compare learning algorithms. Thus the EBF is capable of modeling both effective and ineffective learning algorithms.;We extend the EBF to model un-supervised learning, semi-supervised learning, supervised learning, and empirical function optimization. We also generalize the utility model of the EBF to deal with non-deterministic outcomes, and with utility functions other than 0-1 loss. Finally, we modify the EBF to create a "prescriptive" learning model, meaning that, instead of describing existing algorithms, our model defines how learning should optimally take place. We call the resulting model the Unified Bayesian Decision Theoretical Model, or the UBDTM. WE show that this model can serve as a cohesive theory and framework in which a broad range of questions can be analyzed and studied. Such a broadly applicable unified theoretical framework is one of the major missing ingredients of machine learning theory.;Using the UBDTM, we concentrate on supervised learning and empirical function optimization. We then use the UBDTM to reanalyze many important theoretical issues in Machine Learning, including No-Free-Lunch, utility implications, and active learning. We also point forward to future directions for using the UBDTM to model learnability, sample complexity, and ensembles. We also provide practical applications of the UBDTM by using the model to train a Bayesian variation to the CMAC supervised learner in closed form, to perform a practical empirical function optimization task, and as part of the guiding principles behind an ongoing project to create an electronic and print corpus of tagged ancient Syriac texts using active learning.;Keywords: Machine Learning, Supervised Learning, Function Optimization, Empirical Function Optimization, Statistics, Bayes, Bayes Law, Bayesian, Bayesian Learning, Decision Theory, Utility Theory, Unified Bayesian Model, UBM, Unified Bayesian Decision Theoretical Model, UBDTM, Learning Framework, No-Free-Lunch, NFL, a priori Learning, Extended Bayesian Formalism, EBF, Bias, Inductive Bias, Hypothesis Space, Function Class, Active Learning, Uncertainty Sampling, Query by Uncertainty, Query by Committee, Expected Value of Sample Information, EVSI.
Keywords/Search Tags:Bayesian, Empirical function optimization, Supervised learning, Model, Active learning, EBF, UBDTM, Theory
Related items