Font Size: a A A

Graph-based Estimation of Information Divergence Functions

Posted on:2018-01-09Degree:Ph.DType:Dissertation
University:Arizona State UniversityCandidate:Wisler, AlanFull Text:PDF
GTID:1478390017489796Subject:Computer Science
Abstract/Summary:
nformation divergence functions, such as the Kullback-Leibler divergence or the Hellinger distance, play a critical role in statistical signal processing and information theory; however estimating them can be challenge. Most often, parametric assumptions are made about the two distributions to estimate the divergence of interest. In cases where no parametric model fits the data, non-parametric density estimation is used. In statistical signal processing applications, Gaussianity is usually assumed since closed-form expressions for common divergence measures have been derived for this family of distributions. Parametric assumptions are preferred when it is known that the data follows the model, however this is rarely the case in real-word scenarios. Non-parametric density estimators are characterized by a very large number of parameters that have to be tuned with costly cross-validation. In this dissertation we focus on a specific family of non-parametric estimators, called direct estimators, that bypass density estimation completely and directly estimate the quantity of interest from the data. We introduce a new divergence measure, the...
Keywords/Search Tags:Divergence, Estimation
Related items