Font Size: a A A

Parametric Learning in Collaborative Signal Processing

Posted on:2013-06-30Degree:Ph.DType:Dissertation
University:University of California, Los AngelesCandidate:Lee, Juo-YuFull Text:PDF
GTID:1458390008484715Subject:Engineering
Abstract/Summary:
The advances in embedded computing and communication technology have emerged to foster the genesis of new digital era that embraces a plethora of innovative applications. Central to the realization of these immersive innovations is to exploit the functions that bridge the cyber space and the physical world. The efficacy of these instrumental functions can only be leveraged should the methodology of learning the key parameters be implemented efficiently. We study a few applications of interconnected devices that rely on calibration or initialization of the fundamental parameters in the required affine-like functions, detectors, probability distributions and clusters. Depending on the objectives of applications, the methods of learning parametric functions or functionals may be developed or improved as a consequence of collaboration via computing and communication.;Our first topic entails a unified framework of calibrating the parameters in the affine-like functions that appear in the systems of devices with counters and directional sensing modalities. We propose a conservation law to pursue a robust solution set by taking the advantage of the ring structure. The challenge of calibrating parameters defined on manifold is also described.;In the second topic, we address the detection performance of a star-like data fusion system in the presence of interference. Given the constraints on the interference power, we quantify the tight upper and lower bounds on the probabilistic detection metrics by employing a linear programming approach. The accuracy of the upper and lower bounds may be guaranteed with finite degree of numerical approximation.;The third topic pertains to the initialization of random-finite-set posterior density functions over a tree-like structure. The root node launches messages traversing downward to the leaf nodes, which may return updated messages upward. Encapsulated in the messages are weights and particles that represent multi-Bernoulli random-finite-sets. We show that the initialization process gracefully aggregates the weights and particles in the posterior density functions.;The fourth topic involves a hierarchical system of clustered devices. All clusters aim to achieve a consensus on a support vector machine defined over a data universe. Nevertheless, the data subsets acquired by the clusters are not identical. We propose two optimization problems with the solutions to efficient device clustering and distributed support vector machines. Two novel metrics are proposed to quantify clustering fairness and classification fairness.;The notations in each chapter should be treated independently from that in other chapters.
Keywords/Search Tags:Posterior density functions, Computing and communication
Related items