Research Summary

I am interested in statistical signal processing and machine learning. I am particularly interested in stochastic-resonant/noise-benefit phenomena in these domains.

I proved in my dissertation (summary chapter here) that controlled noise injection can improve the convergence times of Expectation-Maximization (EM) algorithms. This is an example of a noise benefit or stochastic resonance in statistical signal-processing. We call our noise-assisted improvement to the EM algorithm the Noisy Expectation-Maximization (NEM) algorithm. Many iterative statistical estimation and learning algorithms are instances of the EM algorithm. Our noise-induced convergence speed-up also applies to these algorithms. The most notable of such estimation algorithms is backpropagation training for feedforward neural networks. Recent work with K. Audhkhasi showed that backpropagation is a generalized EM algorithm and thus benefits from injection of noise subject to our NEM theorem.

I did some work on the effects of using approximate model functions (priors and likelihood functions) in Bayesian inference. I proved that Bayes theorem produces posterior pdfs equal in "approximation quality" to the model function approximations. This is the Bayesian Approximation theorem (BAT). I also demonstrated a robust method for approximating arbitrary priors or likelihood functions of compact support either from data or from expert contributions. The method can represent bounded closed-form model functions exactly and efficiently. So the method subsumes many traditional applications of Bayesian inference.