I am interested in statistical signal processing and machine learning. I am particularly interested in stochasticresonant/noisebenefit phenomena in these domains.
I proved in my dissertation (summary chapter here) that controlled noise injection can improve the convergence times of ExpectationMaximization (EM) algorithms. This is an example of a noise benefit or stochastic resonance in statistical signalprocessing. We call our noiseassisted improvement to the EM algorithm the Noisy ExpectationMaximization (NEM) algorithm. Many iterative statistical estimation and learning algorithms are instances of the EM algorithm. Our noiseinduced convergence speedup also applies to these algorithms. The most notable of such estimation algorithms is backpropagation training for feedforward neural networks. Recent work with K. Audhkhasi showed that backpropagation is a generalized EM algorithm and thus benefits from injection of noise subject to our NEM theorem.
I did some work on the effects of using approximate model functions (priors and likelihood functions) in Bayesian inference. I proved that Bayes theorem produces posterior pdfs equal in "approximation quality" to the model function approximations. This is the Bayesian Approximation theorem (BAT). I also demonstrated a robust method for approximating arbitrary priors or likelihood functions of compact support either from data or from expert contributions. The method can represent bounded closedform model functions exactly and efficiently. So the method subsumes many traditional applications of Bayesian inference.
