State-of-the art machine learning (ML) methods are not designed to run on small ubiquitous devices---we need ML methods which can adapt to resource-constraint systems to fully exploit the capabilities of ML. A classic approach to give preference to a particular model with desirable properties is regularization. Particular types of regularization help us to solve ill-posed problems, avoid overfitting of ML models, and select relevant (groups of) features.In my talk, I demonstrate how regularization identifies exponential family members with reduced computational requirements. More precisely,we will see how to (1) reduce the memory requirements of time-variant spatio-temporal probabilistic models, (2) reduce the arithmetic requirements of undirected probabilistic models, and (3) present a new quadrature-based nference procedure which allows us to trade quality against run-time.
Nico studied computer science and economics (double major) at TU Dortmund in Germany. He did his PhD (graded "summa cum laude") in the collaborative research center SFB 876 "Providing Information by Resource-Constrained Data Analysis". His PhD thesis is about "Exponential Families on Resource-Constrained Systems" where he brings probabilistic machine learning methods to ultra-low-power devices. Right now, Nico is a research fellow at the new ML2R center for machine learning in North Rhine-Westphalia, funded by the German federal ministry for education and research.