Dagstuhl Seminar on the Future of Learning with Kernels and Gaussian Processes
- 03 December 2016
- Dagstuhl Castle (Leibniz Centre for Computer Science)
- Probabilistic Numerics
A recent meeting at the Leibniz Centre for Computer Science highlights the ongoing significance of analytic nonparametric models for machine learning.
From November 27 to December 2, 2016, an international group of experts on kernel methods and Gaussian process models congregated at Dagstuhl Castle to discuss the role of their areas in the rapidly changing landscape of machine learning. Organized by Philipp Hennig, Arthur Gretton, Carl Rasmussen and Bernhard Schölkopf, the seminar highlighted recent developments and fostered informal discussion of new ideas.
Kernel models and Gaussian processes are two closely related frameworks for analytic machine learning from a very broad class of data types. One of their key advantages over other model classes is the availability of a very well established body of mathematical theory, which provides a deep understanding of their expressivity, modelling power and computational complexity. These properties are also the reason these methods recently enjoy renewed interest from emerging areas like probabilistic programming and probabilistic numerics.
In the secluded setting of the Leibniz Centre for Computer Science, seminar participants discussed opportunities for new research directions. The researchers agreed that, although recent developments in machine learning have highlighted some weaknesses of kernel models and in particular highlighted the need to flexibly and automatically choose forms of the kernel, the theoretical and computational strengths of Gaussian and kernel models will mean these methods will continue to be relevant for the field in the foreseeable future.