27 aprile 2016
San Francesco - Via della Quarquonia 1 (Classroom 1 )
The identification of nonlinear models can be largely divided into two categories: parametric models, which typically require significant insight into and understanding of the system, and nonparametric models, which often have no dependence on prior information, permitting a more flexible model class.
The emergence in recent years of kernel methods for identification has added to the repertoire of available nonparametric methods. Though popularised in statistics and machine learning through approaches such as Support Vector Machines (SVMs), Gaussian Processes (GPs) and Kriging, the theory of Reproducing Kernel Hilbert Spaces (RKHS) provides a unified framework for kernel-based modelling.
In this presentation, after a brief introduction to identification in the RKHS, the problem of model selection in kernel-based identification will be discussed. Any such method will depend upon the choice of a suitable kernel. However, despite the importance of the kernel function in the definition of the model class, this choice is largely overlooked in the literature. It will be shown that penalizing the derivative of the function in the optimization criterion can greatly simplify the aforementioned kernel selection problem. From a system identification perspective, this facilitates the search for a suitable model structure and opens many other interesting possibilities, which will be depicted through a series of examples.
relatore:
Laurain, Vincent
Units:
DYSCO