Overview
The modelling of continuoustime dynamical systems from uncertain observations is an important task that comes up in a wide range of applications ranging from numerical weather prediction over finance to genetic networks and motion capture in video. Often, we may assume that the dynamical models are formulated by systems of differential equations. In a Bayesian approach, we may then incorporate a priori knowledge about the dynamics by providing probability distributions on the unknown functions, which correspond for example to driving forces and appear as coefficients or parameters in the differential equations. Hence, such functions become stochastic processes in a probabilistic Bayesian framework.
Gaussian processes (GPs) provide a natural and flexible framework in such circumstances. The use of GPs in the learning of functions from data is now a wellestablished technique in Machine Learning. Nevertheless, their application to dynamical systems becomes highly nontrivial when the dynamics is nonlinear in the (Gaussian) parameter functions. This happens naturally for nonlinear systems which are driven by a Gaussian noise process, or when the nonlinearity is needed to provide necessary constraints (e.g., positivity) for the parameter functions. In such a case, the prior process over the system's dynamics is nonGaussian right from the start. This means, that closed form analytical posterior predictions (even in the case of Gaussian observation noise) are no longer possible. Moreover, their computation requires the entire underlying Gaussian latent process at all times (not just at the discrete observation times). Hence, inference of the dynamics would require nontrivial sampling methods or approximation techniques.
This raises the following questions:
 What is the practical relevance of nonlinear effects, i.e. could we just ignore them?
 How should we sample randomly from posterior continuoustime processes?
 How should we deal with large data sets and/or very high dimensional data?
 Are functional Laplace approximations suitable?
 Can we think of variational approximations?
 Can we do parameter and hyperparameter estimation?
 Etc.
The aim of this workshop is to provide a forum for discussing open problems related to continuoustime stochastic dynamical systems, their links to Bayesian inference and their relevance to Machine Learning. The workshop will be of interest to workers in both Bayesian Inference and Stochastic Processes. We hope that the workshop will provide new insights in continuoustime stochastic processes and serve as a starting point for new research perspectives and future collaborations.

Program
All presentations are available as video lectures from videolectures.net/dsb06_whistler!
Morning Session
07:45  08:35 
A Tutorial Introduction to Stochastic Differential Equations: Continuoustime Gaussian Markov Processes 

C. Williams (University of Edinburgh) 


08:35  09:00 
Variational Bayes for Continuoustime Nonlinear Statespace Models 

A. Honkela, M. Tornio and T. Raiko (Helsinki University of Technology) 

Talk & paper . 
09:00  09:20 
Coffee break 


09:25  09:45 
How Random is a Coin Toss? Bayesian Inference and the Symbolic Dynamics of Deterministic Chaos 

C. Strelioff and J. Crutchfield (University of California Davis) 

Talk & paper . 
09:45  10:30 
The Gaussian Variational Approximation of Stochastic Differential Equations 

M. Opper (Technical University Berlin) 




Afternoon Session


15:30  16:15 
Path Integral Method for Estimation of Time Series 

J. Restrepo (University of Arizona)  PASCAL invited speaker 


16:15  16:40 
A Gaussian Approximation for Stochastic Nonlinear Dynamical Processes with Annihilation 

W. Wiegerinck and B. Kappen (Radboud University Nijmegen) 

Talk & paper . 
16:40  17:00 
Coffee break 


17:00  17:25 
The Empirical Bayes Estimation of an Instantaneous Spike Rate with a Gaussian Process Prior 

S. Koyama, T. Shimokawa and S. Shinomoto (Kyoto University) 

Talk & paper . 
17:25  18:10 
Inferring Latent Functions with Gaussian Processes in Differential Equations 

Neil Lawrence (University of Sheffield) 



People
Organizers:
Program committee:

Sponsors
