**Keynote Speakers**

## Professor Alfred Hero - Biography

**Abstract** - *Learning to Benchmark*

Using mathematical models to benchmark the capability of a sensor platform to provide data for accurate signal detection, classification, or estimation has been an essential part of performance-driven system design. When a mathematical model is unreliable, or not available, a natural question to ask is whether it is possible to use machine learning to accurately benchmark the capability of a sensor solely from experimental data collected from the sensor. In this talk we will answer this question in the affirmative. For example, in the context of classification, empirical estimation of the minimal achievable classification error, i.e., the Bayes error rate, from labeled experimental sensor data can be framed as the meta-learning problem of estimating the Bayes-optimal misclassification error rate without having to estimate the Bayes-optimal classifier. The talk will cover relevant background, theory, algorithms, and applications of benchmark learning.

## Professor Andy Bell - Biography

**Abstract** - *Coming Shortly*

**Invited Speakers**

## Professor Simon Maskell - Biography

**Abstract** - **Big Hypotheses: A Generic Tool for Fast and Good Bayesian Machine Learning**

There are many machine learning tasks that would ideally involve global optimisation across some parameter space. Researchers often pose such problems in terms of sampling from the distribution and favour Markov Chain Monte Carlo (MCMC) or its derivatives (e.g., Gibbs sampling, Hamiltonian Monte Carlo (HMC) and Simulated annealing). While these techniques can offer the good results that are so important in defence contexts, they are stereotypically slow. We describe an alternative numerical Bayesian algorithm, the Sequential Monte Carlo (SMC) sampler. SMC samplers are closely related to particle filters and are reminiscent of genetic algorithms. More specifically, an SMC sampler replaces the single Markov chain considered by MCMC with a population of samples. The inherent parallelism present makes the SMC sampler a promising starting point for developing a scalable Bayesian global optimiser, e.g., that runs 86,400 times faster than MCMC and might be able to be 86,400 times more computationally efficient. The University of Liverpool and STFC’s Hartree centre have recently started working on a £2.5M EPSRC-funded project (with significant support from IBM, NVidia, Intel and Atos) to develop SMC samplers into a general purpose scalable numerical Bayesian optimisation and embody them as a back-end in the software package Stan. This talk will summarise recent developments, initial results (in a subset of problems posed by Astrazeneca, AWE, Dstl, Unilever, physicists, chemists, biologists and psychologists) and planned work over the next 4 years towards developing a high-performance parallel Bayesian inference implementation that can be used for a wide range of problems relevant to researchers working in a range of application domains including defence.

## Professor Peter Willett - Biography

**Abstract - Navigation- and Destination-Aware Modeling for Highly-Maneuvering Threats **

Concern has largely shifted from ballistic threats to those that execute high-speed and seemingly-random maneuvers prior to final target engagement. For target protection it is vital that a method for very accurate tracking of such objects be developed. The scheme discussed here has four key ingredients: it adheres to physics, it assumes a Proportional Navigation feedback guidance model when targeted toward that destination, it estimates the parameters of the feedback, and it allows for periods in which other are other temporary sham destinations. It is applied to a 3D maneuvering target state estimation problem with a target capable of high-magnitude, random lateral and vertical accelerations under a Proportional Navigation control policy. It is shown that due to the observability of the feedback control parameters, the filter significantly reduces the estimated position, velocity, and prediction errors.