What attracted you to TSE?
I joined TSE in 2014, mainly motivated by the research group’s academic ambition, which is to be at the forefront of the main issues in the field of mathematics, and also my desire to join an institution which gives researchers adequate resources for substantial projects. TSE is a place where things progress and work well.
Since my arrival, the mathematics and statistics group has expanded and progressed in various fields, and I think that we are currently covering all major research trends in applied mathematics, such as big data, artificial intelligence, optimization and machine learning, but also the latest progress in game theory with repeated games. Scientific emulation is a real asset of TSE.
What are you working on?
I am currently working on several research topics, notably statistical learning, which is the tool behind most artificial intelligence. For example, understanding the geometry of very large graphs, which is a problem we encounter when using big data. Facebook is a perfect example of this type of data. If we try to represent interactions between users, we can use mathematical tools to try to understand major trends, the centers and main axes.
I am also looking at problems of optimizing functions, not necessarily convex, with a sequential algorithm. Our contribution is developing an approach which allows us to put the convexity framework and the deterministic measurement framework to one side to get valid results for a wide range of situations. I should also emphasize the exceptional groundwork done by Jérôme Bolte on this topic. This type of work can be used in sequential decision-making, machine learning and finance.
More specifically, we have worked with Airbus to use these sequential decision-making algorithms to define an optimal flight path, considering the uncertainty of weather conditions or the exact weight of the aircraft, and therefore working out the journey which consumes the least fuel. We have successfully tested these algorithms with Airbus flight simulators.
We are also working on immunology with Oncopole de Toulouse to codify the remission of chronic lymphocytic leukaemia, a blood cancer. We are analyzing and processing data to offer a causal model which considers the different variables. Over time, we hope to improve the understanding of this illness and its possible treatments.
Finally, I am currently working on deconvolution of the mixing law through super-resolution. The idea is to be able to sort through the laws that code observations. This work links optimization (particularly the notion of duality) and statistics. It’s an exciting and promising field.
What are the major trends in mathematics at the moment?
Currently, sequential methods are a key concern of applied mathematics (optimization and statistics). They allow decision-making with uncertainty in real time. These issues are increasingly important due to the constant increase in the amount of data collected, notably to build algorithms which govern the digital world. This is what is known as machine learning.
Deep learning is also an ongoing issue. This is a sub-section of machine learning which involves implementing a cascade of extremely complex models. If it provides reliable predictions, these are very obscure and the final algorithm is often not readable by a human. This sets it apart from other machine learning methods, which we can understand and visualize.
We always need new algorithms to continue to process big data, which arrives each second and can signal very rapid changes in all measured fields. These problems will be at the heart of mathematics and machine learning issues for the next 10 years, and our research group is actively pursuing these crucial topics.
Extract of TSE Mag#17 Summer 2018