Many aspects of our lives today are possible thanks to machine learning – where a machine is trained to do a specific, yet complex, job. We no longer think twice about speaking to our digital devices, clicking on recommended products from online stores, or using language translation apps and websites.
Since the 1980s neural networks have been used as the mathematical model for machine learning. Inspired by the structure of our brains, each "neuron" is a simple mathematical calculation, taking numbers as input and producing a single number as an output. Originally the neural networks consisted of just one or two layers of neurons due to the computational complexity of the training process. But since the early 2000s deep neural networks consisting of many layers have been possible, and are now used for tasks that vary from pre-screening job applications to revolutionary approaches in health care.
Deep learning is increasingly important in many areas both outside and inside science. Its usefulness has been proven, but there still are a lot of unanswered questions about the theory of why such deep learning approaches work. And that is why the Isaac Newton Institute (INI) in Cambridge is running a research programme called Mathematics of deep learning (MDL), which aims to understand the mathematical foundations of deep learning.
This collection of articles and podcasts will introduce you to the ideas involved and uncover what the MDL programme is all about!
Opening the black box — The organisers of the MDL programme explain why we need to know more about the maths behind deep learning.
Opening the black box – the video — In this video from the INI, Dan Aspel speaks to the organisers of the MDL programme about the motivations and potential outcomes of their research programme.
Maths in a minute: Artificial neurons — When trying to build an artificial intelligence, it makes sense to mimic the human brain. Artificial neurons do just that.
Maths in a minute: Machine learning and neural networks — Machine learning makes many daily activities possible, but how does it work? Find out in this brief introduction.
Maths in a minute: Gradient descent algorithms — Whether you're lost on a mountainside, or training a deep neural network, you can rely on the gradient descent algorithm to show you the way!
Maths in a minute: Semi-supervised machine learning — Machine learning started with supervised learning and us providing all the training materials, but we are finding ways for algorithms to learn with far fewer resources.
What is machine learning? — Find out how a little bit of maths can enable a machine to learn from experience in this more in-depth introduction from the Plus library.
The agent perspective — In this collection from the Plus library, deep learning pioneer Yoshua Bengio explains why he thinks that true artificial intelligence will only be possible once machines have something babies are born with: the ability to interact with the world, observe what happens, and adapt to the consequences of their actions.
Seeing traffic through new eyes — Traffic is not just annoying, it can also come at a high cost to human and environmental health. This article from the Plus library explores how a form of deep learning is being used to help understand traffic, so that suitable city planning can tame it.
Artificial intelligence takes on COVID-19 — In this article from the Plus library, we find out how researchers from the AIX-COVNET project are developing a tool that will utilise deep learning to help diagnose COVID-19.
We produced this collection of content as part of our collaboration with the Isaac Newton Institute for Mathematical Sciences (INI), an international research centre and our neighbour here on the University of Cambridge's maths campus. INI attracts leading mathematical scientists from all over the world, and is open to all. Visit www.newton.ac.uk to find out more.