Join ACENET's Mat Larade in this session as he introduces various machine learning and mathematical concepts leading up to GCNN's. This includes various architectures of neural networks, such as feed-forward, convolutional , and recurrent, and some of the mathematical theory underpinning their operations, with explanations of why each network is used. These topics will be explored using both rigorous mathematics, approachable metaphor, and attempts at humour. All are welcome, regardless of their level of technical expertise, as we will begin at basics, and build towards an understanding of a complex neural network architecture. There will be time reserved at the end to answer any questions you may have.
We will be recording the presentation part of this session, the question period at the end will not be recorded.
Participants must register using their institutional / organizational email address (not a personal email, ie. gmail).