Jump to Block: (About) 01 02 03 04 05 06 07 08 09 10 11 12 (Assessments)
07 Perceptrons and Neural Networks
In this block we cover:
- Introduction
- Neurons
- Single layer perceptron
- Learning algorithms
- Deep Neural Networks
- Multi layer perceptron and the feed-forward neural network
- Learning for deep neural networks
- CNNs and Transformers
Lectures
Assessments:
- Assessment 2 will be set in this week; see Assessments. This is a summative assessment (i.e. does contribute to your grade) and will be due in Week 12.
- Portfolio 07 of the full Portfolio.
- Block07 on Noteable via Blackboard:
Workshop:
References
Neural Networks textbooks
- Chapter 11 of The Elements of Statistical Learning: Data Mining, Inference, and Prediction (Friedman, Hastie and Tibshirani).
- Russell and Norvig Artificial Intelligence: A Modern Approach
Theoretical practicalities
- Bengio 2012 Practical Recommendations for Gradient-Based Training of Deep Architectures (in the book “Neural Networks: Tricks of the Trade”)
- Kull et al 2019 NeurIPS Beyond temperature scaling: Obtaining well-calibrated multiclass probabilities with Dirichlet calibration
- Swish: Ramachandran, Zoph and Le Searching for Activation Functions
Important historical papers
- McCulloch and Pitts (1943) A logical calculus of the ideas immanent in nervous activity
- Minsky and Papert 1969 Perceptrons
- Hecht-Nielsen, Robert. “Theory of the backpropagation neural network.” Neural networks for perception. Academic Press, 1992. 65-93.
- Bishop 1994 Mixture Density Networks
Likelihood and modelling applications of Neural Networks
- Chilinski and Silva Neural Likelihoods via Cumulative Distribution Functions
- Albawi, Mohammed and Al-Zawi Understanding of a convolutional neural network
- Omi, Ueda and Aihara Fully Neural Network based Model for GeneralTemporal Point Processes
Implementations and Examples
Worksheets (unassessed)
Historical contents
- Note that these have been superseded by the above lectures and workshop.