Jump to Block: (About) 01 02 03 04 05 06 07 08 09 10 11 12
09 Perceptions and Neural Networks
In this block we cover:
- Introduction
- Neurons
- Single layer perceptron
- Learning algorithms
- Deep Neural Networks
- Multi layer perceptron and the feed-forward neural network
- Learning for deep neural networks
- Other types of neural networks and their value:
- Feed-forward
- Convolutional
- Recurrent
- Recursive
- Auto-encoders
Lectures
- Neural Nets and the Perceptron
- Practicalities of Neural Nets
Worksheets:
Workshop:
- 9.3 Workshop on Tensorflow and Keras (31:30)
- Neural Network run as a script.
- Appendix 5 on Bluecrystal has some specific advice on getting Tensorflow to work on Bluecrystal Phase 3.
- You need to understand how to run python as a script.
- This is run (for me) with
/usr/local/opt/python/bin/python3.7 block09-NeuralNetworksScript.py
- You will have to give it the right location of python
- On Bluecrystal, if you followed the instructions to create
tf-env
thenpython3
will do. - Remember to put the data and python script in the directory you are running in!
- A script to run non-interactively could be called
block09-NeuralNetworksScript.sh
and would be submitted withqsub block09-NeuralNetworksScript.sh
, containing the following:#!/bin/bash #PBS -l nodes=1:ppn=2,walltime=1:00:00 module load languages/python-anaconda3-2019.10 eval "$(conda shell.bash hook)" # in interactive mode, conda activate doesn't get set up without this conda activate tf-env cd $PBS_O_WORKDIR python3 block09-NeuralNetworksScript.py
- This is run (for me) with
References
Neural Networks textbooks
- Chapter 11 of The Elements of Statistical Learning: Data Mining, Inference, and Prediction (Friedman, Hastie and Tibshirani).
- Russell and Norvig Artificial Intelligence: A Modern Approach
Theoretical practicalities
- Bengio 2012 Practical Recommendations for Gradient-Based Training of Deep Architectures (in the book “Neural Networks: Tricks of the Trade”)
- Kull et al 2019 NeurIPS Beyond temperature scaling: Obtaining well-calibrated multiclass probabilities with Dirichlet calibration
- Swish: Ramachandran, Zoph and Le Searching for Activation Functions
Important historical papers:
- McCulloch and Pitts (1943) A logical calculus of the ideas immanent in nervous activity
- Minsky and Papert 1969 Perceptrons
- Hecht-Nielsen, Robert. “Theory of the backpropagation neural network.” Neural networks for perception. Academic Press, 1992. 65-93.
- Bishop 1994 Mixture Density Networks
Likelihood and modelling applications of Neural Networks:
- Chilinski and Silva Neural Likelihoods via Cumulative Distribution Functions
- Albawi, Mohammed and Al-Zawi Understanding of a convolutional neural network
- Omi, Ueda and Aihara Fully Neural Network based Model for GeneralTemporal Point Processes