Teacher: Gabriel TURINICI
Summary:
1/ Deep learning : major applications, references, culture
2/ Types: supervised, renforcement, unsupervised
3/ Neural networks: main objects: neurons, operations, loss fonction, optimization, architecture
4/ Stochastic optimization algorithms and convergence proof for SGD
5/ Gradient computation by « back-propagation »
6/ Pure Python implementation of a fully connected sequential network
7/ Convolutional networks (CNN) : filters, layers, architectures.
8/ Keras implementation of a CNN.
9/ Techniques: regularization, hyper-parameters, particular networks, recurrent (RNN, LSTM);
10/ Unsupervised Deep learning: generative AI, GAN, VAE, Stable diffusion.
11/ Keras VAE implementation. “Hugginface” Stable Diffusion.
(12/ If time allows: LLM & NLP: word2vec, Glove (exemples : woman-man + king = queen)
Documents | ||
MAIN document (theory): see your teams channel (no distribution is authorized without WRITTEN consent from the author) | for back-propagation | SGD convergence proof |
Implementations |
Function approximation by NN : notebook version, Python version Results (approximation & convergence) After 5 times more epochs Official code reference https://doi.org/10.5281/zenodo.7220367 |
Pure python (no keras, no tensorflow, no Pytorch) implementation (cf. also theoretical doc): – version « to implement » (with Dense/FC layers) (bd=iris), – version : solution If needed: iris dataset here Implementation : keras/Iris CNN example: https://www.tensorflow.org/tutorials/images/cnn Todo : use on MNIST, try to obtain high accuracy on MNIST, CIFAR10. |
VAE: latent space visualisation : CVAE – python (rename *.py) , CVAE ipynb version |
Stable diffusion: working example 19/1/2024 on Google collab: version : notebook, (here python, rename *.py). ATTENTION the run takes 10 minutes (first time) then is somehow faster (just change the prompt text). |