Applied Deep Learning 2024 Lecture 9 Preprocessing Augmentation Regularization Visualization
>> YOUR LINK HERE: ___ http://youtube.com/watch?v=wn96pyxzqAg
In this lecture, we're discussing how preprocessing can help our networks to learn better or even enable efficient processing in the first place. Data augmentation is another very important technique that helps us to deal with situations where we don't have a lot of data, or want our models to become more robust to small variations. Regularization is another important trick in our toolbox to prevent overfitting and finally, we have a look at how we can visualize what the models are actually learning. • Complete Playlist: • Applied Deep Learning 2024 - TU Wien • == Literature == • 1. Chatzimichailidis et al. GradVis: Visualization and Second Order Analysis of Optimization Surfaces during the Training of Deep Neural Networks. 2019 • 2. Nikolenko. Synthetic Data for Deep Learning. 2019 • 3. Pramerdorfer. Deep Learning for Visual Computing. 2016 • 4. Zoph et al. Learning Data Augmentation Strategies for Object Detection. 2019 • 5. Wu et al. Making and Invisibility Cloak: Real World Adversarial Attacks on Object Detectors. 2019 • 6. Goodfellow, et al. Explaining and Harnessing Adversarial Examples. 2015. • 7. Nicholas et al. DocCreator: A New Software for Creating Synthetic Ground-Truthed Document Images. 2017 • 8. Ma. Data Augmentation for Audio. 2019. • 9. Park et al. SpecAugment: A Simple Data Augmentation Method for Automatic Speech Recognition. 2019. • 10. Zach C. State of the Art Audio Data Augmentation with SpecAugment and PyTorch. 2019. • 11. Kanburoğlu. Audio Data Augmentation. 2018 • 12. Srivastava, et al. Dropout: A Simple Way to Prevent Neural Networks from Overfitting. 2014 • 13. Wei, et al. EDA: Easy Data Augmentation Techniques for Boosting Performance on Text Classification Tasks. 2019. • 14. George Washington Papers. https://www.loc.gov/resource/mgw1b.48... • 15. Prince. Computer Vision Models. 2012 • 16. Xie, et al. Unsupervised Data Augmentation for Consistency Training. 2019 • 17. DeVries, etal. Improved Regularization of Convolutional Neural Networks with Cutout. 2017 • 18. Goodfellow et al. Deep Learning. 2016 • 19. Shorten, et al. A survey on Image Data Augmentation for Deep Learning. 2019 • 20. Calvo-Zaragoza, et al. Camera-Primus: Neural End-To-End Optical Music Recognition on Realistic Monophonic Scores. 2018 • 21. Ioffe, et al. Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift. 2015. • 22. Santurkar, et al. How Does Batch Normalization Help Optimization? 2019. • 23. Barnett. Rethinking Batch Normalization. 2019. • 24. He, et al. Identity Mappings in Deep Residual Networks. 2016. • 25. Klambauer, et al. Self-Normalizing Neural Networks. 2017. • 26. Zhou, Khosla, Lapedriza, Oliva, Torralba. Learning Deep Features for Discriminative Localization. 2015. • 27. Rajpurkar, et al. CheXNet: Radiologist-Level Pneumonia Detection on Chest X-Rays with Deep Learning. 2017. • 28. Su et al. One Pixel Attacks for Fooling Neural Networks. 2017. • 29. Prija. Build Better Deep Learning Models with Batch and Layer Normalization. • 30. Izmailov et al. Averaging Weights Leads to Wider Optima and Better Generalization. 2018
#############################
