02 Making Neural Network in GMS2 Backward pass











>> YOUR LINK HERE: ___ http://youtube.com/watch?v=DyPBi-UAVx4

Coding starts at 4:00 mark. • - - • In these video series we will create Feed-Forward Neural Network in GMS2. In this second video we will create Backward-pass, which is used to train network with examples. This is supervised learning, as we use examples for training, which consist of input/output-pairs. • Today we will create gradient structure for layers and network. For this we create new taped network and layers, so gradients structure is encapsulated because it is not always needed. • Gradient structure allows us backpropagate error signal • We create Mean Squared Error as cost function, which is used to get error signal. • With gradients we can use Gradient Descent to update learnable parameters. • I made some changes to visualization compared to previous video. • In the next video we will do Genetic Algorithm as a way to do unsupervised learning. • - - • Github for project file: • https://github.com/HannulaTero/Buildi... • - - • 00:00 Introduction • 00:17 Backpropagation • 02:45 Gradient Descent • 04:00 Activation Derivatives • 05:17 Changes to Layers • 05:37 Taped Input • 06:37 Taped Dense • 10:54 Cost Function • 12:47 Taped Network • 16:24 Update Builder • 17:09 General Functions • 18:00 Update Visualization • 19:18 Example of Use • 21:55 Final Words • - -

#############################









Content Report
Youtor.org / YTube video Downloader © 2025

created by www.youtor.org