Summary
Chapters
Video Info
Dr. Chirag Shah, PhD, Associate Professor in the Information School, University of Washington, discusses the theory behind neural networks for deep learning, including basic concepts, perceptron, hidden layers, defining a cost function, how learning in neural networks works, and addressing the challenges of learning rate, batch processing, and overfitting.
-
Chapter 1: Chirag Shah Introduces and Defines Deep Learning and Neural Networks
icon angle down -
Chapter 2: Chirag Shah Discusses the Concepts Behind Deep Learning
icon angle down -
Chapter 3: Chirag Shah Discusses a Simple Neural Network Perceptron
icon angle down -
Chapter 4: Chirag Shah Discusses the Hidden Layers of Neural Networks
icon angle down -
Chapter 5: Chirag Shah Discusses Defining a Cost Function in a Neural Network
icon angle down -
Chapter 6: Chirag Shah Discusses How Learning Works in a Neural Network
icon angle down -
Chapter 7: Chirag Shah Discusses Learning Rate and Batch Processing Challenges for the Implementation of a Neural Network
icon angle down -
Chapter 8: Chriaq Shah Discusses Overfitting Data Challenges for the Implementation of a Neural Network
icon angle down