Lecture 1-Unit 3.3
Lecture 1-Unit 3.3
Lecture 1-Unit 3.3
1. **Generalized Delta Rule and Updating of Hidden Layer and Output Layer:**
- The generalized delta rule is an extension of the delta rule used in backpropagation. It involves
adjusting weights based on the gradient of the error with respect to the weights. In a neural network,
weights in both the hidden and output layers are updated using this rule during the training process.
19. **Self-Organization:**
- Self-organization refers to the ability of neural networks to learn and adapt without explicit
programming. It is a key feature in unsupervised learning tasks.
ReLU activation function introduces non-linearity in hidden layers by outputting the input for positive
values and zero for negative values. Its generalized form includes variations like Leaky ReLU.