In the dropout method of regularization, we randomly delete half of the hidden neurons, leaving the input and output layers the same.
In a theoretical sense, wouldn't the same effect occur if we just randomly assign half of the hidden neurons a weight of 0, since this would effectively null out that neuron?
Contents
hide
Best Answer
If you set all the output weights for certain neurons to 0, yes, it is the same. But just to clarify, dropout doesn't actually delete neurons, it just deactivate them temporarily and randomly for each input. But if you deactivate a portion of your weights without regard to specific neurons (you may cut only some weights from one neuron), you end with DropConnect.
Similar Posts:
- Solved – Neural Network, questions on DropOut process
- Solved – Value of the keep probability when calculating loss with dropout
- Solved – What happens when many neurons have same weights
- Solved – Neural networks – how can I interpret what a hidden layer is doing to the data
- Solved – Neural networks – how can I interpret what a hidden layer is doing to the data