MLmediumBackpropagation gradients~15m
Gradient Flow Through Batch Normalization
Problem
Explain how batch normalization affects gradient flow during backpropagation and why it helps train deeper networks more stably than networks without it.
Reference solution
Reference solution available after you attempt the question.
Ready to solve it?
Start a session on Mockbit #78. You'll get graded with specific critique when you submit.
Related ML questions
← Back homemockbit.io/q/78