Mockbit/#80
MLmediumRegularization~15m

Dropout vs Batch Normalization Interaction

Problem

Explain why using dropout and batch normalization together in the same layer can cause training instability, and describe the recommended ordering when both are used.

Reference solution

Reference solution available after you attempt the question.

Ready to solve it?

Start a session on Mockbit #80. You'll get graded with specific critique when you submit.

Related ML questions
← Back homemockbit.io/q/80
PrivacyTerms© 2026 Mockbit