Z-Score Standardization
You are preprocessing a feature matrix for a machine learning model. Raw features have different scales — age ranges from 0-100, salary from 0-200000. Before training, you must standardize each feature column to have mean=0 and standard deviation=1 using z-score normalization.
Given a 2D matrix where each row is a sample and each column is a feature, standardize each column using z = (x - mean) / std. Use population standard deviation (divide by N, not N-1). If a column has std=0 (all values identical), set its output to 0.0. Round each value to 6 decimal places. Return as a list of lists.
Example 1
[[10, -5], [10, 5]][[0.0, -1.0], [0.0, 1.0]]Column 0 has std=0 (all 10s) → output 0.0. Column 1: mean=0, std=5, z-scores: -5/5=-1.0, 5/5=1.0.
Example 2
[[10], [20], [30]][[-1.224745], [0.0], [1.224745]]Mean=20, std≈8.165. z(10)=(10-20)/8.165=-1.224745, z(20)=0.0, z(30)=1.224745.
Example 3
[[-100], [-200]][[1.0], [-1.0]]Mean=-150, std=50. z(-100)=(-100-(-150))/50=1.0. More negative value gets negative z-score.
- ›2 <= rows <= 100
- ›1 <= cols <= 100
- ›-1000 <= matrix[i][j] <= 1000
Reference solution available after you attempt the question.
Ready to solve it?
Start a session on Mockbit #71. Write your code, run it against hidden tests, and get graded with specific critique on each axis.