Matrix Product
You are implementing a forward pass in a neural network. At the core of every dense layer is a matrix multiplication — the input activations matrix multiplied by the weight matrix produces the layer output.
Given two 2D matrices A (shape m×n) and B (shape n×p), return their matrix product A @ B as a list of lists.
The result will have shape (m, p) where result[i][j] = sum of A[i][k] * B[k][j] for k in range(n).
Example 1
([[1, 2, 3], [4, 5, 6]], [[1, 0], [0, 1], [1, 1]])[[4, 5], [10, 11]]2×3 @ 3×2 = 2×2. Row 0: [11+20+31, 10+21+31] = [4, 5]. Row 1: [4+0+6, 0+5+6] = [10, 11].
Example 2
([[1], [2], [3]], [[4, 5, 6]])[[4, 5, 6], [8, 10, 12], [12, 15, 18]]3×1 @ 1×3 = 3×3 outer product. Each row i is scaled by A[i][0].
Example 3
([[1, 1], [0, 1]], [[1, 0], [1, 1]])[[2, 1], [1, 1]]Order matters — A@B ≠ B@A. B@A would give [[1,1],[1,2]].
- ›1 <= m, n, p <= 50
- ›-100 <= A[i][j], B[i][j] <= 100
- ›A has shape (m, n), B has shape (n, p)
Reference solution available after you attempt the question.
Ready to solve it?
Start a session on Mockbit #68. Write your code, run it against hidden tests, and get graded with specific critique on each axis.