The Row With Maximum Ones interview question gives you a binary matrix (containing only 0s and 1s) and asks you to return the index of the row that contains the most 1s, along with the count of 1s in that row. If multiple rows have the same maximum count, return the one with the smallest index.
Amazon, Google, and Bloomberg ask this problem as a beginner-level matrix traversal exercise. It validates that candidates can iterate over a 2D array, compute row-level statistics (sum of 1s), and track a running maximum with its associated index. The tiebreak rule (smallest index) tests whether candidates handle comparison logic correctly.
The pattern is row-wise summation with argmax tracking. Initialize max_ones = -1 and best_row = 0. For each row index i, compute row_sum = sum(matrix[i]). If row_sum > max_ones, update max_ones = row_sum and best_row = i. Return [best_row, max_ones]. Only update when strictly greater (not ≥) to preserve the smallest index on ties.
Matrix:
[[0, 0, 1],
[1, 1, 0],
[1, 1, 1],
[0, 1, 1]]
Row sums: 1, 2, 3, 2. Maximum: 3 at row index 2.
Result: [2, 3].
Tie example:
[[1, 1],
[1, 1]]
Row sums: 2, 2. Tie — return smallest index → [0, 2].
(Using strict > ensures first occurrence wins on tie.)
>= instead of > in the update condition, which replaces the best row with later tied rows.max_ones to a value below 0 — the minimum possible row sum is 0, so initialize to -1.For the Row With Maximum Ones coding problem, the array and matrix interview pattern is straightforward. In Python, sum(row) gives the count of 1s in a binary row without a separate inner loop. Google and Bloomberg interviewers use this as a warm-up — solve it in under 2 minutes, then offer to discuss the sorted-row binary search optimization (if the matrix rows are sorted, use bisect to find the first 1 in O(log n) per row).