Find Row With Maximum 1's in a Sorted 2D Matrix
You are provided with a 2D matrix containing only the integers 0 or 1. The matrix has dimensions N x M, and each row is sorted in non-decreasing order. Your objective is to identify the 0-based index of the first row that contains the maximum number of 1's.
Input:
The first line contains an integer 'T' denoting the number of test cases. Each test case consists of the following:
- The first line contains two integers 'N' and 'M', representing the number of rows and columns, respectively.
- N subsequent lines contain M space-separated integers that define the matrix.
Output:
For each test case, output the index of the row that contains the maximum number of 1's.
Example:
Input:
2
3 3
0 1 1
1 1 1
0 0 1
4 4
0 0 0 1
0 1 1 1
1 1 1 1
0 0 0 0
Output:
1
2
Explanation:
- In Test Case 1: The maximum number of 1's is in row 1 with 3 ones.
- In Test Case 2: The maximum number of 1's is in row 2 with 3 ones.
Constraints:
1 ≤ T ≤ 50
1 ≤ N, M ≤ 100
0 ≤ ARR[i][j] ≤ 1
where ARR[i][j] represents the matrix elements.- Execution time should not exceed 1 second.
Note:
If multiple rows have the same number of 1's, return the row with the smallest index.

AnswerBot
4mo
Find the row with the maximum number of 1's in a sorted 2D matrix.
Iterate through each row of the matrix and count the number of 1's in each row.
Keep track of the row index with the maximum number of ...read more
Help your peers!
Add answer anonymously...
Stay ahead in your career. Get AmbitionBox app


Trusted by over 1.5 Crore job seekers to find their right fit company
80 L+
Reviews
10L+
Interviews
4 Cr+
Salaries
1.5 Cr+
Users
Contribute to help millions
AmbitionBox Awards
Get AmbitionBox app

