Find Row With Maximum 1's in a Sorted 2D Matrix

You are provided with a 2D matrix containing only the integers 0 or 1. The matrix has dimensions N x M, and each row is sorted in non-decreasing order. Your objective is to identify the 0-based index of the first row that contains the maximum number of 1's.

Input:

The first line contains an integer 'T' denoting the number of test cases. Each test case consists of the following:
- The first line contains two integers 'N' and 'M', representing the number of rows and columns, respectively.
- N subsequent lines contain M space-separated integers that define the matrix.

Output:

For each test case, output the index of the row that contains the maximum number of 1's.

Example:

Input:
2
3 3
0 1 1
1 1 1
0 0 1
4 4
0 0 0 1
0 1 1 1
1 1 1 1
0 0 0 0
Output:
1
2
Explanation:

- In Test Case 1: The maximum number of 1's is in row 1 with 3 ones.

- In Test Case 2: The maximum number of 1's is in row 2 with 3 ones.

Constraints:

  • 1 ≤ T ≤ 50
  • 1 ≤ N, M ≤ 100
  • 0 ≤ ARR[i][j] ≤ 1 where ARR[i][j] represents the matrix elements.
  • Execution time should not exceed 1 second.

Note:

If multiple rows have the same number of 1's, return the row with the smallest index.
Be the first one to answer
Add answer anonymously...
Cisco Software Engineer Interview Questions
Stay ahead in your career. Get AmbitionBox app
qr-code
Helping over 1 Crore job seekers every month in choosing their right fit company
65 L+

Reviews

4 L+

Interviews

4 Cr+

Salaries

1 Cr+

Users/Month

Contribute to help millions

Made with ❤️ in India. Trademarks belong to their respective owners. All rights reserved © 2024 Info Edge (India) Ltd.

Follow us
  • Youtube
  • Instagram
  • LinkedIn
  • Facebook
  • Twitter