algoadvance

You are given a matrix mat with integers 0 and 1, where each row is sorted in non-decreasing order. Your task is to find the row with the maximum number of 1’s. If there are multiple such rows, return the one with the smallest index.

Clarifying Questions

  1. Input Specifications:
    • What are the dimensions of the matrix (number of rows and columns)?
    • Are the elements strictly binary (0s and 1s)?
  2. Output Specifications:
    • What should be returned if there are multiple rows with the same maximum number of 1’s?
    • Should there be any specific formatting for the output?

Assuming:

Strategy

  1. Initialization:
    • Start by defining two variables: max_ones to keep track of the maximum number of 1’s found so far, and row_index to store the index of the row with the maximum 1’s.
  2. Iterate Through Each Row:
    • For each row, count the number of 1’s.
    • Compare this count with max_ones. If the current row has more 1’s, update max_ones and row_index.
  3. Binary Search Optimization (Optional):
    • Since rows are sorted, you could potentially use binary search to find the first appearance of 1 in each row, which would optimize the counting process to O(log n) per row.
  4. Result:
    • After iterating through all rows, return row_index.

Code

def rowWithMaxOnes(mat):
    max_ones = -1
    row_index = -1
    
    for idx, row in enumerate(mat):
        count_of_ones = sum(row)
        if count_of_ones > max_ones:
            max_ones = count_of_ones
            row_index = idx
            
    return row_index

Time Complexity

Testing

Make sure to test with varied cases including:

  1. All 0s matrix.
  2. All 1s matrix.
  3. Matrices with different row and column lengths to ensure it handles all input sizes correctly.

Try our interview co-pilot at AlgoAdvance.com