Minimum Cost to Connect All Points Problem Statement
Given an array COORDINATES
representing the integer coordinates of some points on a 2D plane, determine the minimum cost required to connect all points. The cost to connect two points, (x1, y1) and (x2, y2), is defined by their Manhattan distance: |x1 - x2| + |y1 - y2|
.
Input:
The first line of input contains an integer 'T' representing the number of test cases. The first line of each test case contains an integer ‘N’ representing the number of points in the ‘COORDINATES’ array. The next ‘N’ lines of each test case contain two space-separated integers representing the ‘X’ and ‘Y’ coordinates of a point.
Output:
For each test case, output a single integer denoting the minimum cost to connect all points. Each test case's result should be printed on a new line.
Example:
Consider the input:
2
3
0 0
2 2
3 10
4
0 0
10 10
20 20
5 5
Expected output:
12
30
Constraints:
1 ≤ T ≤ 5
1 ≤ N ≤ 1000
10^{-6} ≤ X, Y ≤ 10^{6}
- All points are distinct.
Note:
You do not need to implement input/output handling. The focus should be on the function logic to compute the minimum cost.
Ensure all points are connected with one simple path only, as defined by path terminology in graph theory.
The problem involves finding the minimum cost to connect all points on a 2D plane using Manhattan distance.
Create a function that calculates the Manhattan distance between two points.
Implement a funct...read more
Top Oracle Application Developer interview questions & answers
Popular interview questions of Application Developer
Top HR questions asked in Oracle Application Developer
Reviews
Interviews
Salaries
Users/Month