You are given an array, points
, where each element in points[i]
=[xj​,yi​] represents the integer coordinates of a point in a 2D plane. The distance between any two points is defined as the Manhattan distanceThe Manhattan distance between two cells (x1, y1) and (x2, y2) is |x_1 - x_2| + |y_1 - y_2|..
Your task is to determine and return the smallest possible value for the maximum distance between any two points after removing exactly one point from the array.
Constraints:
3≤ points.length
≤103
points[i].length
==2
1≤ points[i][0], points[i][1]
≤103