I have a point cloud C, where each point has an associated value. Lets say the points are in 2-d space, so each point can be represented with the triplet (x, y, v).
I'd like to find the subset of points which are local maxima. That is, for some radius R, I would like to find the subset of points S in C such that for any point Pi (with value vi) in S, there is no point Pj in C within R distance of Pi whose value vj is greater that vi.
I see how I could do this in O(N^2) time, but that seems wasteful. Is there an efficient way to do this?
Side Notes:
- The source of this problem is that I'm trying to find local maxima in a sparse matrix, so in my case x, y are ordered integer indeces - if this simplifies the problem let me know!
- I'm perfectly happy if the solution is just for a manhattan distance or whatever.
- I'm in python, so if there's some kind of nice vectorized numpy way to do this that's just great.
Following up on Yves' suggestion, here's an answer, which uses scipy's KDTree: