1D interpolation optimization on 3D data cube

145 views Asked by At

I would like to make a 1D interpolation on a 3D data cube in Python. I have 18 images taken at fiducial wavelengths. I want to make a 1D interpolation on each pixel to form a new image for 185 given specific wavelengths. My problem is a problem of optimization. I've already seen tricks like scipy.ndimage.map_coordinates but I don't think this can be applied on my problem and result as an optimized solution. The real problem is the size of my data : I'm working with a (18x1024x1024) data cube and want a new data cube in output of size (185x1024x1024). Is there a way to optimize the following code?

import numpy as np
from scipy.interpolate import interp1d

cube_software = np.zeros((18,1024,1024))


lambda_instrument = np.linspace(2,13,185)
lambda_software   = np.linspace(2,13,18)

cube_new = np.zeros((185,1024,1024))

for r in range(np.shape(cube_software)[1]):
    # print('Number %i over %i'%(r,np.shape(cube_software)[1]))
    for u in range(np.shape(cube_software)[2]):
        pixel_inter = interp1d(lambda_software,cube_software[:,r,u])
        cube_new[:,r,u] = pixel_inter(lambda_instrument) 
1

There are 1 answers

0
Julien Drevon On BEST ANSWER

I've found a way to avoid the two for loops, I've used directly the formula of the linear interpolation and adapted it for a matrix.

For example if I have my raw data with 18 different wavelengths and then a cube of data called data_cube of size 18x1024x1024, and I want to estimate the image at the k-th wavelength lambda_k which is located between the wavelength located at index_before=15 called W_i and index_before=16 W_j of my original data cube then :

new_cube[k,:,:]=data_cube[index_before,:,:] + (data_cube[index_after,:,:]-data_cube[index_before,:,:]) / (W_j-W_i) * (W_k-W_i)

And the computation is instantaneous, hope this can help.