I have 3D image of a brain (let's call it flash) and it's currently 263 x 256 x 185. I want to resize it to be the size of another image(call it whole_brain_bravo); 256 x 256 x 176, and (hopefully) use a lanczos interpolation to resample (Image.ANTIALIAS). My (failed) attempt:
from scipy import ndimage as nd
import nibabel as nib
import numpy as np
a = nib.load('flash.hdr') # nib is what I use to load the images
b = nib.load('whole_brain_bravo.hdr')
flash = a.get_data() # Access data as array (in this case memmap)
whole = b.get_data()
downed = nd.interpolation.zoom(flash, zoom=b.shape) # This obviously doesn't work
Have you guys ever done this sort of thing on a 3D image?
From the docstring for
scipy.ndimage.interpolate.zoom
:What is the scale factor between the two images? Is it constant across all axes (i.e. are you scaling isometrically)? In that case
zoom
should be a single float value. Otherwise it should be a sequence of floats, one per axis.For example, if the physical dimensions of
whole
andflash
can be assumed to be equal, then you could do something like this: