Solution for high ram consumption for loading large NPZ file?

12 views Asked by At

So, I am working with video dataset where I have preprocessed my training data into npz files which approximately consumes about 20 GB size. After that when I try to load the .npz file in below way:

with np.load("/content/drive/MyDrive/NPZ/train.npz") as f:
   keys = list(f.keys())
   train_dataset = tf.data.Dataset.from_tensor_slices(( f[keys[0]],f[keys[1]] ))
   keys=None
   f=None

I find that it is exceeding the colab pro ram limit (51 GB). How to resolve the issues? Is there any other way?

0

There are 0 answers