How to write efficent data to a hdf5 storage?

109 views Asked by At

I try to save a 2 dimension array with 20000 x 30000 size but my computer can't handle this process and die when I run the script. I have a 8 GB Ram Windows7 machine with Python 2.7 64 Bit (Anaconda install). Can I do some optimization here?

import numpy as np
import h5py

ROWS = 20000
COLUMNS = 30000
VALUES = [1, 5, 10]
db_file = 'db.h5'

f = h5py.File(db_file, 'w')

val = np.full((ROWS, COLUMNS), 0)

for t in ["t1", "t5", "t10"]:
    group = f.create_group(t)
    group.create_dataset("mydata", data=val)
del val
f.close()
0

There are 0 answers