How to efficiently calculate 160146 by 160146 matrix inverse in python?

2.9k views Asked by At

My research is into structural dynamics and i am dealing with large symmetric sparse matrix calculation. Recently, i have to calculate the stiffness matrix (160146 by 160146) inverse with 4813762 non zero elements. I did calculate a smaller stiffness matrix inverse for a 15000 by 15000 size and it came out to almost or full dense. Initially i tried with almost all scipy.sparse.linalg functions to calculate inverse through Ax=b form. Currently, i am using superlu to calculate the L and U matrices then using that i calculate the inverse using Solve(). Since the matrix inverse is dense and i could not store in RAM memory i opted for pytables.

Unfortunately, writing time of one column of inverse matrix takes about 16 minutes(time for each step is shown below after the code) and a total of 160146 columns exist for the stiffness matrix. I would like to know how i can boost the writing speed so that this inverse task will finish in couple of days. The code is as follows,

LU= scipy.sparse.linalg.splu(interior_stiff) 
interior_dof_row_ptr=160146
#---PyTables creation Code for interior_stiff_inverse begins--#
if(os.path.isfile("HDF5_Interior.h5")==False):     
    f=tables.open_file("HDF5_Interior.h5", 'w')
    #                    compression-level and compression library   
    filters=tables.Filters(complevel=0, complib='blosc')
    #                   f.root-> your default group in the HDF5 file "firstdata"->name of the dataset  
    #                                        tables.Float32Atom()->WHat si your atomic data object? 
    if(f.__contains__("/DS_interior_stiff_inverse")==False):
        print("DS_Interior_stiff_inverse DOESN'T EXIST!!!!!")
        out=f.create_carray(f.root, "DS_interior_stiff_inverse", tables.Float32Atom(), shape=(interior_dof_row_ptr,interior_dof_row_ptr), filters=filters)
        #out=f.create_earray(f.root, "DS_interior_stiff_inverse", tables.Float32Atom(), shape=(interior_dof_row_ptr,0), filters=filters, expectedrows=interior_dof_row_ptr)
    else:
        print("DS_interior_stiff_inverse EXISTS!!!!!")
        out=f.get_node("/", "DS_interior_stiff_inverse")

    #interior_stiff_inverse=numpy.zeros((interior_dof_row_ptr,interior_dof_row_ptr))
    for i in range(0,interior_dof_row_ptr):
        I=numpy.zeros((interior_dof_row_ptr,1)) 
        I[i,0]=1
        #--   COmmented by Libni - interior_stiff_inverse[:,i]=LU.solve(I[:,0]) #In pytables how we define the variables. So interior_stiff_inverse_1 only needs to be stored in pytables. 
        print("stating solve() calculation for inverse: ", datetime.datetime.now())
        tmpResult=LU.solve(I[:,0])
        print("solve() calculation for inverse DONE: ", datetime.datetime.now())
        out[:,i]=tmpResult
        print("Written to hdf5 (pytable) :", datetime.datetime.now())
        #out.append(LU.solve(I[:,0]))
        print(str(i) + "th iteration of " + str(interior_dof_row_ptr) + " Interior Inv done")
        f.flush()
        print("After FLUSH line: ", datetime.datetime.now())
    f.close()

#--***PyTables creation Code for interior_stiff_inverse begins-***

Time taken for Solve () calculation and writing to hdf5 is as follows,

stating solve() calculation for inverse: 2017-08-26 01:04:20.424447
solve() calculation for inverse DONE: 2017-08-26 01:04:20.596045
Written to hdf5 (pytable) :2017-08-26 01:20:57.228322
After FLUSH line: 01:20:57.555922

which clearly indicate that writing one column of inverse matrix to hdf5 takes 16 minutes. As per this if i need to calculate the entire matrix inverse it will take me 1779 days. I am sure the writing time can be boosted up. I dont know how i can achieve this. Please help me in boosting the writing speed to hdf5 so that the matrix inverse run can be finished within couple of days.

I have used 0 compression in hdf5 creation thinking that this will be helping in reading and writing fast. My computer spec include i7 with 4 cores and 16 RAM.

Any help will be appreciated.

Thank You, Paul Thomas

0

There are 0 answers