I am trying to solve the sparse matrix equation
u_derivative_1 = A * u
(A is the sparse matrix)
But I'm getting the following error :-
IndexError Traceback (most recent call last) <ipython-input-24-f4af80e4ae52> in <cell line: 1>() ----> 1 trial1 = discretise_delta_u_v4(1000, 'implicit') <ipython-input-23-731d13e4ddf7> in discretise_delta_u_v4(N, method) 61 for i in range (1 , N-1): 62 for j in range (1 , N-1): ---> 63 A[i,j] = (u[i-1,j] + u[i+1,j] + u[i,j-1] + u[i,j+1] - (4*u[i,j]))/(h**2) 64 IndexError: index 2 is out of bounds for axis 0 with size 1
I am confused why I am getting this error and how to resolve this. This is my code -
import numpy as np
import scipy
import scipy.sparse
from scipy.sparse import csr_matrix
from scipy.sparse import coo_matrix
def discretise_delta_u_v4(N, method):
i = np.arange(0,N)
j = np.arange(0,N)
h = 2/N
A = csr_matrix((N, N), dtype = np.float64).toarray()
u = np.array([[(i*h), (j*h)]])
#u[ih,jh] =
u[:,0] = 5 #Boundary
u[:,-1] = 0 #Boundary
u[0,:] = 0 #Boundary
u[-1,:] = 0 #Boundary
#Implicit
if (method=='implicit') :
A[0,:] = 0
A[-1,:] = 0
A[:,0] = 0
A[:,-1] = 0
for i in range (1 , N-1):
for j in range (1 , N-1):
A[i,j] = (u[i-1,j] + u[i+1,j] + u[i,j-1] + u[i,j+1] - (4*u[i,j]))/(h**2)
# u_der_1 = A * u
for i in range (0 , N):
for j in range (0 , N):
u_der_1 = scipy.sparse.linalg.spsolve(A,u)
trial1 = discretise_delta_u_v4(1000, 'implicit')
The error you're seeing specifically is being caused by the way you're constructing the array
u.Doing
u = np.array([[(i*h), (j*h)]])results inubeing:which is an array of shape (1, 2, 10). Therefore, when you do
u[i+1,j]the only valid value ofi+1is 0, everything else is out of bounds.Given the problem you are trying to solve, there are a few more issues with the approach.
You want A to be a sparse matrix, but you're creating it as a dense matrix by converting it to a dense array (i.e.
A = csr_matrix((N, N), dtype = np.float64).toarray()). You should remove the call to.toarray().You're also initialising
uincorrectly, you want a 1-dimensional array ofN**2elements, instead you're initialising a 2D array.You have
u_der_1 = scipy.sparse.linalg.spsolve(A,u)inside a loop, which is also not correct. You need to build the matrix A first and then solve the system.