I'm trying to figure out the time complexity for calculating the largest eigenvector in a whole bunch of small matrices.
Each matrix is the adjacency matrix of the 1-step neighborhood of a node in a weighted, undirected graph. So all values are positive and the matrix is symmetric. E.g.
0 2 1 1
2 0 1 0
1 1 0 0
1 0 0 0
I've found that the Power iteration method that is supposed to be O(n^2) complexity per iteration.
So does that mean the complexity for finding largest eigenvector for the 1-step neighborhood for every node in a graph is O(n * p^2), where n is the number of nodes, and p is the average degree of the graph (i.e. number of edges / number of nodes)?
Off the top of my head I would say your best bet is to use an iterative randomised algorithm known as the power iteration. The algorithm is iterative and finds true max eigenvalue with geometric convergence, with ratio of the largest eigenvalue to the second largest. So, if you two largest eigenvalues are equal do not use this method, otherwise it works quite nicely. You actually get the largest eigenvalue and respective eigenvector.
However, if you matrices are very small you might be just as well to perform PCA because it will not be so expensive. I am not aware of any hard threshold of when you should switch between the two. Also it depends if you are willing to accept small inaccuracies or the absolute true value.