:)
While I was defending my thesis proposal, one of my professors asked me why do we have to specify the number of iterations in SOM? He said, there should've been a convergence criterion for us to stop training.
However, I understand that we do not have a target vector and thus we could not minimize the cost.
My question is first, why is there a need for the MAX_ITERATIONS and second, what assures us that the number of iterations we chose would give the optimal map. :(
P.S. Based on experience I tried using 1000 iterations and 10000 iterations on the color dataset. It seems that 10000 iterations does not give a better visualization that 1000. :(
You do implicitly have a target cost function to minimize in SOM. SOM is akin to multi-dimesional scaling (MDS). The purpose is to maintain the topological relationship; therefore, each iteration of SOM is actually to minimize the error between "distance of any two points in source space" and "distance of the same two points in target space", except that in SOM, those similar point are represented using a neuron in target space. That is also how the SOM could be used for clustering.
The iteration process could be regarded the same as gradient descent. When minimizing the quadratic error cost function, it is also prone to be trapped by local minimum. That could also explain why SOM could lead to "kinks" even after a large number of iterations.