I am reading the notes on Markov Cluster Algorithm by Kathy Macropol (http://www.cs.ucsb.edu/~xyan/classes/CS595D-2009winter/MCL_Presentation2.pdf)
On slide 14/46 the author talks about inflation and on slide 12/46 the author provides an example for mcl. I am not sure how the author got the results shown in the matrix on the right.
The author mentions this is squaring and normalizing, I understand this but I guess I am having issues getting started. Any assistance on helping me understand how this matrix was derived is much appreciated.
C = matrix( c( 0, .25, .33, .33, 0, 0, 0, .33, 0, .33, .33, .33, 0, 0, .33, .25, 0, .33, 0, 0, 0, .33, .25, .33, 0, 0, 0, 0, 0, .25, 0, 0, 0, .5, .5, 0, 0, 0, 0, .33, 0, .5, 0, 0, 0, 0, .33, .5, 0), nrow=7, ncol=7) C = t(C) C c1s = (sum(C[,1])^2) C11 = C[,1]^2 C1N = C11/c1s c2s = (sum(C[,2])^2) C22 = C[,2]^2 C2N = C22/c2s c3s = (sum(C[,3])^2) C33 = C[,3]^2 C3N = C33/c3s c4s = (sum(C[,4])^2) C44 = C[,4]^2 C4N = C44/c4s c5s = (sum(C[,5])^2) C55 = C[,5]^2 C5N = C55/c5s c6s = (sum(C[,6])^2) C66 = C[,6]^2 C6N = C66/c6s c7s = (sum(C[,7])^2) C77 = C[,7]^2 C7N = C77/c7s Ct = cbind( C1N,C2N,C3N,C4N,C5N,C6N,C7N ) Ct CN = Ct%*%Ct CN C = CN
Best Answer
The example in slide 12 is similar to the one provided in slide 8.
The resultant matrix to the right is not the result of one step of squaring and normalising, but after some iterations.
You can do multiple steps of matrix squaring and normalising just like in slide 8 to arrive at the final matrix. I wrote a small R script to test the same and it is correct.
a=matrix(0,7,7); a[1,]=c(0,.25,.33,.33, 0, 0, 0); a[2,]=c(.33,0,.33,.33,.33,0, 0); a[3,]=c(.33,.25, 0, .33, 0, 0, 0); a[4,]=c(.33,.25 ,.33, 0, 0, 0,0); a[5,]=c(0,.25, 0, 0, 0,.5,.5); a[6,]=c(0, 0, 0, 0, .33,0,.5); a[7,]=c(0, 0, 0, 0, .33,.5,0); normalize<-function(mat){ for(i in 1:7){mat[,i]=mat[,i]/sum(mat[1:7,i])} return (mat); } iterate<-function(mat){ bm=mat; for(i in 1:10){ bm<-bm%*%bm; bm<-normalize(bm); print(paste("Iteration",i)); print(bm); print ("-----"); } return (bm); } iterate(a);
Observe that it converges just after 4 iteraions.
Thanks
Similar Posts:
- Solved – confusion matrix and Jaccard index computation in O(n) for cluster comparison
- Solved – K-means – comparing solutions with SSwithin elbow-method: minimum “too early”, or non-monotonic curve
- Solved – K-means – comparing solutions with SSwithin elbow-method: minimum “too early”, or non-monotonic curve
- Solved – How to Markov cluster algorithms be used to cluster strings
- Solved – Why doesn’t k-means give the global minimum