The Second Eigenvalue of the Google Matrix



Similar documents
Big Data Technology Motivating NoSQL Databases: Computing Page Importance Metrics at Crawl Time

DATA ANALYSIS II. Matrix Algorithms

Google s PageRank: The Math Behind the Search Engine

Part 1: Link Analysis & Page Rank

THE $25,000,000,000 EIGENVECTOR THE LINEAR ALGEBRA BEHIND GOOGLE

Search engines: ranking algorithms

Similarity and Diagonalization. Similar Matrices

Sybilproof Reputation Mechanisms

The PageRank Citation Ranking: Bring Order to the Web

The Mathematics Behind Google s PageRank

MATH10212 Linear Algebra. Systems of Linear Equations. Definition. An n-dimensional vector is a row or a column of n numbers (or letters): a 1.

UNCOUPLING THE PERRON EIGENVECTOR PROBLEM

Monte Carlo methods in PageRank computation: When one iteration is sufficient

Enhancing the Ranking of a Web Page in the Ocean of Data

Chapter 6. Orthogonality

Reputation Management in P2P Networks: The EigenTrust Algorithm

by the matrix A results in a vector which is a reflection of the given

University of Lille I PC first year list of exercises n 7. Review

Practical Graph Mining with R. 5. Link Analysis

LS.6 Solution Matrices

Section Inner Products and Norms

A Tutorial on Spectral Clustering

Perron vector Optimization applied to search engines

Au = = = 3u. Aw = = = 2w. so the action of A on u and w is very easy to picture: it simply amounts to a stretching by 3 and 2, respectively.

Mining Social-Network Graphs

Walk-Based Centrality and Communicability Measures for Network Analysis

Conductance, the Normalized Laplacian, and Cheeger s Inequality

AN INTRODUCTION TO NUMERICAL METHODS AND ANALYSIS

Orthogonal Diagonalization of Symmetric Matrices

MATH 423 Linear Algebra II Lecture 38: Generalized eigenvectors. Jordan canonical form (continued).

The world s largest matrix computation. (This chapter is out of date and needs a major overhaul.)

Combating Web Spam with TrustRank

Lecture 15 An Arithmetic Circuit Lowerbound and Flows in Graphs

SPECTRAL POLYNOMIAL ALGORITHMS FOR COMPUTING BI-DIAGONAL REPRESENTATIONS FOR PHASE TYPE DISTRIBUTIONS AND MATRIX-EXPONENTIAL DISTRIBUTIONS

Analysis of Mean-Square Error and Transient Speed of the LMS Adaptive Algorithm

SUMMING CURIOUS, SLOWLY CONVERGENT, HARMONIC SUBSERIES

A Google-like Model of Road Network Dynamics and its Application to Regulation and Control

Tools for the analysis and design of communication networks with Markovian dynamics

A Uniform Approach to Accelerated PageRank Computation

Inner Product Spaces

COMBINATORIAL PROPERTIES OF THE HIGMAN-SIMS GRAPH. 1. Introduction

Continuity of the Perron Root

TD(0) Leads to Better Policies than Approximate Value Iteration

Practical Guide to the Simplex Method of Linear Programming

Math 115A HW4 Solutions University of California, Los Angeles. 5 2i 6 + 4i. (5 2i)7i (6 + 4i)( 3 + i) = 35i + 14 ( 22 6i) = i.

MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS. + + x 2. x n. a 11 a 12 a 1n b 1 a 21 a 22 a 2n b 2 a 31 a 32 a 3n b 3. a m1 a m2 a mn b m

Part 2: Community Detection

An Effective Risk Avoidance Scheme for the EigenTrust Reputation Management System

Zachary Monaco Georgia College Olympic Coloring: Go For The Gold

Data Mining: Algorithms and Applications Matrix Math Review

Linear Algebra: Determinants, Inverses, Rank

(67902) Topics in Theory and Complexity Nov 2, Lecture 7

A characterization of trace zero symmetric nonnegative 5x5 matrices

MAT 242 Test 2 SOLUTIONS, FORM T

Split Nonthreshold Laplacian Integral Graphs

Numerical Analysis Lecture Notes

Modeling and Performance Evaluation of Computer Systems Security Operation 1

MATH APPLIED MATRIX THEORY

Combating Web Spam with TrustRank

Linear Programming I

Inner Product Spaces and Orthogonality

160 CHAPTER 4. VECTOR SPACES

By choosing to view this document, you agree to all provisions of the copyright laws protecting it.

Linear Algebra Methods for Data Mining

Random access protocols for channel access. Markov chains and their stability. Laurent Massoulié.

Duality of linear conic problems

LECTURE 4. Last time: Lecture outline

Simulating a File-Sharing P2P Network

MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS

Maximum Likelihood Estimation of Peers Performance in P2P Networks

Minimally Infeasible Set Partitioning Problems with Balanced Constraints

MATH 304 Linear Algebra Lecture 18: Rank and nullity of a matrix.

Linear Algebra Notes

Systems of Linear Equations

Tensor Methods for Machine Learning, Computer Vision, and Computer Graphics

Quadratic forms Cochran s theorem, degrees of freedom, and all that

The Singular Value Decomposition in Symmetric (Löwdin) Orthogonalization and Data Compression

Lecture 5: Singular Value Decomposition SVD (1)

Topical Authority Identification in Community Question Answering

These axioms must hold for all vectors ū, v, and w in V and all scalars c and d.

NOTES ON LINEAR TRANSFORMATIONS

LINEAR ALGEBRA W W L CHEN

Methods for Finding Bases

Matrix Differentiation

IEOR 6711: Stochastic Models, I Fall 2012, Professor Whitt, Final Exam SOLUTIONS

x = + x 2 + x

Solutions to Math 51 First Exam January 29, 2015

SALEM COMMUNITY COLLEGE Carneys Point, New Jersey COURSE SYLLABUS COVER SHEET. Action Taken (Please Check One) New Course Initiated

Transcription:

0 2 The Second Eigenvalue of the Google Matrix Taher H Haveliwala and Sepandar D Kamvar Stanford University taherh,sdkamvar @csstanfordedu Abstract We determine analytically the modulus of the second eigenvalue for the web hyperlink matrix used by Google for computing PageRank Specifically, we prove the following statement: For any matrix, where is an row-stochastic matrix, is a nonnegative rank-one row-stochastic matrix, and! ", the second eigenvalue of has modulus # $&%'#() Furthermore, if has at least two irreducible closed subsets, the second eigenvalue $ % * This statement has implications for the convergence rate of the standard PageRank algorithm as the web scales, for the stability of PageRank to perturbations to the link structure of the web, for the detection of Google spammers, and for the design of algorithms to speed up PageRank Theorem be an,-/, row-stochastic matrix Let 0 be a real number such that 25 Let 6 be the,-!, rank-one row-stochastic matrix 687:9<;>=, where 9 Theorem Let + 32 is the n-vector whose elements are all?a@b7 probability distribution ML Define the matrix CD7FE 0G+IHKJ Theorem 2 Further, if +, and ; is an n-vector that represents a 0GNO6P = Its second eigenvalue Q RTSUQ has at least two irreducible closed subsets (which is the case for the web hyperlink matrix), then the second eigenvalue of C is given by RVSB7W0 0 2 Notation and Preliminaries + is an,-), row-stochastic matrix 6 is the,-), rank-one row-stochastic matrix 6X7Y9<;Z=, where 9 to be the n-vector whose elements are all?a@m7 C is the,-, column-stochastic matrix: ]L C 7[E 0\+WHWJ 0GNO6P = () We denote the ^ th eigenvalue of C as R @, and the corresponding eigenvector as _a` Cb_ ` 7WRT@c_ ` (2) ie, a vector whose elements are nonnegative and whose Ld norm is

Since C is column- By convention, we choose eigenvectors _ ` such that QQ _ `'QQ 7 stochastic, R 7, I Q R S Q Q R ZQ We denote the ^ th eigenvalue of +/= as <@, and its corresponding eigenvector as ` : + = ` 7<@ ` Since + = is column-stochastic, I 7 Q US Q Q Q We denote the ^ th eigenvalue of 6 = as Z@, and its corresponding eigenvector as ` : 6 = ` 7Z@ ` Since 6/= is rank-one and column-stochastic, 7 ZSb7 7 7 An, -, row-stochastic matrix can be viewed as the transition matrix for a Markov chain with, states For any row-stochastic matrix, 9 7K9 A set of states is a closed subset of the Markov chain corresponding to if and only if ^ and! implies that @#" 7 A set of states is an irreducible closed subset of the Markov chain corresponding to if and only if is a closed subset, and no proper subset of is a closed subset Intuitively speaking, each irreducible closed subset of a Markov chain corresponds to a leaf node in the strongly connected component (SCC) graph of the directed graph induced by the nonzero transitions in the chain Note that 6, +, and CB= are row stochastic, and can thus be viewed as transition matrices of Markov chains 3 Proof of Theorem We first show that Theorem is true for 0M7 and 0]7 CASE : 0M7 If 0b7, then, from equation, C 7K6 = Since 6 is a rank-one matrix, R S 7 Theorem is proved for c=0 Thus, CASE 2: 0M7 If 0 7, then, from equation, C 7 + = Since + = is a column-stochastic matrix, 2D Q R S Q Thus, Theorem is proved for c= CASE 3: %$ $D 0 We prove this case via a series of lemmas Lemma The second eigenvalue of C has modulus Q R SUQ Proof Consider the Markov chain corresponding to C= If the Markov chain corre- is aperiodic, then sponding to CB= has only one irreducible closed subchain, and if the chain corresponding to C = must have a unique eigenvector with eigenvalue, by the Ergodic Theorem [3] So we simply must show that the Markov chain corresponding to Cb= has a single irreducible closed subchain, and that this subchain is aperiodic $

$ Lemma shows that CB= has a single irreducible closed subchain, and Lemma 2 shows this subchain is aperiodic Lemma There exists a unique irreducible closed subset of the Markov chain corresponding to CB= Proof We split this proof into a proof of existence and a proof of uniqueness Existence Let the set be the states with nonzero components in ; Let consist of the set of all states reachable from along nonzero transitions in the chain trivially forms a closed subset Further, since every state has a transition to, no subset of can be closed Therefore, forms an irreducible closed subset Uniqueness Every closed subset must contain, and every closed subset containing must contain Therefore, must be the unique irreducible closed subset of the chain Lemma 2 The unique irreducible closed subset is an aperiodic subchain Proof From Theorem 5 in the Appendix, all members in an irreducible closed subset have the same period Therefore, if at least one state in has a self-transition, then the subset is aperiodic Let be any state in By construction, there exists a selftransition from to itself Therefore, must be aperiodic From Lemmas and 2, and the Ergodic Theorem, Q RVS Q Lemma 2 The second eigenvector _ of C is orthogonal to? : 9T=>_ 7 $D and Lemma is proved Proof Since Q R S Q Q R Q (by Lemma ), the second eigenvector _ of C is orthogonal to the first eigenvector of C = by Theorem 3 in the Appendix From Section 2, the first eigenvector of C = is 9 Therefore, S is orthogonal to 9 Lemma 3 6/= _ 7 Proof By definition, 6 7 9&; =, and 6/= 7 ;Z9 = Thus, 6 =>_ 7 ; 9<= _ From Lemma 2, 9 = _ 7 Therefore, 6/= _ 7 Lemma The second eigenvector _ of C must be an eigenvector ` of + =, and the corresponding eigenvalue is &@>7KR S 0 Proof From equation and equation 2: 0\+ = ]L _ MHKJ 0GN 6 = _ 7DR S _ (3) From Lemma 3 and equation 3, we have: 0\+ = _ 7DR S _ () We can divide through by 0 to get: + = _ 7 RTS 0 _ (5)

2 2 0 2 If we let ` 7 _ and @ 7KR S A0, we can rewrite equation + = à7 @ ` (6) Therefore, _ is also an eigenvector of + =, and the relationship between the eigenvalues of C and + = that correspond to _ is given by: R Sb7K0 <@ (7) Lemma 5 Q RTSUQ Proof We know from Lemma that R S 7 0 <@ Because + is stochastic, we have that 2K Q <@ Q Therefore, Q R S Q 0, and Theorem is proved Proof of Theorem 2 Recall that Theorem 2 states: If P has at least two irreducible closed subsets, R S 7W0 Proof CASE : 0M7 This is proven in Case of Section 3 CASE 2: 0M7 This is proven trivially from Theorem 3 in the Appendix CASE 3: %$ $D 0 We prove this as follows We assume + has at least two irreducible closed subsets We then construct a vector @ that is an eigenvector of C and whose corresponding eigenvalue is R @ 7 0 Therefore, Q R S Q 0, and there exists a R @ 780 From Theorem, R S 0 Therefore, if + has at least two irreducible closed subsets, R S 7W0 Lemma Any eigenvector @ of + = that is orthogonal to? is an eigenvector @ of C The relationship between eigenvalues is R @ 7 0 @ Proof It is given that 9 = à7 By definition, Therefore, 6 = ` 7 ;Z9 = à7 (8) + = à7 @ ` (9) Therefore, from equations, 8, and 9, ]L C ` 7 0\+ = `>HWJ 0GNO6 = à7k0\+ = à7k0 @ ` (0)

2 Therefore, C V` 7K0 @ ` and Lemma is proved Lemma 2 There exists a R @ 7 0 Proof We construct a vector _ ` that is an eigenvector of + and is orthogonal to 9 From Theorem 3 in the Appendix, the multiplicity of the eigenvalue for + is equal + + = = 9 = 9 7 _ ` 7 _ ` 7 _ ` 7 L S _ ` + = @? _ C 0 R @ C _ ` R @ to the number of irreducible closed subsets of Thus we can find two linearly independent eigenvectors and of corresponding to the dominant eigenvalue Let 7 () Sb7 (2) If, let, else if S[7, let If S, then let Note that is an eigenvector of with eigenvalue exactly and that is orthogonal to From Lemma, is an eigenvector of corresponding to eigenvalue Therefore, the eigenvalue of corresponding to eigenvector is 7K0 Therefore, Q R S Q 0, and there exists a R @ 750 However, from Theorem, R S Therefore, R S 7W0 and Theorem 2 is proved 2 0 5 Implications The matrix C is used by Google to compute PageRank, an estimate of web-page importance used for ranking search results [] PageRank is defined as the stationary distribution of the Markov chain corresponding to the,-, stochastic transition matrix Cb= The matrix + corresponds to the web link graph; in making + stochastic, there are standard techniques for dealing with web pages with no outgoing links [6] Furthermore, the web graph has been empirically shown to contain many irreducible closed subsets [], so that Theorem 2 holds for the matrix C used by Google Theorem has implications for the rate of convergence of PageRank, for the stability of PageRank to perturbations to the link structure, and for the design of algorithms to speed up PageRank computations Furthermore, it has broader implications in areas ranging from graph partitioning to reputation schemes in peer-to-peer networks We briefly discuss these implications in this section Convergence of PageRank The PageRank algorithm uses the power method to compute the principal eigenvector of C The rate of convergence of the power method is given by [3, 2] For PageRank, the typical value of 0 has been given as ; for this value of 0, Theorem 2 thus implies that the convergence rate of the power method Q RTS (R Q for any web link matrix C is Therefore, the convergence rate of PageRank will be fast, even as the web scales 2 Note that there may be additional eigenvalues with modulus, such as

L Stability of PageRank to Perturbations in the Link Structure The modulus of the nonprincipal eigenvalues also determines whether the corresponding Markov chain is wellconditioned As shown by Meyer in [9], the greater the eigengap Q R Q Q R SUQ, the more stable the stationary distribution is to perturbations in the Markov chain Our analysis provides an alternate explanation for the stability of PageRank shown by Ng et al [0] R S R S 7 0 Accelerating PageRank Computations Previous work on accelerating PageRank computations assumed was unknown [6] By directly using the equality, improved extrapolation techniques may be developed as in [6] Spam Detection The eigenvectors corresponding to the second eigenvalue R S 7 0 are an artifact of certain structures in the web graph In particular, each pair of leaf nodes in the SCC graph for the chain + corresponds to an eigenvector of C with eigenvalue 0 These leaf nodes in the SCC are those subgraphs in the web link graph which may have incoming edges, but have no edges to other components Link spammers often generate such structures in attempts to hoard rank Analysis of the nonprincipal eigenvectors of C may lead to strategies for combating link spam Broader Implications This proof has implication for spectral methods beyond web search For example, in the field of peer-to-peer networks, the EigenTrust reputation algorithm given in [7] computes the principal eigenvector of a matrix of the form defined in equation This result shows that EigenTrust will converge quickly, minimizing network overhead In the field of image segmentation, Perona and Freeman [2] present an algorithm that segments an image by thresholding the first eigenvector of the affinity matrix of the image One may normalize the affinity matrix to be stochastic as in [8] and introduce a regularization parameter as in [] to define a matrix of the form given in equation The benefit of this is that one can choose the regularization parameter 0 to be large enough so that the computation of the dominant eigenvector is very fast, allowing the Perona-Freeman algorithm to work for very large scale images Acknowledgments We would like to thank Gene Golub and Rajeev Motwani for useful conversations References A Broder, R Kumar, F Maghoul, P Raghavan, S Rajagopalan, R Stata, A Tomkins, and J Wiener Graph structure in the web In Proceedings of the Ninth International World Wide Web Conference, 2000 2 G H Golub and C F V Loan Matrix Computations The Johns Hopkins University Press, Baltimore, 996 3 G Grimmett and D Stirzaker Probability and Random Processes Oxford University Press, 989 M Iosifescu Finite Markov Processes and Their Applications John Wiley and Sons, Inc, 980

5 D L Isaacson and R W Madsen Markov Chains: Theory and Applications, chapter IV, pages 26 27 John Wiley and Sons, Inc, New York, 976 6 S D Kamvar, T H Haveliwala, C D Manning, and G H Golub Extrapolation methods for accelerating PageRank computations In Proceedings of the Twelfth International World Wide Web Conference, 2003 7 S D Kamvar, M T Schlosser, and H Garcia-Molina The EigenTrust Algorithm for Reputation Management in P2P Networks In Proceedings of the Twelfth International World Wide Web Conference, 2003 8 M Meila and J Shi A random walks view of spectral segmentation In AI and Statistics (AISTATS), 200 9 C D Meyer Sensitivity of the stationary distribution of a Markov chain SIAM Journal on Matrix Analysis and Applications, 5(3):75 728, 99 0 A Y Ng, A X Zheng, and M I Jordan Link analysis, eigenvectors and stability In IJCAI, pages 903 90, 200 L Page, S Brin, R Motwani, and T Winograd The PageRank citation ranking: Bringing order to the web Stanford Digital Libraries Working Paper, 998 2 P Perona and W Freeman A factorization approach to grouping In Proc EECV, 998 3 J H Wilkinson The Algebraic Eigenvalue Problem Oxford University Press, Oxford, 965

Appendix This appendix contains theorems that are proven elsewhere and are used in proving Theorems and 2 of this paper Theorem 3 (from page 26 of [5]) If + is the transition matrix for a finite Markov chain, then the multiplicity of the eigenvalue is equal to the number of irreducible closed subsets of the chain Theorem (from page of [3]) If _ ` is an eigenvector of C eigenvalue R @, and is an eigenvector of CB= corresponding to R ", then _ Ò= /7 R @ 7DR " ) corresponding to the (if Theorem 5 (from page 82 of []) Two distinct states belonging to the same class (irreducible closed subset) have the same period In other words, the property of having period is a class property