Cloud and Big Data Summer School, Stockholm, Aug., 2015 Jeffrey D. Ullman

Similar documents
Part 1: Link Analysis & Page Rank

Link Analysis. Chapter PageRank

Graph Algorithms and Graph Databases. Dr. Daisy Zhe Wang CISE Department University of Florida August 27th 2014

Estimating PageRank Values of Wikipedia Articles using MapReduce

Practical Graph Mining with R. 5. Link Analysis

Question 2: How do you solve a matrix equation using the matrix inverse?

Search engines: ranking algorithms

The PageRank Citation Ranking: Bring Order to the Web

Cloud and Big Data Summer School, Stockholm, Aug Jeffrey D. Ullman

Review Jeopardy. Blue vs. Orange. Review Jeopardy

Removing Web Spam Links from Search Engine Results

The Mathematics Behind Google s PageRank

18.06 Problem Set 4 Solution Due Wednesday, 11 March 2009 at 4 pm in Total: 175 points.

The world s largest matrix computation. (This chapter is out of date and needs a major overhaul.)

Social Media Mining. Network Measures

Search engine ranking

Solving Systems of Linear Equations Using Matrices

2x + y = 3. Since the second equation is precisely the same as the first equation, it is enough to find x and y satisfying the system

Linear Algebra Notes

x = + x 2 + x

International Journal of Engineering Research-Online A Peer Reviewed International Journal Articles are freely available online:

Combating Web Spam with TrustRank

Chapter 6. Orthogonality

Similar matrices and Jordan form

Here are some examples of combining elements and the operations used:

Web Search. 2 o Semestre 2012/2013

OBJECTIVE ASSESSMENT OF FORECASTING ASSIGNMENTS USING SOME FUNCTION OF PREDICTION ERRORS

Mining Social-Network Graphs

MATH10212 Linear Algebra. Systems of Linear Equations. Definition. An n-dimensional vector is a row or a column of n numbers (or letters): a 1.

Orthogonal Projections

Data Mining: Algorithms and Applications Matrix Math Review

Multivariate Analysis of Variance (MANOVA): I. Theory

7. LU factorization. factor-solve method. LU factorization. solving Ax = b with A nonsingular. the inverse of a nonsingular matrix

Enhancing the Ranking of a Web Page in the Ocean of Data

Manifold Learning Examples PCA, LLE and ISOMAP

13 MATH FACTS a = The elements of a vector have a graphical interpretation, which is particularly easy to see in two or three dimensions.

3.2 Matrix Multiplication

LS.6 Solution Matrices

Notes on Factoring. MA 206 Kurt Bryan

Combating Web Spam with TrustRank

MATH 304 Linear Algebra Lecture 18: Rank and nullity of a matrix.

LINEAR ALGEBRA. September 23, 2010

APPM4720/5720: Fast algorithms for big data. Gunnar Martinsson The University of Colorado at Boulder

Big Data Technology Motivating NoSQL Databases: Computing Page Importance Metrics at Crawl Time

The Characteristic Polynomial

DETERMINANTS TERRY A. LORING

DATA ANALYSIS II. Matrix Algorithms

[1] Diagonal factorization

Power Rankings: Math for March Madness

6. Cholesky factorization

Xiaoqiao Meng, Vasileios Pappas, Li Zhang IBM T.J. Watson Research Center Presented by: Payman Khani

MATLAB Functions. function [Out_1,Out_2,,Out_N] = function_name(in_1,in_2,,in_m)

General Framework for an Iterative Solution of Ax b. Jacobi s Method

MATH APPLIED MATRIX THEORY

3.2 Roulette and Markov Chains

1 Maximum likelihood estimation

5.5. Solving linear systems by the elimination method

Matrix Algebra in R A Minimal Introduction

DERIVATIVES AS MATRICES; CHAIN RULE

1 Introduction to Matrices

December 4, 2013 MATH 171 BASIC LINEAR ALGEBRA B. KITCHENS

Section 8.2 Solving a System of Equations Using Matrices (Guassian Elimination)

CHOOSING A COLLEGE. Teacher s Guide Getting Started. Nathan N. Alexander Charlotte, NC

Corso di Biblioteche Digitali

7 Gaussian Elimination and LU Factorization

SEO Best Practices Checklist

So today we shall continue our discussion on the search engines and web crawlers. (Refer Slide Time: 01:02)

Mining Social Network Graphs

Introduction to Matrix Algebra

THE ANALYTIC HIERARCHY PROCESS (AHP)

101 Basics to Search Engine Optimization. (A Guide on How to Utilize Search Engine Optimization for Your Website)

Domain Name Abuse Detection. Liming Wang

Solving Systems of Linear Equations

Matrices 2. Solving Square Systems of Linear Equations; Inverse Matrices

Math 215 HW #6 Solutions

101 Basics to Search Engine Optimization

GENERALIZED INTEGER PROGRAMMING

TF-IDF. David Kauchak cs160 Fall 2009 adapted from:

Search Engine Optimisation (SEO) Guide

Perron vector Optimization applied to search engines

Eigenvalues, Eigenvectors, Matrix Factoring, and Principal Components

Transcription:

Cloud and Big Data Summer School, Stockholm, Aug., 2015 Jeffrey D. Ullman

2 Intuition: solve the recursive equation: a page is important if important pages link to it. Technically, importance = the principal eigenvector of the transition matrix of the Web. A few fixups needed.

3 Number the pages 1, 2,. Page i corresponds to row and column i. M [i, j] = 1/n if page j links to n pages, including page i ; 0 if j does not link to i. M [i, j] is the probability we ll next be at page i if we are now at page j.

4 Suppose page j links to 3 pages, including i but not x. j i x 1/3 0

Suppose v is a vector whose i th component is the probability that a random walker is at page i at a certain time. If a walker follows a link from i at random, the probability distribution for walkers is then given by the vector Mv. 5

Starting from any vector v, the limit M (M ( M (M v ) )) is the long-term distribution of walkers. Intuition: pages are important in proportion to how likely a walker is to be there. The math: limiting distribution = principal eigenvector of M = PageRank. 6

7 Yahoo y a m y 1/2 1/2 0 a 1/2 0 1 m 0 1/2 0 Amazon M soft

Because there are no constant terms, the equations v = Mv do not have a unique solution. In Web-sized examples, we cannot solve by Gaussian elimination anyway; we need to use relaxation (= iterative solution). Can work if you start with a fixed v. 8

Start with the vector v = [1, 1,, 1] representing the idea that each Web page is given one unit of importance. Repeatedly apply the matrix M to v, allowing the importance to flow like a random walk. About 50 iterations is sufficient to estimate the limiting solution. 9

10 Equations v = Mv: y = y /2 + a /2 a = y /2 + m m = a /2 Note: = is really assignment. y a = m 1 1 1 1 3/2 1/2 5/4 1 3/4 9/8 11/8 1/2... 6/5 6/5 3/5

11 Yahoo Amazon M soft

12 Yahoo Amazon M soft

13 Yahoo Amazon M soft

14 Yahoo Amazon M soft

15 Yahoo Amazon M soft

16 Some pages are dead ends (have no links out). Such a page causes importance to leak out. Other groups of pages are spider traps (all outlinks are within the group). Eventually spider traps absorb all importance.

17 Yahoo y a m y 1/2 1/2 0 a 1/2 0 0 m 0 1/2 0 Amazon M soft

18 Equations v = Mv: y = y /2 + a /2 a = y /2 m = a /2 y a = m 1 1 1 1 1/2 1/2 3/4 1/2 1/4 5/8 3/8 1/4... 0 0 0

19 Yahoo Amazon M soft

20 Yahoo Amazon M soft

21 Yahoo Amazon M soft

22 Yahoo Amazon M soft

23 Yahoo Amazon M soft

24 Yahoo y a m y 1/2 1/2 0 a 1/2 0 0 m 0 1/2 1 Amazon M soft

25 Equations v = Mv: y = y /2 + a /2 a = y /2 m = a /2 + m y a = m 1 1 1 1 1/2 3/2 3/4 1/2 7/4 5/8 3/8 2... 0 0 3

26 Yahoo Amazon M soft

27 Yahoo Amazon M soft

28 Yahoo Amazon M soft

29 Yahoo Amazon M soft

30 Tax each page a fixed percentage at each interation. Add a fixed constant to all pages. Good idea: distribute the tax, plus whatever is lost in dead-ends, equally to all pages. Models a random walk with a fixed probability of leaving the system, and a fixed number of new walkers injected into the system at each step.

31 Equations v = 0.8(Mv) + 0.2: y = 0.8(y /2 + a/2) + 0.2 a = 0.8(y /2) + 0.2 m = 0.8(a /2 + m) + 0.2 y a = m 1 1 1 1.00 0.60 1.40 0.84 0.60 1.56 0.776 0.536 1.688... 7/11 5/11 21/11

32 Goal: Evaluate Web pages not just according to their popularity, but by how relevant they are to a particular topic, e.g. sports or history. Allows search queries to be answered based on interests of the user. Example: Search query [SAS] wants different pages depending on whether you are interested in travel or technology.

33 Assume each walker has a small probability of teleporting at any tick. Teleport can go to: 1. Any page with equal probability. As in the taxation scheme. 2. A set of relevant pages (teleport set). For topic-specific PageRank.

34 Only Microsoft is in the teleport set. Assume 20% tax. I.e., probability of a teleport is 20%.

35 Yahoo Dr. Who s phone booth. Amazon M soft

36 Yahoo Amazon M soft

37 Yahoo Amazon M soft

38 Yahoo Amazon M soft

39 Yahoo Amazon M soft

40 Yahoo Amazon M soft

41 Yahoo Amazon M soft

1. Choose the pages belonging to the topic in Open Directory. 2. Learn from examples the typical words in pages belonging to the topic; use pages heavy in those words as the teleport set. 42

43 Spam farmers create networks of millions of pages designed to focus PageRank on a few undeserving pages. We ll discuss this technology shortly. To minimize their influence, use a teleport set consisting of trusted pages only. Example: home pages of universities.

44 Mutually recursive definition: A hub links to many authorities; An authority is linked to by many hubs. Authorities turn out to be places where information can be found. Example: course home pages. Hubs tell where the authorities are. Example: Departmental course-listing page.

HITS uses a matrix A[i, j] = 1 if page i links to page j, 0 if not. A T, the transpose of A, is similar to the PageRank matrix M, but A T has 1 s where M has fractions. 45

46 Yahoo A = y a m y 1 1 1 a 1 0 1 m 0 1 0 Amazon M soft

47 Powers of A and A T have elements of exponential size, so we need scale factors. Let h and a be vectors measuring the hubbiness and authority of each page. Equations: h = λaa; a = μa T h. Hubbiness = scaled sum of authorities of successor pages (out-links). Authority = scaled sum of hubbiness of predecessor pages (in-links).

From h = λaa; a = μa T h we can derive: h = λμaa T h a = λμa T A a Compute h and a by iteration, assuming initially each page has one unit of hubbiness and one unit of authority. Pick an appropriate value of λμ. 48

49 1 1 1 A = 1 0 1 0 1 0 1 1 0 A T = 1 0 1 1 1 0 3 2 1 AA T = 2 2 0 1 0 1 2 1 2 A T A= 1 2 1 2 1 2 a(yahoo) a(amazon) a(m soft) = = = 1 1 1 5 4 5 24 18 24 114 84 114......... 1+ 3 2 1+ 3 h(yahoo) = 1 h(amazon) = 1 h(microsoft) = 1 6 4 2 28 20 8 132 96 36......... 1.000 0.735 0.268

50 Iterate as for PageRank; don t try to solve equations. But keep components within bounds. Example: scale to keep the largest component of the vector at 1. Trick: start with h = [1,1,,1]; multiply by A T to get first a; scale, then multiply by A to get next h,

You may be tempted to compute AA T and A T A first, then iterate these matrices as for PageRank. Bad, because these matrices are not nearly as sparse as A and A T. 51

PageRank prevents spammers from using term spam (faking the content of their page by adding invisible words) to fool a search engine. Spammers now attempt to fool PageRank by link spam by creating structures on the Web, called spam farms, that increase the PageRank of undeserving pages. 52

53 Three kinds of Web pages from a spammer s point of view: 1. Own pages. Completely controlled by spammer. 2. Accessible pages. E.g., Web-log comment pages: spammer can post links to his pages. 3. Inaccessible pages.

54 Spammer s goal: Maximize the PageRank of target page t. Technique: 1. Get as many links from accessible pages as possible to target page t. 2. Construct link farm to get PageRank multiplier effect.

55 Accessible Own Inaccessible t 1 2 M Goal: boost PageRank of page t. One of the most common and effective organizations for a spam farm.

56 Accessible Own Inaccessible t 1 2 Suppose rank from accessible pages = x. PageRank of target page = y. Taxation rate = 1-b. Rank of each farm page = by/m + (1-b)/N. From t; M = number of farm pages M Share of tax ; N = size of the Web

57 Inaccessible Accessible t Own 1 2 Tax share for t. Very small; ignore. y = x + bm[by/m + (1-b)/N] + (1-b)/N y = x + b 2 y + b(1-b)m/n y = x/(1-b 2 ) + cm/n where c = b/(1+b) PageRank of each farm page M

58 Accessible Own Inaccessible t 1 2 M y = x/(1-b 2 ) + cm/n where c = b/(1+b). For b = 0.85, 1/(1-b 2 )= 3.6. Multiplier effect for acquired page rank. By making M large, we can make y as large as we want.

59 Topic-specific PageRank, with a set of trusted pages as the teleport set is called TrustRank. Spam Mass = (PageRank TrustRank)/PageRank. High spam mass means most of your PageRank comes from untrusted sources you may be linkspam.

60 Two conflicting considerations: Human has to inspect each seed page, so seed set must be as small as possible. Must ensure every good page gets adequate TrustRank, so all good pages should be reachable from the trusted set by short paths.

61 1. Pick the top k pages by PageRank. It is almost impossible to get a spam page to the very top of the PageRank order. 2. Pick the home pages of universities. Domains like.edu are controlled.