The Geometry of Polynomial Division and Elimination



Similar documents
x + y + z = 1 2x + 3y + 4z = 0 5x + 6y + 7z = 3

Chapter 6. Orthogonality

Orthogonal Diagonalization of Symmetric Matrices

α = u v. In other words, Orthogonal Projection

Gröbner Bases and their Applications

Recall that two vectors in are perpendicular or orthogonal provided that their dot

11 Multivariate Polynomials

MATH 423 Linear Algebra II Lecture 38: Generalized eigenvectors. Jordan canonical form (continued).

Similarity and Diagonalization. Similar Matrices

fg = f g Ideals. An ideal of R is a nonempty k-subspace I R closed under multiplication by elements of R:

These axioms must hold for all vectors ū, v, and w in V and all scalars c and d.

MCS 563 Spring 2014 Analytic Symbolic Computation Wednesday 9 April. Hilbert Polynomials

Linear Algebra Review. Vectors

MATH 304 Linear Algebra Lecture 18: Rank and nullity of a matrix.

Math 215 HW #6 Solutions

Recall the basic property of the transpose (for any A): v A t Aw = v w, v, w R n.

by the matrix A results in a vector which is a reflection of the given

x = + x 2 + x

MAT 242 Test 3 SOLUTIONS, FORM A

Solutions to Math 51 First Exam January 29, 2015

LINEAR ALGEBRA. September 23, 2010

Orthogonal Projections and Orthonormal Bases

Applied Linear Algebra I Review page 1

MAT 200, Midterm Exam Solution. a. (5 points) Compute the determinant of the matrix A =

Quotient Rings and Field Extensions

SALEM COMMUNITY COLLEGE Carneys Point, New Jersey COURSE SYLLABUS COVER SHEET. Action Taken (Please Check One) New Course Initiated

Lecture 5: Singular Value Decomposition SVD (1)

1 Homework 1. [p 0 q i+j p i 1 q j+1 ] + [p i q j ] + [p i+1 q j p i+j q 0 ]

Inner product. Definition of inner product

Inner Product Spaces and Orthogonality

Least-Squares Intersection of Lines

Linear Algebra Notes

Linear algebra and the geometry of quadratic equations. Similarity transformations and orthogonal matrices

Linear Algebraic Equations, SVD, and the Pseudo-Inverse

CS3220 Lecture Notes: QR factorization and orthogonal transformations

Lecture Notes 2: Matrices as Systems of Linear Equations

MATH APPLIED MATRIX THEORY

MATH2210 Notebook 1 Fall Semester 2016/ MATH2210 Notebook Solving Systems of Linear Equations... 3

Au = = = 3u. Aw = = = 2w. so the action of A on u and w is very easy to picture: it simply amounts to a stretching by 3 and 2, respectively.

Notes on Orthogonal and Symmetric Matrices MENU, Winter 2013

Linearly Independent Sets and Linearly Dependent Sets

MAT 242 Test 2 SOLUTIONS, FORM T

Mathematics (MAT) MAT 061 Basic Euclidean Geometry 3 Hours. MAT 051 Pre-Algebra 4 Hours

18.06 Problem Set 4 Solution Due Wednesday, 11 March 2009 at 4 pm in Total: 175 points.

Bindel, Spring 2012 Intro to Scientific Computing (CS 3220) Week 3: Wednesday, Feb 8

University of Lille I PC first year list of exercises n 7. Review

Section Inner Products and Norms

3 Monomial orders and the division algorithm

Similar matrices and Jordan form

LINEAR ALGEBRA W W L CHEN

Linear Algebra I. Ronald van Luijk, 2012

Computing Orthonormal Sets in 2D, 3D, and 4D

Orthogonal Projections

Methods for Finding Bases

Introduction to Matrix Algebra

4.5 Linear Dependence and Linear Independence

Algebra 2 Chapter 1 Vocabulary. identity - A statement that equates two equivalent expressions.

Linear Programming Problems

The Singular Value Decomposition in Symmetric (Löwdin) Orthogonalization and Data Compression

MATH10212 Linear Algebra. Systems of Linear Equations. Definition. An n-dimensional vector is a row or a column of n numbers (or letters): a 1.

1 VECTOR SPACES AND SUBSPACES

[1] Diagonal factorization

1 Sets and Set Notation.

DATA ANALYSIS II. Matrix Algorithms

Continued Fractions and the Euclidean Algorithm

On the representability of the bi-uniform matroid

Mathematics Course 111: Algebra I Part IV: Vector Spaces

160 CHAPTER 4. VECTOR SPACES

( ) which must be a vector

Mean value theorem, Taylors Theorem, Maxima and Minima.

Arithmetic and Algebra of Matrices

Nonlinear Iterative Partial Least Squares Method

a 11 x 1 + a 12 x a 1n x n = b 1 a 21 x 1 + a 22 x a 2n x n = b 2.

Numerical Methods I Solving Linear Systems: Sparse Matrices, Iterative Methods and Non-Square Systems

3 Orthogonal Vectors and Matrices

Reduced echelon form: Add the following conditions to conditions 1, 2, and 3 above:

5.5. Solving linear systems by the elimination method

Vector and Matrix Norms

A note on companion matrices

Lectures notes on orthogonal matrices (with exercises) Linear Algebra II - Spring 2004 by D. Klain

NOTES ON LINEAR TRANSFORMATIONS

1 Solving LPs: The Simplex Algorithm of George Dantzig

Computing divisors and common multiples of quasi-linear ordinary differential equations

Lecture 3: Finding integer solutions to systems of linear equations

JUST THE MATHS UNIT NUMBER 1.8. ALGEBRA 8 (Polynomials) A.J.Hobson

By choosing to view this document, you agree to all provisions of the copyright laws protecting it.

Dot product and vector projections (Sect. 12.3) There are two main ways to introduce the dot product

ALGEBRAIC EIGENVALUE PROBLEM

Introduction Epipolar Geometry Calibration Methods Further Readings. Stereo Camera Calibration

APPM4720/5720: Fast algorithms for big data. Gunnar Martinsson The University of Colorado at Boulder

Review Jeopardy. Blue vs. Orange. Review Jeopardy

Zeros of a Polynomial Function

1 Introduction to Matrices

MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS. + + x 2. x n. a 11 a 12 a 1n b 1 a 21 a 22 a 2n b 2 a 31 a 32 a 3n b 3. a m1 a m2 a mn b m

Vector Spaces 4.4 Spanning and Independence

Chapter 17. Orthogonal Matrices and Symmetries of Space

CONTROLLABILITY. Chapter Reachable Set and Controllability. Suppose we have a linear system described by the state equation

Transcription:

The Geometry of Polynomial Division and Elimination Kim Batselier, Philippe Dreesen Bart De Moor Katholieke Universiteit Leuven Department of Electrical Engineering ESAT/SCD/SISTA/SMC May 2012 1 / 26

Outline 1 Introduction 2 Multivariate Polynomial Division 3 Elimination 4 Conclusions 2 / 26

Symbolic Methods Computational Algebraic Geometry Emphasis on symbolic methods Computer algebra Huge body of literature in Algebraic Geometry Wolfgang Gröbner (1899-1980) Bruno Buchberger 3 / 26

Changing the Point of View Richard Feynman Seeing things from a Linear Algebra perspective Is it possible to use Linear Algebra instead? New insights/interpretations? New methods? Numerical Algebraic Geometry 4 / 26

Research on Three Levels Conceptual/Geometric Level Polynomial system solving is an eigenvalue problem! Row and Column Spaces: Ideal/Variety Row space/kernel of M, ranks and dimensions, nullspaces and orthogonality Geometrical: intersection of subspaces, angles between subspaces, Grassmann s theorem,... Numerical Linear Algebra Level Eigenvalue decompositions, SVDs,... Solving systems of equations (consistency, nb sols) QR decomposition and Gram-Schmidt algorithm Numerical Algorithms Level Modified Gram-Schmidt (numerical stability), GS from back to front Exploiting sparsity and Toeplitz structure (computational complexity O(n 2 ) vs O(n 3 )), FFT-like computations and convolutions,... Power method to find smallest eigenvalue (= minimizer of polynomial optimization problem) 5 / 26

Polynomials as Vectors Graded Xel Ordering Let a and b N n 0. We say a > grxel b if a = n a i > b = i=1 n b i, or a = b and a > xel b i=1 where a > xel b if, in the vector difference a b Z n, the leftmost nonzero entry is negative. Examples (2, 0, 0) > grxel (0, 0, 1) because (2, 0, 0) > (0, 0, 1) which implies x 2 1 > grxel x 3 (0, 1, 1) > grxel (2, 0, 0) because (0, 1, 1) > xel (2, 0, 0) which implies x 2 x 3 > grxel x 2 1 6 / 26

Polynomials as Vectors Vector Representation Defining a monomial ordering allows a vector representation Each column of the vector corresponds with a monomial, graded xel ordered and ascending from left to right LM(p) Leading Monomial of polynomial p according to monomial ordering Example: the polynomial 2 + 3x 1 4x 2 + x 1 x 2 7x 2 2 is represented by ( 1 x 1 x 2 x 2 1 x 1 x 2 x 2 2 2 3 4 0 1 7 ) Cd n : vector space of all polynomials in n indeterminates with complex coefficients up to a degree d 7 / 26

Outline 1 Introduction 2 Multivariate Polynomial Division 3 Elimination 4 Conclusions 8 / 26

Definition Divison Definition Fix any monomial order > on Cd n and let F = (f 1,..., f s ) be a s-tuple of polynomials in Cd n. Then every p Cn d can be written as p = h 1 f 1 +... + h s f s + r where h i, r C n d. For each i, h if i = 0 or LM(p) LM(h i f i ), and either r = 0, or r is a linear combination of monomials, none of which is divisible by any of LM(f 1 ),..., LM(f s ). 9 / 26

Divisor Matrix Divisor Matrix D in C n d Given a set of polynomials f 1,..., f s Cd n, each of degree d i (i = 1... s) and a polynomial p Cd n of degree d then the Divisor matrix D is given by D = f 1 x 1f 1 x 2f 1. x k 1 n f 1 f 2 x 1f 2. x ks n f s where each polynomial f i is multiplied with all monomials x α i from degree 0 up to degree k i = deg(p) deg(f i) such that x α i LM(f i) LM(p). 10 / 26

Divisor Matrix Example Let p = 4 + 5x 1 3x 2 9x 2 1 + 7x 1x 2 and F = { 2 + x 1 + x 2, 3 x 1 }. The Divisor Matrix is then D = 1 x 1 x 2 x 2 1 x 1 x 2 f 1 2 1 1 0 0 x 1 f 1 0 2 0 1 1 f 2 3 1 0 0 0 x 1 f 2 0 3 0 1 0 x 2 f 2 0 0 3 0 1 11 / 26

Divisor Matrix 12 / 26

Divisor Matrix Divisor Matrix D row space of D D : all polynomials i h if i s.t. LM(p) LM(h i f i ) dim(d) = rank(d) [p] D = {r C n d : p r D} Set of all these equivalence classes (remainders) is denoted by C d /D dim(c d /D) = nullity(d) Any monomial basis of a vector space R such that R = C d /D and R Cd n = a normal set 13 / 26

Divisor Matrix R r p i h if i r D 14 / 26

Division Algorithm Algorithm: Multivariate Polynomial Division Input: polynomials f 1,..., f s, p Cd n Output: h 1,..., h s, r D Divisor matrix for p D linear independent rows of D col indices of linear dependent columns of D R canonical basis of monomials corresponding with col q = s i h if i project p along R onto D r p q h = ( ) h 1,..., h s solve hd = q 15 / 26

Division Algorithm Oblique Projection p = h 1 f 1 +... + h s f s + r with h i f i D and r R s i h if i is found by projecting p oblique along R onto D s h i f i i=1 = p/r [D/R ] D p/r, D/R orthogonal complements of p orthogonal on R and D orthogonal on R respectively r is then found as r = p hf 16 / 26

Non-uniqueness of quotients Non-uniqueness of quotients General case D not of full row rank Linear independent rows of D form a basis of D Definition does not provide extra constraints to pick out a certain basis Non-uniqueness of remainders General case D not of full column rank Linear dependent columns of D form a monomial basis of R Definition does provide extra constraint but still not-unique 17 / 26

Implementation Implementation determine: rank(d), basis for D and kernel from kernel determine the monomial basis for R compute the oblique projection (exploiting the structure) sparse multifrontal multithreaded rank-revealing QR decomposition 18 / 26

Outline 1 Introduction 2 Multivariate Polynomial Division 3 Elimination 4 Conclusions 19 / 26

Macaulay Matrix Macaulay Matrix Given a set of multivariate polynomials f 1,..., f s Cd n, each of degree d i (i = 1... s) then the Macaulay matrix of degree d is given by M(d) = f 1 x 1 f 1. x d 1 d n f 1 f 2 x 1 f 2. x ds d n where each polynomial f i is multiplied with all monomials up to degree d d i for all i = 1... s. f s 20 / 26

Elimination Elimination Problem Given a set of multivariate polynomials f 1,..., f s Cd n and x e {x 1,..., x n }. Find a polynomial g = s i h if i in which all monomials x e are eliminated. Solution g lies in the intersection of two vector spaces: M d = row space of M(d) E d = vector space spanned by monomials {x 1,..., x n } \ x e containing polynomials up to degree d 21 / 26

Elimination E d g o M d 22 / 26

Elimination Elimination Algorithm Input: polynomials f 1,..., f s C n d, monomial set x e Output: g M d E d d max(deg(f 1 ), deg(f 2 ),..., deg(f s )) g [ ] while g = [ ] do E(d) canonical basis for E d M(d) Macaulay matrix of degree d if M d E d ø then g element from intersection else d d + 1 end if end while 23 / 26

Elimination Implementation Use canonical angles between vector spaces to determine the intersection Q m, Q e orthogonal bases for M d and E d Q m Q Y e = Y CZ T with C = diag(cosθ 1,..., cosθ k ) link with Cosine-Sine decomposition Need orthogonal basis for M d : sparse rank-revealing QR Implicitly Restarted Arnoldi Iterations to determine the canonical angle and g 24 / 26

Conclusions Conclusions Polynomial division: vector decomposition Elimination: intersection of vector spaces Oblique projections Principal angles and CS decomposition Sparse structured matrices Applicable on many other problems: approximate GCD polynomial system solving ideal membership problem... 25 / 26

Conclusions Thank You 26 / 26