COLLINEATIONS OF PROJECTIVE PLANES
|
|
- Virgil Martin
- 7 years ago
- Views:
Transcription
1 COLLINEATIONS OF PROJECTIVE PLANES TIMOTHY VIS 1. Fixed Structures The study of geometry has three main streams: (1) Incidence Geometry: the study of what points are incident with what lines and what can be derived about a geometry. (2) Analytic Geometry: the study of coordinates of a geometry and the algebra derived, as well as geometric properties involving this algebra. (3) Transformational Geometry: the study of collineations of a geometry and the way structures can be moved around. All three areas have been covered so far in class. Incidence Geometry was used to give axiomatic descriptions of projective planes and projective spaces, and to prove results such as the fact that all lines in a finite projective space are incident with a constant number of points. Analytic Geometry was used in our study of field planes and in the development of coordinates when we coordinatized an arbitrary plane. Transformational Geometry was used when we developed the Fundamental Theorem of Field Planes. We are now going to study Transformational Geometry in an arbitrary plane and see what we can say about collineations in general. Definition 1. A collineation of a projective plane π is a bijective map on the points such that the images of collinear points are collinear. That is, P, Q, R are collinear if and only if P σ, Q σ, R σ are collinear. Given a line l and points P, Q on l, we define l σ to be the line P σ Q σ. One of the most fundamental properties studied about collineations is the property of which points and lines are mapped to themselves. Definition 2. If σ is a collineation of a projective plane π, (1) A point P is a fixed point if and only if P σ = P. (2) A line l is a fixed line if and only if l σ = l. (3) A line l is fixed pointwise if and only if P σ = P for all points P l. (4) A point P is fixed linewise if and only if l σ = l for all lines l containing P. Obviously any line that is fixed pointwise is a fixed line and any point that is fixed linewise is a fixed point, but the converse is not necessarily true. Example 1. Consider the collineation of the Fano plane shown in Figure 1. Notice that the points A, F, and G are fixed and that the lines ABC, ADE, and AFG are fixed. Since all points on the line AFG are fixed, AFG is fixed pointwise, and since all lines through A are fixed, A is fixed linewise. Date: April 23, 29. 1
2 2 TIMOTHY VIS A A B D F C E F C E G B D G Figure 1 We ll focus somewhat on the applications of these ideas to field planes; however they apply equally well to arbitrary projective planes. Recall the Fundamental Theorem of Field Planes, which states that every collineation of a field plane is the product of an automorphic collineation and a homography. That is, if σ is a collineation, we have σ : X AX α where α is an automorphism of the underlying field. We also have for lines σ : L L α A 1. Suppose that α is the identity collineation. We then have X σ = AX and L σ = LA 1 or (L σ ) T = ( A 1) T L T. If a point X is fixed by σ, we then have AX = λx, while if a line L is fixed by σ, we have ( A 1) T L T = µl T. Thus the eigenvectors of A are the fixed points of σ, while the eigenvectors of ( A 1) T are the fixed lines of σ. Example 2. Consider the homography given by A = a 1 where a F \ {, 1}. The fixed points are the eigenvectors of A, while the fixed lines are the eigenvectors of ( A 1 ) 1 T a = 1. To find the eigenvectors of A, we determine the characteristic polynomial of A. λ a λi A = λ 1 λ 1 = (λ a)(λ 1) 2.
3 COLLINEATIONS OF PROJECTIVE PLANES 3 Thus A has eigenvalues a and 1. Corresponding to a are eigenvectors determined as vectors in the null space of ai A. That is, a 1 x y = (a 1)y =. a 1 z (a 1)z Now since a 1, we must have both y = and z =. Thus, the only fixed point corresponding to the eigenvalue a is (1,, ). Corresponding to 1 are eigenvectors determined as vectors in the null space of I A. That is, 1 a x y z = (1 a)x = Again, since a 1, we must have x =. Thus, the fixed points corresponding to the eigenvalue 1 are all points of the form (, y, z). On the other hand, notice that A 1, being diagonal, is self-transpose. So ( A 1) T has the same eigenvectors as A 1, which are precisely the eigenvectors of A. Thus, the fixed lines have the same coordinates as the fixed points: [1,, ] and [, y, z]. Now notice that every line through (1,, ) is of the form [, y, z], and each of these lines are fixed, so that (1,, ) is fixed linewise. Similarly, every point on [1,, ] is of the form (, y, z), and each of these points are fixed, so that [1,, ] is fixed pointwise. We now prove some results regarding what possibilities exist for fixed points and lines. We shall assume the following easily verified facts (which you prove in your homework): Proposition 3. Given a collineation σ, (1) If P and Q are distinct fixed points of σ then PQ is a fixed line of σ. (2) If l and m are distinct fixed lines of σ then l m is a fixed point of σ. These results allow us to prove several other results. Proposition 4. Let σ be a collineation of π fixing l pointwise. If distinct points P and Q in π \ l are fixed by σ, then σ is the identity collineation (σ fixes all points of π. Proof. Suppose R is some point of π that does not lie on either l or PQ. So PR and QR are distinct lines through R meeting l in points P and Q respectively. But PR = PP and since both P and P are fixed, PR is fixed. Similarly, QR is fixed. But then PR QR = R is fixed. Now let S be a point of PQ other than P and PQ l. Let R be any point on neither l nor PQ. Then P, R, S are non-collinear points, and both P and R are fixed. By the last argument then, S is also fixed. So every point of π is fixed, and σ is the identity collineation. Collineations that fix a line pointwise or a point linewise play a central role in the study of projective planes. One of the most fundamental results regarding such collineations is the following theorem that states that any collineation fixing a line pointwise fixes a point linewise. The converse of this statement is, of course, also true by the principle of duality..
4 4 TIMOTHY VIS Theorem 5. If σ is a collineation of π fixing l pointwise, σ fixes some point V linewise. Proof. If σ fixes a point V not on l let m be a line through V and let m l = W. Then m = V W, and since both V and W are fixed, m is also fixed. So V is fixed linewise. Suppose then that no point of π \ l is fixed by σ. Let P be a point of π \ l and consider the line PP σ. This line intersects l in a point V. So PV = P σ V = (PV ) σ and is fixed by σ. Let Q be any point of π \ (l PV ). In the same manner as for P, QQ σ is a fixed line. So PP σ QQ σ is a fixed point and must then lie on l. So QQ σ l = V as well. Now consider any line m through V and let R be any point on this line. Then RR σ is a fixed line through V, so that RR σ = m and m is fixed. So V is fixed linewise. Corollary 6. If σ is a collineation of π fixing V linewise, σ fixes some line l pointwise. Proof. This is the dual to Theorem 5. With these results in hand, we are ready to make several definitions. Definition 7. A collineation σ fixing a point V linewise and a line l pointwise is called a (V, l)-perspectivity or central collineation. The point V is called the center of σ and the line l is called the axis of σ. If V l, σ is called an elation, while if V l, σ is called a homology. Example 3. Two of the most familiar motions in the Euclidean plane give us examples of central collineations of the Extended Euclidean Plane. Consider a translation of the Euclidean plane. No points of the Euclidean plane are fixed, but a translation fixes the slopes of all lines in the Euclidean plane. These slopes correspond exactly to the points at infinity, so that every point at infinity is fixed by a translation, and thus, l is the axis of every translation. Furthermore, the lines parallel to the direction of the translation stay in place, so that these lines are fixed. But then every line through the point at infinity corresponding to this slope is fixed, so that this point is the center of the translation. Since this point is on l, a translation is an elation with axis l. Now consider a reflection over the line l of the Euclidean plane. Every point on l is fixed by this reflection, so l is the axis of the reflection. The only lines that are fixed by a reflection, however, are the lines perpendicular to l, so the point at infinity corresponding to the slope perpendicular to l is the center of the reflection. So a reflection is a homology of the Extended Euclidean Plane. Example 4. The homography determined by the matrix a 1 was shown to have the point (1,, ) fixed linewise and the line [1,, ] fixed pointwise. Since (1,, ) does not lie on the line [1,, ], this homography defines a homology.
5 COLLINEATIONS OF PROJECTIVE PLANES 5 Example 5. Consider the homography determined by the matrix A and its inverse transpose ( A 1) T as given. A = 1 1 ( 1 1 A 1 ) T = The characteristic polynomial of both matrices is (λ 1) 3 and the only eigenvalue of either is 1. A has eigenvectors (x, y, ) and ( A 1) T has eigenvectors [x, x, z] Thus, the fixed points are the points of the form (x, y, ), while the fixed lines are the lines of the form [x, x, z]. Notice that the fixed points all lie on the line [,, 1] so that this line is fixed pointwise. Similarly, the fixed lines all contain the point (1, 1, ). Since (1, 1, ) [,, 1], this homography defines an elation. A significant property of central collineations is that they are very easily determined. Theorem 8. A central collineation is completely and uniquely determined by its axis l, center V, and the image P σ of any point P distinct from V not lying on l. Proof. Let R be any point of π \ (l PV ) and let RP l = W and RV = m. Then R = PW m so that R σ = (PW m) σ = P σ W σ m σ = P σ W m. Thus, R σ is completely and uniquely determined. Now let S PV and let R π \ (l PV ). So S π \ (l RV ) and since R σ is completely and uniquely determined, S σ is completely and uniquely determined using V, l, and the image R σ of R as argued when determining R σ. A word of warning here: this theorem applies only when a central collineation actually exists with the appropriate property. There may not be any collineations (other than the identity) with a particular axis, with a particular center, with a particular axis, center pair, or mapping a given point to any other. This theorem does not state that such a collineation exists; it only states that when one exists, it is unique. In order to show that one exists, we need particular properties of a plane. For example, in a field plane, every possible central collineation exists, although we will not prove this. Proposition 9. If σ is a (V, l)-perspectivity and τ a collineation, then τ 1 στ is a (V τ, l τ )-perspectivity. Proof. Let m be a line through V τ. Then m τ 1 necessarily contains V and is fixed by σ. m τ 1 στ = ((m τ 1) σ) τ = (m τ 1) τ = m.
6 6 TIMOTHY VIS So m is fixed by τ 1 στ and V τ is a center of τ 1 στ. Similarly, if P is a point on l τ, P τ 1 lies on l and is fixed by σ. P τ 1 στ = ((P τ 1) σ) τ = (P τ 1) τ = P. So P is fixed by τ 1 στ and l τ is an axis of τ 1 στ. So τ 1 στ is a (V τ, l τ )- perspectivity. Example 6. Suppose in a field plane there is a ((1,, ), [1,, ])-perspectivity σ such that (1, 1, 1) σ = (1, b, b). What is the equation of this perspectivity? We know that as a collineation, σ must be the product of an automorphic collineation and a homography. We also know that every automorphic collineation fixes every point of the fundamental quadrangle. Notice now that the points of the triangle of reference are all fixed by σ, and thus by the associated homography A ((1,, ) is the center and both (, 1, ) and (,, 1) are on the axis). Thus, a a 1 a 2 a 1 a 11 a 12 a 2 a 21 a = ρ x ρ y ρ z Since (1, 1, 1) is fixed by automorphic collineations, we must have (up to scalar multiples) ρ x = 1, ρ y = ρ z = b. So the homography of this collineation is A = 1 b. b Now notice that for any point X on [1,, ], X = (, y, z). But AX = bx then, so that X is fixed by A. Further, notice that for any line L through (1,, ), L = [, y, z] and LA 1 = 1 bl, so that L is fixed by A. So A is the desired ((1,, ),[1,, ])-perspectivity. In the last example, we did not need any automorphic collineation to obtain the appropriate perspectivity. In fact, in a finite field plane, every (V, l)-perspectivity is a homography; that is, every (V, l)-perspectivity has the identity as associated automorphic collineation. Theorem 1. Every (V, l)-perspectivity of a finite field plane is a homography. That is, every (V, l)-perspectivity of a finite field plane has the identity as the associated automorphic collineation. Proof. By Proposition 9 and the transitivity of homographies on ordered quadrangles in PG(2, q), we need only show the result holds for all perspectivities with a given axis. Other axes are then obtained by conjugation by an appropriate homography. Suppose then that l = [1,, ]. Let α be the associated automorphic collineation and A = (a ij ) be the associated homography, and let σ be the collineation in question. Now since both (, 1, ) and (,, 1) are fixed both by σ and α, we know that a 1 = a 2 = a 12 = a 21 =. Let a 11 = ρ y and let a 22 = ρ z. Now consider (, 1, u) σ..
7 COLLINEATIONS OF PROJECTIVE PLANES 7 Since this point lies on the axis, it must be fixed. But (, 1, u) σ = A(, 1, u α ) = (, ρ y, u α ρ z ). It follows that u = u αρ z ρ y = u αρ z ρ y u for all values of u in GF (q). If q = p h, we need p h 1 zeroes to a polynomial of degree α. But α p h 1, a contradiction unless u α ρz ρ y u is the zero polynomial. But this can only be the zero polynomial if α = 1, so we necessarily have α = 1, and σ is a homography.
by the matrix A results in a vector which is a reflection of the given
Eigenvalues & Eigenvectors Example Suppose Then So, geometrically, multiplying a vector in by the matrix A results in a vector which is a reflection of the given vector about the y-axis We observe that
More informationChapter 17. Orthogonal Matrices and Symmetries of Space
Chapter 17. Orthogonal Matrices and Symmetries of Space Take a random matrix, say 1 3 A = 4 5 6, 7 8 9 and compare the lengths of e 1 and Ae 1. The vector e 1 has length 1, while Ae 1 = (1, 4, 7) has length
More informationDESARGUES THEOREM DONALD ROBERTSON
DESARGUES THEOREM DONALD ROBERTSON Two triangles ABC and A B C are said to be in perspective axially when no two vertices are equal and when the three intersection points AC A C, AB A B and BC B C are
More informationNOTES ON LINEAR TRANSFORMATIONS
NOTES ON LINEAR TRANSFORMATIONS Definition 1. Let V and W be vector spaces. A function T : V W is a linear transformation from V to W if the following two properties hold. i T v + v = T v + T v for all
More informationSimilar matrices and Jordan form
Similar matrices and Jordan form We ve nearly covered the entire heart of linear algebra once we ve finished singular value decompositions we ll have seen all the most central topics. A T A is positive
More informationLecture 14: Section 3.3
Lecture 14: Section 3.3 Shuanglin Shao October 23, 2013 Definition. Two nonzero vectors u and v in R n are said to be orthogonal (or perpendicular) if u v = 0. We will also agree that the zero vector in
More informationGeometric Transformations
Geometric Transformations Definitions Def: f is a mapping (function) of a set A into a set B if for every element a of A there exists a unique element b of B that is paired with a; this pairing is denoted
More informationLecture 2: Homogeneous Coordinates, Lines and Conics
Lecture 2: Homogeneous Coordinates, Lines and Conics 1 Homogeneous Coordinates In Lecture 1 we derived the camera equations λx = P X, (1) where x = (x 1, x 2, 1), X = (X 1, X 2, X 3, 1) and P is a 3 4
More informationTHREE DIMENSIONAL GEOMETRY
Chapter 8 THREE DIMENSIONAL GEOMETRY 8.1 Introduction In this chapter we present a vector algebra approach to three dimensional geometry. The aim is to present standard properties of lines and planes,
More informationSystems of Linear Equations
Systems of Linear Equations Beifang Chen Systems of linear equations Linear systems A linear equation in variables x, x,, x n is an equation of the form a x + a x + + a n x n = b, where a, a,, a n and
More informationSimilarity and Diagonalization. Similar Matrices
MATH022 Linear Algebra Brief lecture notes 48 Similarity and Diagonalization Similar Matrices Let A and B be n n matrices. We say that A is similar to B if there is an invertible n n matrix P such that
More informationFigure 1.1 Vector A and Vector F
CHAPTER I VECTOR QUANTITIES Quantities are anything which can be measured, and stated with number. Quantities in physics are divided into two types; scalar and vector quantities. Scalar quantities have
More informationChapter 6. Orthogonality
6.3 Orthogonal Matrices 1 Chapter 6. Orthogonality 6.3 Orthogonal Matrices Definition 6.4. An n n matrix A is orthogonal if A T A = I. Note. We will see that the columns of an orthogonal matrix must be
More informationMathematics Course 111: Algebra I Part IV: Vector Spaces
Mathematics Course 111: Algebra I Part IV: Vector Spaces D. R. Wilkins Academic Year 1996-7 9 Vector Spaces A vector space over some field K is an algebraic structure consisting of a set V on which are
More informationSynthetic Projective Treatment of Cevian Nests and Graves Triangles
Synthetic Projective Treatment of Cevian Nests and Graves Triangles Igor Minevich 1 Introduction Several proofs of the cevian nest theorem (given below) are known, including one using ratios along sides
More informationInner Product Spaces
Math 571 Inner Product Spaces 1. Preliminaries An inner product space is a vector space V along with a function, called an inner product which associates each pair of vectors u, v with a scalar u, v, and
More informationClassical theorems on hyperbolic triangles from a projective point of view
tmcs-szilasi 2012/3/1 0:14 page 175 #1 10/1 (2012), 175 181 Classical theorems on hyperbolic triangles from a projective point of view Zoltán Szilasi Abstract. Using the Cayley-Klein model of hyperbolic
More informationa 11 x 1 + a 12 x 2 + + a 1n x n = b 1 a 21 x 1 + a 22 x 2 + + a 2n x n = b 2.
Chapter 1 LINEAR EQUATIONS 1.1 Introduction to linear equations A linear equation in n unknowns x 1, x,, x n is an equation of the form a 1 x 1 + a x + + a n x n = b, where a 1, a,..., a n, b are given
More informationNotes on Determinant
ENGG2012B Advanced Engineering Mathematics Notes on Determinant Lecturer: Kenneth Shum Lecture 9-18/02/2013 The determinant of a system of linear equations determines whether the solution is unique, without
More informationA note on the geometry of three circles
A note on the geometry of three circles R. Pacheco, F. Pinheiro and R. Portugal Departamento de Matemática, Universidade da Beira Interior, Rua Marquês d Ávila e Bolama, 6201-001, Covilhã - Portugal. email:
More information4.5 Linear Dependence and Linear Independence
4.5 Linear Dependence and Linear Independence 267 32. {v 1, v 2 }, where v 1, v 2 are collinear vectors in R 3. 33. Prove that if S and S are subsets of a vector space V such that S is a subset of S, then
More informationLinear Algebra Notes
Linear Algebra Notes Chapter 19 KERNEL AND IMAGE OF A MATRIX Take an n m matrix a 11 a 12 a 1m a 21 a 22 a 2m a n1 a n2 a nm and think of it as a function A : R m R n The kernel of A is defined as Note
More informationMath 4310 Handout - Quotient Vector Spaces
Math 4310 Handout - Quotient Vector Spaces Dan Collins The textbook defines a subspace of a vector space in Chapter 4, but it avoids ever discussing the notion of a quotient space. This is understandable
More informationLinear Algebra Notes for Marsden and Tromba Vector Calculus
Linear Algebra Notes for Marsden and Tromba Vector Calculus n-dimensional Euclidean Space and Matrices Definition of n space As was learned in Math b, a point in Euclidean three space can be thought of
More informationSolving Systems of Linear Equations
LECTURE 5 Solving Systems of Linear Equations Recall that we introduced the notion of matrices as a way of standardizing the expression of systems of linear equations In today s lecture I shall show how
More information4: EIGENVALUES, EIGENVECTORS, DIAGONALIZATION
4: EIGENVALUES, EIGENVECTORS, DIAGONALIZATION STEVEN HEILMAN Contents 1. Review 1 2. Diagonal Matrices 1 3. Eigenvectors and Eigenvalues 2 4. Characteristic Polynomial 4 5. Diagonalizability 6 6. Appendix:
More informationInner Product Spaces and Orthogonality
Inner Product Spaces and Orthogonality week 3-4 Fall 2006 Dot product of R n The inner product or dot product of R n is a function, defined by u, v a b + a 2 b 2 + + a n b n for u a, a 2,, a n T, v b,
More informationTHE DIMENSION OF A VECTOR SPACE
THE DIMENSION OF A VECTOR SPACE KEITH CONRAD This handout is a supplementary discussion leading up to the definition of dimension and some of its basic properties. Let V be a vector space over a field
More informationPROJECTIVE GEOMETRY. b3 course 2003. Nigel Hitchin
PROJECTIVE GEOMETRY b3 course 2003 Nigel Hitchin hitchin@maths.ox.ac.uk 1 1 Introduction This is a course on projective geometry. Probably your idea of geometry in the past has been based on triangles
More informationEquations Involving Lines and Planes Standard equations for lines in space
Equations Involving Lines and Planes In this section we will collect various important formulas regarding equations of lines and planes in three dimensional space Reminder regarding notation: any quantity
More informationMAT 1341: REVIEW II SANGHOON BAEK
MAT 1341: REVIEW II SANGHOON BAEK 1. Projections and Cross Product 1.1. Projections. Definition 1.1. Given a vector u, the rectangular (or perpendicular or orthogonal) components are two vectors u 1 and
More information1 Sets and Set Notation.
LINEAR ALGEBRA MATH 27.6 SPRING 23 (COHEN) LECTURE NOTES Sets and Set Notation. Definition (Naive Definition of a Set). A set is any collection of objects, called the elements of that set. We will most
More information[1] Diagonal factorization
8.03 LA.6: Diagonalization and Orthogonal Matrices [ Diagonal factorization [2 Solving systems of first order differential equations [3 Symmetric and Orthonormal Matrices [ Diagonal factorization Recall:
More informationISOMETRIES OF R n KEITH CONRAD
ISOMETRIES OF R n KEITH CONRAD 1. Introduction An isometry of R n is a function h: R n R n that preserves the distance between vectors: h(v) h(w) = v w for all v and w in R n, where (x 1,..., x n ) = x
More informationRecall that two vectors in are perpendicular or orthogonal provided that their dot
Orthogonal Complements and Projections Recall that two vectors in are perpendicular or orthogonal provided that their dot product vanishes That is, if and only if Example 1 The vectors in are orthogonal
More informationProjective Geometry. Projective Geometry
Euclidean versus Euclidean geometry describes sapes as tey are Properties of objects tat are uncanged by rigid motions» Lengts» Angles» Parallelism Projective geometry describes objects as tey appear Lengts,
More informationSelected practice exam solutions (part 5, item 2) (MAT 360)
Selected practice exam solutions (part 5, item ) (MAT 360) Harder 8,91,9,94(smaller should be replaced by greater )95,103,109,140,160,(178,179,180,181 this is really one problem),188,193,194,195 8. On
More informationMA106 Linear Algebra lecture notes
MA106 Linear Algebra lecture notes Lecturers: Martin Bright and Daan Krammer Warwick, January 2011 Contents 1 Number systems and fields 3 1.1 Axioms for number systems......................... 3 2 Vector
More informationMATRIX ALGEBRA AND SYSTEMS OF EQUATIONS. + + x 2. x n. a 11 a 12 a 1n b 1 a 21 a 22 a 2n b 2 a 31 a 32 a 3n b 3. a m1 a m2 a mn b m
MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS 1. SYSTEMS OF EQUATIONS AND MATRICES 1.1. Representation of a linear system. The general system of m equations in n unknowns can be written a 11 x 1 + a 12 x 2 +
More informationSolutions to Math 51 First Exam January 29, 2015
Solutions to Math 5 First Exam January 29, 25. ( points) (a) Complete the following sentence: A set of vectors {v,..., v k } is defined to be linearly dependent if (2 points) there exist c,... c k R, not
More information15.062 Data Mining: Algorithms and Applications Matrix Math Review
.6 Data Mining: Algorithms and Applications Matrix Math Review The purpose of this document is to give a brief review of selected linear algebra concepts that will be useful for the course and to develop
More informationLinear Algebra I. Ronald van Luijk, 2012
Linear Algebra I Ronald van Luijk, 2012 With many parts from Linear Algebra I by Michael Stoll, 2007 Contents 1. Vector spaces 3 1.1. Examples 3 1.2. Fields 4 1.3. The field of complex numbers. 6 1.4.
More informationLINEAR ALGEBRA W W L CHEN
LINEAR ALGEBRA W W L CHEN c W W L Chen, 1997, 2008 This chapter is available free to all individuals, on understanding that it is not to be used for financial gain, and may be downloaded and/or photocopied,
More informationA characterization of trace zero symmetric nonnegative 5x5 matrices
A characterization of trace zero symmetric nonnegative 5x5 matrices Oren Spector June 1, 009 Abstract The problem of determining necessary and sufficient conditions for a set of real numbers to be the
More informationVector and Matrix Norms
Chapter 1 Vector and Matrix Norms 11 Vector Spaces Let F be a field (such as the real numbers, R, or complex numbers, C) with elements called scalars A Vector Space, V, over the field F is a non-empty
More informationUnified Lecture # 4 Vectors
Fall 2005 Unified Lecture # 4 Vectors These notes were written by J. Peraire as a review of vectors for Dynamics 16.07. They have been adapted for Unified Engineering by R. Radovitzky. References [1] Feynmann,
More informationα = u v. In other words, Orthogonal Projection
Orthogonal Projection Given any nonzero vector v, it is possible to decompose an arbitrary vector u into a component that points in the direction of v and one that points in a direction orthogonal to v
More informationDEFINITION 5.1.1 A complex number is a matrix of the form. x y. , y x
Chapter 5 COMPLEX NUMBERS 5.1 Constructing the complex numbers One way of introducing the field C of complex numbers is via the arithmetic of matrices. DEFINITION 5.1.1 A complex number is a matrix of
More informationINCIDENCE-BETWEENNESS GEOMETRY
INCIDENCE-BETWEENNESS GEOMETRY MATH 410, CSUSM. SPRING 2008. PROFESSOR AITKEN This document covers the geometry that can be developed with just the axioms related to incidence and betweenness. The full
More informationThe minimum number of distinct areas of triangles determined by a set of n points in the plane
The minimum number of distinct areas of triangles determined by a set of n points in the plane Rom Pinchasi Israel Institute of Technology, Technion 1 August 6, 007 Abstract We prove a conjecture of Erdős,
More informationCOMBINATORIAL PROPERTIES OF THE HIGMAN-SIMS GRAPH. 1. Introduction
COMBINATORIAL PROPERTIES OF THE HIGMAN-SIMS GRAPH ZACHARY ABEL 1. Introduction In this survey we discuss properties of the Higman-Sims graph, which has 100 vertices, 1100 edges, and is 22 regular. In fact
More information5.3 The Cross Product in R 3
53 The Cross Product in R 3 Definition 531 Let u = [u 1, u 2, u 3 ] and v = [v 1, v 2, v 3 ] Then the vector given by [u 2 v 3 u 3 v 2, u 3 v 1 u 1 v 3, u 1 v 2 u 2 v 1 ] is called the cross product (or
More informationLecture L3 - Vectors, Matrices and Coordinate Transformations
S. Widnall 16.07 Dynamics Fall 2009 Lecture notes based on J. Peraire Version 2.0 Lecture L3 - Vectors, Matrices and Coordinate Transformations By using vectors and defining appropriate operations between
More informationMODERN APPLICATIONS OF PYTHAGORAS S THEOREM
UNIT SIX MODERN APPLICATIONS OF PYTHAGORAS S THEOREM Coordinate Systems 124 Distance Formula 127 Midpoint Formula 131 SUMMARY 134 Exercises 135 UNIT SIX: 124 COORDINATE GEOMETRY Geometry, as presented
More informationSection 1.1. Introduction to R n
The Calculus of Functions of Several Variables Section. Introduction to R n Calculus is the study of functional relationships and how related quantities change with each other. In your first exposure to
More information1 VECTOR SPACES AND SUBSPACES
1 VECTOR SPACES AND SUBSPACES What is a vector? Many are familiar with the concept of a vector as: Something which has magnitude and direction. an ordered pair or triple. a description for quantities such
More informationn 2 + 4n + 3. The answer in decimal form (for the Blitz): 0, 75. Solution. (n + 1)(n + 3) = n + 3 2 lim m 2 1
. Calculate the sum of the series Answer: 3 4. n 2 + 4n + 3. The answer in decimal form (for the Blitz):, 75. Solution. n 2 + 4n + 3 = (n + )(n + 3) = (n + 3) (n + ) = 2 (n + )(n + 3) ( 2 n + ) = m ( n
More informationElementary Linear Algebra
Elementary Linear Algebra Kuttler January, Saylor URL: http://wwwsaylororg/courses/ma/ Saylor URL: http://wwwsaylororg/courses/ma/ Contents Some Prerequisite Topics Sets And Set Notation Functions Graphs
More informationMetric Spaces. Chapter 7. 7.1. Metrics
Chapter 7 Metric Spaces A metric space is a set X that has a notion of the distance d(x, y) between every pair of points x, y X. The purpose of this chapter is to introduce metric spaces and give some
More information1 if 1 x 0 1 if 0 x 1
Chapter 3 Continuity In this chapter we begin by defining the fundamental notion of continuity for real valued functions of a single real variable. When trying to decide whether a given function is or
More information28 CHAPTER 1. VECTORS AND THE GEOMETRY OF SPACE. v x. u y v z u z v y u y u z. v y v z
28 CHAPTER 1. VECTORS AND THE GEOMETRY OF SPACE 1.4 Cross Product 1.4.1 Definitions The cross product is the second multiplication operation between vectors we will study. The goal behind the definition
More informationVector Algebra CHAPTER 13. Ü13.1. Basic Concepts
CHAPTER 13 ector Algebra Ü13.1. Basic Concepts A vector in the plane or in space is an arrow: it is determined by its length, denoted and its direction. Two arrows represent the same vector if they have
More informationMATRIX ALGEBRA AND SYSTEMS OF EQUATIONS
MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS Systems of Equations and Matrices Representation of a linear system The general system of m equations in n unknowns can be written a x + a 2 x 2 + + a n x n b a
More information8 Square matrices continued: Determinants
8 Square matrices continued: Determinants 8. Introduction Determinants give us important information about square matrices, and, as we ll soon see, are essential for the computation of eigenvalues. You
More informationSection 13.5 Equations of Lines and Planes
Section 13.5 Equations of Lines and Planes Generalizing Linear Equations One of the main aspects of single variable calculus was approximating graphs of functions by lines - specifically, tangent lines.
More informationLINEAR EQUATIONS IN TWO VARIABLES
66 MATHEMATICS CHAPTER 4 LINEAR EQUATIONS IN TWO VARIABLES The principal use of the Analytic Art is to bring Mathematical Problems to Equations and to exhibit those Equations in the most simple terms that
More informationMATH 304 Linear Algebra Lecture 9: Subspaces of vector spaces (continued). Span. Spanning set.
MATH 304 Linear Algebra Lecture 9: Subspaces of vector spaces (continued). Span. Spanning set. Vector space A vector space is a set V equipped with two operations, addition V V (x,y) x + y V and scalar
More informationChapter 4.1 Parallel Lines and Planes
Chapter 4.1 Parallel Lines and Planes Expand on our definition of parallel lines Introduce the idea of parallel planes. What do we recall about parallel lines? In geometry, we have to be concerned about
More informationNotes on Orthogonal and Symmetric Matrices MENU, Winter 2013
Notes on Orthogonal and Symmetric Matrices MENU, Winter 201 These notes summarize the main properties and uses of orthogonal and symmetric matrices. We covered quite a bit of material regarding these topics,
More informationCHAPTER 3. Methods of Proofs. 1. Logical Arguments and Formal Proofs
CHAPTER 3 Methods of Proofs 1. Logical Arguments and Formal Proofs 1.1. Basic Terminology. An axiom is a statement that is given to be true. A rule of inference is a logical rule that is used to deduce
More informationAu = = = 3u. Aw = = = 2w. so the action of A on u and w is very easy to picture: it simply amounts to a stretching by 3 and 2, respectively.
Chapter 7 Eigenvalues and Eigenvectors In this last chapter of our exploration of Linear Algebra we will revisit eigenvalues and eigenvectors of matrices, concepts that were already introduced in Geometry
More information13 MATH FACTS 101. 2 a = 1. 7. The elements of a vector have a graphical interpretation, which is particularly easy to see in two or three dimensions.
3 MATH FACTS 0 3 MATH FACTS 3. Vectors 3.. Definition We use the overhead arrow to denote a column vector, i.e., a linear segment with a direction. For example, in three-space, we write a vector in terms
More informationVector Spaces. Chapter 2. 2.1 R 2 through R n
Chapter 2 Vector Spaces One of my favorite dictionaries (the one from Oxford) defines a vector as A quantity having direction as well as magnitude, denoted by a line drawn from its original to its final
More informationLectures notes on orthogonal matrices (with exercises) 92.222 - Linear Algebra II - Spring 2004 by D. Klain
Lectures notes on orthogonal matrices (with exercises) 92.222 - Linear Algebra II - Spring 2004 by D. Klain 1. Orthogonal matrices and orthonormal sets An n n real-valued matrix A is said to be an orthogonal
More information12. Parallels. Then there exists a line through P parallel to l.
12. Parallels Given one rail of a railroad track, is there always a second rail whose (perpendicular) distance from the first rail is exactly the width across the tires of a train, so that the two rails
More informationLEARNING OBJECTIVES FOR THIS CHAPTER
CHAPTER 2 American mathematician Paul Halmos (1916 2006), who in 1942 published the first modern linear algebra book. The title of Halmos s book was the same as the title of this chapter. Finite-Dimensional
More informationMath 115A - Week 1 Textbook sections: 1.1-1.6 Topics covered: What is a vector? What is a vector space? Span, linear dependence, linear independence
Math 115A - Week 1 Textbook sections: 1.1-1.6 Topics covered: What is Linear algebra? Overview of course What is a vector? What is a vector space? Examples of vector spaces Vector subspaces Span, linear
More informationLINEAR ALGEBRA. September 23, 2010
LINEAR ALGEBRA September 3, 00 Contents 0. LU-decomposition.................................... 0. Inverses and Transposes................................. 0.3 Column Spaces and NullSpaces.............................
More information12.5 Equations of Lines and Planes
Instructor: Longfei Li Math 43 Lecture Notes.5 Equations of Lines and Planes What do we need to determine a line? D: a point on the line: P 0 (x 0, y 0 ) direction (slope): k 3D: a point on the line: P
More informationMATH1231 Algebra, 2015 Chapter 7: Linear maps
MATH1231 Algebra, 2015 Chapter 7: Linear maps A/Prof. Daniel Chan School of Mathematics and Statistics University of New South Wales danielc@unsw.edu.au Daniel Chan (UNSW) MATH1231 Algebra 1 / 43 Chapter
More informationProjective Geometry: A Short Introduction. Lecture Notes Edmond Boyer
Projective Geometry: A Short Introduction Lecture Notes Edmond Boyer Contents 1 Introduction 2 11 Objective 2 12 Historical Background 3 13 Bibliography 4 2 Projective Spaces 5 21 Definitions 5 22 Properties
More informationIntroduction to Matrix Algebra
Psychology 7291: Multivariate Statistics (Carey) 8/27/98 Matrix Algebra - 1 Introduction to Matrix Algebra Definitions: A matrix is a collection of numbers ordered by rows and columns. It is customary
More informationVisualizing Triangle Centers Using Geogebra
Visualizing Triangle Centers Using Geogebra Sanjay Gulati Shri Shankaracharya Vidyalaya, Hudco, Bhilai India http://mathematicsbhilai.blogspot.com/ sanjaybhil@gmail.com ABSTRACT. In this paper, we will
More informationGeometry of Vectors. 1 Cartesian Coordinates. Carlo Tomasi
Geometry of Vectors Carlo Tomasi This note explores the geometric meaning of norm, inner product, orthogonality, and projection for vectors. For vectors in three-dimensional space, we also examine the
More informationNumerical Analysis Lecture Notes
Numerical Analysis Lecture Notes Peter J. Olver 5. Inner Products and Norms The norm of a vector is a measure of its size. Besides the familiar Euclidean norm based on the dot product, there are a number
More informationOctober 3rd, 2012. Linear Algebra & Properties of the Covariance Matrix
Linear Algebra & Properties of the Covariance Matrix October 3rd, 2012 Estimation of r and C Let rn 1, rn, t..., rn T be the historical return rates on the n th asset. rn 1 rṇ 2 r n =. r T n n = 1, 2,...,
More informationNumerical Analysis Lecture Notes
Numerical Analysis Lecture Notes Peter J. Olver 6. Eigenvalues and Singular Values In this section, we collect together the basic facts about eigenvalues and eigenvectors. From a geometrical viewpoint,
More informationFOUNDATIONS OF ALGEBRAIC GEOMETRY CLASS 22
FOUNDATIONS OF ALGEBRAIC GEOMETRY CLASS 22 RAVI VAKIL CONTENTS 1. Discrete valuation rings: Dimension 1 Noetherian regular local rings 1 Last day, we discussed the Zariski tangent space, and saw that it
More informationSection 6.1 - Inner Products and Norms
Section 6.1 - Inner Products and Norms Definition. Let V be a vector space over F {R, C}. An inner product on V is a function that assigns, to every ordered pair of vectors x and y in V, a scalar in F,
More informationProjective Geometry - Part 2
Projective Geometry - Part 2 Alexander Remorov alexanderrem@gmail.com Review Four collinear points A, B, C, D form a harmonic bundle (A, C; B, D) when CA : DA CB DB = 1. A pencil P (A, B, C, D) is the
More informationFactorization Theorems
Chapter 7 Factorization Theorems This chapter highlights a few of the many factorization theorems for matrices While some factorization results are relatively direct, others are iterative While some factorization
More informationArrangements And Duality
Arrangements And Duality 3.1 Introduction 3 Point configurations are tbe most basic structure we study in computational geometry. But what about configurations of more complicated shapes? For example,
More informationMATH10040 Chapter 2: Prime and relatively prime numbers
MATH10040 Chapter 2: Prime and relatively prime numbers Recall the basic definition: 1. Prime numbers Definition 1.1. Recall that a positive integer is said to be prime if it has precisely two positive
More informationMath 241, Exam 1 Information.
Math 241, Exam 1 Information. 9/24/12, LC 310, 11:15-12:05. Exam 1 will be based on: Sections 12.1-12.5, 14.1-14.3. The corresponding assigned homework problems (see http://www.math.sc.edu/ boylan/sccourses/241fa12/241.html)
More informationT ( a i x i ) = a i T (x i ).
Chapter 2 Defn 1. (p. 65) Let V and W be vector spaces (over F ). We call a function T : V W a linear transformation form V to W if, for all x, y V and c F, we have (a) T (x + y) = T (x) + T (y) and (b)
More informationCONTROLLABILITY. Chapter 2. 2.1 Reachable Set and Controllability. Suppose we have a linear system described by the state equation
Chapter 2 CONTROLLABILITY 2 Reachable Set and Controllability Suppose we have a linear system described by the state equation ẋ Ax + Bu (2) x() x Consider the following problem For a given vector x in
More informationLinearly Independent Sets and Linearly Dependent Sets
These notes closely follow the presentation of the material given in David C. Lay s textbook Linear Algebra and its Applications (3rd edition). These notes are intended primarily for in-class presentation
More information( ) which must be a vector
MATH 37 Linear Transformations from Rn to Rm Dr. Neal, WKU Let T : R n R m be a function which maps vectors from R n to R m. Then T is called a linear transformation if the following two properties are
More informationVector Math Computer Graphics Scott D. Anderson
Vector Math Computer Graphics Scott D. Anderson 1 Dot Product The notation v w means the dot product or scalar product or inner product of two vectors, v and w. In abstract mathematics, we can talk about
More informationClassification of Cartan matrices
Chapter 7 Classification of Cartan matrices In this chapter we describe a classification of generalised Cartan matrices This classification can be compared as the rough classification of varieties in terms
More information