Normal Estimation for Point Clouds: A Comparison Study for a Voronoi Based Method
|
|
|
- Alberta Copeland
- 10 years ago
- Views:
Transcription
1 Eurographics Symposium on Point-Based Graphics (2005) M. Pauly, M. Zwicker, (Editors) Normal Estimation for Point Clouds: A Comparison Study for a Voronoi Based Method Tamal K. Dey Gang Li Jian Sun The Ohio State University, Columbus OH, USA Abstract Many applications that process a point cloud data benefit from a reliable normal estimation step. Given a point cloud presumably sampled from an unknown surface, the problem is to estimate the normals of the surface at the data points. Two approaches, one based on numerical optimizations and another based on Voronoi diagrams are known for the problem. Variations of numerical approaches work well even when point clouds are contaminated with noise. Recently a variation of the Voronoi based method is proposed for noisy point clouds. The centrality of the normal estimation step in point cloud processing begs a thorough study of the two approaches so that one knows which approach is appropriate for what circumstances. This paper presents such results. Categories and Subject Descriptors (according to ACM CCS): I.3.3 [Computer Graphics]: Line and Curve Generation 1. Introduction In many problems dealing with point cloud data, a normal estimation step precedes the main task. For example, in surface reconstruction, the quality of the approximation of the output surface depends on how well the estimated normals approximate the true normals of the sampled surface, see [AK04, ABCO 01, BC00, DS05] for example. Similar correlation exists between estimated normals and point-based rendering of surfaces [AA03]. Being so central to point cloud processing, the normal estimation step deserves special attention on its own right. The problem is compounded by the fact that the input point cloud can be noisy. Therefore, we need a normal estimation method that remains robust against noise. There are two dominant approaches for estimating normals from point clouds; one is numerical applying some optimization technique, the other is mostly combinatorial applying some Delaunay/Voronoi property. The numerical optimization based approach is known to work well under noise. However, it is not established how well the Voronoi based approach scales with noise. The original algorithm of Amenta and Bern [AB99] that uses the poles in the Voronoi diagrams works well with the data that are not noisy. It does not work in principle and in practice when data becomes noisy. Recently, starting with the work of [DG04], Dey and his co-authors [DGS05, DS05] have suggested a Voronoi/Delaunay based method for estimating normals from noisy point cloud data. The centrality of the normal estimation step in point cloud processing begs a comparison of this technique with the competitive numerical based approaches. The purpose of this paper is to make this comparison study. In the numerical approach, a widely used technique is to find a proper set of points in the local neighborhood for a point p and then compute a plane that best fits to these chosen points. The normal of the plane is taken as the estimated normal at p. This basic plane fitting method has been made more effective with sophisticated modifications. We consider two such variations, one by Pauly, Keiser, Kobbelt and Gross [PKKG03] and the other by Mitra, Nguyen and Guibas [MNG04] which have been shown to work well in practice. 2. Plane fitting methods Hoppe et al. [HDD 92] proposed an algorithm where the normal at each point is estimated as the normal to the fitting plane obtained by applying the total least square method to the k nearest neighbors of the point in the point cloud. Specifically for a point p and its k nearest neighbors {p i } k i=1,
2 they find the fitting plane n T x = c for p by minimizing the error term e(n,c) = k i=1 (nt p i c) 2 under the constraint n T n = 1. Notice that the normals computed by fitting planes are unoriented. They proposed an algorithm to orient the normals consistently. Pauly et al. [PKKG03] and Mitra et al. [MNG04] improved the method in two different ways. Pauly et al. noticed that the fitting plane for a point p should respect the nearby points more than the distant points in the point cloud. Hence the neighboring points are assigned different weights based on their distances to p. The smaller the distance of a sample from p, the bigger the weight it has. In other words, they redefined the error term as e(n,c) = k i=1 (nt p i c) 2 θ( p i p ), where θ() is a weighting function. In their implementation [PKKG03], the weighting function is taken as Gaussian, i.e., θ( p i p ) = e p i p h 2, where h 2 is chosen to be one third the square distance between p and its k-th nearest neighbor. We call this method as weighted plane fitting method, or WPF in short. Mitra, Nguygen and Guibas noticed that a proper selection of the value of k is crucial to obtain a good normal estimation. Using the same value of k at all points as in [HDD 92] could give biased fitting especially at places where samples are arbitrarily dense. Hence, instead of using k nearest neighbors of the point, they consider the samples within a ball of certain radius r. Under the assumption that the noise has zero mean and standard deviation σ n, they could get a bound on the angle between the estimated normal and the true normal with a probability almost one. An optimal radius r can be obtained by minimizing this bound, which has the following expression in three dimensional case provided the probability is 1 ε: 2 r = ( 1 ( σ n c1 + c 2 σ 2 )) 1 3 n (2.1) κ ερ where ρ is the local sampling density, κ is the local curvature and c 1 and c 2 are some constants. The actual algorithm takes σ n as user input and evaluates r in an iterative manner. Initially ρ and κ are evaluated based on the k(= 15) nearest neighbors and then the radius r is obtained from equation 2.1. Once one gets the neighborhood size r, ρ and κ are reevaluated based on the samples within this neighborhood to get a better estimation of ρ, κ and r. They claim that three iterations in general are enough to obtain good estimations for all the quantities. We call the above method as the adaptive plane fitting method, or APF in short. 3. Big Delaunay ball method For a "noise-free" point set, Amenta et al. [AB99] proposed a Voronoi based method for estimating normals. For a given set of points P R 3, let VorP and DelP denote the Voronoi diagram and its dual Delaunay triangulation of P respectively. Denote the Voronoi cell for a point p as V p. Amenta and Bern [AB99] showed that the line through p and the furthest Voronoi vertex in V p, called its pole, can approximate the normal at p up to orientation. However this property does not hold for noisy samples. Dey and Goswami [DG04] extended the idea of poles to the noisy samples. Call a ball Delaunay if its boundary circumscribes a Delaunay tetrahedron, or equivalently has a center at a Voronoi vertex v and has a radius v p where v V p. By definition, the Delaunay balls are maximally empty. The Delaunay balls with poles at their centers are called polar balls. The observation of Amenta and Bern can be interpreted in terms of the polar balls as follows. If p is a sample point on the boundary of a polar ball B, the segment joining p and the center of B estimates the normal direction at p. Dey and Goswami observed that, under some reasonable noise model, certain Delaunay balls remain relatively big and can play the role of polar balls. This suggests an algorithm for estimating the normals for the noisy point cloud. Redefine the pole for a point p P as the furthest vertex of its Voronoi cell whose dual Delaunay ball is big. Similar to the "noise-free" case, the normal line at p can be approximated by the line through p and its pole. We call this algorithm Big Delaunay Ball algorithm, or BDB in short Algorithm The key to the BDB algorithm is to identify the big Delaunay balls. The big Delaunay balls are identified by comparing their radii with the nearest neighbor distances of the incident samples. Specifically, for a point p P, let λ p denote its average nearest distances to the five nearest neighbors of p in P. We call a Delaunay ball big if its radius is larger than cλ p for at least one of its incident points p P where c is an user defined parameter. A small value for c makes the algorithm sensitive to the noise since the small Delaunay balls are identified as big. On the other hand, a large value for c makes less Delaunay balls marked as big. As a result, more points have no big Delaunay ball incident on them and hence no normal can be estimated for these points. We fix c = 2.5 in our experiments which yields the best normal estimation for all the models. After we obtain the estimation for the normal lines, We adopt the same method as Hoppe et. al [HDD 92] to orient them consistently. The entire algorithm is described in Figure Justification The justification of the BDB algorithm is given by a claim in [DS05]. To understand the claim, one needs the definition of local feature size lfs() for a smooth surface Σ. For any point x Σ, lfs(x) is defined to be the distance of x to the medial axis of Σ [AB99]. Assume that P is a noisy sample of Σ where it satisfies the locally uniform ε-sampling conditions. We refer the readers to [DS05] for a definition of this
3 BDB(P,c) Compute DelP for each point p P compute λ p for each Delaunay ball incident on p if its radius r > cλ p then mark it as big endfor endfor for each point p incident to a big Delaunay ball find the furthest Voronoi vertex c in V p compute the normal line as the line through p and c endfor orient the normals consistently. Figure 1: Algorithm BDB. sampling condition. Roughly, this sampling condition means that each point of Σ has a sample point within a small factor (given by ε) of the local feature size and also the sample points cannot cluster together arbitrarily. Claim 1 ([DS05]) Let p P be incident to a Delaunay ball with the center c and radius r. Let r > 5 1 lfs( p) where p is the closest point of p in Σ. Then, the acute angle between the normal line at p to Σ and the line through p and c is O(ε) for a sufficiently small ε > 0. Notice that some sample points may have no big Delaunay ball incident on them. Hence no normal can be estimated for these points. However Claim 2 proved in [DG04] shows that there are sufficiently many big Delaunay balls and hence the sample points to estimate the normals of the surface almost everywhere. Claim 2 ([DG04]) For each point x Σ, there is a Delaunay ball containing a medial axis point inside and a sample point on the boundary within O(ε) distance from x. Claim 1 and Claim 2 justify the BDB method. 4. Comparison In this section, we compare the big Delaunay ball (BDB) method with the WPF and APF methods. Since BDB method does not estimate the normal for all points in the point cloud, we only compare the estimated normals for those points at which the BDB method estimates the normals. Notice that, the normal estimation only at a subset of the input points is not a serious restriction for the BDB method as the normals can be interpolated such as with Gaussian interpolation [AK04, DGS05] to obtain normals at other points Experimental setup For comparing the methods, ideally we need to measure the deviation of the estimated normals from the true" surface normals. But, for point cloud data, often we do not know the sampled surface. We compensate for this shortcoming by computing a set of referential normals as described below. For experiments with noise we obtain noisy data by adding noise to the original point cloud data. The x, y and z components of the noise are independent and uniformly distributed. Their amplitudes (noise level) are controlled by a factor as described later. For referential normals, first we compute a surface from the original data (presumably no noise) by a surface reconstruction software called CO- CONE [COC]. Then, the average of the normals of the triangles incident to a vertex p in this surface is taken as the referential normal for p. The other surfaces we consider are some parametric algebraic surfaces. The true normals to these surfaces are numerically computed. The normal at each point is computed as the cross product of two tangential vectors. We define the error of an estimated normal at a point p as the angle (in radians) between the referential normal and the the estimated normal. Obviously the smaller the error, the better the normal estimation is Noisy data In our experiment, we choose the noise level both with respect to a global and a local scale. For the global scale, we take the amplitude of noise to be a factor of the largest side of an axis parallel bounding box of the point cloud. Since this global yardstick is large, the factor needs to be small so that the point cloud after perturbations remain reasonable for reconstruction. The factors we experiment with are 0, 0.005, 0.01 and The global scale for perturbations is perhaps more close to reality. In choosing the factor with respect to a local scale, we use the average distance of a point p to its five nearest neighbors. The point p is perturbed with a factor of this distance. Four factors 0, 0.5, 1 and 2 are considered for the experiments. In the APF method, we increase the value of parameter σ n accordingly as the noise level increases. We use three data sets, TORUS, BIGHAND and MAX- PLANCK for perturbations with noise. Just to give an idea of the perturbations, see the rendered point clouds of BIG- HAND in Figure 4 with different noise levels. Table 1 and Table 2 show the errors of different normal estimation methods over these three point clouds with different noise levels. They list the error values, standard deviations and timings. We make several observations from these experimental data. Also, we plot the average errors in Figure 2. First of all, when the noise level is low, all three methods estimate normals well as indicated by small mean error and small standard deviation. The WPF and BDB methods perform almost comparably. In general, WPF method gives the best estimation when the noise level is low. As the noise
4 Model Name # pts Noise Level Mean Error Standard Deviation Timing(sec) BDB WPF APF BDB WPF APF BDB WPF APF TORUS BIGHAND PLANCK Table 1: Normal estimation errors (in radians) under noise levels determined with respect to the global scale. Model Name # pts Noise Level Mean Error Standard Deviation Timing(sec) BDB WPF APF BDB WPF APF BDB WPF APF TORUS BIGHAND MAX- MAX- PLANCK Table 2: Normal estimation errors (in radians) under noise levels determined with respect to the local scale. level increases, BDB method gives relatively better performance. In general both WPF and BDB perform better than the APF method, the difference being more pronounced for larger noise levels Special cases Other than the noisy point clouds, we experiment with the normal estimation methods on a couple of special point clouds. In one case the point cloud samples the surface very unevenly and in the other, the point cloud samples a thin" surface, i.e., the surface has high curvature at some areas. These special point clouds are obtained by sampling some parametric surfaces in a special way. We obtain the first type of special point cloud TORUS by sampling a torus with dense samples along the equator line, as the left most picture of the first row in Figure 3 shows. Also, we sample a part of a single sheet hyperboloid where the points line up along some curves as shown in Figure 3. This point cloud HYPERBOL also serves as an example where the surface has boundaries.
5 Figure 2: Normal estimation comparison under noise levels with respect to the global scale(upper row), and the local scale(bottom row). For both of TORUS and HYPERBOL, BDB method works better than the other two methods as Figure 3 shows. The reason can be explained as follows. All three methods use some k-nearest neighbors to estimate the normals for some value of k. While WPF and APF use these neighbors to fit a plane, BDB use them to estimate the local sampling density. When points line up along a curve on the surface, APF and WPF try to fit a plane through the points along the curve since k nearest neighbors lie along that curve. Consequently, this plane deviates from the true tangent plane considerably. Of course, if k is chosen large enough the problem goes away. But, this becomes a serious issue for the users to supply an appropriate k and also a single k may not be appropriate for all places on the surface. Bad estimation of normals along the equator line of TORUS and almost all over HYPERBOL by APF and WPF are results of this pronounced dependency on k. We kept k = 40 (the default value in PointShop3D) for WPF method and k = 15 (the suggested value in [MNG04]) for APF method. The BDB method, on the other hand, does not depend on the choice of k so sensitively. It only needs to estimate the average distance to its local neighbors for some k neighbors. We kept k = 5 in the experiments. The arrangement of points along some specific directions (which is not rare in scanning processes) is not so harmful for the BDB method. In summary, BDB does not rely on local information as much as APF and WPF do. In another special class of point clouds, we sample a very thin ellipsoid; see the left most picture of the third row in Figure 3. For a point at the high curvature region, the neighboring points from the point cloud does not lie close to a plane and hence the fitting plane computed by WPF method or APF method could not approximate the tangent plane properly at those points. Of course, this problem can be attributed to the poor sampling density at the high curvature regions which is not again uncommon in the scanning processes. However, BDB method is not so sensitive to this relatively poor sampling as Figure 3 shows. In the above examples, we make some exaggeration about the specialty of the point cloud for the illustration purpose. However these special cases do occur in the real data up to a certain degree Timings Tables 1 and 2 show the timings for the three methods. Clearly, the BDB method is the slowest. The main reason is that it employs a Delaunay triangulation procedure to the entire point cloud while the other two methods can operate very locally. The very reason of globality for which BDB works better than the other two in special cases makes it slower. However, we must mention that we used CGAL 2.3 [CGA] for computing the Delaunay triangulations. Recent
6 Figure 3: The first column shows the point clouds of TORUS, HYPERBOL and ELLIPSOID. We take the normal estimation error as grey scale color for each point and render the surfaces with Gouraud shading. The darker the surface, the better the normal estimation. The column two, three and four show the normal estimation results of BDB, WPF and APF methods respectively. versions of CGAL are faster and we plan to test the timings on these versions in future. On reasonable state-of-the-art PCs, the Delaunay triangulation takes time in the order of minutes and hours for point cloud data in the range of several hundred thousands and millions respectively [DGH01]. Therefore, timings do not become a prohibitive issue for the point clouds up to this range Summary We summarize our observations from the experiments as follows. When the noise level is low and the point cloud samples the surface more or less evenly, all the three methods perform almost equally well though WPF gives the best results. When the noise level is relatively high and the sampling is skewed along some curves or is not dense enough for thin parts, BDB works the best. In general, if the size of the point cloud is no more than a few million points ( 5 million points), BDB is safer to use if one does not have specific knowledge about the quality of the input. Otherwise, WPF or APF should be preferred. 5. Applications In earlier work, normal estimations with numerical techniques have been used for some applications [AA03, PKKG03]. In this section we show some example applications where BDB method can be used effectively. The first one is the point cloud rendering. We feed the point cloud together with the estimated normals to PointShop 3D and use the so call "OpenGL preview" technique to render the point cloud directly. Figure 4 shows the rendering results for BIGHAND with different noise levels. Dey and Sun [DS05] define a smooth MLS (moving least squares) surface called adaptive MLS (AMLS) based on the set of points with normals possibly containing noise. All sample points can be projected onto an approximation of this surface using a Newton projection method. Once all sample points are projected, one can use any of the existing reconstruction algorithms to reconstruct the surface. Here we use the COCONE software [COC] which can reconstruct surfaces with or without boundary. In figure 5, we show the results of three different point clouds: MAX-PLANCK, HYPERSHEET and LUDWIQ. The left column shows the reconstruction results before the point cloud gets smoothed and the right column shows the reconstruction results after smoothing.
7 T. K. Dey, G. Li & J. Sun / Normal Estimation for Point Clouds: A Comparison Study for a Voronoi Based Method Figure 4: First hand from left is the rendering result for the original smooth point cloud. The 2nd, 3th and 4th hands (from left to right) are the rendering results for the point clouds with noise levels (local scale) 0.5, 1 and 2 respectively. References [AA03] A DAMSON A., A LEXA M.: Ray tracing point set surfaces. In Proceedings of Shape Modeling International (2003), pp [AB99] A MENTA N., B ERN M.: Surface reconstruction by voronoi filtering. Discr. Comput. Geom. 22 (1999), [ABCO 01] A LEXA M., B EHR J., C OHEN -O R D., F LEISHMAN S., L EVIN D., S ILVA C.: Point set surfaces. In Proc. IEEE Visualization (2001), pp [AK04] A MENTA N., K IL Y. J.: Defining point-set surfaces. In Proceedings of ACM SIGGRAPH 2004 (Aug. 2004), ACM Press, pp [BC00] B OISSONNAT J. D., C AZALS F.: Smooth surface reconstruction via natural neighbor interpolation of distance functions. In Proc. 16th. Annu. Sympos. Comput. Geom. (2000), pp [CGA] CGAL: Cgal library. [COC] COCONE: tamaldey. The Ohio State University. [DG04] D EY T. K., G OSWAMI S.: Provable surface reconstruction from noisy samples. In Proc. 20th Annu. Sympos. Comput. Geom. (2004), pp Figure 5: Smooth reconstruction from noisy point clouds. Acknowledgements. We acknowledge the support of Army Research Office, USA under the grant DAAD and NSF, USA under grants DMS and CCR c The Eurographics Association [DGH01] D EY T. K., G IESEN J., H UDSON J.: Delaunay based shape reconstruction from large data. In Proc. IEEE Sympos. Parallel and Large Data Visualization and Graphics (2001), pp [DGS05] D EY T. K., G OSWAMI S., S UN J.: Extremal surface based projections converge and reconstruct with isotopy. Technical Report OSU-CISRC-05-TR25, also available from authors web-pages (April 2005). [DS05] D EY T. K., S UN J.: An adaptive mls surface for reconstruction with guarantees. Technical Report OSUCISRC-05-TR26, also available from authors web-pages (April 2005).
8 [HDD 92] HOPPE H., DEROSE T., DUCHAMP T., MC- DONALD J., STUETZLE W.: Surface reconstruction from unorganized points. In Proceedings of ACM SIGGRAPH 1992 (1992), vol. 26, pp [MNG04] MITRA N. J., NGUYEN A., GUIBAS L.: Estimating surface normals in noisy point cloud data. In Internat. J. Comput. Geom. & Applications (2004), p. to appear. [PKKG03] PAULY M., KEISER R., KOBBELT L., GROSS M.: Shape modeling with point-sampled geometry. In Proceedings of ACM SIGGRAPH 2003 (2003), ACM Press, pp
Geometry and Topology from Point Cloud Data
Geometry and Topology from Point Cloud Data Tamal K. Dey Department of Computer Science and Engineering The Ohio State University Dey (2011) Geometry and Topology from Point Cloud Data WALCOM 11 1 / 51
Delaunay Based Shape Reconstruction from Large Data
Delaunay Based Shape Reconstruction from Large Data Tamal K. Dey Joachim Giesen James Hudson Ohio State University, Columbus, OH 4321, USA Abstract Surface reconstruction provides a powerful paradigm for
How To Create A Surface From Points On A Computer With A Marching Cube
Surface Reconstruction from a Point Cloud with Normals Landon Boyd and Massih Khorvash Department of Computer Science University of British Columbia,2366 Main Mall Vancouver, BC, V6T1Z4, Canada {blandon,khorvash}@cs.ubc.ca
Fast and Robust Normal Estimation for Point Clouds with Sharp Features
1/37 Fast and Robust Normal Estimation for Point Clouds with Sharp Features Alexandre Boulch & Renaud Marlet University Paris-Est, LIGM (UMR CNRS), Ecole des Ponts ParisTech Symposium on Geometry Processing
Point Cloud Segmentation via Constrained Nonlinear Least Squares Surface Normal Estimates
Point Cloud Segmentation via Constrained Nonlinear Least Squares Surface Normal Estimates Edward Castillo Radiation Oncology Department University of Texas MD Anderson Cancer Center, Houston TX [email protected]
Robust NURBS Surface Fitting from Unorganized 3D Point Clouds for Infrastructure As-Built Modeling
81 Robust NURBS Surface Fitting from Unorganized 3D Point Clouds for Infrastructure As-Built Modeling Andrey Dimitrov 1 and Mani Golparvar-Fard 2 1 Graduate Student, Depts of Civil Eng and Engineering
On Fast Surface Reconstruction Methods for Large and Noisy Point Clouds
On Fast Surface Reconstruction Methods for Large and Noisy Point Clouds Zoltan Csaba Marton, Radu Bogdan Rusu, Michael Beetz Intelligent Autonomous Systems, Technische Universität München {marton,rusu,beetz}@cs.tum.edu
IEEE TRANSACTIONS ON VISUALISATION AND COMPUTER GRAPHICS 1. Voronoi-based Curvature and Feature Estimation from Point Clouds
IEEE TRANSACTIONS ON VISUALISATION AND COMPUTER GRAPHICS 1 Voronoi-based Curvature and Feature Estimation from Point Clouds Quentin Mérigot, Maks Ovsjanikov, and Leonidas Guibas Abstract We present an
From Scattered Samples to Smooth Surfaces
From Scattered Samples to Smooth Surfaces Kai Hormann 1 California Institute of Technology (a) (b) (c) (d) Figure 1: A point cloud with 4,100 scattered samples (a), its triangulation with 7,938 triangles
Segmentation of building models from dense 3D point-clouds
Segmentation of building models from dense 3D point-clouds Joachim Bauer, Konrad Karner, Konrad Schindler, Andreas Klaus, Christopher Zach VRVis Research Center for Virtual Reality and Visualization, Institute
Neural Gas for Surface Reconstruction
Neural Gas for Surface Reconstruction Markus Melato, Barbara Hammer, Kai Hormann IfI Technical Report Series IfI-07-08 Impressum Publisher: Institut für Informatik, Technische Universität Clausthal Julius-Albert
Surface Reconstruction from Point Clouds
Autonomous Systems Lab Prof. Roland Siegwart Bachelor-Thesis Surface Reconstruction from Point Clouds Spring Term 2010 Supervised by: Andreas Breitenmoser François Pomerleau Author: Inna Tishchenko Abstract
Fast and Accurate Computation of Surface Normals from Range Images
Fast and Accurate Computation of Surface Normals from Range Images H. Badino, D. Huber, Y. Park and T. Kanade Abstract The fast and accurate computation of surface normals from a point cloud is a critical
Efficient Simplification of Point-Sampled Surfaces
Efficient Simplification of Point-Sampled Surfaces Mark Pauly Markus Gross Leif P Kobbelt ETH Zürich ETH Zürich RWTH Aachen Figure 1: Michelangelo s David at different levels-of-detail From left to right,
PCL - SURFACE RECONSTRUCTION
PCL - SURFACE RECONSTRUCTION TOYOTA CODE SPRINT Alexandru-Eugen Ichim Computer Graphics and Geometry Laboratory PROBLEM DESCRIPTION 1/2 3D revolution due to cheap RGB-D cameras (Asus Xtion & Microsoft
A new point cloud simplification algorithm
A new point cloud simplification algorithm Carsten Moenning Computer Laboratory University of Cambridge 15 JJ Thomson Avenue Cambridge CB3 0FD - UK email: [email protected] Neil A. Dodgson Computer Laboratory
MA 323 Geometric Modelling Course Notes: Day 02 Model Construction Problem
MA 323 Geometric Modelling Course Notes: Day 02 Model Construction Problem David L. Finn November 30th, 2004 In the next few days, we will introduce some of the basic problems in geometric modelling, and
Computer Applications in Textile Engineering. Computer Applications in Textile Engineering
3. Computer Graphics Sungmin Kim http://latam.jnu.ac.kr Computer Graphics Definition Introduction Research field related to the activities that includes graphics as input and output Importance Interactive
Reflection and Refraction
Equipment Reflection and Refraction Acrylic block set, plane-concave-convex universal mirror, cork board, cork board stand, pins, flashlight, protractor, ruler, mirror worksheet, rectangular block worksheet,
Surface Reconstruction from Point Cloud of Human Body by Clustering
Systems and Computers in Japan, Vol. 37, No. 11, 2006 Translated from Denshi Joho Tsushin Gakkai Ronbunshi, Vol. J87-D-II, No. 2, February 2004, pp. 649 660 Surface Reconstruction from Point Cloud of Human
Heat Kernel Signature
INTODUCTION Heat Kernel Signature Thomas Hörmann Informatics - Technische Universität ünchen Abstract Due to the increasing computational power and new low cost depth cameras the analysis of 3D shapes
Arrangements And Duality
Arrangements And Duality 3.1 Introduction 3 Point configurations are tbe most basic structure we study in computational geometry. But what about configurations of more complicated shapes? For example,
Consolidation of Unorganized Point Clouds for Surface Reconstruction
Consolidation of Unorganized Point Clouds for Surface Reconstruction Hui Huang 1 Dan Li 1 Hao Zhang 2 Uri Ascher 1 Daniel Cohen-Or 3 1 University of British Columbia 2 Simon Fraser University 3 Tel-Aviv
Topographic Change Detection Using CloudCompare Version 1.0
Topographic Change Detection Using CloudCompare Version 1.0 Emily Kleber, Arizona State University Edwin Nissen, Colorado School of Mines J Ramón Arrowsmith, Arizona State University Introduction CloudCompare
L 2 : x = s + 1, y = s, z = 4s + 4. 3. Suppose that C has coordinates (x, y, z). Then from the vector equality AC = BD, one has
The line L through the points A and B is parallel to the vector AB = 3, 2, and has parametric equations x = 3t + 2, y = 2t +, z = t Therefore, the intersection point of the line with the plane should satisfy:
Solving Simultaneous Equations and Matrices
Solving Simultaneous Equations and Matrices The following represents a systematic investigation for the steps used to solve two simultaneous linear equations in two unknowns. The motivation for considering
Classification of Fingerprints. Sarat C. Dass Department of Statistics & Probability
Classification of Fingerprints Sarat C. Dass Department of Statistics & Probability Fingerprint Classification Fingerprint classification is a coarse level partitioning of a fingerprint database into smaller
Constrained curve and surface fitting
Constrained curve and surface fitting Simon Flöry FSP-Meeting Strobl (June 20, 2006), [email protected], Vienna University of Technology Overview Introduction Motivation, Overview, Problem
A. OPENING POINT CLOUDS. (Notepad++ Text editor) (Cloud Compare Point cloud and mesh editor) (MeshLab Point cloud and mesh editor)
MeshLAB tutorial 1 A. OPENING POINT CLOUDS (Notepad++ Text editor) (Cloud Compare Point cloud and mesh editor) (MeshLab Point cloud and mesh editor) 2 OPENING POINT CLOUDS IN NOTEPAD ++ Let us understand
We can display an object on a monitor screen in three different computer-model forms: Wireframe model Surface Model Solid model
CHAPTER 4 CURVES 4.1 Introduction In order to understand the significance of curves, we should look into the types of model representations that are used in geometric modeling. Curves play a very significant
Mean Value Coordinates
Mean Value Coordinates Michael S. Floater Abstract: We derive a generalization of barycentric coordinates which allows a vertex in a planar triangulation to be expressed as a convex combination of its
CIRCLE COORDINATE GEOMETRY
CIRCLE COORDINATE GEOMETRY (EXAM QUESTIONS) Question 1 (**) A circle has equation x + y = 2x + 8 Determine the radius and the coordinates of the centre of the circle. r = 3, ( 1,0 ) Question 2 (**) A circle
New York State Student Learning Objective: Regents Geometry
New York State Student Learning Objective: Regents Geometry All SLOs MUST include the following basic components: Population These are the students assigned to the course section(s) in this SLO all students
42 CHAPTER 1. VECTORS AND THE GEOMETRY OF SPACE. Figure 1.18: Parabola y = 2x 2. 1.6.1 Brief review of Conic Sections
2 CHAPTER 1. VECTORS AND THE GEOMETRY OF SPACE Figure 1.18: Parabola y = 2 1.6 Quadric Surfaces 1.6.1 Brief review of Conic Sections You may need to review conic sections for this to make more sense. You
Determine whether the following lines intersect, are parallel, or skew. L 1 : x = 6t y = 1 + 9t z = 3t. x = 1 + 2s y = 4 3s z = s
Homework Solutions 5/20 10.5.17 Determine whether the following lines intersect, are parallel, or skew. L 1 : L 2 : x = 6t y = 1 + 9t z = 3t x = 1 + 2s y = 4 3s z = s A vector parallel to L 1 is 6, 9,
Comparison of Non-linear Dimensionality Reduction Techniques for Classification with Gene Expression Microarray Data
CMPE 59H Comparison of Non-linear Dimensionality Reduction Techniques for Classification with Gene Expression Microarray Data Term Project Report Fatma Güney, Kübra Kalkan 1/15/2013 Keywords: Non-linear
H.Calculating Normal Vectors
Appendix H H.Calculating Normal Vectors This appendix describes how to calculate normal vectors for surfaces. You need to define normals to use the OpenGL lighting facility, which is described in Chapter
Algebra 2 Chapter 1 Vocabulary. identity - A statement that equates two equivalent expressions.
Chapter 1 Vocabulary identity - A statement that equates two equivalent expressions. verbal model- A word equation that represents a real-life problem. algebraic expression - An expression with variables.
An introduction to Global Illumination. Tomas Akenine-Möller Department of Computer Engineering Chalmers University of Technology
An introduction to Global Illumination Tomas Akenine-Möller Department of Computer Engineering Chalmers University of Technology Isn t ray tracing enough? Effects to note in Global Illumination image:
AN ALGORITHM FOR CENTRELINE EXTRACTION USING NATURAL NEIGHBOUR INTERPOLATION
AN ALGORITHM FOR CENTRELINE EXTRACTION USING NATURAL NEIGHBOUR INTERPOLATION Darka Mioc, François Anton and Girija Dharmaraj Department of Geomatics Engineering University of Calgary 2500 University Drive
Exam 1 Sample Question SOLUTIONS. y = 2x
Exam Sample Question SOLUTIONS. Eliminate the parameter to find a Cartesian equation for the curve: x e t, y e t. SOLUTION: You might look at the coordinates and notice that If you don t see it, we can
Review Sheet for Test 1
Review Sheet for Test 1 Math 261-00 2 6 2004 These problems are provided to help you study. The presence of a problem on this handout does not imply that there will be a similar problem on the test. And
Medial Axis Construction and Applications in 3D Wireless Sensor Networks
Medial Axis Construction and Applications in 3D Wireless Sensor Networks Su Xia, Ning Ding, Miao Jin, Hongyi Wu, and Yang Yang Presenter: Hongyi Wu University of Louisiana at Lafayette Outline Introduction
Reconstruction of Solid Models from Oriented Point Sets
Eurographics Symposium on Geometry Processing (2005) M. Desbrun, H. Pottmann (Editors) Reconstruction of Solid Models from Oriented Point Sets Michael Kazhdan Abstract In this paper we present a novel
Section 1.1. Introduction to R n
The Calculus of Functions of Several Variables Section. Introduction to R n Calculus is the study of functional relationships and how related quantities change with each other. In your first exposure to
EM Clustering Approach for Multi-Dimensional Analysis of Big Data Set
EM Clustering Approach for Multi-Dimensional Analysis of Big Data Set Amhmed A. Bhih School of Electrical and Electronic Engineering Princy Johnson School of Electrical and Electronic Engineering Martin
Computer Graphics CS 543 Lecture 12 (Part 1) Curves. Prof Emmanuel Agu. Computer Science Dept. Worcester Polytechnic Institute (WPI)
Computer Graphics CS 54 Lecture 1 (Part 1) Curves Prof Emmanuel Agu Computer Science Dept. Worcester Polytechnic Institute (WPI) So Far Dealt with straight lines and flat surfaces Real world objects include
Solving Geometric Problems with the Rotating Calipers *
Solving Geometric Problems with the Rotating Calipers * Godfried Toussaint School of Computer Science McGill University Montreal, Quebec, Canada ABSTRACT Shamos [1] recently showed that the diameter of
USING TRIANGULATIONS IN COMPUTER SIMULATIONS OF GRANULAR MEDIA
USING TRIANGULATIONS IN COMPUTER SIMULATIONS OF GRANULAR MEDIA DIDIER MÜLLER, THOMAS M. LIEBLING EPFL-DMA, 1015 Lausanne, Switzerland ABSTRACT The distinct element method used to simulate behavior of granular
Mathematics on the Soccer Field
Mathematics on the Soccer Field Katie Purdy Abstract: This paper takes the everyday activity of soccer and uncovers the mathematics that can be used to help optimize goal scoring. The four situations that
Bedford, Fowler: Statics. Chapter 4: System of Forces and Moments, Examples via TK Solver
System of Forces and Moments Introduction The moment vector of a force vector,, with respect to a point has a magnitude equal to the product of the force magnitude, F, and the perpendicular distance from
Some Comments on the Derivative of a Vector with applications to angular momentum and curvature. E. L. Lady (October 18, 2000)
Some Comments on the Derivative of a Vector with applications to angular momentum and curvature E. L. Lady (October 18, 2000) Finding the formula in polar coordinates for the angular momentum of a moving
SEGMENTATION OF POINT CLOUDS USING SMOOTHNESS CONSTRAINT
ISPRS Commission V Symposium 'Image Engineering and Vision Metrology' SEGMENTATION OF POINT CLOUDS USING SMOOTHNESS CONSTRAINT T. Rabbani a, F. A. van den Heuvel b, G. Vosselman c a* Section of Optical
3D Distance from a Point to a Triangle
3D Distance from a Point to a Triangle Mark W. Jones Technical Report CSR-5-95 Department of Computer Science, University of Wales Swansea February 1995 Abstract In this technical report, two different
Name Class. Date Section. Test Form A Chapter 11. Chapter 11 Test Bank 155
Chapter Test Bank 55 Test Form A Chapter Name Class Date Section. Find a unit vector in the direction of v if v is the vector from P,, 3 to Q,, 0. (a) 3i 3j 3k (b) i j k 3 i 3 j 3 k 3 i 3 j 3 k. Calculate
Introduction to nonparametric regression: Least squares vs. Nearest neighbors
Introduction to nonparametric regression: Least squares vs. Nearest neighbors Patrick Breheny October 30 Patrick Breheny STA 621: Nonparametric Statistics 1/16 Introduction For the remainder of the course,
Automated Process for Generating Digitised Maps through GPS Data Compression
Automated Process for Generating Digitised Maps through GPS Data Compression Stewart Worrall and Eduardo Nebot University of Sydney, Australia {s.worrall, e.nebot}@acfr.usyd.edu.au Abstract This paper
Metrics on SO(3) and Inverse Kinematics
Mathematical Foundations of Computer Graphics and Vision Metrics on SO(3) and Inverse Kinematics Luca Ballan Institute of Visual Computing Optimization on Manifolds Descent approach d is a ascent direction
Multiphase Flow - Appendices
Discovery Laboratory Multiphase Flow - Appendices 1. Creating a Mesh 1.1. What is a geometry? The geometry used in a CFD simulation defines the problem domain and boundaries; it is the area (2D) or volume
Biggar High School Mathematics Department. National 5 Learning Intentions & Success Criteria: Assessing My Progress
Biggar High School Mathematics Department National 5 Learning Intentions & Success Criteria: Assessing My Progress Expressions & Formulae Topic Learning Intention Success Criteria I understand this Approximation
December 4, 2013 MATH 171 BASIC LINEAR ALGEBRA B. KITCHENS
December 4, 2013 MATH 171 BASIC LINEAR ALGEBRA B KITCHENS The equation 1 Lines in two-dimensional space (1) 2x y = 3 describes a line in two-dimensional space The coefficients of x and y in the equation
11.1. Objectives. Component Form of a Vector. Component Form of a Vector. Component Form of a Vector. Vectors and the Geometry of Space
11 Vectors and the Geometry of Space 11.1 Vectors in the Plane Copyright Cengage Learning. All rights reserved. Copyright Cengage Learning. All rights reserved. 2 Objectives! Write the component form of
So which is the best?
Manifold Learning Techniques: So which is the best? Todd Wittman Math 8600: Geometric Data Analysis Instructor: Gilad Lerman Spring 2005 Note: This presentation does not contain information on LTSA, which
Environmental Remote Sensing GEOG 2021
Environmental Remote Sensing GEOG 2021 Lecture 4 Image classification 2 Purpose categorising data data abstraction / simplification data interpretation mapping for land cover mapping use land cover class
GEOMETRY COMMON CORE STANDARDS
1st Nine Weeks Experiment with transformations in the plane G-CO.1 Know precise definitions of angle, circle, perpendicular line, parallel line, and line segment, based on the undefined notions of point,
Section 4: The Basics of Satellite Orbits
Section 4: The Basics of Satellite Orbits MOTION IN SPACE VS. MOTION IN THE ATMOSPHERE The motion of objects in the atmosphere differs in three important ways from the motion of objects in space. First,
Automatic Labeling of Lane Markings for Autonomous Vehicles
Automatic Labeling of Lane Markings for Autonomous Vehicles Jeffrey Kiske Stanford University 450 Serra Mall, Stanford, CA 94305 [email protected] 1. Introduction As autonomous vehicles become more popular,
How To Create A Triangulation From A Point Cloud
Eurographics Symposium on Geometry Processing (2004) R. Scopigno, D. Zorin, (Editors) Spectral Surface Reconstruction from Noisy Point Clouds Ravikrishna Kolluri Jonathan Richard Shewchuk James F. O Brien
TEXT-FILLED STACKED AREA GRAPHS Martin Kraus
Martin Kraus Text can add a significant amount of detail and value to an information visualization. In particular, it can integrate more of the data that a visualization is based on, and it can also integrate
A 3d particle visualization system for temperature management
A 3d particle visualization system for temperature management Benoit Lange, Nancy Rodriguez, William Puech, Hervé Rey, Xavier Vasques To cite this version: Benoit Lange, Nancy Rodriguez, William Puech,
12.5 Equations of Lines and Planes
Instructor: Longfei Li Math 43 Lecture Notes.5 Equations of Lines and Planes What do we need to determine a line? D: a point on the line: P 0 (x 0, y 0 ) direction (slope): k 3D: a point on the line: P
Chapter 2. Derivation of the Equations of Open Channel Flow. 2.1 General Considerations
Chapter 2. Derivation of the Equations of Open Channel Flow 2.1 General Considerations Of interest is water flowing in a channel with a free surface, which is usually referred to as open channel flow.
Visualization of General Defined Space Data
International Journal of Computer Graphics & Animation (IJCGA) Vol.3, No.4, October 013 Visualization of General Defined Space Data John R Rankin La Trobe University, Australia Abstract A new algorithm
Math for Game Programmers: Dual Numbers. Gino van den Bergen [email protected]
Math for Game Programmers: Dual Numbers Gino van den Bergen [email protected] Introduction Dual numbers extend real numbers, similar to complex numbers. Complex numbers adjoin an element i, for which i 2
Seminar. Path planning using Voronoi diagrams and B-Splines. Stefano Martina [email protected]
Seminar Path planning using Voronoi diagrams and B-Splines Stefano Martina [email protected] 23 may 2016 This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International
Volumetric Meshes for Real Time Medical Simulations
Volumetric Meshes for Real Time Medical Simulations Matthias Mueller and Matthias Teschner Computer Graphics Laboratory ETH Zurich, Switzerland [email protected], http://graphics.ethz.ch/ Abstract.
Geometry of Vectors. 1 Cartesian Coordinates. Carlo Tomasi
Geometry of Vectors Carlo Tomasi This note explores the geometric meaning of norm, inner product, orthogonality, and projection for vectors. For vectors in three-dimensional space, we also examine the
EXPLORING SPATIAL PATTERNS IN YOUR DATA
EXPLORING SPATIAL PATTERNS IN YOUR DATA OBJECTIVES Learn how to examine your data using the Geostatistical Analysis tools in ArcMap. Learn how to use descriptive statistics in ArcMap and Geoda to analyze
J. P. Oakley and R. T. Shann. Department of Electrical Engineering University of Manchester Manchester M13 9PL U.K. Abstract
A CURVATURE SENSITIVE FILTER AND ITS APPLICATION IN MICROFOSSIL IMAGE CHARACTERISATION J. P. Oakley and R. T. Shann Department of Electrical Engineering University of Manchester Manchester M13 9PL U.K.
INTRODUCTION TO RENDERING TECHNIQUES
INTRODUCTION TO RENDERING TECHNIQUES 22 Mar. 212 Yanir Kleiman What is 3D Graphics? Why 3D? Draw one frame at a time Model only once X 24 frames per second Color / texture only once 15, frames for a feature
Computing and Rendering Point Set Surfaces
IEEE TVCG 9(1) Jan 2003 Computing and Rendering Point Set Surfaces Marc Alexa TU Darmstadt Johannes Behr ZGDV Darmstadt Daniel Cohen-Or Tel Aviv University Shachar Fleishman Tel Aviv University David Levin
Structural Axial, Shear and Bending Moments
Structural Axial, Shear and Bending Moments Positive Internal Forces Acting Recall from mechanics of materials that the internal forces P (generic axial), V (shear) and M (moment) represent resultants
Figure 2.1: Center of mass of four points.
Chapter 2 Bézier curves are named after their inventor, Dr. Pierre Bézier. Bézier was an engineer with the Renault car company and set out in the early 196 s to develop a curve formulation which would
Jiří Matas. Hough Transform
Hough Transform Jiří Matas Center for Machine Perception Department of Cybernetics, Faculty of Electrical Engineering Czech Technical University, Prague Many slides thanks to Kristen Grauman and Bastian
Part-Based Recognition
Part-Based Recognition Benedict Brown CS597D, Fall 2003 Princeton University CS 597D, Part-Based Recognition p. 1/32 Introduction Many objects are made up of parts It s presumably easier to identify simple
Feature Preserving Point Set Surfaces based on Non-Linear Kernel Regression
EUROGRAPHICS 2009 / P. Dutré and M. Stamminger (Guest Editors) Volume 28 (2009), Number 2 Feature Preserving Point Set Surfaces based on Non-Linear Kernel Regression A. C. Öztireli1, G. Guennebaud2,3 and
A Visualization System and Monitoring Tool to Measure Concurrency in MPICH Programs
A Visualization System and Monitoring Tool to Measure Concurrency in MPICH Programs Michael Scherger Department of Computer Science Texas Christian University Email: [email protected] Zakir Hussain Syed
Clustering. Data Mining. Abraham Otero. Data Mining. Agenda
Clustering 1/46 Agenda Introduction Distance K-nearest neighbors Hierarchical clustering Quick reference 2/46 1 Introduction It seems logical that in a new situation we should act in a similar way as in
An Introduction to Point Pattern Analysis using CrimeStat
Introduction An Introduction to Point Pattern Analysis using CrimeStat Luc Anselin Spatial Analysis Laboratory Department of Agricultural and Consumer Economics University of Illinois, Urbana-Champaign
How I won the Chess Ratings: Elo vs the rest of the world Competition
How I won the Chess Ratings: Elo vs the rest of the world Competition Yannis Sismanis November 2010 Abstract This article discusses in detail the rating system that won the kaggle competition Chess Ratings:
NEW MEXICO Grade 6 MATHEMATICS STANDARDS
PROCESS STANDARDS To help New Mexico students achieve the Content Standards enumerated below, teachers are encouraged to base instruction on the following Process Standards: Problem Solving Build new mathematical
Classroom Tips and Techniques: The Student Precalculus Package - Commands and Tutors. Content of the Precalculus Subpackage
Classroom Tips and Techniques: The Student Precalculus Package - Commands and Tutors Robert J. Lopez Emeritus Professor of Mathematics and Maple Fellow Maplesoft This article provides a systematic exposition
