Fall Andrew U. Frank, October 23, 2011
|
|
|
- Eleanore Williamson
- 10 years ago
- Views:
Transcription
1 Fall 2011 Andrew U. Frank, October 23, 2011 TU Wien, Department of Geoinformation and Cartography, Gusshausstrasse 27-29/E127.1 A-1040 Vienna, Austria Part I Introduction 1 The data producer's perspective Current approaches to geo data quality are pushed by the producers of geo data primarily the National Mapping Agencies NMA to communicate the standards they maintain [nsds, morrison] and to coordinate necessary specication with other agencies, which are potentially users of their data [fcdic]. Some NMA make their data available again substantial fees and use data quality arguments to try to convince potential users that the data are of high quality and therefore the high prices are justied [some german/austrian publication?]. Data producers, similar to producers of other goods, claim that their product is of high quality. Unlike material goods, which are produced for a particular use, data can be used for many purposes - this is what GIS, previously called multipurpose cadastre [], is all about [wisconsin paper]. For a physical good, e.g. a pair of scissors for cutting hair, it is relatively clear what requirements a user has and what 'high quality' means; but note that a pair of scissors for cutting paper is not of high quality if one intents to cut hair (or reverse!). I would interpret high quality as fullls the users requirements for the intended purpose. But what does 'high quality' mean for a good with not yet dened and thus not yet known use? High quality for what? - Here starts the problem of geo data quality! The producer knows the production method, the instruments used etc. and understands their eect on the quality of the data collected. A surveyor can describe the statistical deviations from true values for the coordinates of the 1
2 points determined, assuming a normal distribution and indicating mean and standard deviation of the results. But precision of location is not the only aspect of geo data quality: Geo data quality has multiple aspects. Besides the precision of locations, matters obviously: when the data was collected, which themes are included, and how they are coded etc. In the mid 1980s tentative lists of what data quality aspects need to be included were published [chrisman, frank 1986]. They included, dierentiated between precision and resolution, for the three components of geo data: geometry, thematic data and time [sinton]. As a cope-out aspect, lineage was added, where data producers should describe where the data originated and how it was produced and treated. These views more or less reworked are the state of the art in today's data quality standards [refs]. <<insert a matrix: precision resolution // geometry, time, theme>> Only later was discovered that these aspects were not orthogonal to each other [frank?]. For example, spatial and temporal precision are hard to separate completely - an uncertain location and a sharp time time cannot be dierentiated from a certain location and a uncertain time stamp (g). This is only an application of Sinton's generic description of geographic data with three aspects (location, time, theme), of which one is xed, one varies as the independent variable, and the last is the dependent variable. Practical progress in reporting data quality for data sets is slow, despite the publication of standards. Hunter started a systematic investigations [ref], which revealed not only many missing indications, but often found uninformative values. What should a user do with a description of geometric precision as varying? Not much more is learned from precision between 2m and 10 km. I take this as indicating, that the practitioners among the data producers know that the users hardly ever consult the metadata, and that the data quality values in the metadata hardly ever help the user decide whether to use the dataset or not. This is conrmed by studies of user behavior [ann boin], which reveals other information users use to make the decision whether to use a dataset. The separation of the producers point of view from the perspective of the user introduced by Timpf [] help the research out of the impasse and stagnation. It posed a number of new questions for research: how to describe the user's requirements, how to connect the producers descriptions of data quality with the user requirements; These questions are the major driving force and provide the guideline for the presentation in this course. The practical goal of geo data quality research should be to achieve an operational connection between the data quality description from the producer's perspective which we know how to do and the decision by the user, whether a geo data set is useful for him and he should acquire and use it. 2
3 2 The users perspective If we consider the users perspective on data quality, we have to ask why a user would acquire a dataset and how data quality will aect this decision. It is obvious, that data which is not useful for the user will not be acquired - but what does not useful in this context? To answer the question, why a potential user would acquire some data, we have to look into the users situation. 2.1 Data serve only in decision situations When does a user need data? The only use of data is to improve decisions this is the only use of data! therefore, a user will consider the acquisition of data only when he needs them to make a decision, i.e. specic situation, not some generic need to know. The modern, highly distributed methods of decision making in corporations and public administration produce many situations, where potential decision-makers ask for data, which is, however, always related to some possible decision situation. The decision not to act is a decision as well; decision-makers typically ask for information to help them rst decide if an action by them is necessary and often no further action is observable - meaning that the decision was not to act. 2.2 Model of decision making A model of a decision is required for a formal analysis: a decision is a choice between dierent alternative actions, represented as a 1, a 2,... a n. A person makes a decision between the alternatives such that the outcome of the action he selects promises to be the best, the most advantageous outcome for him. In his mind, the outcome for each of the actions a i is the transformation of the current state s 0 to a new state s i ; the states s i are evaluated by a valuation function v, which produces for each state s i the corresponding value v i. The action which corresponds to the highest value v i is the most advantageous and is therefore selected. <<gure>> Note, that we do not assume that the user knows exactly what state follows from an action and what his valuation of this state will be, after execution. The concept of bounded rationality introduced by Herb Simon [] posits only that the decision maker has some idea of what the outcome will be and how he imagines the value of this outcome. From experience, we all know that we are sometimes very limited in what we know and select actions because we erroneously imagine an outcome which never realizes and we are disappointed when we realize our error in expecting a specic outcome or the error in valuation of an outcome we have imagined much nicer than what is actually achieved. 3
4 Figure 1: Model for decision making; without information, the maximum of v i (i = 1, 2,or 3). Information is acquired if v c, which is the expected value achieved with information, is larger than v i. 2.3 Role of information in decision making Assume a decision maker with the alternatives a 1, a 2,... a n as before, but the additional choice to acquire some data d, which contains information of relevance for the decision (Fig. 2.3). When should the data d be acquired? Lets label the alternatives, when executed after acquiring the data d with primes: a 1, a 2,... a n, to which outcomes s i with valuations v ibelong (Fig. 2). Given the additional information the user has, neither the outcomes nor the valuations are necessary the same as the ones he would expect without the acquired information. A rational decision maker will again select the action among a 1, a 2,... a n and a 1, a 2,... a n which give the best value. The apparent value of the information is the contribution to improve the decision, i.e. the dierence in the maximum of the values v i and the maximum of the values v i.the acquisition of the data was worthwhile if the maximum of the values v i, say v m, is larger than the maximum of the values v i, say v m ; a rational user should be ready to pay the dierence between v m and v m. With the assumption of bounded rationality, one must actually include an additional compound decision a c which is the action of acquiring the data and then select the best decision; the initially, before acquiring the data, expected value of this v c enters in the assessment of the willingness to pay for acquiring data as v c v m. The real value of the data is only revealed after the fact, when the actions are carried out and the real outcome of the decisions is revealed. The eect of acquiring data is often (only) a reduction of risk in a decision, which must be counted as a positive contribution. 4
5 Figure 2: Model for decision making; after acquisition of information, the improvement of the decision through the information can be evaluated. 3 Model of data quality from a user perspective 3.1 When is data correct (from a user perspective) Correctness of data is the pinnacle of data quality. When is data correct? Much has been discussed by data producers and standards state what deviation from values from re-measurements of higher quality is acceptable - often quite arbitrary rules, dictated by practicability, available resources of an agency etc. If we take the perspective of the user, the answer is relatively easy: data is correct if it leads to decisions for actions, which can be carried out and have the expected results. Some simplistic examples: a railway timetable entry is correct, if is leads us to catch the desired train: if we arrive at the station before the indicated time we are able to catch the respective train; navigation instructions are correct, if they can be followed (i.e. not leading to actions prohibited by the driving rules established by law) and lead to the desired goal, i.e. we reach the destination. This denition of correctness of data from a user perspective does not require that the data gives a true description of reality, as is sometimes requested, but only that the eect of deviations from a true description does not inuence the decision substantially - meaning another decision would be better, if the data were better. This leads to an understanding of the value of data and indirectly to the quality of the data always related to a specic decision situation. It hints to a reduced need for quality in the data: lack of correctness in the data is only aecting a decision, if another decision would be better than the one selected based on the erroneous data; given that for a decision we seldom have many options, then only data which is better than helping us to avoid selecting 5
6 the wrong alternative, is necessary. This means that approximate data and heuristic methods for decision making are sucient to select among the few alternatives one has in reality. It is meaningless to ask for data quality from a user perspective without considering a specic decision situation. 3.2 Quality of a decision Assume a decision situation, where the optimal decision is ã m and the decision with the available information is a m, the value of the information is the improvement of the decision and the degraded available information is thus just ã m a m less valuable than the perfect one. Consider the decision making d as a function d from some input data values d i to a decision (a i, v i ). Using ideas from adjustment computations to this decision function, one posit, that the optimal decision ã m results from correct values d for each input data element. In consequence, the contribution for the deviation of each data element from the correct value can be computed - assuming that the deviations are not large, linearization of the function d is permitted. (a i, v i ) = d(d i ) The data quality of a data element is then derived from the contribution it makes to the correctness of the decision. We can compare the decision with information d i compared to the decision we would make with no particular information d 0 (the absence of additional information is just the a particular case of erroneous information). Comparing the corresponding values indicates what contribution this data makes to the decision and says what a rational decision maker would be willing pay for it. [my paper] 4 Summary Data quality is not unlike the quality of other products: producers claim 'high quality', meaning that the data are produced with high quality inputs and carefully arranged operations under permanent control and nally checked against exacting standards. What sounds very similar to material production is somewhat complicated that the denition of dimensions on which to measure data - quantity as well as quality - is considerably more complex than for material goods. Measuring the quantity of data you receive from a source is far more complicated and no widely accepted consensus on how to do it exists - it is denitely not as easy as weighting a bag of potatoes. Measuring the quality is equally dicult and not comparable to non-trivial, but standardized measure of the starch content of said bag of potatoes (some industries pay potatoes for their starch content, which I consider here a quality attribute of potatoes). We have also seen dierence between material goods and data, e.g. non-rival, 6
7 multipurpose, experience good; aect how quality for data is somewhat dierent from quality descriptions for material goods. Considering decision making as a function from data to outcomes shows how the eect of data and data quality on a decision can be analyzed; given that deviation from correct values are small, linearization of the function is possible. The quality of the decision can be calculated by applying Gauss' law of error propagation from the quality of the input data. This decision quality deriving formula is in principle the desired method to translate the data quality descriptions of the producer to the data quality of the user. The restriction in principle indicates that the assumption of normal distribution of the deviations, i.e. that the deviations from perfect quality can be described statistically with standard deviations, is not justied for all data quality aspects. The completeness - technically described by omission and commission rates, for example, needs other statistical methods. To gain some insight, we start an ontological approach next. 7
Oscillations of the Sending Window in Compound TCP
Oscillations of the Sending Window in Compound TCP Alberto Blanc 1, Denis Collange 1, and Konstantin Avrachenkov 2 1 Orange Labs, 905 rue Albert Einstein, 06921 Sophia Antipolis, France 2 I.N.R.I.A. 2004
The Universe of Discourse Design with Visible Context
The Universe of Discourse Design with Visible Context Rational GUI Andrew U. Frank TU Wien, Department of Geoinformation Gusshausstrasse 27-29/E127.1 A-1040 Vienna, Austria [email protected] for
Software development process
OpenStax-CNX module: m14619 1 Software development process Trung Hung VO This work is produced by OpenStax-CNX and licensed under the Creative Commons Attribution License 2.0 Abstract A software development
Bilateral Exposures and Systemic Solvency Risk
Bilateral Exposures and Systemic Solvency Risk C., GOURIEROUX (1), J.C., HEAM (2), and A., MONFORT (3) (1) CREST, and University of Toronto (2) CREST, and Autorité de Contrôle Prudentiel et de Résolution
Is a Single-Bladed Knife Enough to Dissect Human Cognition? Commentary on Griffiths et al.
Cognitive Science 32 (2008) 155 161 Copyright C 2008 Cognitive Science Society, Inc. All rights reserved. ISSN: 0364-0213 print / 1551-6709 online DOI: 10.1080/03640210701802113 Is a Single-Bladed Knife
Spatial data quality assessment in GIS
Recent Advances in Geodesy and Geomatics Engineering Spatial data quality assessment in GIS DANIELA CRISTIANA DOCAN Surveying and Cadastre Department Technical University of Civil Engineering Bucharest
Clustering and scheduling maintenance tasks over time
Clustering and scheduling maintenance tasks over time Per Kreuger 2008-04-29 SICS Technical Report T2008:09 Abstract We report results on a maintenance scheduling problem. The problem consists of allocating
programming languages, programming language standards and compiler validation
Software Quality Issues when choosing a Programming Language C.J.Burgess Department of Computer Science, University of Bristol, Bristol, BS8 1TR, England Abstract For high quality software, an important
DATA QUALITY AND SCALE IN CONTEXT OF EUROPEAN SPATIAL DATA HARMONISATION
DATA QUALITY AND SCALE IN CONTEXT OF EUROPEAN SPATIAL DATA HARMONISATION Katalin Tóth, Vanda Nunes de Lima European Commission Joint Research Centre, Ispra, Italy ABSTRACT The proposal for the INSPIRE
1 Example of Time Series Analysis by SSA 1
1 Example of Time Series Analysis by SSA 1 Let us illustrate the 'Caterpillar'-SSA technique [1] by the example of time series analysis. Consider the time series FORT (monthly volumes of fortied wine sales
GEOGRAPHIC INFORMATION SYSTEMS CERTIFICATION
GEOGRAPHIC INFORMATION SYSTEMS CERTIFICATION GIS Syllabus - Version 1.2 January 2007 Copyright AICA-CEPIS 2009 1 Version 1 January 2007 GIS Certification Programme 1. Target The GIS certification is aimed
Stock Investing Using HUGIN Software
Stock Investing Using HUGIN Software An Easy Way to Use Quantitative Investment Techniques Abstract Quantitative investment methods have gained foothold in the financial world in the last ten years. This
TOWARDS AN AUTOMATED HEALING OF 3D URBAN MODELS
TOWARDS AN AUTOMATED HEALING OF 3D URBAN MODELS J. Bogdahn a, V. Coors b a University of Strathclyde, Dept. of Electronic and Electrical Engineering, 16 Richmond Street, Glasgow G1 1XQ UK - [email protected]
Exercises Engenharia de Software (cod. 5386 & 6633 )
Exercises Engenharia de Software (cod. 5386 & 6633 ) Departamento de Informática Universidade da Beira Interior Ano lectivo 2010/2011 These exercises are taken from Software Engineering, 9th edition, Pearson
Information and Responsiveness in Spare Parts Supply Chains
Information and Responsiveness in Spare Parts Supply Chains Table of Contents 1.0 Motivation... 3 2.0 What is Supply Chain?... 3 2.1 Spare Parts Supply Chain... 4 2.2 Spare Part Supply Chain Characteristics...
Introduction to Logistic Regression
OpenStax-CNX module: m42090 1 Introduction to Logistic Regression Dan Calderon This work is produced by OpenStax-CNX and licensed under the Creative Commons Attribution License 3.0 Abstract Gives introduction
Ulrich A. Muller UAM.1994-01-31. June 28, 1995
Hedging Currency Risks { Dynamic Hedging Strategies Based on O & A Trading Models Ulrich A. Muller UAM.1994-01-31 June 28, 1995 A consulting document by the O&A Research Group This document is the property
IMPLICIT COLLUSION IN DEALER MARKETS WITH DIFFERENT COSTS OF MARKET MAKING ANDREAS KRAUSE Abstract. This paper introduces dierent costs into the Dutta-Madhavan model of implicit collusion between market
INDIVIDUAL COURSE DETAILS
INDIVIDUAL COURSE DETAILS A. Name of Institution NATIONAL INSTITUTE OF TECHNICAL TEACHERS TRAINING AND RESEARCH TARAMANI CHENNAI 600 113 [An Autonomous Institute under Ministry of Human Resource Development,
Microeconomics. Lecture Outline. Claudia Vogel. Winter Term 2009/2010. Part III Market Structure and Competitive Strategy
Microeconomics Claudia Vogel EUV Winter Term 2009/2010 Claudia Vogel (EUV) Microeconomics Winter Term 2009/2010 1 / 25 Lecture Outline Part III Market Structure and Competitive Strategy 12 Monopolistic
Geo-information in The Hague & National SDI hub PDOK
Geo-information in The Hague & National SDI hub PDOK dr.ir. Friso Penninga senior advisor at Municipality of The Hague & tactical advisor at Geonovum Contents 1. Introduction 2. GI in The Hague a. Overview
STRUTS: Statistical Rules of Thumb. Seattle, WA
STRUTS: Statistical Rules of Thumb Gerald van Belle Departments of Environmental Health and Biostatistics University ofwashington Seattle, WA 98195-4691 Steven P. Millard Probability, Statistics and Information
IMPLEMENTATION OF A MANAGEMENT AND QUALITY CONTROL SYSTEM UNDER ISO STANDARDS 9001:2000, 19113, 19114,19138 AND 19115 IN CARTOGRAPHIC PRODUCTION
IMPLEMENTATION OF A MANAGEMENT AND QUALITY CONTROL SYSTEM UNDER ISO STANDARDS 9001:2000, 19113, 19114,19138 AND 19115 IN CARTOGRAPHIC PRODUCTION SUMMARY INTRODUCTION JOSELYN A. ROBLEDO CEBALLOS [email protected]
Draft Martin Doerr ICS-FORTH, Heraklion, Crete Oct 4, 2001
A comparison of the OpenGIS TM Abstract Specification with the CIDOC CRM 3.2 Draft Martin Doerr ICS-FORTH, Heraklion, Crete Oct 4, 2001 1 Introduction This Mapping has the purpose to identify, if the OpenGIS
Many algorithms, particularly divide and conquer algorithms, have time complexities which are naturally
Recurrence Relations Many algorithms, particularly divide and conquer algorithms, have time complexities which are naturally modeled by recurrence relations. A recurrence relation is an equation which
Statistics for Business Decision Making
Statistics for Business Decision Making Faculty of Economics University of Siena 1 / 62 You should be able to: ˆ Summarize and uncover any patterns in a set of multivariate data using the (FM) ˆ Apply
The Time Value of Money
The Time Value of Money This handout is an overview of the basic tools and concepts needed for this corporate nance course. Proofs and explanations are given in order to facilitate your understanding and
Chapter 12 Modal Decomposition of State-Space Models 12.1 Introduction The solutions obtained in previous chapters, whether in time domain or transfor
Lectures on Dynamic Systems and Control Mohammed Dahleh Munther A. Dahleh George Verghese Department of Electrical Engineering and Computer Science Massachuasetts Institute of Technology 1 1 c Chapter
Geography 4203 / 5203. GIS Modeling. Class 12: Spatial Data Quality and Uncertainty
Geography 4203 / 5203 GIS Modeling Class 12: Spatial Data Quality and Uncertainty Some Updates - Progress Reports Progress reports: 11 & 14 April (instead of 14 & 16 April) Class on 16 April: Jeremy Class
On computer algebra-aided stability analysis of dierence schemes generated by means of Gr obner bases
On computer algebra-aided stability analysis of dierence schemes generated by means of Gr obner bases Vladimir Gerdt 1 Yuri Blinkov 2 1 Laboratory of Information Technologies Joint Institute for Nuclear
1 Monopoly Why Monopolies Arise? Monopoly is a rm that is the sole seller of a product without close substitutes. The fundamental cause of monopoly is barriers to entry: A monopoly remains the only seller
Open Source Project Categorization Based on Growth Rate Analysis and Portfolio Planning Methods
Open Source Project Categorization Based on Growth Rate Analysis and Portfolio Planning Methods Stefan Koch and Volker Stix Vienna University of Economics and Business Administration Institute for Information
The CORAS Model-based Method for Security Risk Analysis
The CORAS Model-based Method for Security Risk Analysis Folker den Braber, Gyrd Brændeland, Heidi E. I. Dahl, Iselin Engan, Ida Hogganvik, Mass S. Lund, Bjørnar Solhaug, Ketil Stølen, Fredrik Vraalsen
Intelligent Agents. Based on An Introduction to MultiAgent Systems and slides by Michael Wooldridge
Intelligent Agents Based on An Introduction to MultiAgent Systems and slides by Michael Wooldridge Denition of an Agent An agent is a computer system capable of autonomous action in some environment, in
Prot Maximization and Cost Minimization
Simon Fraser University Prof. Karaivanov Department of Economics Econ 0 COST MINIMIZATION Prot Maximization and Cost Minimization Remember that the rm's problem is maximizing prots by choosing the optimal
Chapter 7. Continuity
Chapter 7 Continuity There are many processes and eects that depends on certain set of variables in such a way that a small change in these variables acts as small change in the process. Changes of this
A Spatial Data Infrastructure for a Spatially Enabled Government and Society
Chapter 1 A Spatial Data Infrastructure for a Spatially Enabled Government and Society Abbas Rajabifard Centre for Spatial Data Infrastructures and Land Administration, Department of Geomatics, University
Intermediate Microeconomics (22014)
Intermediate Microeconomics (22014) I. Consumer Instructor: Marc Teignier-Baqué First Semester, 2011 Outline Part I. Consumer 1. umer 1.1 Budget Constraints 1.2 Preferences 1.3 Utility Function 1.4 1.5
Daniel F. DeMenthon and Larry S. Davis. Center for Automation Research. University of Maryland
Model-Based Object Pose in 25 Lines of Code Daniel F. DeMenthon and Larry S. Davis Computer Vision Laboratory Center for Automation Research University of Maryland College Park, MD 20742 Abstract In this
Michael Cline. University of British Columbia. Vancouver, British Columbia. [email protected]. bimanual user interface.
Higher Degree-of-Freedom Bimanual User Interfaces for 3-D Computer Graphics Michael Cline Dept. of Computer Science University of British Columbia Vancouver, British Columbia Canada V6T 1Z4 [email protected]
Managing large sound databases using Mpeg7
Max Jacob 1 1 Institut de Recherche et Coordination Acoustique/Musique (IRCAM), place Igor Stravinsky 1, 75003, Paris, France Correspondence should be addressed to Max Jacob ([email protected]) ABSTRACT
How To Understand The Theory Of Economic Theory
MICROECONOMICS II. ELTE Faculty of Social Sciences, Department of Economics Microeconomics II. MARKET THEORY AND MARKETING, PART 3 Author: Supervised by February 2011 Prepared by:, using Jack Hirshleifer,
Simultaneous or Sequential? Search Strategies in the U.S. Auto. Insurance Industry. Elisabeth Honka 1. Pradeep Chintagunta 2
Simultaneous or Sequential? Search Strategies in the U.S. Auto Insurance Industry Elisabeth Honka 1 University of Texas at Dallas Pradeep Chintagunta 2 University of Chicago Booth School of Business October
Possibilistic programming in production planning of assemble-to-order environments
Fuzzy Sets and Systems 119 (2001) 59 70 www.elsevier.com/locate/fss Possibilistic programming in production planning of assemble-to-order environments Hsi-Mei Hsu, Wen-Pai Wang Department of Industrial
Six Degree of Freedom Control with a Two-Dimensional Input Device: Intuitive Controls and Simple Implementations
Six Degree of Freedom Control with a Two-Dimensional Input Device: Intuitive Controls and Simple Implementations Mark A. Livingston 1,2, Arthur Gregory 2, Bruce Culbertson 1 Computer Systems Laboratory
Operations management: Special topic: supply chain management
OpenStax-CNX module: m35461 1 Operations management: Special topic: supply chain management Global Text Project This work is produced by OpenStax-CNX and licensed under the Creative Commons Attribution
1 Uncertainty and Preferences
In this chapter, we present the theory of consumer preferences on risky outcomes. The theory is then applied to study the demand for insurance. Consider the following story. John wants to mail a package
DATA QUALITY IN GIS TERMINOLGY GIS11
DATA QUALITY IN GIS When using a GIS to analyse spatial data, there is sometimes a tendency to assume that all data, both locational and attribute, are completely accurate. This of course is never the
The program also provides supplemental modules on topics in geometry and probability and statistics.
Algebra 1 Course Overview Students develop algebraic fluency by learning the skills needed to solve equations and perform important manipulations with numbers, variables, equations, and inequalities. Students
Finite cloud method: a true meshless technique based on a xed reproducing kernel approximation
INTERNATIONAL JOURNAL FOR NUMERICAL METHODS IN ENGINEERING Int. J. Numer. Meth. Engng 2001; 50:2373 2410 Finite cloud method: a true meshless technique based on a xed reproducing kernel approximation N.
Tracking Moving Objects In Video Sequences Yiwei Wang, Robert E. Van Dyck, and John F. Doherty Department of Electrical Engineering The Pennsylvania State University University Park, PA16802 Abstract{Object
APPLICATION OF FREE TACHEOMETRIC STATIONS IN MONITORING OF MONUMENTAL OBJECTS
APPLICATION OF FREE TACHEOMETRIC STATIONS IN MONITORING OF MONUMENTAL OBJECTS Ryszard Malarski, Kamil Nagórski Warsaw University of Technology, Faculty of Geodesy and Cartography Department of Engineering
CHARACTERISTICS IN FLIGHT DATA ESTIMATION WITH LOGISTIC REGRESSION AND SUPPORT VECTOR MACHINES
CHARACTERISTICS IN FLIGHT DATA ESTIMATION WITH LOGISTIC REGRESSION AND SUPPORT VECTOR MACHINES Claus Gwiggner, Ecole Polytechnique, LIX, Palaiseau, France Gert Lanckriet, University of Berkeley, EECS,
How to Write a Successful PhD Dissertation Proposal
How to Write a Successful PhD Dissertation Proposal Before considering the "how", we should probably spend a few minutes on the "why." The obvious things certainly apply; i.e.: 1. to develop a roadmap
ArcGIS Data Models Practical Templates for Implementing GIS Projects
ArcGIS Data Models Practical Templates for Implementing GIS Projects GIS Database Design According to C.J. Date (1995), database design deals with the logical representation of data in a database. The
Two-step competition process leads to quasi power-law income distributions Application to scientic publication and citation distributions
Physica A 298 (21) 53 536 www.elsevier.com/locate/physa Two-step competition process leads to quasi power-law income distributions Application to scientic publication and citation distributions Anthony
Case Study 1: Adjustable-Rate Home Mortgage Loan Concepts illustrated: Time value of money, equivalence calculation, and loan analysis. Required readings: Chapters 4 and 5. 1 Background Buying a home today
For example, estimate the population of the United States as 3 times 10⁸ and the
CCSS: Mathematics The Number System CCSS: Grade 8 8.NS.A. Know that there are numbers that are not rational, and approximate them by rational numbers. 8.NS.A.1. Understand informally that every number
Mississippi Private Schools 2015
Mississippi Private Schools 2015 Shapefile Tags education, schools, private, K-12 Summary To add to state data clearinghouse the Mississippi private schools point features. Description Point locations
ADVANCED GEOGRAPHIC INFORMATION SYSTEMS Vol. II - Spatial Data Management: Topic Overview Gary J. Hunter SPATIAL DATA MANAGEMENT: TOPIC OVERVIEW
SPATIAL DATA MANAGEMENT: TOPIC OVERVIEW Gary J. Department of Geomatics, University of Melbourne, Australia Keywords: Spatial data management, standards, data quality, legal issues, GIS planning and implementation,
It all depends on independence
Working Papers Institute of Mathematical Economics 412 January 2009 It all depends on independence Daniel Eckert and Frederik Herzberg IMW Bielefeld University Postfach 100131 33501 Bielefeld Germany email:
A.II. Kernel Estimation of Densities
A.II. Kernel Estimation of Densities Olivier Scaillet University of Geneva and Swiss Finance Institute Outline 1 Introduction 2 Issues with Empirical Averages 3 Kernel Estimator 4 Optimal Bandwidth 5 Bivariate
ECON 305 Tutorial 7 (Week 9)
H. K. Chen (SFU) ECON 305 Tutorial 7 (Week 9) July 2,3, 2014 1 / 24 ECON 305 Tutorial 7 (Week 9) Questions for today: Ch.9 Problems 15, 7, 11, 12 MC113 Tutorial slides will be posted Thursday after 10:30am,
BUSINESS RULES AND GAP ANALYSIS
Leading the Evolution WHITE PAPER BUSINESS RULES AND GAP ANALYSIS Discovery and management of business rules avoids business disruptions WHITE PAPER BUSINESS RULES AND GAP ANALYSIS Business Situation More
The Data Warehouse Challenge
The Data Warehouse Challenge Taming Data Chaos Michael H. Brackett Technische Hochschule Darmstadt Fachbereichsbibliothek Informatik TU Darmstadt FACHBEREICH INFORMATIK B I B L I O T H E K Irwentar-Nr.:...H.3...:T...G3.ty..2iL..
CFSD 21 ST CENTURY SKILL RUBRIC CRITICAL & CREATIVE THINKING
Critical and creative thinking (higher order thinking) refer to a set of cognitive skills or strategies that increases the probability of a desired outcome. In an information- rich society, the quality
Abstract. Introduction
Data Replication and Data Sharing Integrating Heterogeneous Spatial Databases Mark Stoakes and Katherine Irwin Professional Services, Safe Software Inc. Abstract Spatial data warehouses are becoming more
University of Chicago
University of Chicago Department of Economics Recovery from bidding fever: Why pay more than 102% to buy a gift card? Author: Jörn Boehnke Date: April 5, 2013 Abstract On ebay, gift certicates often sell
Lars Nielsen. Abstract. In this paper, a general technique for evaluation of measurements by the method of
Evaluation of measurements by the method of least squares Lars Nielsen Danish Institute of Fundamental Metrology (DFM), Lyngby, DK LNdfmdtudk Abstract In this paper, a general technique for evaluation
Title 10 DEPARTMENT OF NATURAL RESOURCES Division 35 Land Survey Chapter 1 Cadastral Mapping Standards
Title 10 DEPARTMENT OF NATURAL RESOURCES Division 35 Land Survey Chapter 1 Cadastral Mapping Standards 10 CSR 35-1.010 Application of Standards PURPOSE: These minimum standards provide the digital mapper
Working Paper. Combining Recession Probability Forecasts from a Dynamic Probit Indicator. Januar 2012. Thomas Theobald
Institut für Makroökonomie und Konjunkturforschung Macroeconomic Policy Institute Januar 2012 Working Paper 89 Thomas Theobald Combining Recession Probability Forecasts from a Dynamic Probit Indicator
South Carolina College- and Career-Ready (SCCCR) Algebra 1
South Carolina College- and Career-Ready (SCCCR) Algebra 1 South Carolina College- and Career-Ready Mathematical Process Standards The South Carolina College- and Career-Ready (SCCCR) Mathematical Process
Miscellaneous. Simone Freschi [email protected] Tommaso Gabriellini [email protected]. Università di Siena
Miscellaneous Simone Freschi [email protected] Tommaso Gabriellini [email protected] Head of Global MPS Capital Services SpA - MPS Group Università di Siena ini A.A. 2014-2015 1 / 1 A
Introduction. Background Knowledge. The Task. The Support the Work Setting Should Provide
Introduction Onboarding new employees is always a challenge. Everyone has a learning curve a gradual progression from initial level of understanding to mastery of a new role. Succeeding in a new workplace
PS engine. Execution
A Model-Based Approach to the Verication of Program Supervision Systems Mar Marcos 1 y, Sabine Moisan z and Angel P. del Pobil y y Universitat Jaume I, Dept. of Computer Science Campus de Penyeta Roja,
Chapter 4 Multi-Stage Interconnection Networks The general concept of the multi-stage interconnection network, together with its routing properties, have been used in the preceding chapter to describe
Overview. Essential Questions. Precalculus, Quarter 4, Unit 4.5 Build Arithmetic and Geometric Sequences and Series
Sequences and Series Overview Number of instruction days: 4 6 (1 day = 53 minutes) Content to Be Learned Write arithmetic and geometric sequences both recursively and with an explicit formula, use them
Oracle Database 10g: Building GIS Applications Using the Oracle Spatial Network Data Model. An Oracle Technical White Paper May 2005
Oracle Database 10g: Building GIS Applications Using the Oracle Spatial Network Data Model An Oracle Technical White Paper May 2005 Building GIS Applications Using the Oracle Spatial Network Data Model
Virtual Landmarks for the Internet
Virtual Landmarks for the Internet Liying Tang Mark Crovella Boston University Computer Science Internet Distance Matters! Useful for configuring Content delivery networks Peer to peer applications Multiuser
