Lecture 5,6 Linear Methods for Classification. Summary

Size: px
Start display at page:

Download "Lecture 5,6 Linear Methods for Classification. Summary"

Transcription

1 Lecture 5,6 Lnear Methods for Classfcaton Rce ELEC 697 Farnaz Koushanfar Fall 2006 Summary Bayes Classfers Lnear Classfers Lnear regresson of an ndcator matrx Lnear dscrmnant analyss (LDA) Logstc regresson Separatng hyperplanes Readng (ch4, ELS) 1

2 Bayes Classfer he margnal dstrbutons of G are specfed as PMF p G (g), g=1,2,,k f X G (x G=g) shows the condtonal dstrbuton of X for G=g he tranng set (x,g ),=1,..,N has ndependent samples from the jont dstrbuton f X,G (x,g) f X,G (x,g) = p G (g)f X G (x G=g) he loss of predctng G * for G s L(G *,G) Classfcaton goal: mnmze the expected loss E X,G L(G(X),G)=E X (E G X L(G(X),G)) Bayes Classfer (cont d) It suffces to mnmze E G X L(G(X),G) for each X. he optmal classfer s: G(x) = argmn g E G X=x L(g,G) he Bayes rule s also nown as the rule of maxmum a posteror probablty G(x) = argmax g Pr(G=g X=x) Many classfcaton algorthms estmate the Pr(G=g X=x) and then apply the Bayes rule Bayes classfcaton rule 2

3 More About Lnear Classfcaton Snce predctor G(x) tae values n a dscrete set G, we can dvde the nput space nto a collecton of regons labeled accordng to classfcaton For K classes (1,2,,K), and the ftted lnear model for -th ndcator response varable s fˆ ( x ) = β ˆ + βˆ 0 x he decson boundary b/w and l s: fˆ ˆ ( x) = fl( x) An affne set or hyperplane: x : ( βˆ βˆ ) + ( βˆ βˆ { 0 l 0 l ) x = Model dscrmnant functon δ (x) for each class, then classfy x to the class wth the largest value for δ (x) 0} Lnear Decson Boundary We requre that monotone transformaton of δ or Pr(G= X=x) be lnear Decson boundares are the set of ponts wth log-odds=0 Prob. of class 1: π, prob. of class 2: 1- π Apply a transformaton:: log[π/(1- π)]=β 0 + β x wo popular methods that use log-odds Lnear dscrmnant analyss, lnear logstc regresson Explctly model the boundary b/w two classes as lnear. For a two-class problem wth p-dmensonal nput space, ths s modelng decson boundary as a hyperplane wo methods usng separatng hyperplanes Perceptron - Rosenblatt, optmally separatng hyperplanes -Vapn 3

4 Generalzng Lnear Decson Boundares Expand the varable set X 1,,X p by ncludng squares and cross products, addng up to p(p+1)/2 addtonal varables Lnear Regresson of an Indcator Matrx For K classes, K ndcators Y, =1,,K, wth Y =1, f G=, else 0 Indcator response matrx 4

5 Lnear Regresson of an Indcator Matrx (Cont d) For N tranng data, form N K ndcator response matrx Y, a matrx of 0 s and 1 s Yˆ 1 = X( X X) X Y A new observaton s classfed as follows: Compute the ftted output (K vector) - f ˆ( x) = [(1, x) Bˆ ] Identfy the largest component and classfy accordngly: Gˆ ( x) = argmax fˆ G (x) But how good s the ft? Verfy G f (x)=1 for any x f (x) can be negatve or larger than 1 We can allow lnear regresson nto bass expanson of h(x) As the sze of tranng set ncreases, adaptvely add more bass Lnear Regresson - Drawbac For K 3, especally for large K 5

6 Lnear Regresson - Drawbac For large K and small p, masng can naturally occur E.g. Vowel recognton data n 2D subspace, K=11, p=10 dmensons Lnear Regresson and Projecton * A lnear regresson functon (here n 2D) Projectseach pont x=[x 1 x 2 ] to a lne parallel to W 1 We can study how well the projected ponts {z 1,z 2,,z n }, vewed as functons of w 1, are separated across the classes * Sldes Courtesy of omm S. Jaaola, MI CSAIL 6

7 Lnear Regresson and Projecton A lnear regresson functon (here n 2D) Projects each pont x=[x 1 x 2 ] to a lne parallel to W 1 We can study how well the projected ponts {z 1,z 2,,z n }, vewed as functons of w 1, are separated across the classes Projecton and Classfcaton By varyng w 1 we get dfferent levels of separaton between the projected ponts 7

8 Optmzng the Projecton We would le to fnd the w 1 that somehow maxmzes the separaton of the projected ponts across classes We can quantfy the separaton (overlap) n terms of means and varatons of the resultng 1-D class dstrbuton Fsher Lnear Dscrmnant: Prelmnares Class descrpton n R d Class 0: n 0 samples, mean μ 0, covarance 0 Class 1: n 1 samples, mean μ 1, covarance 1 Projected class descrptons n R Class 0: n 0 samples, mean μ 0 w 1, covarance w 1 0 w 1 Class 1: n 1 samples, mean μ 1 w 1, covarance w 1 1 w 1 8

9 Fsher Lnear Dscrmnant Estmaton crteron: fnd w 1 that maxmzes he soluton (class separaton) s decson theoretcally optmal for two normal populatons wth equal covarances ( 1 = 0 ) Lnear Dscrmnant Analyss (LDA) π class pror Pr(G=) Functon f (x)=densty of X n class G= Bayes heorem: Leads to LDA, QDA, MDA (mxture DA), Kernel DA, Naïve Bayes Suppose that we model densty as a MVG: LDA s when we assume the classes have a common covarance matrx: =. It s suffcent to loo at log-odds 9

10 LDA Log-odds functon mples decson boundary b/w and l: Pr(G= X=x)=Pr(G=l X=x) lnear n x; n p dmensons a hyperplane Example: three classes and p=2 LDA (Cont d) 10

11 LDA (Cont d) In practce, we do not now the parameters of Gaussan dstrbutons. Estmate w/ tranng set N s the number of class data π μˆ = g = x / N K Σ ˆ = ( x μˆ )( x μˆ ) /( N ˆ = = = 1 g N / K ) For two classes, ths s le lnear regresson N QDA If s are not equal, the quadratc terms n x reman; we get quadratc dscrmnant functons (QDA) 11

12 QDA (Cont d) he estmates are smlar to LDA, but each class has a separate covarance matrces For large p dramatc ncrease n parameters In LDA, there are (K-1)(p+1) parameters For QDA, there are (K-1) {1+p(p+3)/2} LDA and QDA both wor really well hs s not because the data s Gaussan, rather, for smple decson boundares, Gaussan estmates are stable Bas-varance trade-off Regularzed Dscrmnent Analyss A compromse b/w LDA and QDA. Shrn separate covarances of QDA towards a common covarance (smlar to Rdge Reg.) 12

13 Example - RDA Computatons for LDA Suppose we compute the egen decomposton for,.e. U s p p orthonormal, D dagonal matrx of postve egenvalues d l. hen, ˆ 1 1 ( x μ ) ( ) [ ( ˆ )] [ ( ˆ Σ x μ = U x μ D U x μ log Σˆ = log d l l he LDA classfer s mplemented as: X* D -1/2 U X, where =UDU. he common covarance estmate of X* s dentty Classfy to the closest class centrod n the transformed space, modulo the effect of the class pror probabltes π )] 13

14 Bacground: Smple Decson heory * Suppose we now the class-condtonal denstes p(x y) for y=0,1 as well as the overall class frequences P(y) How do we decde whch class a new example x belongs to so as to mnmze the overall probablty of error? * Courtesy of omm S. Jaaola, MI CSAIL Bacground: Smple Decson heory Suppose we now the class-condtonal denstes p(x y) for y=0,1 as well as the overall class frequences P(y) How do we decde whch class a new example x belongs to so as to mnmze the overall probablty of error? 14

15 2-Class Logstc Regresson he optmal decsons are based on the posteror class probabltes P(y x). For bnary classfcaton problems, we can wrte these decsons as We generally don t now P(y x) but we can parameterze the possble decsons accordng to 2-Class Logstc Regresson (Cont d) Our log-odds model Gves rse to a specfc form for the condtonal probablty over the labels (the logstc model): Where Is a logstc squashng functon hat turns lnear predctons nto probabltes 15

16 2-Class Logstc Regresson: Decsons Logstc regresson models mply a lnear decson boundary K-Class Logstc Regresson he model s specfed n terms of K-1 log-odds or logt transformatons (reflectng the constrant that the probabltes sum to one) he choce of denomnator s arbtrary, typcally last class Pr( G = 1 X = x) log = β Pr( G = K X = x) Pr( G = 2 X = x) log = β Pr( G = K X = x) β 1 + β x 2 x log.. Pr( G = K 1 X = x) Pr( G = K X = x) = β + β ( K 1) 0 K 1 x 16

17 K-Class Logstc Regresson (Cont d) he model s specfed n terms of K-1 log-odds or logt transformatons (reflectng the constrant that the probabltes sum to one) A smple calculaton shows that exp( β 0 + β x) Pr( G = X = x) =, = 1,..., K 1, K exp( β ) β x l = l l 1 Pr( G = K X = x) = K exp( β + β x = 1 0 ) l l l o emphasze the dependence on the entre parameter set θ={β 10, β 1,,β (K-1)0, β (K-1)}, we denote the probabltes as Pr(G= X=x) = p (x; θ) Fttng Logstc Regresson Models logt P x P x = ( ) ( ) log = η( x = β 1 P( x) ) x log Lelhood = = N = 1 N = 1 { y log p + (1 y )log(1 p )} { y β x log(1 + e β x )} 17

18 Fttng Logstc Regresson Models IRLS s equvalent to Newton-Raphson procedure Fttng Logstc Regresson Models logt P x P x = ( ) ( ) log = η( x = β 1 P( x) ) log Lelhood = = N = 1 N = 1 { y β x { y log p + (1 y )log(1 p )} x log(1 + e β x IRLS algorthm (equvalent to Newton-Raphson) Intalze β. Form Lnearzed response: Form weghts w =p (1-p ) Update β by weghted LS of z on x wth weghts w Steps 2-4 repeated untl convergence )} 18

19 Example Logstc Regresson South Afrcan Heart Dsease: Coronary rs factor study (CORIS) baselne survey, carred out n three rural areas. Whte males b/w 15 and 64 Response: presence or absence of myocardal nfarcton Maxmum lelhood ft: Example Logstc Regresson South Afrcan Heart Dsease: 19

20 LDA: Logstc Regresson or LDA? hs lnearty s a consequence of the Gaussan assumpton for the class denstes, as well as the assumpton of a common covarance matrx. Logstc model hey use the same form for the logt functon Logstc Regresson or LDA? Dscrmnatve vs nformatve learnng: logstc regresson uses the condtonal dstrbuton of Y gven x to estmate parameters, whle LDA uses the full jont dstrbuton (assumng normalty). If normalty holds, LDA s up to 30% more effcent; o/w logstc regresson can be more robust. But the methods are smlar n practce. 20

21 Separatng Hyperplanes Separatng Hyperplanes Perceptrons: compute a lnear combnaton of the nput features and return the sgn For x 1,x 2 n L, β (x 1 -x 2 )=0 β*= β/ β normal to surface L For x 0 n L, β x 0 = - β 0 he sgned dstance of any pont x to L s gven by * 1 β ( x x0) = ( β x + β0) β 1 = f ( x) f '( x) 21

22 Rosenblatt's Perceptron Learnng Algorthm Fnds a separatng hyperplane by mnmzng the dstance of msclassfed ponts to the decson boundary If a response y =1 s msclassfed, then x β+β 0 <0, and the opposte for msclassfed pont y =-1 he goal s to mnmze Rosenblatt's Perceptron Learnng Algorthm (Cont d) Stochastc gradent descent he msclassfed observatons are vsted n some sequence and the parameters β updated ρ s the learnng rate, can be 1 w/o loss of generalty It can be shown that algorthm converges to a separatng hyperplane n a fnte number of steps 22

23 Optmal Separatng Hyperplanes Problem Example - Optmal Separatng Hyperplanes 23

Logistic Regression. Lecture 4: More classifiers and classes. Logistic regression. Adaboost. Optimization. Multiple class classification

Logistic Regression. Lecture 4: More classifiers and classes. Logistic regression. Adaboost. Optimization. Multiple class classification Lecture 4: More classfers and classes C4B Machne Learnng Hlary 20 A. Zsserman Logstc regresson Loss functons revsted Adaboost Loss functons revsted Optmzaton Multple class classfcaton Logstc Regresson

More information

Logistic Regression. Steve Kroon

Logistic Regression. Steve Kroon Logstc Regresson Steve Kroon Course notes sectons: 24.3-24.4 Dsclamer: these notes do not explctly ndcate whether values are vectors or scalars, but expects the reader to dscern ths from the context. Scenaro

More information

L10: Linear discriminants analysis

L10: Linear discriminants analysis L0: Lnear dscrmnants analyss Lnear dscrmnant analyss, two classes Lnear dscrmnant analyss, C classes LDA vs. PCA Lmtatons of LDA Varants of LDA Other dmensonalty reducton methods CSCE 666 Pattern Analyss

More information

What is Candidate Sampling

What is Candidate Sampling What s Canddate Samplng Say we have a multclass or mult label problem where each tranng example ( x, T ) conssts of a context x a small (mult)set of target classes T out of a large unverse L of possble

More information

Forecasting the Direction and Strength of Stock Market Movement

Forecasting the Direction and Strength of Stock Market Movement Forecastng the Drecton and Strength of Stock Market Movement Jngwe Chen Mng Chen Nan Ye cjngwe@stanford.edu mchen5@stanford.edu nanye@stanford.edu Abstract - Stock market s one of the most complcated systems

More information

Face Verification Problem. Face Recognition Problem. Application: Access Control. Biometric Authentication. Face Verification (1:1 matching)

Face Verification Problem. Face Recognition Problem. Application: Access Control. Biometric Authentication. Face Verification (1:1 matching) Face Recognton Problem Face Verfcaton Problem Face Verfcaton (1:1 matchng) Querymage face query Face Recognton (1:N matchng) database Applcaton: Access Control www.vsage.com www.vsoncs.com Bometrc Authentcaton

More information

Causal, Explanatory Forecasting. Analysis. Regression Analysis. Simple Linear Regression. Which is Independent? Forecasting

Causal, Explanatory Forecasting. Analysis. Regression Analysis. Simple Linear Regression. Which is Independent? Forecasting Causal, Explanatory Forecastng Assumes cause-and-effect relatonshp between system nputs and ts output Forecastng wth Regresson Analyss Rchard S. Barr Inputs System Cause + Effect Relatonshp The job of

More information

CS 2750 Machine Learning. Lecture 3. Density estimation. CS 2750 Machine Learning. Announcements

CS 2750 Machine Learning. Lecture 3. Density estimation. CS 2750 Machine Learning. Announcements Lecture 3 Densty estmaton Mlos Hauskrecht mlos@cs.ptt.edu 5329 Sennott Square Next lecture: Matlab tutoral Announcements Rules for attendng the class: Regstered for credt Regstered for audt (only f there

More information

Single and multiple stage classifiers implementing logistic discrimination

Single and multiple stage classifiers implementing logistic discrimination Sngle and multple stage classfers mplementng logstc dscrmnaton Hélo Radke Bttencourt 1 Dens Alter de Olvera Moraes 2 Vctor Haertel 2 1 Pontfíca Unversdade Católca do Ro Grande do Sul - PUCRS Av. Ipranga,

More information

Support Vector Machines

Support Vector Machines Support Vector Machnes Max Wellng Department of Computer Scence Unversty of Toronto 10 Kng s College Road Toronto, M5S 3G5 Canada wellng@cs.toronto.edu Abstract Ths s a note to explan support vector machnes.

More information

Probabilistic Linear Classifier: Logistic Regression. CS534-Machine Learning

Probabilistic Linear Classifier: Logistic Regression. CS534-Machine Learning robablstc Lnear Classfer: Logstc Regresson CS534-Machne Learnng Three Man Approaches to learnng a Classfer Learn a classfer: a functon f, ŷ f Learn a probablstc dscrmnatve model,.e., the condtonal dstrbuton

More information

benefit is 2, paid if the policyholder dies within the year, and probability of death within the year is ).

benefit is 2, paid if the policyholder dies within the year, and probability of death within the year is ). REVIEW OF RISK MANAGEMENT CONCEPTS LOSS DISTRIBUTIONS AND INSURANCE Loss and nsurance: When someone s subject to the rsk of ncurrng a fnancal loss, the loss s generally modeled usng a random varable or

More information

Regression Models for a Binary Response Using EXCEL and JMP

Regression Models for a Binary Response Using EXCEL and JMP SEMATECH 997 Statstcal Methods Symposum Austn Regresson Models for a Bnary Response Usng EXCEL and JMP Davd C. Trndade, Ph.D. STAT-TECH Consultng and Tranng n Appled Statstcs San Jose, CA Topcs Practcal

More information

Descriptive Models. Cluster Analysis. Example. General Applications of Clustering. Examples of Clustering Applications

Descriptive Models. Cluster Analysis. Example. General Applications of Clustering. Examples of Clustering Applications CMSC828G Prncples of Data Mnng Lecture #9 Today s Readng: HMS, chapter 9 Today s Lecture: Descrptve Modelng Clusterng Algorthms Descrptve Models model presents the man features of the data, a global summary

More information

CHAPTER 14 MORE ABOUT REGRESSION

CHAPTER 14 MORE ABOUT REGRESSION CHAPTER 14 MORE ABOUT REGRESSION We learned n Chapter 5 that often a straght lne descrbes the pattern of a relatonshp between two quanttatve varables. For nstance, n Example 5.1 we explored the relatonshp

More information

Quantization Effects in Digital Filters

Quantization Effects in Digital Filters Quantzaton Effects n Dgtal Flters Dstrbuton of Truncaton Errors In two's complement representaton an exact number would have nfntely many bts (n general). When we lmt the number of bts to some fnte value

More information

Prediction of Stock Market Index Movement by Ten Data Mining Techniques

Prediction of Stock Market Index Movement by Ten Data Mining Techniques Vol. 3, o. Modern Appled Scence Predcton of Stoc Maret Index Movement by en Data Mnng echnques Phchhang Ou (Correspondng author) School of Busness, Unversty of Shangha for Scence and echnology Rm 0, Internatonal

More information

1 Example 1: Axis-aligned rectangles

1 Example 1: Axis-aligned rectangles COS 511: Theoretcal Machne Learnng Lecturer: Rob Schapre Lecture # 6 Scrbe: Aaron Schld February 21, 2013 Last class, we dscussed an analogue for Occam s Razor for nfnte hypothess spaces that, n conjuncton

More information

PSYCHOLOGICAL RESEARCH (PYC 304-C) Lecture 12

PSYCHOLOGICAL RESEARCH (PYC 304-C) Lecture 12 14 The Ch-squared dstrbuton PSYCHOLOGICAL RESEARCH (PYC 304-C) Lecture 1 If a normal varable X, havng mean µ and varance σ, s standardsed, the new varable Z has a mean 0 and varance 1. When ths standardsed

More information

v a 1 b 1 i, a 2 b 2 i,..., a n b n i.

v a 1 b 1 i, a 2 b 2 i,..., a n b n i. SECTION 8.4 COMPLEX VECTOR SPACES AND INNER PRODUCTS 455 8.4 COMPLEX VECTOR SPACES AND INNER PRODUCTS All the vector spaces we have studed thus far n the text are real vector spaces snce the scalars are

More information

Portfolio Loss Distribution

Portfolio Loss Distribution Portfolo Loss Dstrbuton Rsky assets n loan ortfolo hghly llqud assets hold-to-maturty n the bank s balance sheet Outstandngs The orton of the bank asset that has already been extended to borrowers. Commtment

More information

Lecture 2: Single Layer Perceptrons Kevin Swingler

Lecture 2: Single Layer Perceptrons Kevin Swingler Lecture 2: Sngle Layer Perceptrons Kevn Sngler kms@cs.str.ac.uk Recap: McCulloch-Ptts Neuron Ths vastly smplfed model of real neurons s also knon as a Threshold Logc Unt: W 2 A Y 3 n W n. A set of synapses

More information

Statistical Methods to Develop Rating Models

Statistical Methods to Develop Rating Models Statstcal Methods to Develop Ratng Models [Evelyn Hayden and Danel Porath, Österrechsche Natonalbank and Unversty of Appled Scences at Manz] Source: The Basel II Rsk Parameters Estmaton, Valdaton, and

More information

Discussion Papers. Support Vector Machines (SVM) as a Technique for Solvency Analysis. Laura Auria Rouslan A. Moro. Berlin, August 2008

Discussion Papers. Support Vector Machines (SVM) as a Technique for Solvency Analysis. Laura Auria Rouslan A. Moro. Berlin, August 2008 Deutsches Insttut für Wrtschaftsforschung www.dw.de Dscusson Papers 8 Laura Aura Rouslan A. Moro Support Vector Machnes (SVM) as a Technque for Solvency Analyss Berln, August 2008 Opnons expressed n ths

More information

CHAPTER 5 RELATIONSHIPS BETWEEN QUANTITATIVE VARIABLES

CHAPTER 5 RELATIONSHIPS BETWEEN QUANTITATIVE VARIABLES CHAPTER 5 RELATIONSHIPS BETWEEN QUANTITATIVE VARIABLES In ths chapter, we wll learn how to descrbe the relatonshp between two quanttatve varables. Remember (from Chapter 2) that the terms quanttatve varable

More information

Point cloud to point cloud rigid transformations. Minimizing Rigid Registration Errors

Point cloud to point cloud rigid transformations. Minimizing Rigid Registration Errors Pont cloud to pont cloud rgd transformatons Russell Taylor 600.445 1 600.445 Fall 000-014 Copyrght R. H. Taylor Mnmzng Rgd Regstraton Errors Typcally, gven a set of ponts {a } n one coordnate system and

More information

Latent Class Regression. Statistics for Psychosocial Research II: Structural Models December 4 and 6, 2006

Latent Class Regression. Statistics for Psychosocial Research II: Structural Models December 4 and 6, 2006 Latent Class Regresson Statstcs for Psychosocal Research II: Structural Models December 4 and 6, 2006 Latent Class Regresson (LCR) What s t and when do we use t? Recall the standard latent class model

More information

Support Vector Machine Model for Currency Crisis Discrimination. Arindam Chaudhuri 1. Abstract

Support Vector Machine Model for Currency Crisis Discrimination. Arindam Chaudhuri 1. Abstract Support Vector Machne Model for Currency Crss Dscrmnaton Arndam Chaudhur Abstract Support Vector Machne (SVM) s powerful classfcaton technque based on the dea of structural rsk mnmzaton. Use of kernel

More information

How Sets of Coherent Probabilities May Serve as Models for Degrees of Incoherence

How Sets of Coherent Probabilities May Serve as Models for Degrees of Incoherence 1 st Internatonal Symposum on Imprecse Probabltes and Ther Applcatons, Ghent, Belgum, 29 June 2 July 1999 How Sets of Coherent Probabltes May Serve as Models for Degrees of Incoherence Mar J. Schervsh

More information

STATISTICAL DATA ANALYSIS IN EXCEL

STATISTICAL DATA ANALYSIS IN EXCEL Mcroarray Center STATISTICAL DATA ANALYSIS IN EXCEL Lecture 6 Some Advanced Topcs Dr. Petr Nazarov 14-01-013 petr.nazarov@crp-sante.lu Statstcal data analyss n Ecel. 6. Some advanced topcs Correcton for

More information

THE METHOD OF LEAST SQUARES THE METHOD OF LEAST SQUARES

THE METHOD OF LEAST SQUARES THE METHOD OF LEAST SQUARES The goal: to measure (determne) an unknown quantty x (the value of a RV X) Realsaton: n results: y 1, y 2,..., y j,..., y n, (the measured values of Y 1, Y 2,..., Y j,..., Y n ) every result s encumbered

More information

Bag-of-Words models. Lecture 9. Slides from: S. Lazebnik, A. Torralba, L. Fei-Fei, D. Lowe, C. Szurka

Bag-of-Words models. Lecture 9. Slides from: S. Lazebnik, A. Torralba, L. Fei-Fei, D. Lowe, C. Szurka Bag-of-Words models Lecture 9 Sldes from: S. Lazebnk, A. Torralba, L. Fe-Fe, D. Lowe, C. Szurka Bag-of-features models Overvew: Bag-of-features models Orgns and motvaton Image representaton Dscrmnatve

More information

Learning from Large Distributed Data: A Scaling Down Sampling Scheme for Efficient Data Processing

Learning from Large Distributed Data: A Scaling Down Sampling Scheme for Efficient Data Processing Internatonal Journal of Machne Learnng and Computng, Vol. 4, No. 3, June 04 Learnng from Large Dstrbuted Data: A Scalng Down Samplng Scheme for Effcent Data Processng Che Ngufor and Janusz Wojtusak part

More information

GRAVITY DATA VALIDATION AND OUTLIER DETECTION USING L 1 -NORM

GRAVITY DATA VALIDATION AND OUTLIER DETECTION USING L 1 -NORM GRAVITY DATA VALIDATION AND OUTLIER DETECTION USING L 1 -NORM BARRIOT Jean-Perre, SARRAILH Mchel BGI/CNES 18.av.E.Beln 31401 TOULOUSE Cedex 4 (France) Emal: jean-perre.barrot@cnes.fr 1/Introducton The

More information

1 De nitions and Censoring

1 De nitions and Censoring De ntons and Censorng. Survval Analyss We begn by consderng smple analyses but we wll lead up to and take a look at regresson on explanatory factors., as n lnear regresson part A. The mportant d erence

More information

Support vector domain description

Support vector domain description Pattern Recognton Letters 20 (1999) 1191±1199 www.elsever.nl/locate/patrec Support vector doman descrpton Davd M.J. Tax *,1, Robert P.W. Dun Pattern Recognton Group, Faculty of Appled Scence, Delft Unversty

More information

Machine Learning and Data Mining Lecture Notes

Machine Learning and Data Mining Lecture Notes Machne Learnng and Data Mnng Lecture Notes CSC 411/D11 Computer Scence Department Unversty of Toronto Verson: February 6, 2012 Copyrght c 2010 Aaron Hertzmann and Davd Fleet CONTENTS Contents Conventons

More information

MARKET SHARE CONSTRAINTS AND THE LOSS FUNCTION IN CHOICE BASED CONJOINT ANALYSIS

MARKET SHARE CONSTRAINTS AND THE LOSS FUNCTION IN CHOICE BASED CONJOINT ANALYSIS MARKET SHARE CONSTRAINTS AND THE LOSS FUNCTION IN CHOICE BASED CONJOINT ANALYSIS Tmothy J. Glbrde Assstant Professor of Marketng 315 Mendoza College of Busness Unversty of Notre Dame Notre Dame, IN 46556

More information

Georey E. Hinton. University oftoronto. Email: zoubin@cs.toronto.edu. Technical Report CRG-TR-96-1. May 21, 1996 (revised Feb 27, 1997) Abstract

Georey E. Hinton. University oftoronto. Email: zoubin@cs.toronto.edu. Technical Report CRG-TR-96-1. May 21, 1996 (revised Feb 27, 1997) Abstract The EM Algorthm for Mxtures of Factor Analyzers Zoubn Ghahraman Georey E. Hnton Department of Computer Scence Unversty oftoronto 6 Kng's College Road Toronto, Canada M5S A4 Emal: zoubn@cs.toronto.edu Techncal

More information

Exhaustive Regression. An Exploration of Regression-Based Data Mining Techniques Using Super Computation

Exhaustive Regression. An Exploration of Regression-Based Data Mining Techniques Using Super Computation Exhaustve Regresson An Exploraton of Regresson-Based Data Mnng Technques Usng Super Computaton Antony Daves, Ph.D. Assocate Professor of Economcs Duquesne Unversty Pttsburgh, PA 58 Research Fellow The

More information

Estimating the Number of Clusters in Genetics of Acute Lymphoblastic Leukemia Data

Estimating the Number of Clusters in Genetics of Acute Lymphoblastic Leukemia Data Journal of Al Azhar Unversty-Gaza (Natural Scences), 2011, 13 : 109-118 Estmatng the Number of Clusters n Genetcs of Acute Lymphoblastc Leukema Data Mahmoud K. Okasha, Khaled I.A. Almghar Department of

More information

Feature selection for intrusion detection. Slobodan Petrović NISlab, Gjøvik University College

Feature selection for intrusion detection. Slobodan Petrović NISlab, Gjøvik University College Feature selecton for ntruson detecton Slobodan Petrovć NISlab, Gjøvk Unversty College Contents The feature selecton problem Intruson detecton Traffc features relevant for IDS The CFS measure The mrmr measure

More information

Abstract. Clustering ensembles have emerged as a powerful method for improving both the

Abstract. Clustering ensembles have emerged as a powerful method for improving both the Clusterng Ensembles: {topchyal, Models jan, of punch}@cse.msu.edu Consensus and Weak Parttons * Alexander Topchy, Anl K. Jan, and Wllam Punch Department of Computer Scence and Engneerng, Mchgan State Unversty

More information

A Lyapunov Optimization Approach to Repeated Stochastic Games

A Lyapunov Optimization Approach to Repeated Stochastic Games PROC. ALLERTON CONFERENCE ON COMMUNICATION, CONTROL, AND COMPUTING, OCT. 2013 1 A Lyapunov Optmzaton Approach to Repeated Stochastc Games Mchael J. Neely Unversty of Southern Calforna http://www-bcf.usc.edu/

More information

Imperial College London

Imperial College London F. Fang 1, C.C. Pan 1, I.M. Navon 2, M.D. Pggott 1, G.J. Gorman 1, P.A. Allson 1 and A.J.H. Goddard 1 1 Appled Modellng and Computaton Group Department of Earth Scence and Engneerng Imperal College London,

More information

Forecasting the Demand of Emergency Supplies: Based on the CBR Theory and BP Neural Network

Forecasting the Demand of Emergency Supplies: Based on the CBR Theory and BP Neural Network 700 Proceedngs of the 8th Internatonal Conference on Innovaton & Management Forecastng the Demand of Emergency Supples: Based on the CBR Theory and BP Neural Network Fu Deqang, Lu Yun, L Changbng School

More information

How To Calculate The Accountng Perod Of Nequalty

How To Calculate The Accountng Perod Of Nequalty Inequalty and The Accountng Perod Quentn Wodon and Shlomo Ytzha World Ban and Hebrew Unversty September Abstract Income nequalty typcally declnes wth the length of tme taen nto account for measurement.

More information

Institute of Informatics, Faculty of Business and Management, Brno University of Technology,Czech Republic

Institute of Informatics, Faculty of Business and Management, Brno University of Technology,Czech Republic Lagrange Multplers as Quanttatve Indcators n Economcs Ivan Mezník Insttute of Informatcs, Faculty of Busness and Management, Brno Unversty of TechnologCzech Republc Abstract The quanttatve role of Lagrange

More information

Formulating & Solving Integer Problems Chapter 11 289

Formulating & Solving Integer Problems Chapter 11 289 Formulatng & Solvng Integer Problems Chapter 11 289 The Optonal Stop TSP If we drop the requrement that every stop must be vsted, we then get the optonal stop TSP. Ths mght correspond to a ob sequencng

More information

Recurrence. 1 Definitions and main statements

Recurrence. 1 Definitions and main statements Recurrence 1 Defntons and man statements Let X n, n = 0, 1, 2,... be a MC wth the state space S = (1, 2,...), transton probabltes p j = P {X n+1 = j X n = }, and the transton matrx P = (p j ),j S def.

More information

Learning from Multiple Outlooks

Learning from Multiple Outlooks Learnng from Multple Outlooks Maayan Harel Department of Electrcal Engneerng, Technon, Hafa, Israel She Mannor Department of Electrcal Engneerng, Technon, Hafa, Israel maayanga@tx.technon.ac.l she@ee.technon.ac.l

More information

Mixtures of Factor Analyzers with Common Factor Loadings for the Clustering and Visualisation of High-Dimensional Data

Mixtures of Factor Analyzers with Common Factor Loadings for the Clustering and Visualisation of High-Dimensional Data Mxtures of Factor Analyzers wth Common Factor Loadngs for the Clusterng and Vsualsaton of Hgh-Dmensonal Data Jangsun Baek 1 and Geoffrey J. McLachlan 2 1 Department of Statstcs, Chonnam Natonal Unversty,

More information

ECE544NA Final Project: Robust Machine Learning Hardware via Classifier Ensemble

ECE544NA Final Project: Robust Machine Learning Hardware via Classifier Ensemble 1 ECE544NA Fnal Project: Robust Machne Learnng Hardware va Classfer Ensemble Sa Zhang, szhang12@llnos.edu Dept. of Electr. & Comput. Eng., Unv. of Illnos at Urbana-Champagn, Urbana, IL, USA Abstract In

More information

Economic Interpretation of Regression. Theory and Applications

Economic Interpretation of Regression. Theory and Applications Economc Interpretaton of Regresson Theor and Applcatons Classcal and Baesan Econometrc Methods Applcaton of mathematcal statstcs to economc data for emprcal support Economc theor postulates a qualtatve

More information

Solving Factored MDPs with Continuous and Discrete Variables

Solving Factored MDPs with Continuous and Discrete Variables Solvng Factored MPs wth Contnuous and screte Varables Carlos Guestrn Berkeley Research Center Intel Corporaton Mlos Hauskrecht epartment of Computer Scence Unversty of Pttsburgh Branslav Kveton Intellgent

More information

The Development of Web Log Mining Based on Improve-K-Means Clustering Analysis

The Development of Web Log Mining Based on Improve-K-Means Clustering Analysis The Development of Web Log Mnng Based on Improve-K-Means Clusterng Analyss TngZhong Wang * College of Informaton Technology, Luoyang Normal Unversty, Luoyang, 471022, Chna wangtngzhong2@sna.cn Abstract.

More information

1. Fundamentals of probability theory 2. Emergence of communication traffic 3. Stochastic & Markovian Processes (SP & MP)

1. Fundamentals of probability theory 2. Emergence of communication traffic 3. Stochastic & Markovian Processes (SP & MP) 6.3 / -- Communcaton Networks II (Görg) SS20 -- www.comnets.un-bremen.de Communcaton Networks II Contents. Fundamentals of probablty theory 2. Emergence of communcaton traffc 3. Stochastc & Markovan Processes

More information

Prediction of Disability Frequencies in Life Insurance

Prediction of Disability Frequencies in Life Insurance Predcton of Dsablty Frequences n Lfe Insurance Bernhard Köng Fran Weber Maro V. Wüthrch October 28, 2011 Abstract For the predcton of dsablty frequences, not only the observed, but also the ncurred but

More information

Multiclass sparse logistic regression for classification of multiple cancer types using gene expression data

Multiclass sparse logistic regression for classification of multiple cancer types using gene expression data Computatonal Statstcs & Data Analyss 51 (26) 1643 1655 www.elsever.com/locate/csda Multclass sparse logstc regresson for classfcaton of multple cancer types usng gene expresson data Yongda Km a,, Sunghoon

More information

How To Find The Dsablty Frequency Of A Clam

How To Find The Dsablty Frequency Of A Clam 1 Predcton of Dsablty Frequences n Lfe Insurance Bernhard Köng 1, Fran Weber 1, Maro V. Wüthrch 2 Abstract: For the predcton of dsablty frequences, not only the observed, but also the ncurred but not yet

More information

8.5 UNITARY AND HERMITIAN MATRICES. The conjugate transpose of a complex matrix A, denoted by A*, is given by

8.5 UNITARY AND HERMITIAN MATRICES. The conjugate transpose of a complex matrix A, denoted by A*, is given by 6 CHAPTER 8 COMPLEX VECTOR SPACES 5. Fnd the kernel of the lnear transformaton gven n Exercse 5. In Exercses 55 and 56, fnd the mage of v, for the ndcated composton, where and are gven by the followng

More information

Evaluating the generalizability of an RCT using electronic health records data

Evaluating the generalizability of an RCT using electronic health records data Evaluatng the generalzablty of an RCT usng electronc health records data 3 nterestng questons Is our RCT representatve? How can we generalze RCT results? Can we use EHR* data as a control group? *) Electronc

More information

Gender Classification for Real-Time Audience Analysis System

Gender Classification for Real-Time Audience Analysis System Gender Classfcaton for Real-Tme Audence Analyss System Vladmr Khryashchev, Lev Shmaglt, Andrey Shemyakov, Anton Lebedev Yaroslavl State Unversty Yaroslavl, Russa vhr@yandex.ru, shmaglt_lev@yahoo.com, andrey.shemakov@gmal.com,

More information

The Greedy Method. Introduction. 0/1 Knapsack Problem

The Greedy Method. Introduction. 0/1 Knapsack Problem The Greedy Method Introducton We have completed data structures. We now are gong to look at algorthm desgn methods. Often we are lookng at optmzaton problems whose performance s exponental. For an optmzaton

More information

OPTIMAL INVESTMENT POLICIES FOR THE HORSE RACE MODEL. Thomas S. Ferguson and C. Zachary Gilstein UCLA and Bell Communications May 1985, revised 2004

OPTIMAL INVESTMENT POLICIES FOR THE HORSE RACE MODEL. Thomas S. Ferguson and C. Zachary Gilstein UCLA and Bell Communications May 1985, revised 2004 OPTIMAL INVESTMENT POLICIES FOR THE HORSE RACE MODEL Thomas S. Ferguson and C. Zachary Glsten UCLA and Bell Communcatons May 985, revsed 2004 Abstract. Optmal nvestment polces for maxmzng the expected

More information

Robust Design of Public Storage Warehouses. Yeming (Yale) Gong EMLYON Business School

Robust Design of Public Storage Warehouses. Yeming (Yale) Gong EMLYON Business School Robust Desgn of Publc Storage Warehouses Yemng (Yale) Gong EMLYON Busness School Rene de Koster Rotterdam school of management, Erasmus Unversty Abstract We apply robust optmzaton and revenue management

More information

Lecture 3: Force of Interest, Real Interest Rate, Annuity

Lecture 3: Force of Interest, Real Interest Rate, Annuity Lecture 3: Force of Interest, Real Interest Rate, Annuty Goals: Study contnuous compoundng and force of nterest Dscuss real nterest rate Learn annuty-mmedate, and ts present value Study annuty-due, and

More information

A Practitioner's Guide to Generalized Linear Models

A Practitioner's Guide to Generalized Linear Models A Practtoner's Gude to Generalzed Lnear Models A CAS Study Note Duncan Anderson, FIA Sholom Feldblum, FCAS Claudne Modln, FCAS Dors Schrmacher, FCAS Ernesto Schrmacher, ASA Neeza Thand, FCAS Thrd Edton

More information

Multi-View Regression via Canonical Correlation Analysis

Multi-View Regression via Canonical Correlation Analysis Mult-Vew Regresson va Canoncal Correlaton Analyss Sham M. Kakade 1 and Dean P. Foster 2 1 Toyota Technologcal Insttute at Chcago Chcago, IL 60637 2 Unversty of Pennsylvana Phladelpha, PA 19104 Abstract.

More information

Production. 2. Y is closed A set is closed if it contains its boundary. We need this for the solution existence in the profit maximization problem.

Production. 2. Y is closed A set is closed if it contains its boundary. We need this for the solution existence in the profit maximization problem. Producer Theory Producton ASSUMPTION 2.1 Propertes of the Producton Set The producton set Y satsfes the followng propertes 1. Y s non-empty If Y s empty, we have nothng to talk about 2. Y s closed A set

More information

DEFINING %COMPLETE IN MICROSOFT PROJECT

DEFINING %COMPLETE IN MICROSOFT PROJECT CelersSystems DEFINING %COMPLETE IN MICROSOFT PROJECT PREPARED BY James E Aksel, PMP, PMI-SP, MVP For Addtonal Informaton about Earned Value Management Systems and reportng, please contact: CelersSystems,

More information

HÜCKEL MOLECULAR ORBITAL THEORY

HÜCKEL MOLECULAR ORBITAL THEORY 1 HÜCKEL MOLECULAR ORBITAL THEORY In general, the vast maorty polyatomc molecules can be thought of as consstng of a collecton of two electron bonds between pars of atoms. So the qualtatve pcture of σ

More information

Chapter 4 ECONOMIC DISPATCH AND UNIT COMMITMENT

Chapter 4 ECONOMIC DISPATCH AND UNIT COMMITMENT Chapter 4 ECOOMIC DISATCH AD UIT COMMITMET ITRODUCTIO A power system has several power plants. Each power plant has several generatng unts. At any pont of tme, the total load n the system s met by the

More information

THE DISTRIBUTION OF LOAN PORTFOLIO VALUE * Oldrich Alfons Vasicek

THE DISTRIBUTION OF LOAN PORTFOLIO VALUE * Oldrich Alfons Vasicek HE DISRIBUION OF LOAN PORFOLIO VALUE * Oldrch Alfons Vascek he amount of captal necessary to support a portfolo of debt securtes depends on the probablty dstrbuton of the portfolo loss. Consder a portfolo

More information

Approximating Cross-validatory Predictive Evaluation in Bayesian Latent Variables Models with Integrated IS and WAIC

Approximating Cross-validatory Predictive Evaluation in Bayesian Latent Variables Models with Integrated IS and WAIC Approxmatng Cross-valdatory Predctve Evaluaton n Bayesan Latent Varables Models wth Integrated IS and WAIC Longha L Department of Mathematcs and Statstcs Unversty of Saskatchewan Saskatoon, SK, CANADA

More information

Kernel Carpentry for Online Regression using Randomly Varying Coefficient Model

Kernel Carpentry for Online Regression using Randomly Varying Coefficient Model Kernel Carpentr for Onlne Regresson usng Randoml Varng Coeffcent Model Naraanan U Edakunn Stefan Schaal Sethu Vaakumar School of Informatcs, Unverst of Ednburgh, Ednburgh EH9 3J, UK Department of Computer

More information

SVM Tutorial: Classification, Regression, and Ranking

SVM Tutorial: Classification, Regression, and Ranking SVM Tutoral: Classfcaton, Regresson, and Rankng Hwanjo Yu and Sungchul Km 1 Introducton Support Vector Machnes(SVMs) have been extensvely researched n the data mnng and machne learnng communtes for the

More information

Learning to Classify Ordinal Data: The Data Replication Method

Learning to Classify Ordinal Data: The Data Replication Method Journal of Machne Learnng Research 8 (7) 393-49 Submtted /6; Revsed 9/6; Publshed 7/7 Learnng to Classfy Ordnal Data: The Data Replcaton Method Jame S. Cardoso INESC Porto, Faculdade de Engenhara, Unversdade

More information

Transition Matrix Models of Consumer Credit Ratings

Transition Matrix Models of Consumer Credit Ratings Transton Matrx Models of Consumer Credt Ratngs Abstract Although the corporate credt rsk lterature has many studes modellng the change n the credt rsk of corporate bonds over tme, there s far less analyss

More information

An Algorithm for Data-Driven Bandwidth Selection

An Algorithm for Data-Driven Bandwidth Selection IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, VOL. 25, NO. 2, FEBRUARY 2003 An Algorthm for Data-Drven Bandwdth Selecton Dorn Comancu, Member, IEEE Abstract The analyss of a feature space

More information

Modelling high-dimensional data by mixtures of factor analyzers

Modelling high-dimensional data by mixtures of factor analyzers Computatonal Statstcs & Data Analyss 41 (2003) 379 388 www.elsever.com/locate/csda Modellng hgh-dmensonal data by mxtures of factor analyzers G.J. McLachlan, D. Peel, R.W. Bean Department of Mathematcs,

More information

Calculation of Sampling Weights

Calculation of Sampling Weights Perre Foy Statstcs Canada 4 Calculaton of Samplng Weghts 4.1 OVERVIEW The basc sample desgn used n TIMSS Populatons 1 and 2 was a two-stage stratfed cluster desgn. 1 The frst stage conssted of a sample

More information

A Simple Approach to Clustering in Excel

A Simple Approach to Clustering in Excel A Smple Approach to Clusterng n Excel Aravnd H Center for Computatonal Engneerng and Networng Amrta Vshwa Vdyapeetham, Combatore, Inda C Rajgopal Center for Computatonal Engneerng and Networng Amrta Vshwa

More information

Variance estimation for the instrumental variables approach to measurement error in generalized linear models

Variance estimation for the instrumental variables approach to measurement error in generalized linear models he Stata Journal (2003) 3, Number 4, pp. 342 350 Varance estmaton for the nstrumental varables approach to measurement error n generalzed lnear models James W. Hardn Arnold School of Publc Health Unversty

More information

Extending Probabilistic Dynamic Epistemic Logic

Extending Probabilistic Dynamic Epistemic Logic Extendng Probablstc Dynamc Epstemc Logc Joshua Sack May 29, 2008 Probablty Space Defnton A probablty space s a tuple (S, A, µ), where 1 S s a set called the sample space. 2 A P(S) s a σ-algebra: a set

More information

ActiveClean: Interactive Data Cleaning While Learning Convex Loss Models

ActiveClean: Interactive Data Cleaning While Learning Convex Loss Models ActveClean: Interactve Data Cleanng Whle Learnng Convex Loss Models Sanjay Krshnan, Jannan Wang, Eugene Wu, Mchael J. Frankln, Ken Goldberg UC Berkeley, Columba Unversty {sanjaykrshnan, jnwang, frankln,

More information

ONE of the most crucial problems that every image

ONE of the most crucial problems that every image IEEE TRANSACTIONS ON IMAGE PROCESSING, VOL. 23, NO. 10, OCTOBER 2014 4413 Maxmum Margn Projecton Subspace Learnng for Vsual Data Analyss Symeon Nktds, Anastasos Tefas, Member, IEEE, and Ioanns Ptas, Fellow,

More information

Least Squares Fitting of Data

Least Squares Fitting of Data Least Squares Fttng of Data Davd Eberly Geoetrc Tools, LLC http://www.geoetrctools.co/ Copyrght c 1998-2016. All Rghts Reserved. Created: July 15, 1999 Last Modfed: January 5, 2015 Contents 1 Lnear Fttng

More information

Chapter 6. Classification and Prediction

Chapter 6. Classification and Prediction Chapter 6. Classfcaton and Predcton What s classfcaton? What s Lazy learners (or learnng from predcton? your neghbors) Issues regardng classfcaton and Frequent-pattern-based predcton classfcaton Classfcaton

More information

Credit Limit Optimization (CLO) for Credit Cards

Credit Limit Optimization (CLO) for Credit Cards Credt Lmt Optmzaton (CLO) for Credt Cards Vay S. Desa CSCC IX, Ednburgh September 8, 2005 Copyrght 2003, SAS Insttute Inc. All rghts reserved. SAS Propretary Agenda Background Tradtonal approaches to credt

More information

Journal of Statistical Software

Journal of Statistical Software JSS Journal of Statstcal Software November 2014, Volume 62, Issue 3. http://www.jstatsoft.org/ Learnng Contnuous Tme Bayesan Network Classfers Usng MapReduce Smone Vlla Unversty of Mlano-Bcocca Marco Rossett

More information

Churn prediction in subscription services: An application of support vector machines while comparing two parameter-selection techniques

Churn prediction in subscription services: An application of support vector machines while comparing two parameter-selection techniques Expert Systems wth Applcatons Expert Systems wth Applcatons 34 (2008) 313 327 www.elsever.com/locate/eswa Churn predcton n subscrpton servces: An applcaton of support vector machnes whle comparng two parameter-selecton

More information

New Approaches to Support Vector Ordinal Regression

New Approaches to Support Vector Ordinal Regression New Approaches to Support Vector Ordnal Regresson We Chu chuwe@gatsby.ucl.ac.uk Gatsby Computatonal Neuroscence Unt, Unversty College London, London, WCN 3AR, UK S. Sathya Keerth selvarak@yahoo-nc.com

More information

Lecture 3: Linear methods for classification

Lecture 3: Linear methods for classification Lecture 3: Linear methods for classification Rafael A. Irizarry and Hector Corrada Bravo February, 2010 Today we describe four specific algorithms useful for classification problems: linear regression,

More information

BERNSTEIN POLYNOMIALS

BERNSTEIN POLYNOMIALS On-Lne Geometrc Modelng Notes BERNSTEIN POLYNOMIALS Kenneth I. Joy Vsualzaton and Graphcs Research Group Department of Computer Scence Unversty of Calforna, Davs Overvew Polynomals are ncredbly useful

More information

Boosting as a Regularized Path to a Maximum Margin Classifier

Boosting as a Regularized Path to a Maximum Margin Classifier Journal of Machne Learnng Research 5 (2004) 941 973 Submtted 5/03; Revsed 10/03; Publshed 8/04 Boostng as a Regularzed Path to a Maxmum Margn Classfer Saharon Rosset Data Analytcs Research Group IBM T.J.

More information

How To Know The Components Of Mean Squared Error Of Herarchcal Estmator S

How To Know The Components Of Mean Squared Error Of Herarchcal Estmator S S C H E D A E I N F O R M A T I C A E VOLUME 0 0 On Mean Squared Error of Herarchcal Estmator Stans law Brodowsk Faculty of Physcs, Astronomy, and Appled Computer Scence, Jagellonan Unversty, Reymonta

More information

Traffic-light a stress test for life insurance provisions

Traffic-light a stress test for life insurance provisions MEMORANDUM Date 006-09-7 Authors Bengt von Bahr, Göran Ronge Traffc-lght a stress test for lfe nsurance provsons Fnansnspetonen P.O. Box 6750 SE-113 85 Stocholm [Sveavägen 167] Tel +46 8 787 80 00 Fax

More information

Performance Analysis and Coding Strategy of ECOC SVMs

Performance Analysis and Coding Strategy of ECOC SVMs Internatonal Journal of Grd and Dstrbuted Computng Vol.7, No. (04), pp.67-76 http://dx.do.org/0.457/jgdc.04.7..07 Performance Analyss and Codng Strategy of ECOC SVMs Zhgang Yan, and Yuanxuan Yang, School

More information

PREDICTION OF MISSING DATA IN CARDIOTOCOGRAMS USING THE EXPECTATION MAXIMIZATION ALGORITHM

PREDICTION OF MISSING DATA IN CARDIOTOCOGRAMS USING THE EXPECTATION MAXIMIZATION ALGORITHM 18-19 October 2001, Hotel Kontokal Bay, Corfu PREDICTIO OF MISSIG DATA I CARDIOTOCOGRAMS USIG THE EXPECTATIO MAXIMIZATIO ALGORITHM G. okas Department of Electrcal and Computer Engneerng, Unversty of Patras,

More information