Out-of-Sample Extensions for LLE, Isomap, MDS, Eigenmaps, and Spectral Clustering

Size: px
Start display at page:

Download "Out-of-Sample Extensions for LLE, Isomap, MDS, Eigenmaps, and Spectral Clustering"

Transcription

1 Out-of-Sample Extensons for LLE, Isomap, MDS, Egenmaps, and Spectral Clusterng Yoshua Bengo, Jean-Franços Paement, Pascal Vncent Olver Delalleau, Ncolas Le Roux and Mare Oumet Département d Informatque et Recherche Opératonnelle Unversté de Montréal Montréal, Québec, Canada, H3C 3J7 Abstract Several unsupervsed learnng algorthms based on an egendecomposton provde ether an embeddng or a clusterng only for gven tranng ponts, wth no straghtforward extenson for out-of-sample examples short of recomputng egenvectors. Ths paper provdes a unfed framework for extendng Local Lnear Embeddng (LLE), Isomap, Laplacan Egenmaps, Mult-Dmensonal Scalng (for dmensonalty reducton) as well as for Spectral Clusterng. Ths framework s based on seeng these algorthms as learnng egenfunctons of a data-dependent kernel. Numercal experments show that the generalzatons performed have a level of error comparable to the varablty of the embeddng algorthms due to the choce of tranng data. 1 Introducton Many unsupervsed learnng algorthms have been recently proposed, all usng an egendecomposton for obtanng a lower-dmensonal embeddng of data lyng on a non-lnear manfold: Local Lnear Embeddng (LLE) (Rowes and Saul, 2000), Isomap (Tenenbaum, de Slva and Langford, 2000) and Laplacan Egenmaps (Belkn and Nyog, 2003). There are also many varants of Spectral Clusterng (Wess, 1999; Ng, Jordan and Wess, 2002), n whch such an embeddng s an ntermedate step before obtanng a clusterng of the data that can capture flat, elongated and even curved clusters. The two tasks (manfold learnng and clusterng) are lnked because the clusters found by spectral clusterng can be arbtrary curved manfolds (as long as there s enough data to locally capture ther curvature). 2 Common Framework In ths paper we consder fve types of unsupervsed learnng algorthms that can be cast n the same framework, based on the computaton of an embeddng for the tranng ponts obtaned from the prncpal egenvectors of a symmetrc matrx. Algorthm 1 1. Start from a data set D = {x 1,..., x n } wth n ponts n R d. Construct a n n neghborhood or smlarty matrx M. Let us denote K D (, ) (or K for shorthand) the data-dependent functon whch produces M by M j = K D (x, x j ). 2. Optonally transform M, yeldng a normalzed matrx M. Equvalently, ths corresponds to generatng M from a K D by M j = K D (x, x j ).

2 3. Compute the m largest postve egenvalues λ k and egenvectors v k of M. 4. The embeddng of each example x s the vector y wth y k the -th element of the k-th prncpal egenvector v k of M. Alternatvely (MDS and Isomap), the embeddng s e, wth e k = λ k y k. If the frst m egenvalues are postve, then e e j s the best approxmaton of M j usng only m coordnates, n the squared error sense. In the followng, we consder the specalzatons of Algorthm 1 for dfferent unsupervsed learnng algorthms. Let S be the -th row sum of the affnty matrx M: S = j M j. (1) We say that two ponts (a, b) are k-nearest-neghbors of each other f a s among the k nearest neghbors of b n D {a} or vce-versa. We denote by x j the j-th coordnate of the vector x. 2.1 Mult-Dmensonal Scalng Mult-Dmensonal Scalng (MDS) starts from a noton of dstance or affnty K that s computed between each par of tranng examples. We consder here metrc MDS (Cox and Cox, 1994). For the normalzaton step 2 n Algorthm 1, these dstances are converted to equvalent dot products usng ( the double-centerng formula: ) M j = 1 M j 1 2 n S 1 n S j + 1 n 2 S k. (2) The embeddng e k of example x s gven by λ k v k. 2.2 Spectral Clusterng Spectral clusterng (Wess, 1999) can yeld mpressvely good results where tradtonal clusterng lookng for round blobs n the data, such as K-means, would fal mserably. It s based on two man steps: frst embeddng the data ponts n a space n whch clusters are more obvous (usng the egenvectors of a Gram matrx), and then applyng a classcal clusterng algorthm such as K-means, e.g. as n (Ng, Jordan and Wess, 2002). The affnty matrx M s formed usng a kernel such as the Gaussan kernel. Several normalzaton steps have been proposed. Among the most successful ones, as advocated n (Wess, 1999; Ng, Jordan and Wess, 2002), s the followng: M j = k M j S S j. (3) To obtan m clusters, the frst m prncpal egenvectors of M are computed and K-means s appled on the unt-norm coordnates, obtaned from the embeddng y k = v k. 2.3 Laplacan Egenmaps Laplacan Egenmaps s a recently proposed dmensonalty reducton procedure (Belkn and Nyog, 2003) that has been proposed for sem-supervsed learnng. The authors use an approxmaton of the Laplacan operator such as the Gaussan kernel or the matrx whose element (, j) s 1 f x and x j are k-nearest-neghbors and 0 otherwse. Instead of solvng an ordnary egenproblem, the followng generalzed egenproblem s solved: (S M)v j = λ j Sv j (4) wth egenvalues λ j, egenvectors v j and S the dagonal matrx wth entres gven by eq. (1). The smallest egenvalue s left out and the egenvectors correspondng to the other small egenvalues are used for the embeddng. Ths s the same embeddng that s computed wth the spectral clusterng algorthm from (Sh and Malk, 1997). As noted n (Wess, 1999) (Normalzaton Lemma 1), an equvalent result (up to a componentwse scalng of the embeddng) can be obtaned by consderng the prncpal egenvectors of the normalzed matrx defned n eq. (3).

3 2.4 Isomap Isomap (Tenenbaum, de Slva and Langford, 2000) generalzes MDS to non-lnear manfolds. It s based on replacng the Eucldean dstance by an approxmaton of the geodesc dstance on the manfold. We defne the geodesc dstance wth respect to a data set D, a dstance d(u, v) and a neghborhood k as follows: D(a, b) = mn p d(p, p +1 ) (5) where p s a sequence of ponts of length l 2 wth p 1 = a, p l = b, p D {2,..., l 1} and (p,p +1 ) are k-nearest-neghbors. The length l s free n the mnmzaton. The Isomap algorthm obtans the normalzed matrx M from whch the embeddng s derved by transformng the raw parwse dstances matrx as follows: frst compute the matrx M j = D 2 (x, x j ) of squared geodesc dstances wth respect to the data D, then apply to ths matrx the dstance-to-dot-product transformaton (eq. (2)), as for MDS. As n MDS, the embeddng s e k = λ k v k rather than y k = v k. 2.5 LLE The Local Lnear Embeddng (LLE) algorthm (Rowes and Saul, 2000) looks for an embeddng that preserves the local geometry n the neghborhood of each data pont. Frst, a sparse matrx of local predctve weghts W j s computed, such that j W j = 1, W j = 0 f x j s not a k-nearest-neghbor of x and ( j W jx j x ) 2 s mnmzed. Then the matrx M = (I W ) (I W ) (6) s formed. The embeddng s obtaned from the lowest egenvectors of M, except for the smallest egenvector whch s unnterestng because t s (1, 1,... 1), wth egenvalue 0. Note that the lowest egenvectors of M are the largest egenvectors of M µ = µi M to ft Algorthm 1 (the use of µ > 0 wll be dscussed n secton 4.4). The embeddng s gven by y k = v k, and s constant wth respect to µ. 3 From Egenvectors to Egenfunctons To obtan an embeddng for a new data pont, we propose to use the Nyström formula (eq. 9) (Baker, 1977), whch has been used successfully to speed-up kernel methods computatons by focussng the heaver computatons (the egendecomposton) on a subset of examples. The use of ths formula can be justfed by consderng the convergence of egenvectors and egenvalues, as the number of examples ncreases (Baker, 1977; Wllams and Seeger, 2000; Koltchnsk and Gné, 2000; Shawe-Taylor and Wllams, 2003). Intutvely, the extensons to obtan the embeddng for a new example requre specfyng a new column of the Gram matrx M, through a tranng-set dependent kernel functon K D, n whch one of the arguments may be requred to be n the tranng set. If we start from a data set D, obtan an embeddng for ts elements, and add more and more data, the embeddng for the ponts n D converges (for egenvalues that are unque). (Shawe-Taylor and Wllams, 2003) gve bounds on the convergence error (n the case of kernel PCA). In the lmt, we expect each egenvector to converge to an egenfuncton for the lnear operator defned below, n the sense that the -th element of the k-th egenvector converges to the applcaton of the k-th egenfuncton to x (up to a normalzaton factor). Consder a Hlbert space H p of functons wth nner product f, g p = f(x)g(x)p(x)dx, wth a densty functon p(x). Assocate wth kernel K a lnear operator K p n H p : (K p f)(x) = K(x, y)f(y)p(y)dy. (7) We don t know the true densty p but we can approxmate the above nner product and lnear operator (and ts egenfunctons) usng the emprcal dstrbuton ˆp. An emprcal Hlbert space Hˆp s thus defned usng ˆp nstead of p. Note that the proposton below can be

4 appled even f the kernel s not postve sem-defnte, although the embeddng algorthms we have studed are restrcted to usng the prncpal coordnates assocated wth postve egenvalues. For a more rgorous mathematcal analyss, see (Bengo et al., 2003). Proposton 1 Let K(a, b) be a kernel functon, not necessarly postve sem-defnte, that gves rse to a symmetrc matrx M wth entres M j = K(x, x j ) upon a dataset D = {x 1,..., x n }. Let (v k, λ k ) be an (egenvector,egenvalue) par that solves Mv k = λ k v k. Let (f k, λ k ) be an (egenfuncton,egenvalue) par that solves ( Kˆp f k )(x) = λ k f k(x) for any x, wth ˆp the emprcal dstrbuton over D. Let e k (x) = y k (x) λ k or y k (x) denote the embeddng assocated wth a new pont x. Then λ k = 1 n λ k (8) n n f k (x) = v k K(x, x ) (9) λ k =1 f k (x ) = nv k (10) y k (x) = f k(x) = 1 n v k K(x, x ) (11) n λ k =1 y k (x ) = y k, e k (x ) = e k (12) See (Bengo et al., 2003) for a proof and further justfcatons of the above formulae. The generalzed embeddng for Isomap and MDS s e k (x) = λ k y k (x) whereas the one for spectral clusterng, Laplacan egenmaps and LLE s y k (x). Proposton 2 In addton, f the data-dependent kernel K D s postve sem-defnte, then n f k (x) = π k (x) λ k where π k (x) s the k-th component of the kernel PCA projecton of x obtaned from the kernel K D (up to centerng). Ths relaton wth kernel PCA (Schölkopf, Smola and Müller, 1998), already ponted out n (Wllams and Seeger, 2000), s further dscussed n (Bengo et al., 2003). 4 Extendng to new Ponts Usng Proposton 1, one obtans a natural extenson of all the unsupervsed learnng algorthms mapped to Algorthm 1, provded we can wrte down a kernel functon K that gves rse to the matrx M on D, and can be used n eq. (11) to generalze the embeddng. We consder each of them n turn below. In addton to the convergence propertes dscussed n secton 3, another justfcaton for usng equaton (9) s gven by the followng proposton: Proposton 3 If we defne the f k (x ) by eq. (10) and take a new pont x, the value of f k (x) that mnmzes ( 2 n m K(x, x ) λ tf t (x)f t (x )) (13) =1 t=1 s gven by eq. (9), for m 1 and any k m. The proof s a drect consequence of the orthogonalty of the egenvectors v k. Ths proposton lnks equatons (9) and (10). Indeed, we can obtan eq. (10) when tryng to approxmate

5 K at the data ponts by mnmzng ( the cost n m K(x, x j ) λ tf t (x )f t (x j ),j=1 t=1 for m = 1, 2,... When we add a new pont x, t s thus natural to use the same cost to approxmate the K(x, x ), whch yelds (13). Note that by dong so, we do not seek to approxmate K(x, x). Future work should nvestgate embeddngs whch mnmze the emprcal reconstructon error of K but gnore the dagonal contrbutons. 4.1 Extendng MDS For MDS, a normalzed kernel can be defned as follows, usng a contnuous verson of the double-centerng eq. (2): K(a, b) = 1 2 (d2 (a, b) E x [d 2 (x, b)] E x [d 2 (a, x )] + E x,x [d 2 (x, x )]) (14) where d(a, b) s the orgnal dstance and the expectatons are taken over the emprcal data D. An extenson of metrc MDS to new ponts has already been proposed n (Gower, 1968), solvng exactly for the embeddng of x to be consstent wth ts dstances to tranng ponts, whch n general requres addng a new dmenson. 4.2 Extendng Spectral Clusterng and Laplacan Egenmaps Both the verson of Spectral Clusterng and Laplacan Egenmaps descrbed above are based on an ntal kernel K, such as the Gaussan or nearest-neghbor kernel. An equvalent normalzed kernel s: K(a, b) = 1 K(a, b) n Ex [K(a, x)]e x [K(b, x )] where the expectatons are taken over the emprcal data D. 4.3 Extendng Isomap To extend Isomap, the test pont s not used n computng the geodesc dstance between tranng ponts, otherwse we would have to recompute all the geodesc dstances. A reasonable soluton s to use the defnton of D(a, b) n eq. (5), whch only uses the tranng ponts n the ntermedate ponts on the path from a to b. We obtan a normalzed kernel by applyng the contnuous double-centerng of eq. (14) wth d = D. A formula has already been proposed (de Slva and Tenenbaum, 2003) to approxmate Isomap usng only a subset of the examples (the landmark ponts) to compute the egenvectors. Usng our notatons, ths formula s e k(x) = 1 2 v k (E x [ λ D 2 (x, x )] D 2 (x, x)). (15) k where E x s an average over the data set. The formula s appled to obtan an embeddng for the non-landmark examples. Corollary 1 The embeddng proposed n Proposton 1 for Isomap (e k (x)) s equal to formula 15 (Landmark Isomap) when K(x, y) s defned as n eq. (14) wth d = D. Proof: the proof reles on a property of the Gram matrx for Isomap: M j = 0, by constructon. Therefore (1, 1,... 1) s an egenvector wth egenvalue 0, and all the other egenvectors v k have the property v k = 0 because of the orthogonalty wth (1, 1,... 1). Wrtng (E x [ D 2 (x, x )] D 2 (x, x )) = 2 K(x, x )+E x,x [ D 2 (x, x )] E x [ D 2 (x, x )] yelds e k (x) = 2 2 λ k v K(x, k x ) + (E x,x [ D 2 (x, x )] E x [ D 2 (x, x )]) v k = e k (x), snce the last sum s 0. ) 2

6 4.4 Extendng LLE The extenson of LLE s the most challengng one because t does not ft as well the framework of Algorthm 1: the M matrx for LLE does not have a clear nterpretaton n terms of dstance or dot product. An extenson has been proposed n (Saul and Rowes, 2002), but unfortunately t cannot be cast drectly nto the framework of Proposton 1. Ther embeddng of a new pont x s gven by n y k (x) = y k (x )w(x, x ) (16) =1 where w(x, x ) s the weght of x n the reconstructon of x by ts k-nearest-neghbors n the tranng set (f x = x j D, w(x, x ) = δ j ). Ths s very close to eq. (11), but lacks the normalzaton by λ k. However, we can see ths embeddng as a lmt case of Proposton 1, as shown below. We frst need to defne a kernel K µ such that K µ (x, x j ) = M µ,j = (µ 1)δ j + W j + W j k W k W kj (17) for x, x j D. Let us defne a kernel K by K (x, x) = K (x, x ) = w(x, x ) and K (x, y) = 0 when nether x nor y s n the tranng set D. Let K be defned by K (x, x j ) = W j + W j k W k W kj and K (x, y) = 0 when ether x or y sn t n D. Then, by constructon, the kernel Kµ = (µ 1) K + K verfes eq. (17). Thus, we can apply eq. (11) to obtan an embeddng of a new pont x, whch yelds y µ,k (x) = 1 y k ((µ 1) λ K (x, x ) + K ) (x, x ) k wth λ k = (µ ˆλ k ), and ˆλ k beng the k-th lowest egenvalue of M. Ths rewrtes nto y µ,k (x) = µ 1 µ ˆλ y k w(x, x ) + 1 k µ ˆλ y K k (x, x ). k Then when µ, y µ,k (x) y k (x) defned by eq. (16). Snce the choce of µ s free, we can thus consder eq. (16) as approxmatng the use of the kernel Kµ wth a large µ n Proposton 1. Ths s what we have done n the experments descrbed n the next secton. Note however that we can fnd smoother kernels K µ verfyng eq. (17), gvng other extensons of LLE from Proposton 1. It s out of the scope of ths paper to study whch kernel s best for generalzaton, but t seems desrable to use a smooth kernel that would take nto account not only the reconstructon of x by ts neghbors x, but also the reconstructon of the x by ther neghbors ncludng the new pont x. 5 Experments We want to evaluate whether the precson of the generalzatons suggested n the prevous secton s comparable to the ntrnsc perturbatons of the embeddng algorthms. The perturbaton analyss wll be acheved by consderng splts of the data n three sets, D = F R 1 R 2 and tranng ether wth F R 1 or F R 2, comparng the embeddngs on F. For each algorthm descrbed n secton 2, we apply the followng procedure:

7 10 x 10 4 x x Fgure 1: Tranng set varablty mnus out-of-sample error, wrt the proporton of tranng samples substtuted. Top left: MDS. Top rght: spectral clusterng or Laplacan egenmaps. Bottom left: Isomap. Bottom rght: LLE. Error bars are 95% confdence ntervals. 1. We choose F D wth m = F samples. The remanng n m samples n D/F are splt nto two equal sze subsets R 1 and R 2. We tran (obtan the egenvectors) over F R 1 and F R 2. When egenvalues are close, the estmated egenvectors are unstable and can rotate n the subspace they span. Thus we estmate an affne algnment between the two embeddngs usng the ponts n F, and we calculate the Eucldean dstance between the algned embeddngs obtaned for each s F. 2. For each sample s F, we also tran over {F R 1 }/{s }. We apply the extenson to out-of-sample ponts to fnd the predcted embeddng of s and calculate the Eucldean dstance between ths embeddng and the one obtaned when tranng wth F R 1,.e. wth s n the tranng set. 3. We calculate the mean dfference (and ts standard error, shown n the fgure) between the dstance obtaned n step 1 and the one obtaned n step 2 for each sample s F, and we repeat ths experment for varous szes of F. The results obtaned for MDS, Isomap, spectral clusterng and LLE are shown n fgure 1 for dfferent values of m. Experments are done over a database of 698 synthetc face mages descrbed by 4096 components that s avalable at Qualtatvely smlar results have been obtaned over other databases such as Ionosphere ( mlearn/mlsummary.html) and swssroll ( rowes/lle/). Each algorthm generates a twodmensonal embeddng of the mages, followng the experments reported for Isomap. The number of neghbors s 10 for Isomap and LLE, and a Gaussan kernel wth a standard devaton of 0.01 s used for spectral clusterng / Laplacan egenmaps. 95% confdence

8 ntervals are drawn besde each mean dfference of error on the fgure. As expected, the mean dfference between the two dstances s almost monotoncally ncreasng as the fracton of substtuted examples grows (x-axs n the fgure). In most cases, the out-of-sample error s less than or comparable to the tranng set embeddng stablty: t corresponds to substtutng a fracton of between 1 and 4% of the tranng examples. 6 Conclusons In ths paper we have presented an extenson to fve unsupervsed learnng algorthms based on a spectral embeddng of the data: MDS, spectral clusterng, Laplacan egenmaps, Isomap and LLE. Ths extenson allows one to apply a traned model to out-ofsample ponts wthout havng to recompute egenvectors. It ntroduces a noton of functon nducton and generalzaton error for these algorthms. The experments on real hghdmensonal data show that the average dstance between the out-of-sample and n-sample embeddngs s comparable or lower than the varaton n n-sample embeddng due to replacng a few ponts n the tranng set. References Baker, C. (1977). The numercal treatment of ntegral equatons. Clarendon Press, Oxford. Belkn, M. and Nyog, P. (2003). Laplacan egenmaps for dmensonalty reducton and data representaton. Neural Computaton, 15(6): Bengo, Y., Vncent, P., Paement, J., Delalleau, O., Oumet, M., and Le Roux, N. (2003). Spectral clusterng and kernel pca are learnng egenfunctons. Techncal report, Département d nformatque et recherche opératonnelle, Unversté de Montréal. Cox, T. and Cox, M. (1994). Multdmensonal Scalng. Chapman & Hall, London. de Slva, V. and Tenenbaum, J. (2003). Global versus local methods n nonlnear dmensonalty reducton. In Becker, S., Thrun, S., and Obermayer, K., edtors, Advances n Neural Informaton Processng Systems, volume 15, pages , Cambrdge, MA. The MIT Press. Gower, J. (1968). Addng a pont to vector dagrams n multvarate analyss. Bometrka, 55(3): Koltchnsk, V. and Gné, E. (2000). Random matrx approxmaton of spectra of ntegral operators. Bernoull, 6(1): Ng, A. Y., Jordan, M. I., and Wess, Y. (2002). On spectral clusterng: Analyss and an algorthm. In Detterch, T. G., Becker, S., and Ghahraman, Z., edtors, Advances n Neural Informaton Processng Systems 14, Cambrdge, MA. MIT Press. Rowes, S. and Saul, L. (2000). Nonlnear dmensonalty reducton by locally lnear embeddng. Scence, 290(5500): Saul, L. and Rowes, S. (2002). Thnk globally, ft locally: unsupervsed learnng of low dmensonal manfolds. Journal of Machne Learnng Research, 4: Schölkopf, B., Smola, A., and Müller, K.-R. (1998). Nonlnear component analyss as a kernel egenvalue problem. Neural Computaton, 10: Shawe-Taylor, J. and Wllams, C. (2003). The stablty of kernel prncpal components analyss and ts relaton to the process egenspectrum. In Becker, S., Thrun, S., and Obermayer, K., edtors, Advances n Neural Informaton Processng Systems, volume 15. The MIT Press. Sh, J. and Malk, J. (1997). Normalzed cuts and mage segmentaton. In Proc. IEEE Conf. Computer Vson and Pattern Recognton, pages Tenenbaum, J., de Slva, V., and Langford, J. (2000). A global geometrc framework for nonlnear dmensonalty reducton. Scence, 290(5500): Wess, Y. (1999). Segmentaton usng egenvectors: a unfyng vew. In Proceedngs IEEE Internatonal Conference on Computer Vson, pages Wllams, C. and Seeger, M. (2000). The effect of the nput densty dstrbuton on kernel-based classfers. In Proceedngs of the Seventeenth Internatonal Conference on Machne Learnng. Morgan Kaufmann.

8.5 UNITARY AND HERMITIAN MATRICES. The conjugate transpose of a complex matrix A, denoted by A*, is given by

8.5 UNITARY AND HERMITIAN MATRICES. The conjugate transpose of a complex matrix A, denoted by A*, is given by 6 CHAPTER 8 COMPLEX VECTOR SPACES 5. Fnd the kernel of the lnear transformaton gven n Exercse 5. In Exercses 55 and 56, fnd the mage of v, for the ndcated composton, where and are gven by the followng

More information

v a 1 b 1 i, a 2 b 2 i,..., a n b n i.

v a 1 b 1 i, a 2 b 2 i,..., a n b n i. SECTION 8.4 COMPLEX VECTOR SPACES AND INNER PRODUCTS 455 8.4 COMPLEX VECTOR SPACES AND INNER PRODUCTS All the vector spaces we have studed thus far n the text are real vector spaces snce the scalars are

More information

Face Verification Problem. Face Recognition Problem. Application: Access Control. Biometric Authentication. Face Verification (1:1 matching)

Face Verification Problem. Face Recognition Problem. Application: Access Control. Biometric Authentication. Face Verification (1:1 matching) Face Recognton Problem Face Verfcaton Problem Face Verfcaton (1:1 matchng) Querymage face query Face Recognton (1:N matchng) database Applcaton: Access Control www.vsage.com www.vsoncs.com Bometrc Authentcaton

More information

L10: Linear discriminants analysis

L10: Linear discriminants analysis L0: Lnear dscrmnants analyss Lnear dscrmnant analyss, two classes Lnear dscrmnant analyss, C classes LDA vs. PCA Lmtatons of LDA Varants of LDA Other dmensonalty reducton methods CSCE 666 Pattern Analyss

More information

What is Candidate Sampling

What is Candidate Sampling What s Canddate Samplng Say we have a multclass or mult label problem where each tranng example ( x, T ) conssts of a context x a small (mult)set of target classes T out of a large unverse L of possble

More information

Support Vector Machines

Support Vector Machines Support Vector Machnes Max Wellng Department of Computer Scence Unversty of Toronto 10 Kng s College Road Toronto, M5S 3G5 Canada wellng@cs.toronto.edu Abstract Ths s a note to explan support vector machnes.

More information

A Fast Incremental Spectral Clustering for Large Data Sets

A Fast Incremental Spectral Clustering for Large Data Sets 2011 12th Internatonal Conference on Parallel and Dstrbuted Computng, Applcatons and Technologes A Fast Incremental Spectral Clusterng for Large Data Sets Tengteng Kong 1,YeTan 1, Hong Shen 1,2 1 School

More information

Forecasting the Direction and Strength of Stock Market Movement

Forecasting the Direction and Strength of Stock Market Movement Forecastng the Drecton and Strength of Stock Market Movement Jngwe Chen Mng Chen Nan Ye cjngwe@stanford.edu mchen5@stanford.edu nanye@stanford.edu Abstract - Stock market s one of the most complcated systems

More information

Luby s Alg. for Maximal Independent Sets using Pairwise Independence

Luby s Alg. for Maximal Independent Sets using Pairwise Independence Lecture Notes for Randomzed Algorthms Luby s Alg. for Maxmal Independent Sets usng Parwse Independence Last Updated by Erc Vgoda on February, 006 8. Maxmal Independent Sets For a graph G = (V, E), an ndependent

More information

Dimensionality Reduction for Data Visualization

Dimensionality Reduction for Data Visualization Dmensonalty Reducton for Data Vsualzaton Samuel Kask and Jaakko Peltonen Dmensonalty reducton s one of the basc operatons n the toolbox of data-analysts and desgners of machne learnng and pattern recognton

More information

1 Example 1: Axis-aligned rectangles

1 Example 1: Axis-aligned rectangles COS 511: Theoretcal Machne Learnng Lecturer: Rob Schapre Lecture # 6 Scrbe: Aaron Schld February 21, 2013 Last class, we dscussed an analogue for Occam s Razor for nfnte hypothess spaces that, n conjuncton

More information

BERNSTEIN POLYNOMIALS

BERNSTEIN POLYNOMIALS On-Lne Geometrc Modelng Notes BERNSTEIN POLYNOMIALS Kenneth I. Joy Vsualzaton and Graphcs Research Group Department of Computer Scence Unversty of Calforna, Davs Overvew Polynomals are ncredbly useful

More information

Ring structure of splines on triangulations

Ring structure of splines on triangulations www.oeaw.ac.at Rng structure of splnes on trangulatons N. Vllamzar RICAM-Report 2014-48 www.rcam.oeaw.ac.at RING STRUCTURE OF SPLINES ON TRIANGULATIONS NELLY VILLAMIZAR Introducton For a trangulated regon

More information

The Development of Web Log Mining Based on Improve-K-Means Clustering Analysis

The Development of Web Log Mining Based on Improve-K-Means Clustering Analysis The Development of Web Log Mnng Based on Improve-K-Means Clusterng Analyss TngZhong Wang * College of Informaton Technology, Luoyang Normal Unversty, Luoyang, 471022, Chna wangtngzhong2@sna.cn Abstract.

More information

Learning from Multiple Outlooks

Learning from Multiple Outlooks Learnng from Multple Outlooks Maayan Harel Department of Electrcal Engneerng, Technon, Hafa, Israel She Mannor Department of Electrcal Engneerng, Technon, Hafa, Israel maayanga@tx.technon.ac.l she@ee.technon.ac.l

More information

Vision Mouse. Saurabh Sarkar a* University of Cincinnati, Cincinnati, USA ABSTRACT 1. INTRODUCTION

Vision Mouse. Saurabh Sarkar a* University of Cincinnati, Cincinnati, USA ABSTRACT 1. INTRODUCTION Vson Mouse Saurabh Sarkar a* a Unversty of Cncnnat, Cncnnat, USA ABSTRACT The report dscusses a vson based approach towards trackng of eyes and fngers. The report descrbes the process of locatng the possble

More information

Forecasting the Demand of Emergency Supplies: Based on the CBR Theory and BP Neural Network

Forecasting the Demand of Emergency Supplies: Based on the CBR Theory and BP Neural Network 700 Proceedngs of the 8th Internatonal Conference on Innovaton & Management Forecastng the Demand of Emergency Supples: Based on the CBR Theory and BP Neural Network Fu Deqang, Lu Yun, L Changbng School

More information

Loop Parallelization

Loop Parallelization - - Loop Parallelzaton C-52 Complaton steps: nested loops operatng on arrays, sequentell executon of teraton space DECLARE B[..,..+] FOR I :=.. FOR J :=.. I B[I,J] := B[I-,J]+B[I-,J-] ED FOR ED FOR analyze

More information

8 Algorithm for Binary Searching in Trees

8 Algorithm for Binary Searching in Trees 8 Algorthm for Bnary Searchng n Trees In ths secton we present our algorthm for bnary searchng n trees. A crucal observaton employed by the algorthm s that ths problem can be effcently solved when the

More information

Conversion between the vector and raster data structures using Fuzzy Geographical Entities

Conversion between the vector and raster data structures using Fuzzy Geographical Entities Converson between the vector and raster data structures usng Fuzzy Geographcal Enttes Cdála Fonte Department of Mathematcs Faculty of Scences and Technology Unversty of Combra, Apartado 38, 3 454 Combra,

More information

Active Learning for Interactive Visualization

Active Learning for Interactive Visualization Actve Learnng for Interactve Vsualzaton Tomoharu Iwata Nel Houlsby Zoubn Ghahraman Unversty of Cambrdge Unversty of Cambrdge Unversty of Cambrdge Abstract Many automatc vsualzaton methods have been. However,

More information

Recurrence. 1 Definitions and main statements

Recurrence. 1 Definitions and main statements Recurrence 1 Defntons and man statements Let X n, n = 0, 1, 2,... be a MC wth the state space S = (1, 2,...), transton probabltes p j = P {X n+1 = j X n = }, and the transton matrx P = (p j ),j S def.

More information

Georey E. Hinton. University oftoronto. Email: zoubin@cs.toronto.edu. Technical Report CRG-TR-96-1. May 21, 1996 (revised Feb 27, 1997) Abstract

Georey E. Hinton. University oftoronto. Email: zoubin@cs.toronto.edu. Technical Report CRG-TR-96-1. May 21, 1996 (revised Feb 27, 1997) Abstract The EM Algorthm for Mxtures of Factor Analyzers Zoubn Ghahraman Georey E. Hnton Department of Computer Scence Unversty oftoronto 6 Kng's College Road Toronto, Canada M5S A4 Emal: zoubn@cs.toronto.edu Techncal

More information

J. Parallel Distrib. Comput.

J. Parallel Distrib. Comput. J. Parallel Dstrb. Comput. 71 (2011) 62 76 Contents lsts avalable at ScenceDrect J. Parallel Dstrb. Comput. journal homepage: www.elsever.com/locate/jpdc Optmzng server placement n dstrbuted systems n

More information

Causal, Explanatory Forecasting. Analysis. Regression Analysis. Simple Linear Regression. Which is Independent? Forecasting

Causal, Explanatory Forecasting. Analysis. Regression Analysis. Simple Linear Regression. Which is Independent? Forecasting Causal, Explanatory Forecastng Assumes cause-and-effect relatonshp between system nputs and ts output Forecastng wth Regresson Analyss Rchard S. Barr Inputs System Cause + Effect Relatonshp The job of

More information

Logistic Regression. Lecture 4: More classifiers and classes. Logistic regression. Adaboost. Optimization. Multiple class classification

Logistic Regression. Lecture 4: More classifiers and classes. Logistic regression. Adaboost. Optimization. Multiple class classification Lecture 4: More classfers and classes C4B Machne Learnng Hlary 20 A. Zsserman Logstc regresson Loss functons revsted Adaboost Loss functons revsted Optmzaton Multple class classfcaton Logstc Regresson

More information

GRAVITY DATA VALIDATION AND OUTLIER DETECTION USING L 1 -NORM

GRAVITY DATA VALIDATION AND OUTLIER DETECTION USING L 1 -NORM GRAVITY DATA VALIDATION AND OUTLIER DETECTION USING L 1 -NORM BARRIOT Jean-Perre, SARRAILH Mchel BGI/CNES 18.av.E.Beln 31401 TOULOUSE Cedex 4 (France) Emal: jean-perre.barrot@cnes.fr 1/Introducton The

More information

An Interest-Oriented Network Evolution Mechanism for Online Communities

An Interest-Oriented Network Evolution Mechanism for Online Communities An Interest-Orented Network Evoluton Mechansm for Onlne Communtes Cahong Sun and Xaopng Yang School of Informaton, Renmn Unversty of Chna, Bejng 100872, P.R. Chna {chsun,yang}@ruc.edu.cn Abstract. Onlne

More information

Performance Analysis and Coding Strategy of ECOC SVMs

Performance Analysis and Coding Strategy of ECOC SVMs Internatonal Journal of Grd and Dstrbuted Computng Vol.7, No. (04), pp.67-76 http://dx.do.org/0.457/jgdc.04.7..07 Performance Analyss and Codng Strategy of ECOC SVMs Zhgang Yan, and Yuanxuan Yang, School

More information

Support vector domain description

Support vector domain description Pattern Recognton Letters 20 (1999) 1191±1199 www.elsever.nl/locate/patrec Support vector doman descrpton Davd M.J. Tax *,1, Robert P.W. Dun Pattern Recognton Group, Faculty of Appled Scence, Delft Unversty

More information

Section 5.4 Annuities, Present Value, and Amortization

Section 5.4 Annuities, Present Value, and Amortization Secton 5.4 Annutes, Present Value, and Amortzaton Present Value In Secton 5.2, we saw that the present value of A dollars at nterest rate per perod for n perods s the amount that must be deposted today

More information

Point cloud to point cloud rigid transformations. Minimizing Rigid Registration Errors

Point cloud to point cloud rigid transformations. Minimizing Rigid Registration Errors Pont cloud to pont cloud rgd transformatons Russell Taylor 600.445 1 600.445 Fall 000-014 Copyrght R. H. Taylor Mnmzng Rgd Regstraton Errors Typcally, gven a set of ponts {a } n one coordnate system and

More information

Extending Probabilistic Dynamic Epistemic Logic

Extending Probabilistic Dynamic Epistemic Logic Extendng Probablstc Dynamc Epstemc Logc Joshua Sack May 29, 2008 Probablty Space Defnton A probablty space s a tuple (S, A, µ), where 1 S s a set called the sample space. 2 A P(S) s a σ-algebra: a set

More information

Data Visualization by Pairwise Distortion Minimization

Data Visualization by Pairwise Distortion Minimization Communcatons n Statstcs, Theory and Methods 34 (6), 005 Data Vsualzaton by Parwse Dstorton Mnmzaton By Marc Sobel, and Longn Jan Lateck* Department of Statstcs and Department of Computer and Informaton

More information

CS 2750 Machine Learning. Lecture 3. Density estimation. CS 2750 Machine Learning. Announcements

CS 2750 Machine Learning. Lecture 3. Density estimation. CS 2750 Machine Learning. Announcements Lecture 3 Densty estmaton Mlos Hauskrecht mlos@cs.ptt.edu 5329 Sennott Square Next lecture: Matlab tutoral Announcements Rules for attendng the class: Regstered for credt Regstered for audt (only f there

More information

How To Know The Components Of Mean Squared Error Of Herarchcal Estmator S

How To Know The Components Of Mean Squared Error Of Herarchcal Estmator S S C H E D A E I N F O R M A T I C A E VOLUME 0 0 On Mean Squared Error of Herarchcal Estmator Stans law Brodowsk Faculty of Physcs, Astronomy, and Appled Computer Scence, Jagellonan Unversty, Reymonta

More information

+ + + - - This circuit than can be reduced to a planar circuit

+ + + - - This circuit than can be reduced to a planar circuit MeshCurrent Method The meshcurrent s analog of the nodeoltage method. We sole for a new set of arables, mesh currents, that automatcally satsfy KCLs. As such, meshcurrent method reduces crcut soluton to

More information

How Sets of Coherent Probabilities May Serve as Models for Degrees of Incoherence

How Sets of Coherent Probabilities May Serve as Models for Degrees of Incoherence 1 st Internatonal Symposum on Imprecse Probabltes and Ther Applcatons, Ghent, Belgum, 29 June 2 July 1999 How Sets of Coherent Probabltes May Serve as Models for Degrees of Incoherence Mar J. Schervsh

More information

benefit is 2, paid if the policyholder dies within the year, and probability of death within the year is ).

benefit is 2, paid if the policyholder dies within the year, and probability of death within the year is ). REVIEW OF RISK MANAGEMENT CONCEPTS LOSS DISTRIBUTIONS AND INSURANCE Loss and nsurance: When someone s subject to the rsk of ncurrng a fnancal loss, the loss s generally modeled usng a random varable or

More information

Project Networks With Mixed-Time Constraints

Project Networks With Mixed-Time Constraints Project Networs Wth Mxed-Tme Constrants L Caccetta and B Wattananon Western Australan Centre of Excellence n Industral Optmsaton (WACEIO) Curtn Unversty of Technology GPO Box U1987 Perth Western Australa

More information

PSYCHOLOGICAL RESEARCH (PYC 304-C) Lecture 12

PSYCHOLOGICAL RESEARCH (PYC 304-C) Lecture 12 14 The Ch-squared dstrbuton PSYCHOLOGICAL RESEARCH (PYC 304-C) Lecture 1 If a normal varable X, havng mean µ and varance σ, s standardsed, the new varable Z has a mean 0 and varance 1. When ths standardsed

More information

Descriptive Models. Cluster Analysis. Example. General Applications of Clustering. Examples of Clustering Applications

Descriptive Models. Cluster Analysis. Example. General Applications of Clustering. Examples of Clustering Applications CMSC828G Prncples of Data Mnng Lecture #9 Today s Readng: HMS, chapter 9 Today s Lecture: Descrptve Modelng Clusterng Algorthms Descrptve Models model presents the man features of the data, a global summary

More information

Module 2 LOSSLESS IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur

Module 2 LOSSLESS IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur Module LOSSLESS IMAGE COMPRESSION SYSTEMS Lesson 3 Lossless Compresson: Huffman Codng Instructonal Objectves At the end of ths lesson, the students should be able to:. Defne and measure source entropy..

More information

Logistic Regression. Steve Kroon

Logistic Regression. Steve Kroon Logstc Regresson Steve Kroon Course notes sectons: 24.3-24.4 Dsclamer: these notes do not explctly ndcate whether values are vectors or scalars, but expects the reader to dscern ths from the context. Scenaro

More information

NPAR TESTS. One-Sample Chi-Square Test. Cell Specification. Observed Frequencies 1O i 6. Expected Frequencies 1EXP i 6

NPAR TESTS. One-Sample Chi-Square Test. Cell Specification. Observed Frequencies 1O i 6. Expected Frequencies 1EXP i 6 PAR TESTS If a WEIGHT varable s specfed, t s used to replcate a case as many tmes as ndcated by the weght value rounded to the nearest nteger. If the workspace requrements are exceeded and samplng has

More information

An Enhanced Super-Resolution System with Improved Image Registration, Automatic Image Selection, and Image Enhancement

An Enhanced Super-Resolution System with Improved Image Registration, Automatic Image Selection, and Image Enhancement An Enhanced Super-Resoluton System wth Improved Image Regstraton, Automatc Image Selecton, and Image Enhancement Yu-Chuan Kuo ( ), Chen-Yu Chen ( ), and Chou-Shann Fuh ( ) Department of Computer Scence

More information

SVM Tutorial: Classification, Regression, and Ranking

SVM Tutorial: Classification, Regression, and Ranking SVM Tutoral: Classfcaton, Regresson, and Rankng Hwanjo Yu and Sungchul Km 1 Introducton Support Vector Machnes(SVMs) have been extensvely researched n the data mnng and machne learnng communtes for the

More information

How To Understand The Results Of The German Meris Cloud And Water Vapour Product

How To Understand The Results Of The German Meris Cloud And Water Vapour Product Ttel: Project: Doc. No.: MERIS level 3 cloud and water vapour products MAPP MAPP-ATBD-ClWVL3 Issue: 1 Revson: 0 Date: 9.12.1998 Functon Name Organsaton Sgnature Date Author: Bennartz FUB Preusker FUB Schüller

More information

Design of Output Codes for Fast Covering Learning using Basic Decomposition Techniques

Design of Output Codes for Fast Covering Learning using Basic Decomposition Techniques Journal of Computer Scence (7): 565-57, 6 ISSN 59-66 6 Scence Publcatons Desgn of Output Codes for Fast Coverng Learnng usng Basc Decomposton Technques Aruna Twar and Narendra S. Chaudhar, Faculty of Computer

More information

An Alternative Way to Measure Private Equity Performance

An Alternative Way to Measure Private Equity Performance An Alternatve Way to Measure Prvate Equty Performance Peter Todd Parlux Investment Technology LLC Summary Internal Rate of Return (IRR) s probably the most common way to measure the performance of prvate

More information

DEFINING %COMPLETE IN MICROSOFT PROJECT

DEFINING %COMPLETE IN MICROSOFT PROJECT CelersSystems DEFINING %COMPLETE IN MICROSOFT PROJECT PREPARED BY James E Aksel, PMP, PMI-SP, MVP For Addtonal Informaton about Earned Value Management Systems and reportng, please contact: CelersSystems,

More information

320 The Internatonal Arab Journal of Informaton Technology, Vol. 5, No. 3, July 2008 Comparsons Between Data Clusterng Algorthms Osama Abu Abbas Computer Scence Department, Yarmouk Unversty, Jordan Abstract:

More information

How To Assemble The Tangent Spaces Of A Manfold Nto A Coherent Whole

How To Assemble The Tangent Spaces Of A Manfold Nto A Coherent Whole CHAPTER 7 VECTOR BUNDLES We next begn addressng the queston: how do we assemble the tangent spaces at varous ponts of a manfold nto a coherent whole? In order to gude the decson, consder the case of U

More information

When Network Effect Meets Congestion Effect: Leveraging Social Services for Wireless Services

When Network Effect Meets Congestion Effect: Leveraging Social Services for Wireless Services When Network Effect Meets Congeston Effect: Leveragng Socal Servces for Wreless Servces aowen Gong School of Electrcal, Computer and Energy Engeerng Arzona State Unversty Tempe, AZ 8587, USA xgong9@asuedu

More information

AN EFFECTIVE MATRIX GEOMETRIC MEAN SATISFYING THE ANDO LI MATHIAS PROPERTIES

AN EFFECTIVE MATRIX GEOMETRIC MEAN SATISFYING THE ANDO LI MATHIAS PROPERTIES MATHEMATICS OF COMPUTATION Volume, Number, Pages S 5-578(XX)- AN EFFECTIVE MATRIX GEOMETRIC MEAN SATISFYING THE ANDO LI MATHIAS PROPERTIES DARIO A. BINI, BEATRICE MEINI AND FEDERICO POLONI Abstract. We

More information

Solution: Let i = 10% and d = 5%. By definition, the respective forces of interest on funds A and B are. i 1 + it. S A (t) = d (1 dt) 2 1. = d 1 dt.

Solution: Let i = 10% and d = 5%. By definition, the respective forces of interest on funds A and B are. i 1 + it. S A (t) = d (1 dt) 2 1. = d 1 dt. Chapter 9 Revew problems 9.1 Interest rate measurement Example 9.1. Fund A accumulates at a smple nterest rate of 10%. Fund B accumulates at a smple dscount rate of 5%. Fnd the pont n tme at whch the forces

More information

Joe Pimbley, unpublished, 2005. Yield Curve Calculations

Joe Pimbley, unpublished, 2005. Yield Curve Calculations Joe Pmbley, unpublshed, 005. Yeld Curve Calculatons Background: Everythng s dscount factors Yeld curve calculatons nclude valuaton of forward rate agreements (FRAs), swaps, nterest rate optons, and forward

More information

We are now ready to answer the question: What are the possible cardinalities for finite fields?

We are now ready to answer the question: What are the possible cardinalities for finite fields? Chapter 3 Fnte felds We have seen, n the prevous chapters, some examples of fnte felds. For example, the resdue class rng Z/pZ (when p s a prme) forms a feld wth p elements whch may be dentfed wth the

More information

How To Calculate The Accountng Perod Of Nequalty

How To Calculate The Accountng Perod Of Nequalty Inequalty and The Accountng Perod Quentn Wodon and Shlomo Ytzha World Ban and Hebrew Unversty September Abstract Income nequalty typcally declnes wth the length of tme taen nto account for measurement.

More information

Quantization Effects in Digital Filters

Quantization Effects in Digital Filters Quantzaton Effects n Dgtal Flters Dstrbuton of Truncaton Errors In two's complement representaton an exact number would have nfntely many bts (n general). When we lmt the number of bts to some fnte value

More information

Brigid Mullany, Ph.D University of North Carolina, Charlotte

Brigid Mullany, Ph.D University of North Carolina, Charlotte Evaluaton And Comparson Of The Dfferent Standards Used To Defne The Postonal Accuracy And Repeatablty Of Numercally Controlled Machnng Center Axes Brgd Mullany, Ph.D Unversty of North Carolna, Charlotte

More information

Financial Mathemetics

Financial Mathemetics Fnancal Mathemetcs 15 Mathematcs Grade 12 Teacher Gude Fnancal Maths Seres Overvew In ths seres we am to show how Mathematcs can be used to support personal fnancal decsons. In ths seres we jon Tebogo,

More information

Using Series to Analyze Financial Situations: Present Value

Using Series to Analyze Financial Situations: Present Value 2.8 Usng Seres to Analyze Fnancal Stuatons: Present Value In the prevous secton, you learned how to calculate the amount, or future value, of an ordnary smple annuty. The amount s the sum of the accumulated

More information

THE DISTRIBUTION OF LOAN PORTFOLIO VALUE * Oldrich Alfons Vasicek

THE DISTRIBUTION OF LOAN PORTFOLIO VALUE * Oldrich Alfons Vasicek HE DISRIBUION OF LOAN PORFOLIO VALUE * Oldrch Alfons Vascek he amount of captal necessary to support a portfolo of debt securtes depends on the probablty dstrbuton of the portfolo loss. Consder a portfolo

More information

Institute of Informatics, Faculty of Business and Management, Brno University of Technology,Czech Republic

Institute of Informatics, Faculty of Business and Management, Brno University of Technology,Czech Republic Lagrange Multplers as Quanttatve Indcators n Economcs Ivan Mezník Insttute of Informatcs, Faculty of Busness and Management, Brno Unversty of TechnologCzech Republc Abstract The quanttatve role of Lagrange

More information

Nonbinary Quantum Error-Correcting Codes from Algebraic Curves

Nonbinary Quantum Error-Correcting Codes from Algebraic Curves Nonbnary Quantum Error-Correctng Codes from Algebrac Curves Jon-Lark Km and Judy Walker Department of Mathematcs Unversty of Nebraska-Lncoln, Lncoln, NE 68588-0130 USA e-mal: {jlkm, jwalker}@math.unl.edu

More information

Answer: A). There is a flatter IS curve in the high MPC economy. Original LM LM after increase in M. IS curve for low MPC economy

Answer: A). There is a flatter IS curve in the high MPC economy. Original LM LM after increase in M. IS curve for low MPC economy 4.02 Quz Solutons Fall 2004 Multple-Choce Questons (30/00 ponts) Please, crcle the correct answer for each of the followng 0 multple-choce questons. For each queston, only one of the answers s correct.

More information

where the coordinates are related to those in the old frame as follows.

where the coordinates are related to those in the old frame as follows. Chapter 2 - Cartesan Vectors and Tensors: Ther Algebra Defnton of a vector Examples of vectors Scalar multplcaton Addton of vectors coplanar vectors Unt vectors A bass of non-coplanar vectors Scalar product

More information

Optimal resource capacity management for stochastic networks

Optimal resource capacity management for stochastic networks Submtted for publcaton. Optmal resource capacty management for stochastc networks A.B. Deker H. Mlton Stewart School of ISyE, Georga Insttute of Technology, Atlanta, GA 30332, ton.deker@sye.gatech.edu

More information

NMT EE 589 & UNM ME 482/582 ROBOT ENGINEERING. Dr. Stephen Bruder NMT EE 589 & UNM ME 482/582

NMT EE 589 & UNM ME 482/582 ROBOT ENGINEERING. Dr. Stephen Bruder NMT EE 589 & UNM ME 482/582 NMT EE 589 & UNM ME 482/582 ROBOT ENGINEERING Dr. Stephen Bruder NMT EE 589 & UNM ME 482/582 7. Root Dynamcs 7.2 Intro to Root Dynamcs We now look at the forces requred to cause moton of the root.e. dynamcs!!

More information

The Application of Fractional Brownian Motion in Option Pricing

The Application of Fractional Brownian Motion in Option Pricing Vol. 0, No. (05), pp. 73-8 http://dx.do.org/0.457/jmue.05.0..6 The Applcaton of Fractonal Brownan Moton n Opton Prcng Qng-xn Zhou School of Basc Scence,arbn Unversty of Commerce,arbn zhouqngxn98@6.com

More information

Single and multiple stage classifiers implementing logistic discrimination

Single and multiple stage classifiers implementing logistic discrimination Sngle and multple stage classfers mplementng logstc dscrmnaton Hélo Radke Bttencourt 1 Dens Alter de Olvera Moraes 2 Vctor Haertel 2 1 Pontfíca Unversdade Católca do Ro Grande do Sul - PUCRS Av. Ipranga,

More information

21 Vectors: The Cross Product & Torque

21 Vectors: The Cross Product & Torque 21 Vectors: The Cross Product & Torque Do not use our left hand when applng ether the rght-hand rule for the cross product of two vectors dscussed n ths chapter or the rght-hand rule for somethng curl

More information

Product-Form Stationary Distributions for Deficiency Zero Chemical Reaction Networks

Product-Form Stationary Distributions for Deficiency Zero Chemical Reaction Networks Bulletn of Mathematcal Bology (21 DOI 1.17/s11538-1-9517-4 ORIGINAL ARTICLE Product-Form Statonary Dstrbutons for Defcency Zero Chemcal Reacton Networks Davd F. Anderson, Gheorghe Cracun, Thomas G. Kurtz

More information

Mean Value Coordinates for Closed Triangular Meshes

Mean Value Coordinates for Closed Triangular Meshes Mean Value Coordnates for Closed Trangular Meshes Tao Ju, Scott Schaefer, Joe Warren Rce Unversty (a) (b) (c) (d) Fgure : Orgnal horse model wth enclosng trangle control mesh shown n black (a). Several

More information

New Approaches to Support Vector Ordinal Regression

New Approaches to Support Vector Ordinal Regression New Approaches to Support Vector Ordnal Regresson We Chu chuwe@gatsby.ucl.ac.uk Gatsby Computatonal Neuroscence Unt, Unversty College London, London, WCN 3AR, UK S. Sathya Keerth selvarak@yahoo-nc.com

More information

n + d + q = 24 and.05n +.1d +.25q = 2 { n + d + q = 24 (3) n + 2d + 5q = 40 (2)

n + d + q = 24 and.05n +.1d +.25q = 2 { n + d + q = 24 (3) n + 2d + 5q = 40 (2) MATH 16T Exam 1 : Part I (In-Class) Solutons 1. (0 pts) A pggy bank contans 4 cons, all of whch are nckels (5 ), dmes (10 ) or quarters (5 ). The pggy bank also contans a con of each denomnaton. The total

More information

Regression Models for a Binary Response Using EXCEL and JMP

Regression Models for a Binary Response Using EXCEL and JMP SEMATECH 997 Statstcal Methods Symposum Austn Regresson Models for a Bnary Response Usng EXCEL and JMP Davd C. Trndade, Ph.D. STAT-TECH Consultng and Tranng n Appled Statstcs San Jose, CA Topcs Practcal

More information

An Evaluation of the Extended Logistic, Simple Logistic, and Gompertz Models for Forecasting Short Lifecycle Products and Services

An Evaluation of the Extended Logistic, Simple Logistic, and Gompertz Models for Forecasting Short Lifecycle Products and Services An Evaluaton of the Extended Logstc, Smple Logstc, and Gompertz Models for Forecastng Short Lfecycle Products and Servces Charles V. Trappey a,1, Hsn-yng Wu b a Professor (Management Scence), Natonal Chao

More information

PERRON FROBENIUS THEOREM

PERRON FROBENIUS THEOREM PERRON FROBENIUS THEOREM R. CLARK ROBINSON Defnton. A n n matrx M wth real entres m, s called a stochastc matrx provded () all the entres m satsfy 0 m, () each of the columns sum to one, m = for all, ()

More information

HÜCKEL MOLECULAR ORBITAL THEORY

HÜCKEL MOLECULAR ORBITAL THEORY 1 HÜCKEL MOLECULAR ORBITAL THEORY In general, the vast maorty polyatomc molecules can be thought of as consstng of a collecton of two electron bonds between pars of atoms. So the qualtatve pcture of σ

More information

On Lockett pairs and Lockett conjecture for π-soluble Fitting classes

On Lockett pairs and Lockett conjecture for π-soluble Fitting classes On Lockett pars and Lockett conjecture for π-soluble Fttng classes Lujn Zhu Department of Mathematcs, Yangzhou Unversty, Yangzhou 225002, P.R. Chna E-mal: ljzhu@yzu.edu.cn Nanyng Yang School of Mathematcs

More information

A Novel Methodology of Working Capital Management for Large. Public Constructions by Using Fuzzy S-curve Regression

A Novel Methodology of Working Capital Management for Large. Public Constructions by Using Fuzzy S-curve Regression Novel Methodology of Workng Captal Management for Large Publc Constructons by Usng Fuzzy S-curve Regresson Cheng-Wu Chen, Morrs H. L. Wang and Tng-Ya Hseh Department of Cvl Engneerng, Natonal Central Unversty,

More information

ANALYZING THE RELATIONSHIPS BETWEEN QUALITY, TIME, AND COST IN PROJECT MANAGEMENT DECISION MAKING

ANALYZING THE RELATIONSHIPS BETWEEN QUALITY, TIME, AND COST IN PROJECT MANAGEMENT DECISION MAKING ANALYZING THE RELATIONSHIPS BETWEEN QUALITY, TIME, AND COST IN PROJECT MANAGEMENT DECISION MAKING Matthew J. Lberatore, Department of Management and Operatons, Vllanova Unversty, Vllanova, PA 19085, 610-519-4390,

More information

REGULAR MULTILINEAR OPERATORS ON C(K) SPACES

REGULAR MULTILINEAR OPERATORS ON C(K) SPACES REGULAR MULTILINEAR OPERATORS ON C(K) SPACES FERNANDO BOMBAL AND IGNACIO VILLANUEVA Abstract. The purpose of ths paper s to characterze the class of regular contnuous multlnear operators on a product of

More information

An Algorithm for Data-Driven Bandwidth Selection

An Algorithm for Data-Driven Bandwidth Selection IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, VOL. 25, NO. 2, FEBRUARY 2003 An Algorthm for Data-Drven Bandwdth Selecton Dorn Comancu, Member, IEEE Abstract The analyss of a feature space

More information

THE METHOD OF LEAST SQUARES THE METHOD OF LEAST SQUARES

THE METHOD OF LEAST SQUARES THE METHOD OF LEAST SQUARES The goal: to measure (determne) an unknown quantty x (the value of a RV X) Realsaton: n results: y 1, y 2,..., y j,..., y n, (the measured values of Y 1, Y 2,..., Y j,..., Y n ) every result s encumbered

More information

A study on the ability of Support Vector Regression and Neural Networks to Forecast Basic Time Series Patterns

A study on the ability of Support Vector Regression and Neural Networks to Forecast Basic Time Series Patterns A study on the ablty of Support Vector Regresson and Neural Networks to Forecast Basc Tme Seres Patterns Sven F. Crone, Jose Guajardo 2, and Rchard Weber 2 Lancaster Unversty, Department of Management

More information

Least 1-Norm SVMs: a New SVM Variant between Standard and LS-SVMs

Least 1-Norm SVMs: a New SVM Variant between Standard and LS-SVMs ESANN proceedngs, European Smposum on Artfcal Neural Networks - Computatonal Intellgence and Machne Learnng. Bruges (Belgum), 8-3 Aprl, d-sde publ., ISBN -9337--. Least -Norm SVMs: a New SVM Varant between

More information

A frequency decomposition time domain model of broadband frequency-dependent absorption: Model II

A frequency decomposition time domain model of broadband frequency-dependent absorption: Model II A frequenc decomposton tme doman model of broadband frequenc-dependent absorpton: Model II W. Chen Smula Research Laborator, P. O. Box. 134, 135 Lsaker, Norwa (1 Aprl ) (Proect collaborators: A. Bounam,

More information

CHAPTER 14 MORE ABOUT REGRESSION

CHAPTER 14 MORE ABOUT REGRESSION CHAPTER 14 MORE ABOUT REGRESSION We learned n Chapter 5 that often a straght lne descrbes the pattern of a relatonshp between two quanttatve varables. For nstance, n Example 5.1 we explored the relatonshp

More information

Least Squares Fitting of Data

Least Squares Fitting of Data Least Squares Fttng of Data Davd Eberly Geoetrc Tools, LLC http://www.geoetrctools.co/ Copyrght c 1998-2016. All Rghts Reserved. Created: July 15, 1999 Last Modfed: January 5, 2015 Contents 1 Lnear Fttng

More information

Stability, observer design and control of networks using Lyapunov methods

Stability, observer design and control of networks using Lyapunov methods Stablty, observer desgn and control of networks usng Lyapunov methods von Lars Naujok Dssertaton zur Erlangung des Grades enes Doktors der Naturwssenschaften - Dr. rer. nat. - Vorgelegt m Fachberech 3

More information

Visualization of high-dimensional data with relational perspective map

Visualization of high-dimensional data with relational perspective map (2004) 3, 49 59 & 2004 Palgrave Macmllan Ltd. All rghts reserved 1473-8716 $25.00 www.palgrave-journals.com/vs Vsualzaton of hgh-dmensonal data wth relatonal perspectve map James Xnzh L 1 1 Edgehll Dr.

More information

Sngle Snk Buy at Bulk Problem and the Access Network

Sngle Snk Buy at Bulk Problem and the Access Network A Constant Factor Approxmaton for the Sngle Snk Edge Installaton Problem Sudpto Guha Adam Meyerson Kamesh Munagala Abstract We present the frst constant approxmaton to the sngle snk buy-at-bulk network

More information

PRACTICE 1: MUTUAL FUNDS EVALUATION USING MATLAB.

PRACTICE 1: MUTUAL FUNDS EVALUATION USING MATLAB. PRACTICE 1: MUTUAL FUNDS EVALUATION USING MATLAB. INDEX 1. Load data usng the Edtor wndow and m-fle 2. Learnng to save results from the Edtor wndow. 3. Computng the Sharpe Rato 4. Obtanng the Treynor Rato

More information

Lecture 2: Single Layer Perceptrons Kevin Swingler

Lecture 2: Single Layer Perceptrons Kevin Swingler Lecture 2: Sngle Layer Perceptrons Kevn Sngler kms@cs.str.ac.uk Recap: McCulloch-Ptts Neuron Ths vastly smplfed model of real neurons s also knon as a Threshold Logc Unt: W 2 A Y 3 n W n. A set of synapses

More information

Lecture 3: Force of Interest, Real Interest Rate, Annuity

Lecture 3: Force of Interest, Real Interest Rate, Annuity Lecture 3: Force of Interest, Real Interest Rate, Annuty Goals: Study contnuous compoundng and force of nterest Dscuss real nterest rate Learn annuty-mmedate, and ts present value Study annuty-due, and

More information

ECE544NA Final Project: Robust Machine Learning Hardware via Classifier Ensemble

ECE544NA Final Project: Robust Machine Learning Hardware via Classifier Ensemble 1 ECE544NA Fnal Project: Robust Machne Learnng Hardware va Classfer Ensemble Sa Zhang, szhang12@llnos.edu Dept. of Electr. & Comput. Eng., Unv. of Illnos at Urbana-Champagn, Urbana, IL, USA Abstract In

More information