Person Reidentification by Probabilistic Relative Distance Comparison


 Beverley Boyd
 2 years ago
 Views:
Transcription
1 Person Redentfcaton by Probablstc Relatve Dstance Comparson WeSh Zheng 1,2, Shaogang Gong 2, and Tao Xang 2 1 School of Informaton Scence and Technology, Sun Yatsen Unversty, Chna 2 School of Electronc Engneerng and Computer Scence, Queen Mary Unversty of London, UK Abstract Matchng people across nonoverlappng camera vews, known as person redentfcaton, s challengng due to the lack of spatal and temporal constrants and large vsual appearance changes caused by varatons n vew angle, lghtng, background clutter and occluson. To address these challenges, most prevous approaches am to extract vsual features that are both dstnctve and stable under appearance changes. However, most vsual features and ther combnatons under realstc condtons are nether stable nor dstnctve thus should not be used ndscrmnately. In ths paper, we propose to formulate person redentfcaton as a dstance learnng problem, whch ams to learn the optmal dstance that can maxmses matchng accuracy regardless the choce of representaton. To that end, we ntroduce a novel Probablstc Relatve Dstance Comparson (PRDC) model, whch dffers from most exstng dstance learnng methods n that, rather than mnmsng ntraclass varaton whlst maxmsng ntraclass varaton, t ams to maxmse the probablty of a par of true match havng a smaller dstance than that of a wrong match par. Ths makes our model more tolerant to appearance changes and less susceptble to model overfttng. Extensve experments are carred out to demonstrate that 1) by formulatng the person redentfcaton problem as a dstance learnng problem, notable mprovement on matchng accuracy can be obtaned aganst conventonal person redentfcaton technques, whch s partcularly sgnfcant when the tranng sample sze s small; and 2) our PRDC outperforms not only exstng dstance learnng methods but also alternatve learnng methods based on boostng and learnng to rank. 1. Introducton There has been an ncreasng nterest n matchng people across dsjont camera vews n a multcamera system, known as the person redentfcaton problem [10, 7, 14, 8, Most of ths work was done when the frst author was at QMUL. Fgure 1. Typcal examples of appearance changes caused by crossvew varatons n vew angle, lghtng, background clutter and occluson. Each column shows two mages of the same person from two dfferent camera vews. 3]. For understandng behavour of people n a large area of publc space covered by multple nooverlappng cameras, t s crtcal that when a target dsappears from one vew, he/she can be dentfed n another vew among a crowd of people. Despte the best efforts from computer vson researchers n the past 5 years, the person redentfcaton problem remans largely unsolved. Specfcally, n a busy uncontrolled envronment montored by cameras from a dstance, person verfcaton relyng upon bometrcs such as face and gat s nfeasble or unrelable. Wthout accurate temporal and spatal constrants gven the typcally large gaps between camera vews, vsual appearance features alone, extracted manly from clothng, are ntrnscally weak for matchng people (e.g. most people n wnter wear dark clothes). In addton, a person s appearance often undergoes large varatons across dfferent camera vews due to sgnfcant changes n vew angle, lghtng, background clutter and occluson (see Fg. 1), resultng n dfferent people appearng more alke than that of the same person across dfferent camera vews (see Fgs. 4 and 5). Most exstng studes have tred to address the above problems by seekng a more dstnctve and stable feature representaton of people s appearance, rangng wdely from color hstogram [10, 7], graph model [4], spatal cooccurrence representaton model [14], prncpal axs hstogram [8], rectangle regon hstogram [2], to combnatons of multple features [7, 3]. After feature extracton, exstng methods smply choose a standard dstance mea 649
2 sure such as l 1 norm [14], l 2 norm based dstance [8], or Bahattacharyya dstance [7]. However under severe vewng condton changes that can cause sgnfcant ntraobject appearance varaton (e.g. vew angle, lghtng, occluson), computng a set of features that are both dstnctve and stable under all condton changes s extremely hard f not mpossble under realstc condtons. Moreover, gven that certan features could be more relable than others under a certan condton, applyng a standard dstance measure s undesrable as t essentally treats all features equally wthout dscardng bad features selectvely n each ndvdual matchng crcumstance. In ths paper, we propose to formulate person redentfcaton as a dstance learnng problem whch ams to learn the optmal dstance metrc that can maxmse matchng accuracy regardless the choce of representaton. To that end, a novel Probablstc Relatve Dstance Comparson (PRDC) model s proposed. The objectve functon used by PRDC ams to maxmse the probablty of a par of true match (.e. two true mages of person A) havng a smaller dstance than that of a par of related wrong match (.e. two mages of person A and B respectvely). Ths s n contrast wth that of a conventonal dstance learnng approach, whch ams to mnmse ntraclass varaton n an absolute sense (.e. makng all mages of person A more smlar) whlst maxmsng nterclass varaton (.e. makng all mages of person A and B more dssmlar). Our approach s motvated by the nature of our problem. Specfcally, the person redentfcaton problem has three characterstcs: 1) the ntraclass varaton can be large and mportantly can be sgnfcantly vared for dfferent classes as t s caused by dfferent condton changes (see Fg. 1); 2) the nterclass varaton also vares drastcally across dfferent pars of classes; and 3) annotatng matched people across camera vews s tedous and typcally only lmted number of classes (people) are avalable for tranng wth each class contanng only a handful of mages of a person from dfferent camera vews (.e. undersamplng for buldng a representatve class dstrbuton). By explorng a relatve dstance comparson model probablstcally, our model s more tolerant to the large ntra/nterclass varaton and severe overlappng of dfferent classes n a multdmensonal feature space. Furthermore, due to the thrd characterstcs of undersamplng, a model could be easly overftted f t s learned by mnmsng ntraclass dstance and maxmsng nterclass dstance smultaneously by brutal force. In contrast, our approach s able to learn a dstance wth much reduced complexty thus allevatng the overfttng problem, as valdated by our extensve experments. Related work Although t has not been exploted for person redentfcaton, dstance learnng s a wellstuded problem wth a large number of methods reported n the lterature [16, 5, 17, 7, 17, 12, 15, 9, 1]. However, most of them suffer from the overfttng problem as explaned above. Recently, a few approaches attempt to allevate the problem by ncorporatng the dea of relatve dstance comparson as our PRDC model [12, 15, 9]. However, n these works, the relatve dstance comparson s not quantfed probablstcally, and mportantly s used as an optmsaton constrant rather than objectve functon. Therefore these approaches, ether mplctly [12, 9] or explctly [15] stll am to learn a dstance by whch each class becomes more compact whlst beng more separable from each other n an absolute sense. We demonstrate through experments that they reman susceptble to overfttng for person redentfcaton. There have been a couple of feature selecton based methods proposed specfcally for person redentfcaton [7, 11]. Gray et al. [7] proposed to use boostng to select a subset of optmal features for matchng people. However, n a boostng framework, good features are only selected sequentally and ndependently n the orgnal feature space where dfferent classes can be heavly overlapped. Such selecton may not be globally optmal. Rather than selectng features ndvdually and ndependently (local selecton), we am to learn an optmal dstance measure for all features jontly va dstance learnng (global selecton). An alternatve global selecton approach was developed based on RankSVM [11]. By formulatng the person redentfcaton as a rankng problem, the RankSVM approach shares the sprt of relatve comparson n our model. Nevertheless, our approach s more prncpled and tractable than the RankSVM n that 1) PRDC s a secondorder feature selecton approach whereas RankSVM s a frstorder one whch s not able to explot correlatons of dfferent features; 2) although RankSVM allevates the overfttng problem by fusng a rankng error functon wth a large margn functon n ts objectve functon, the probablstc formulaton of our objectve functon makes PRDC more tolerant to large ntra and nterclass varatons and data sparsty; 3) tunng the crtcal free parameter of RankSVM that determnes the weght between the margn functon and the rankng error functon s computatonally costly and can be suboptmal gven lmted data. In contrast, our PRDC model does not such a problem. We demonstrate the advantage of our approach over both the Boostng [7] and RankSVM [11] based methods through experments. The man contrbutons of ths work are twofold. 1) We formulate the person redentfcaton problem as a dstance learnng problem, whch leads to noteworthy mprovement on redentfcaton accuracy. To the best of our knowledge, t has not been nvestgated before. 2) We propose a probablstc relatve dstance comparson based method that overcomes the lmtatons of exstng dstance learnng methods when appled to person redentfcaton. 650
3 2. Probablstc Relatve Dstance Comparson for Person Redentfcaton Let us formally cast the person redentfcaton problem nto the followng dstance learnng problem. For an mage z of person A, we wsh to learn a redentfcaton model to successfully dentfy another mage z of the same person captured elsewhere n space and tme. Ths s acheved by learnng a dstance functon f(, ) so that f(z, z ) < f(z, z ), where z s an mage of any other person except A. To that end, gven a tranng set { (z, y ) }, where z Z s a multdmensonal feature vector representng the appearance of a person n one vew and y s ts class label (person ID), we defne a parwse set O = {O = (x p, xn )}, where each element of a parwse data O tself s computed usng a par of sample feature vectors. More specfcally, x p s a dfference vector computed between a par of relevant samples (of the same class/person) and x n s a dfference vector from a par of related rrelevant samples,.e. only one sample for computng x n s one of the two relevant samples for computng x p and the other s a msmatch from another class. The dfference vector x between any two samples z and z s computed by x = d(z, z ), z, z Z (1) where d s an entrywse dfference functon that outputs a dfference vector between z and z. The specfc form of functon d wll be descrbed n Sec Gven the parwse set O, a dstance functon f can be learned based on relatve dstance comparson so that a dstance between a relevant sample par (f(x p )) s smaller than that between a related rrelevant par (f(x n )). That s f(x p ) < f(xn ) for each parwse data O. To ths end, we measure the probablty of the dstance between a relevant par beng smaller than that of a related rrelevant par as: P (f(x p ) < f(xn )) = ( 1+exp { f(x p ) f(xn ) }) 1. (2) We assume the events of dstance comparson between a relevant par and an rrelevant par,.e. f(x p ) < f(xn ), are ndependent 1. Then, based on the maxmum lkelhood prncple, the optmal functon f can be learned as follows: f = arg mn r(f, O), f r(f, O) = log( P (f(x p O ) < (3) f(xn ))). The dstance functon f s parametersed as a Mahanalobs (quadratc) based dstance functon: f(x) = x T Mx, M 0 (4) where M s a semdefnte matrx. The dstance learnng problem thus becomes learnng M usng Eqn. (3). Drectly learnng M usng semdefnte program technques s computatonally expensve for hgh dmensonal data [15]. In partcular, we found out n our experments that gven a d 1 Note that we do not assume the data are ndependent. mensonalty of thousands, typcal for vsual object representaton, a dstance learnng method based on learnng M becomes ntractable. To overcome ths problem, we perform egenvalue decomposton on M: M = AΛA T = WW T, W = AΛ 1 2, (5) where the columns of A are orthonormal egenvectors of M and the dagonals of Λ are the correspondng egenvalues. Note that W s orthogonal. Therefore, learnng a functon f s equvalent to learnng an orthogonal matrx W = (w 1,, w l,, w L ) such that W = arg mn r(w, O), s.t. W wt w j = 0, j r(w, O) = O log(1 + exp { W T x p 2 W T x n 2} ) An Iteratve Optmsaton Algorthm It s mportant to pont out that our optmsaton crteron (6) may not be a convex optmsaton problem aganst the orthogonal constrant due to the relatve comparson modellng. It means that dervng an global soluton by drectly optmsng W s not straghtforward. In ths work we formulate an teratve optmsaton algorthm to learn an optmal W, whch also ams to seek a low rank (nontrval) soluton automatcally. Ths s crtcal for reducng the model complexty thus overcomng the overfttng problem gven sparse data. Startng from an empty matrx, after teraton l, a new estmated column w l s added to W. The algorthm termnates after L teratons when a stoppng crteron s met. Each teraton conssts of two steps as follows: Step 1. Assume that after l teratons, a total of l orthogonal vectors w 1,, w l have been learned. To learn the next orthogonal vector w l+1, let a l+1 l = exp{ j=0 wt j x p,j (6) 2 w T j x n,j 2 }, (7) are the dffer where we defne w 0 = 0, and x p,l and x n,l ence vectors at the lth teraton defned as follows: x s,l = x s,l 1 w l 1 w l 1x T s,l 1, s {p, n}, = 1,, O, l 1, where w l 1 = w l 1 / w l 1. Note that we defne x s,0 = x s, s {p, n}, and w 0 = 0. Step 2. Obtan x p,l+1, x n,l+1 by Eqn. (8). Let O l+1 = {O l+1 =(x p,l+1, x n,l+1 )}. Then, learn a new optmal projecton w l+1 on O l+1 as follows: r l+1 (w, O l+1 ) (8) w l+1 = arg mn w r l+1(w, O l+1 ), where (9) = log(1 + a l+1 O l+1 exp { w T x p,l+1 2 w T x n,l+1 2} ). We seek an optmal soluton by a gradent descent method: w l+1 w l+1 λ r l+1 w l+1, λ 0, (10) 651
4 r l+1 = w l+1 O l+1 2 a l a l+1 exp { wl+1 T xp,l+1 2 wl+1 T xn,l+1 2} exp { wl+1 T xp,l+1 2 wl+1 T xn,l+1 2} ( x p,l+1 x p,l+1 T x n,l+1 x n,l+1 T ) wl+1. where λ s a step length automatcally determned at each gradent update step. Accordng to the descent drecton n Eqn. (10) the ntal value of w l+1 for the gradent descent method s set to w l+1 = O l+1 1 (x n,l+1 O l+1 x p,l+1 ). (11) Note that the update n Eqn. (8) deducts nformaton from each sample x s,l 1 affected by w l 1 as wl 1 T xs,l = 0, so that the next learned vector w l wll only quantfy the part of the data left from the last step,.e. x s,l. In addton, ndcates the trends n the change of dstance measures a l+1 for x p and xn over prevous teratons and serve as a pror weght for learnng w l. The teraton of the algorthm (for l > 1) s termnated when the followng crteron s met: r l (w l, O l ) r l+1 (w l+1, O l+1 ) < ε. (12) where ε s a small tolerance value set to 10 6 n ths work. The algorthm s summarsed n Algorthm 1. Algorthm 1: Learnng the PRDC model Data: O = {O = (x p, xn )}, ε > 0 begn w 0 0, w 0 0; x s,0 x s, s {p, n}, O 0 O; l 0; whle 1 do Compute a l+1 by Eqn. (7); Compute x s,l+1, s {p, n} by Eqn. (8); O l+1 {O l+1 = (x p,l+1, x n,l+1 )}; Estmate w l+1 usng Eqn. (9); w l+1 = w l+1 ; w l+1 f (l > 1)&(r l (w l, O l ) r l+1 (w l+1, O l+1 ) < ε) then break; end l l + 1; end end Output: W = [ ] w 1,, w l 2.2. Theoretcal Valdaton The followng two theorems valdate that the proposed teratve optmsaton algorthm learns a set of orthogonal projectons {w l } that teratvely decrease the objectve functon n Crteron (6). Theorem 1. The learned vectors w l, l = 1,, L, are orthogonal to each other. Proof. Assume that l 1 orthogonal vectors {w j } l 1 j=1 have been learned. Let w l be the optmal soluton of Crteron (9) at the l teraton. Frst, we know that w l s n the range space 2 of {x p,l } {x n,l } accordng to Eqns. (10) and (11),.e. w l span{x s,l, = 1,, O, s {p, n}}. Second, accordng to Eqn. (8), we have w T j x s,j+1 = 0, s {p, n}, j = 1,, l 1 span{x s,l, = 1,, O, s {p, n}} span{x s,l 1, = 1,, O, s {p, n}} span{x s,0, = 1,, O, s {p, n}} Hence, w l s orthogonal to w j, j = 1,, l 1. (13) Theorem 2. r(w l+1, O l+1 ) r(w l, O l ), where W l = (w 1,, w l ), l 1. That s, the algorthm teratvely decreases the objectve functon value. Proof. Let w l+1 be the optmal soluton of Eqn. (9). By Theorem 1, t s easy to prove that for any j 1, wj T x s,j = wj T x s,0 = wj T x s, s {p, n}. Hence we have r l+1 (w l+1, O l+1 ) = O l+1 log(1 + a l+1 = r(w l+1, O) exp { wl+1 T xp,l+1 2 wl+1 T xn,l+1 2} ) Also r l+1 (0, O l+1 ) = r(w l, O). Snce w l+1 s the mnmal soluton, we have r l+1 (w l+1, O l+1 ) r l+1 (0, O l+1 ), and therefore r(w l+1, O) r(w l, O). Snce Crteron (9) may not be convex, a local optmum could be obtaned n each teraton of our algorthm. However, even f the computaton was trapped n a local mnmum of Eqn. (9) at the l + 1 teraton, Theorem 2 s stll vald f r l+1 (w l+1, O l+1 ) r l (w l, O l ), otherwse the algorthm wll be termnated by the stoppng crteron (12). To allevate the local optmum problem at each teraton, multple ntalsatons could also be deployed n practce Learnng n an Absolute Data Dfference Space To compute the data dfference vector x defned n Eqn. (1), most exstng dstance learnng methods use the followng entrywse dfference functon x = d(z, z ) = z z (14) to learn M = WW T n the normal data dfference space denoted by DZ = { x j = z z j z, z j Z }. The learned dstance functon s thus wrtten as: f(x j ) = (z z j ) T M(z z j ) = W T x j 2. (15) In ths work, we compute the dfference vector by the followng entrywse absolute dfference functon: x = d(z, z ) = z z, x(k) = z(k) z (k). (16) 2 It can be explored by Lagrangan equaton for Eqn. (9) for a nonzero wl. 652
5 where z(k) s the kth element of the sample feature vector. M s thus learned n an absolute data dfference space, denoted by { DZ = xj = z z j z, z j Z }, and our dstance functon becomes: f( x j ) = z z j T M z z j = W T x j 2. (17) We now explan why learnng n an absolute data dfference space s more sutable to our relatve comparson model. Frst, we note that: z (k) z j (k) (z (k) z j (k) (18) (z (k) z j (k)) (z (k) z j (k)), hence we have x j x j. x j x j, where. s an entrywse. As x j, x j 0, we thus can prove xj x j xj x j. (19) Ths suggests that the varaton of x j gven the same sample space Z s always less than that of x j. Specfcally, f z, z j, z j are from the same class, the ntraclass varaton s smaller n DZ than n DZ. On the other hand, f z j and z j belong to a dfferent class as z, the varaton of nterclass dfferences s also more compact n the absolute data dfference space. Snce the varatons of both relevant and rrelevant sample dfferences x p and x n are smaller, the learned dstance functon usng Eqn. (6) would yeld more consstent dstance comparson results therefore benefttng our PRDC model. Specally, for the same semdefnte matrx M, the Cauchy nequalty suggests upper( W T ( x j x j ) ) upper( W T (x j x j ) ), where upper( ) s the upper bound operaton. Ths ndcates that n the latent subspace nduced by W, the maxmum varaton of x j T M x j s lower than that of x T j Mx j. We show notable beneft of learnng PRDC n an absolute data dfference space n our experments Feature Representaton Our PRDC model can be appled regardless of the choce of appearance feature representaton of people. However, n order to beneft from dfferent and complementary nformaton captured by dfferent features, we start wth a mxture of colour and texture hstogram features smlar to those used n [7] and let our model automatcally dscover an optmal feature dstance. Specfcally, we dvded a person mage nto sx horzontal strpes. For each strpe, the RGB, YCbCr, HSV color features and two types of texture features extracted by Schmd and Gabor flters were computed and represented as hstograms. In total 29 feature channels were constructed for each strpe and each feature channel was represented by a 16 dmensonal hstogram vector. Each person mage was thus represented by a feature vector n a 2784 dmensonal feature space Z. Snce the features computed for ths representaton nclude lowlevel features wdely used by exstng person redentfcaton technques, ths representaton s consdered as generc and representatve. 3. Experments Datasets and settngs. Two publcally avalable person redentfcaton datasets, LIDS MultpleCamera Trackng Scenaro (MCTS) [18, 13] and VIPeR [6], were used for evaluaton. In the LIDS MCTS dataset, whch was captured ndoor at a busy arport arrval hall, there are 119 people wth a total 476 person mages captured by multple nonoverlappng cameras wth an average of 4 mages for each person. Many of these mages undergo large llumnaton change and are subject to occlusons (see Fg. 4). The VIPeR dataset s the largest person redentfcaton dataset avalable consstng of 632 people captured outdoor wth two mages for each person. Vewpont change s the most sgnfcant cause of appearance change wth most of the matched mage pars contanng one front/back vew and one sdevew (see Fg. 5). In our experments, for each dataset, we randomly selected all mages of p people (classes) to set up the test set, and the rest were used for tranng. Each test set was composed of a gallery set and a probe set. The gallery set conssted of one mage for each person, and the remanng mages were used as the probe set. Ths procedure was repeated 10 tmes. Durng tranng, a par of mages of each person formed a relevant par, and one mage of hm/her and one of another person n the tranng set formed a related rrelevant par, and together they form the parwse set O defned n Sec. 2. For evaluaton, we use the average cumulatve match characterstc (CMC) curves [6] over 10 trals to show the ranked matchng rates. A rank r matchng rate ndcates the percentage of the probe mages wth correct matches found n the top r ranks aganst the p gallery mages. Rank 1 matchng rate s thus the correct matchng/recognton rate. Note that n practce, although a hgh rank 1 matchng rate s crtcal, the top r ranked matchng rate wth a small r value s also mportant because the top matched mages wll normally be verfed by a human operator [6]. PRDC vs. NonLearnng based Dstances. We frst compared our PRDC wth nonlearnng based l 1 norm dstance and Bhattacharyya dstance whch were used by most exstng person redentfcaton work. Our results (Fgs. 2 and 3, Tables 1 and 2) show clearly that wth the proposed PRDC, the matchng performance for both datasets s mproved notably, more so when the number of people n the test pool ncreases (.e. tranng set sze decreases). The mprovement s partcularly dramatc on the VIPeR dataset. In partcular, Table 2 shows that a 4fold ncrease n correct matchng rate (r = 1) s obtaned aganst both l 1 norm and Bhattacharyya dstances when p = 316. The results valdate the mportance of performng dstance learnng. Examples of matchng people usng PRDC for both datasets are shown n Fgs. 4 and 5 respectvely. 653
6 Matchng Rate (%) LIDS 75 PRDC 65 Adaboost 55 ITM LMNN 45 MCC 35 Xng s L1 Norm Rank Score (a) p = 50 Matchng Rate (%) LIDS Rank Score (b) p = 80 Fgure 2. Performance comparson usng CMC curves on LIDS MCTS dataset. Methods p = 30 p = 50 p = 80 r = 1 r = 5 r = 10 r = 20 r = 1 r = 5 r = 10 r = 20 r = 1 r = 5 r = 10 r = 20 PRDC Adaboost LMNN ITM MCC Xng s L1norm Bhat Table 1. Top ranked matchng rate (%) on LIDS MCTS. p s sze of the gallery set (larger p means smaller tranng set) and r s the rank. PRDC vs. Alternatve Learnng Methods. We also compared PRDC wth 5 alternatve dscrmnant learnng based approaches. These nclude 4 popular dstance learnng methods, namely Xng s method [16], LMNN [15], ITM [1] and MCC [5], and a method specfcally desgned for person redentfcaton based on Adaboost [7]. Among the 4 dstance learnng methods, only LMNN explots relatve dstance comparson. But as mentoned n Sec. 1, t s used as an optmsaton constrant rather than the man objectve functon whch s also not formulated probablstcally. MCC s smlar to PRDC n that a probablstc model s used but t s not a relatve dstance comparson based method. Note that snce MCC needs to select the best dmenson for matchng, we performed crossvaldaton by selectng ts value n {[1 : 1 : 10], d}, where d s the maxmum rank MCC can learn. Among the 5, the only method that learns n an absolute data dfferent space s Adaboost. Our results (Fgs. 2 and 3, Tables 1 and 2) show clearly that our model yelds the best rank 1 matchng rate and overall much superor performance compared to the compared models. The advantage of PRDC s partcularly apparent when a tranng set s small (learnng becomes more dffcult) and a test set s large ndcated by the value of p (matchng becomes harder). Table 2 shows that on VIPeR when 100 people are used for learnng and 532 people for testng (p = 532), the correct matchng rate for PRDC (and MCC) s almost more than doubled aganst any alternatve dstance learnng methods. Partcularly, beneftng from beng a probablstc model, MCC gves the most comparable results to PRDC when the tranng set s large. However, ts performance degrades dramatcally when the sze of tranng data decreases (see columns under p = 80 n Table 1 and p = 532 n Table 2). Ths suggests that overfttng to lmted tranng data s the man reason for the nferor performance of the compared alternatve learnng approaches. PRDC vs. RankSVM. Dfferent from PRDC, RankSVM has a free parameter whch determnes the relatve weghts between the margn functon and the rankng error functon [11]. In our experment, we crossvaldated the parameter n {0.0001, 0.005, 0.001, 0.05, 0.1, 0.5, 1, 10, 100, 1000}. As shown n Tables 3 and 4, the two methods all perform very well aganst other compared algorthms and our PRDC yelds overall better performance especally at lower rank matchng rate and gven less tranng data. The better performance of PRDC s due to the probablstc modellng and a secondorder rather than frstorder feature selecton. It s also noted that tunng the free parameter for RankSVM s not a trval task and the performance can be senstve to the tunng especally gven sparse data, whle PRDC does not have ths problem. In addton RankSVM s computatonally more expensve (see detals later). Effect of learnng n an Absolute Data Dfference Space. We have shown n Sec. 2.3 that n theory our relatve dstance comparson learnng method can beneft from learnng n an absolute data dfference space. To valdate ths expermentally, we compare PRDC wth PRDC raw whch learns n the normal data dfference space DZ (see Sec. 2.3). The result n Table 5 ndcates that learnng n an absolute data dfference space does mprove the matchng performance. Note that most exstng dstance learnng models are based on learnng n the normal data dfference space DZ. It s possble to reformulate some of them n 654
7 Matchng Rate (%) VIPeR PRDC Adaboost 50 ITM 40 LMNN 30 MCC 20 Xng s 10 L1 Norm Rank Score (a) p = 316 Matchng Rate (%) VIPeR Rank Score (b) p = 532 Fgure 3. Performance comparson usng CMC curves on VIPeR dataset. Methods p = 316 p = 432 p = 532 r = 1 r = 5 r = 10 r = 20 r = 1 r = 5 r = 10 r = 20 r = 1 r = 5 r = 10 r = 20 PRDC Adaboost LMNN ITM MCC Xng s L1norm Bhat Table 2. Top ranked matchng rate (%) on VIPeR. p s the number of classes n the testng set; r s the rank. Rank PRDC RankSVM r = 1 r = 5 r = 10 r = 20 r = 1 r = 5 r = 10 r = 20 p = p = p = Table 3. PRDC vs. RankSVM (%) on LIDS. Rank PRDC RankSVM r = 1 r = 5 r = 10 r = 20 r = 1 r = 5 r = 10 r = 20 p = p = p = Table 4. PRDC vs. RankSVM (%) on VIPeR. Methods LIDS, (p = 50) VIPeR (p = 316) r = 1 r = 5 r = 10 r = 20 r = 1 r = 5 r = 10 r = 20 PRDC PRDC raw ITM abs MCC abs Table 5. Effect of learnng n an absolute data dfference space. Methods LIDS MCTS VIPeR p = 30 p = 50 p = 80 p = 316 p = 432 p = 532 rank(w) Table 6. Average Rank of W Learned by PRDC. order to learn n an absolute data dfference space. In Table 5 we show that when ITM and MCC are learned n the absolute data dfference space DZ, termed as ITM abs and MCC abs respectvely, ther performances become worse as compared to ther results n Tables 1 and 2. Ths ndcates that the absolute dfferent space s more sutable for our relatve comparson dstance learnng. Computatonal cost. Though PRDC s teratve, t has relatvely low cost n practce. In our experments, for VIPeR wth p = 316, t took around 15 mnutes for an Intel dualcore 2.93GHz CPU and 48GB RAM to learn PRDC for each tral. We observed that the low cost of PRDC s partally due to ts ablty to seek a sutable low rank of W (.e. converge wthn very few teratons) as shown n Table 6. For comparson, among the other compared methods, Adaboost s the most costly and took over 7 hours for each tral. For the 4 compared dstance learnng methods, PCA dmensonalty reducton must be performed otherwse they becomes ntractable gven the hgh dmensonal feature space. For the RankSVM method, each tral took around 2.5 hours due to parameter tunng. 4. Concluson We have proposed a new approach for person redentfcaton based on probablstc relatve dstance comparson whch ams to learn an sutable optmal dstance measure gven large ntra and nterclass appearance varatons and sparse data. Our experments demonstrate that 1) by formulatng person redentfcaton as a dstance learnng problem, clear mprovement n matchng performance can be obtaned and the mprovement s more sgnfcant when tranng sample sze s small, and (2) our PRDC outperforms not only exstng dstance learnng methods but also alternatve learnng methods based on boostng and learnng to rank. Acknowledgements Ths research was partally funded by the EU FP7 project SAMURAI wth grant no Dr. WeSh Zheng was also addtonally supported by the 985 project n Sun Yatsen Unversty wth grant no References [1] J. Davs, B. Kuls, P. Jan, S. Sra, and I. Dhllon. Informatontheoretc metrc learnng. In ICML,
8 Fgure 4. Examples of Person Redentfcaton on LIDS MCTS usng PRDC. In each row, the leftmost mage s the probe, mages n the mddle are the top 20 matched gallery mages wth a hghlghted red box for the correctly matched, and the rghtmost shows a true match Fgure 5. Examples of Person Redentfcaton on VIPeR usng PRDC [2] P. Dollar, Z. Tu, H. Tao, and S. Belonge. Feature mnng for mage classfcaton. In CVPR, [3] M. Farenzena, L. Bazzan, A. Perna, M. Crstan, and V. Murno. Person redentfcaton by symmetrydrven accumulaton of local features. In CVPR, [4] N. Ghessar, T. Sebastan, and R. Hartley. Person redentfcaton usng spatotemporal appearance. In CVPR, [5] A. Globerson and S. Rowes. Metrc learnng by collapsng classes. In NIPS, [6] D. Gray, S. Brennan, and H. Tao. Evaluatng appearance models for recognton, reacquston, and trackng. In IEEE Internatonal workshop on performance evaluaton of trackng and survellance, [7] D. Gray and H. Tao. Vewpont nvarant pedestran recognton wth an ensemble of localzed features. In ECCV, [8] W. Hu, M. Hu, X. Zhou, J. Lou, T. Tan, and S. Maybank. Prncpal axsbased correspondence between multple cameras for people trackng. PAMI, 28(4): , [9] J. Lee, R. Jn, and A. Jan. Rankbased dstance metrc learnng: An applcaton to mage retreval. In CVPR, [10] U. Park, A. Jan, I. Ktahara, K. Kogure, and N. Hagta. Vse: Vsual search engne usng multple networked cameras. In ICPR, [11] B. Prosser, W.S. Zheng, S. Gong, and T. Xang. Person redentfcaton by support vector rankng. In BMVC, [12] M. Schultz and T. Joachms. Learnng a dstance metrc from relatve comparsons. In NIPS, [13] UK. Home Offce LIDS multple camera trackng scenaro defnton [14] X. Wang, G. Doretto, T. Sebastan, J. Rttscher, and P. Tu. Shape and appearance context modelng. In ICCV, [15] K. Wenberger, J. Bltzer, and L. Saul. Dstance metrc learnng for large margn nearest neghbor classfcaton. In NIPS, [16] E. Xng, A. Ng, M. Jordan, and S. Russell. Dstance metrc learnng, wth applcaton to clusterng wth sdenformaton. In NIPS, [17] L. Yang, R. Jn, R. Sukthankar, and Y. Lu. An effcent algorthm for local dstance metrc learnng. In AAAI, [18] W.S. Zheng, S. Gong, and T. Xang. Assocatng groups of people. In BMVC,
FOR understanding behavior of people in a large area of
No commercal use ^_^ IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, VOL. 35, NO. 3, MARCH 2013 653 Redentfcaton by Relatve Dstance Comparson WeSh Zheng, Member, IEEE, Shaogang Gong, and
More informationThe Development of Web Log Mining Based on ImproveKMeans Clustering Analysis
The Development of Web Log Mnng Based on ImproveKMeans Clusterng Analyss TngZhong Wang * College of Informaton Technology, Luoyang Normal Unversty, Luoyang, 471022, Chna wangtngzhong2@sna.cn Abstract.
More informationFace Verification Problem. Face Recognition Problem. Application: Access Control. Biometric Authentication. Face Verification (1:1 matching)
Face Recognton Problem Face Verfcaton Problem Face Verfcaton (1:1 matchng) Querymage face query Face Recognton (1:N matchng) database Applcaton: Access Control www.vsage.com www.vsoncs.com Bometrc Authentcaton
More informationL10: Linear discriminants analysis
L0: Lnear dscrmnants analyss Lnear dscrmnant analyss, two classes Lnear dscrmnant analyss, C classes LDA vs. PCA Lmtatons of LDA Varants of LDA Other dmensonalty reducton methods CSCE 666 Pattern Analyss
More informationNonlinear data mapping by neural networks
Nonlnear data mappng by neural networks R.P.W. Dun Delft Unversty of Technology, Netherlands Abstract A revew s gven of the use of neural networks for nonlnear mappng of hgh dmensonal data on lower dmensonal
More informationWhat is Candidate Sampling
What s Canddate Samplng Say we have a multclass or mult label problem where each tranng example ( x, T ) conssts of a context x a small (mult)set of target classes T out of a large unverse L of possble
More informationForecasting the Direction and Strength of Stock Market Movement
Forecastng the Drecton and Strength of Stock Market Movement Jngwe Chen Mng Chen Nan Ye cjngwe@stanford.edu mchen5@stanford.edu nanye@stanford.edu Abstract  Stock market s one of the most complcated systems
More information8.5 UNITARY AND HERMITIAN MATRICES. The conjugate transpose of a complex matrix A, denoted by A*, is given by
6 CHAPTER 8 COMPLEX VECTOR SPACES 5. Fnd the kernel of the lnear transformaton gven n Exercse 5. In Exercses 55 and 56, fnd the mage of v, for the ndcated composton, where and are gven by the followng
More informationPSYCHOLOGICAL RESEARCH (PYC 304C) Lecture 12
14 The Chsquared dstrbuton PSYCHOLOGICAL RESEARCH (PYC 304C) Lecture 1 If a normal varable X, havng mean µ and varance σ, s standardsed, the new varable Z has a mean 0 and varance 1. When ths standardsed
More informationFeature selection for intrusion detection. Slobodan Petrović NISlab, Gjøvik University College
Feature selecton for ntruson detecton Slobodan Petrovć NISlab, Gjøvk Unversty College Contents The feature selecton problem Intruson detecton Traffc features relevant for IDS The CFS measure The mrmr measure
More informationCS 2750 Machine Learning. Lecture 17a. Clustering. CS 2750 Machine Learning. Clustering
Lecture 7a Clusterng Mlos Hauskrecht mlos@cs.ptt.edu 539 Sennott Square Clusterng Groups together smlar nstances n the data sample Basc clusterng problem: dstrbute data nto k dfferent groups such that
More informationLuby s Alg. for Maximal Independent Sets using Pairwise Independence
Lecture Notes for Randomzed Algorthms Luby s Alg. for Maxmal Independent Sets usng Parwse Independence Last Updated by Erc Vgoda on February, 006 8. Maxmal Independent Sets For a graph G = (V, E), an ndependent
More informationMultivariate EWMA Control Chart
Multvarate EWMA Control Chart Summary The Multvarate EWMA Control Chart procedure creates control charts for two or more numerc varables. Examnng the varables n a multvarate sense s extremely mportant
More informationLogistic Regression. Lecture 4: More classifiers and classes. Logistic regression. Adaboost. Optimization. Multiple class classification
Lecture 4: More classfers and classes C4B Machne Learnng Hlary 20 A. Zsserman Logstc regresson Loss functons revsted Adaboost Loss functons revsted Optmzaton Multple class classfcaton Logstc Regresson
More informationAn Alternative Way to Measure Private Equity Performance
An Alternatve Way to Measure Prvate Equty Performance Peter Todd Parlux Investment Technology LLC Summary Internal Rate of Return (IRR) s probably the most common way to measure the performance of prvate
More informationbenefit is 2, paid if the policyholder dies within the year, and probability of death within the year is ).
REVIEW OF RISK MANAGEMENT CONCEPTS LOSS DISTRIBUTIONS AND INSURANCE Loss and nsurance: When someone s subject to the rsk of ncurrng a fnancal loss, the loss s generally modeled usng a random varable or
More informationDetecting Global Motion Patterns in Complex Videos
Detectng Global Moton Patterns n Complex Vdeos Mn Hu, Saad Al, Mubarak Shah Computer Vson Lab, Unversty of Central Florda {mhu,sal,shah}@eecs.ucf.edu Abstract Learnng domnant moton patterns or actvtes
More informationSupport Vector Machines
Support Vector Machnes Max Wellng Department of Computer Scence Unversty of Toronto 10 Kng s College Road Toronto, M5S 3G5 Canada wellng@cs.toronto.edu Abstract Ths s a note to explan support vector machnes.
More informationDescriptive Models. Cluster Analysis. Example. General Applications of Clustering. Examples of Clustering Applications
CMSC828G Prncples of Data Mnng Lecture #9 Today s Readng: HMS, chapter 9 Today s Lecture: Descrptve Modelng Clusterng Algorthms Descrptve Models model presents the man features of the data, a global summary
More informationAn InterestOriented Network Evolution Mechanism for Online Communities
An InterestOrented Network Evoluton Mechansm for Onlne Communtes Cahong Sun and Xaopng Yang School of Informaton, Renmn Unversty of Chna, Bejng 100872, P.R. Chna {chsun,yang}@ruc.edu.cn Abstract. Onlne
More informationProject Networks With MixedTime Constraints
Project Networs Wth MxedTme Constrants L Caccetta and B Wattananon Western Australan Centre of Excellence n Industral Optmsaton (WACEIO) Curtn Unversty of Technology GPO Box U1987 Perth Western Australa
More informationData Broadcast on a MultiSystem Heterogeneous Overlayed Wireless Network *
JOURNAL OF INFORMATION SCIENCE AND ENGINEERING 24, 819840 (2008) Data Broadcast on a MultSystem Heterogeneous Overlayed Wreless Network * Department of Computer Scence Natonal Chao Tung Unversty Hsnchu,
More informationImperial College London
F. Fang 1, C.C. Pan 1, I.M. Navon 2, M.D. Pggott 1, G.J. Gorman 1, P.A. Allson 1 and A.J.H. Goddard 1 1 Appled Modellng and Computaton Group Department of Earth Scence and Engneerng Imperal College London,
More informationSingle and multiple stage classifiers implementing logistic discrimination
Sngle and multple stage classfers mplementng logstc dscrmnaton Hélo Radke Bttencourt 1 Dens Alter de Olvera Moraes 2 Vctor Haertel 2 1 Pontfíca Unversdade Católca do Ro Grande do Sul  PUCRS Av. Ipranga,
More informationv a 1 b 1 i, a 2 b 2 i,..., a n b n i.
SECTION 8.4 COMPLEX VECTOR SPACES AND INNER PRODUCTS 455 8.4 COMPLEX VECTOR SPACES AND INNER PRODUCTS All the vector spaces we have studed thus far n the text are real vector spaces snce the scalars are
More informationModule 2 LOSSLESS IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur
Module LOSSLESS IMAGE COMPRESSION SYSTEMS Lesson 3 Lossless Compresson: Huffman Codng Instructonal Objectves At the end of ths lesson, the students should be able to:. Defne and measure source entropy..
More informationMAPP. MERIS level 3 cloud and water vapour products. Issue: 1. Revision: 0. Date: 9.12.1998. Function Name Organisation Signature Date
Ttel: Project: Doc. No.: MERIS level 3 cloud and water vapour products MAPP MAPPATBDClWVL3 Issue: 1 Revson: 0 Date: 9.12.1998 Functon Name Organsaton Sgnature Date Author: Bennartz FUB Preusker FUB Schüller
More informationAn Enhanced SuperResolution System with Improved Image Registration, Automatic Image Selection, and Image Enhancement
An Enhanced SuperResoluton System wth Improved Image Regstraton, Automatc Image Selecton, and Image Enhancement YuChuan Kuo ( ), ChenYu Chen ( ), and ChouShann Fuh ( ) Department of Computer Scence
More informationUsing Mixture Covariance Matrices to Improve Face and Facial Expression Recognitions
Usng Mxture Covarance Matrces to Improve Face and Facal Expresson Recogntons Carlos E. homaz, Duncan F. Glles and Raul Q. Fetosa 2 Imperal College of Scence echnology and Medcne, Department of Computng,
More informationA DATA MINING APPLICATION IN A STUDENT DATABASE
JOURNAL OF AERONAUTICS AND SPACE TECHNOLOGIES JULY 005 VOLUME NUMBER (5357) A DATA MINING APPLICATION IN A STUDENT DATABASE Şenol Zafer ERDOĞAN Maltepe Ünversty Faculty of Engneerng BüyükbakkalköyIstanbul
More informationMANY machine learning and pattern recognition applications
1 Trace Rato Problem Revsted Yangqng Ja, Fepng Ne, and Changshu Zhang Abstract Dmensonalty reducton s an mportant ssue n many machne learnng and pattern recognton applcatons, and the trace rato problem
More informationIMPACT ANALYSIS OF A CELLULAR PHONE
4 th ASA & μeta Internatonal Conference IMPACT AALYSIS OF A CELLULAR PHOE We Lu, 2 Hongy L Bejng FEAonlne Engneerng Co.,Ltd. Bejng, Chna ABSTRACT Drop test smulaton plays an mportant role n nvestgatng
More informationPerformance Analysis and Coding Strategy of ECOC SVMs
Internatonal Journal of Grd and Dstrbuted Computng Vol.7, No. (04), pp.6776 http://dx.do.org/0.457/jgdc.04.7..07 Performance Analyss and Codng Strategy of ECOC SVMs Zhgang Yan, and Yuanxuan Yang, School
More informationTrafficlight a stress test for life insurance provisions
MEMORANDUM Date 006097 Authors Bengt von Bahr, Göran Ronge Traffclght a stress test for lfe nsurance provsons Fnansnspetonen P.O. Box 6750 SE113 85 Stocholm [Sveavägen 167] Tel +46 8 787 80 00 Fax
More informationGender Classification for RealTime Audience Analysis System
Gender Classfcaton for RealTme Audence Analyss System Vladmr Khryashchev, Lev Shmaglt, Andrey Shemyakov, Anton Lebedev Yaroslavl State Unversty Yaroslavl, Russa vhr@yandex.ru, shmaglt_lev@yahoo.com, andrey.shemakov@gmal.com,
More informationRecurrence. 1 Definitions and main statements
Recurrence 1 Defntons and man statements Let X n, n = 0, 1, 2,... be a MC wth the state space S = (1, 2,...), transton probabltes p j = P {X n+1 = j X n = }, and the transton matrx P = (p j ),j S def.
More informationNumber of Levels Cumulative Annual operating Income per year construction costs costs ($) ($) ($) 1 600,000 35,000 100,000 2 2,200,000 60,000 350,000
Problem Set 5 Solutons 1 MIT s consderng buldng a new car park near Kendall Square. o unversty funds are avalable (overhead rates are under pressure and the new faclty would have to pay for tself from
More informationRiskbased Fatigue Estimate of Deep Water Risers  Course Project for EM388F: Fracture Mechanics, Spring 2008
Rskbased Fatgue Estmate of Deep Water Rsers  Course Project for EM388F: Fracture Mechancs, Sprng 2008 Chen Sh Department of Cvl, Archtectural, and Envronmental Engneerng The Unversty of Texas at Austn
More informationConversion between the vector and raster data structures using Fuzzy Geographical Entities
Converson between the vector and raster data structures usng Fuzzy Geographcal Enttes Cdála Fonte Department of Mathematcs Faculty of Scences and Technology Unversty of Combra, Apartado 38, 3 454 Combra,
More informationOcclusion Invariant Face Recognition Using Selective Lnmf Basis Images
Occluson Invarant Face Recognton Usng Selectve Lnmf Bass Images Hyun Jun Oh 1, Kyoung Mu Lee 2, Sang Uk Lee 2, and ChungHyuk Ym 3 1 SK Telecom, Korea purete5@nate.com 2 School of Electrcal Eng. and Computer
More informationHallucinating Multiple Occluded CCTV Face Images of Different Resolutions
In Proc. IEEE Internatonal Conference on Advanced Vdeo and Sgnal based Survellance (AVSS 05), September 2005 Hallucnatng Multple Occluded CCTV Face Images of Dfferent Resolutons Ku Ja Shaogang Gong Computer
More informationA Novel Methodology of Working Capital Management for Large. Public Constructions by Using Fuzzy Scurve Regression
Novel Methodology of Workng Captal Management for Large Publc Constructons by Usng Fuzzy Scurve Regresson ChengWu Chen, Morrs H. L. Wang and TngYa Hseh Department of Cvl Engneerng, Natonal Central Unversty,
More informationCHOLESTEROL REFERENCE METHOD LABORATORY NETWORK. Sample Stability Protocol
CHOLESTEROL REFERENCE METHOD LABORATORY NETWORK Sample Stablty Protocol Background The Cholesterol Reference Method Laboratory Network (CRMLN) developed certfcaton protocols for total cholesterol, HDL
More informationInstitute of Informatics, Faculty of Business and Management, Brno University of Technology,Czech Republic
Lagrange Multplers as Quanttatve Indcators n Economcs Ivan Mezník Insttute of Informatcs, Faculty of Busness and Management, Brno Unversty of TechnologCzech Republc Abstract The quanttatve role of Lagrange
More informationExtending Probabilistic Dynamic Epistemic Logic
Extendng Probablstc Dynamc Epstemc Logc Joshua Sack May 29, 2008 Probablty Space Defnton A probablty space s a tuple (S, A, µ), where 1 S s a set called the sample space. 2 A P(S) s a σalgebra: a set
More informationLETTER IMAGE RECOGNITION
LETTER IMAGE RECOGNITION 1. Introducton. 1. Introducton. Objectve: desgn classfers for letter mage recognton. consder accuracy and tme n takng the decson. 20,000 samples: Startng set: mages based on 20
More informationCommunication Networks II Contents
8 / 1  Communcaton Networs II (Görg)  www.comnets.unbremen.de Communcaton Networs II Contents 1 Fundamentals of probablty theory 2 Traffc n communcaton networs 3 Stochastc & Marovan Processes (SP
More informationMultiplePeriod Attribution: Residuals and Compounding
MultplePerod Attrbuton: Resduals and Compoundng Our revewer gave these authors full marks for dealng wth an ssue that performance measurers and vendors often regard as propretary nformaton. In 1994, Dens
More informationVision Mouse. Saurabh Sarkar a* University of Cincinnati, Cincinnati, USA ABSTRACT 1. INTRODUCTION
Vson Mouse Saurabh Sarkar a* a Unversty of Cncnnat, Cncnnat, USA ABSTRACT The report dscusses a vson based approach towards trackng of eyes and fngers. The report descrbes the process of locatng the possble
More information"Research Note" APPLICATION OF CHARGE SIMULATION METHOD TO ELECTRIC FIELD CALCULATION IN THE POWER CABLES *
Iranan Journal of Scence & Technology, Transacton B, Engneerng, ol. 30, No. B6, 789794 rnted n The Islamc Republc of Iran, 006 Shraz Unversty "Research Note" ALICATION OF CHARGE SIMULATION METHOD TO ELECTRIC
More informationThe Greedy Method. Introduction. 0/1 Knapsack Problem
The Greedy Method Introducton We have completed data structures. We now are gong to look at algorthm desgn methods. Often we are lookng at optmzaton problems whose performance s exponental. For an optmzaton
More informationDEFINING %COMPLETE IN MICROSOFT PROJECT
CelersSystems DEFINING %COMPLETE IN MICROSOFT PROJECT PREPARED BY James E Aksel, PMP, PMISP, MVP For Addtonal Informaton about Earned Value Management Systems and reportng, please contact: CelersSystems,
More informationCS 2750 Machine Learning. Lecture 3. Density estimation. CS 2750 Machine Learning. Announcements
Lecture 3 Densty estmaton Mlos Hauskrecht mlos@cs.ptt.edu 5329 Sennott Square Next lecture: Matlab tutoral Announcements Rules for attendng the class: Regstered for credt Regstered for audt (only f there
More informationA DYNAMIC CRASHING METHOD FOR PROJECT MANAGEMENT USING SIMULATIONBASED OPTIMIZATION. Michael E. Kuhl Radhamés A. TolentinoPeña
Proceedngs of the 2008 Wnter Smulaton Conference S. J. Mason, R. R. Hll, L. Mönch, O. Rose, T. Jefferson, J. W. Fowler eds. A DYNAMIC CRASHING METHOD FOR PROJECT MANAGEMENT USING SIMULATIONBASED OPTIMIZATION
More informationIMPROVEMENT OF CONVERGENCE CONDITION OF THE SQUAREROOT INTERVAL METHOD FOR MULTIPLE ZEROS 1
Nov Sad J. Math. Vol. 36, No. 2, 2006, 009 IMPROVEMENT OF CONVERGENCE CONDITION OF THE SQUAREROOT INTERVAL METHOD FOR MULTIPLE ZEROS Modrag S. Petkovć 2, Dušan M. Mloševć 3 Abstract. A new theorem concerned
More informationAbstract. Clustering ensembles have emerged as a powerful method for improving both the
Clusterng Ensembles: {topchyal, Models jan, of punch}@cse.msu.edu Consensus and Weak Parttons * Alexander Topchy, Anl K. Jan, and Wllam Punch Department of Computer Scence and Engneerng, Mchgan State Unversty
More information1. Fundamentals of probability theory 2. Emergence of communication traffic 3. Stochastic & Markovian Processes (SP & MP)
6.3 /  Communcaton Networks II (Görg) SS20  www.comnets.unbremen.de Communcaton Networks II Contents. Fundamentals of probablty theory 2. Emergence of communcaton traffc 3. Stochastc & Markovan Processes
More informationLecture 18: Clustering & classification
O CPS260/BGT204. Algorthms n Computatonal Bology October 30, 2003 Lecturer: Pana K. Agarwal Lecture 8: Clusterng & classfcaton Scrbe: Daun Hou Open Problem In HomeWor 2, problem 5 has an open problem whch
More informationHuman Tracking by Fast Mean Shift Mode Seeking
JOURAL OF MULTIMEDIA, VOL. 1, O. 1, APRIL 2006 1 Human Trackng by Fast Mean Shft Mode Seekng [10 font sze blank 1] [10 font sze blank 2] C. Belezna Advanced Computer Vson GmbH  ACV, Venna, Austra Emal:
More informationGenetic algorithm for searching for critical slip surface in gravity dams based on stress fields CHEN Jianyun 1, WANG Shu 2, XU Qiang 3, LI Jing 4
Advanced Materals Research Onlne: 2030904 ISSN: 6628985, Vol. 790, pp 4649 do:0.4028/www.scentfc.net/amr.790.46 203 Trans Tech Publcatons, Swtzerland Genetc algorthm for searchng for crtcal slp surface
More informationA hybrid global optimization algorithm based on parallel chaos optimization and outlook algorithm
Avalable onlne www.ocpr.com Journal of Chemcal and Pharmaceutcal Research, 2014, 6(7):18841889 Research Artcle ISSN : 09757384 CODEN(USA) : JCPRC5 A hybrd global optmzaton algorthm based on parallel
More informationOn Mean Squared Error of Hierarchical Estimator
S C H E D A E I N F O R M A T I C A E VOLUME 0 0 On Mean Squared Error of Herarchcal Estmator Stans law Brodowsk Faculty of Physcs, Astronomy, and Appled Computer Scence, Jagellonan Unversty, Reymonta
More informationDistributed MultiTarget Tracking In A SelfConfiguring Camera Network
Dstrbuted MultTarget Trackng In A SelfConfgurng Camera Network Crstan Soto, B Song, Amt K. RoyChowdhury Department of Electrcal Engneerng Unversty of Calforna, Rversde {cwlder,bsong,amtrc}@ee.ucr.edu
More informationONE of the most crucial problems that every image
IEEE TRANSACTIONS ON IMAGE PROCESSING, VOL. 23, NO. 10, OCTOBER 2014 4413 Maxmum Margn Projecton Subspace Learnng for Vsual Data Analyss Symeon Nktds, Anastasos Tefas, Member, IEEE, and Ioanns Ptas, Fellow,
More informationNaïve Bayes classifier & Evaluation framework
Lecture aïve Bayes classfer & Evaluaton framework Mlos Hauskrecht mlos@cs.ptt.edu 539 Sennott Square Generatve approach to classfcaton Idea:. Represent and learn the dstrbuton p x, y. Use t to defne probablstc
More informationStudy on CET4 Marks in China s Graded English Teaching
Study on CET4 Marks n Chna s Graded Englsh Teachng CHE We College of Foregn Studes, Shandong Insttute of Busness and Technology, P.R.Chna, 264005 Abstract: Ths paper deploys Logt model, and decomposes
More informationIdentifying Workloads in Mixed Applications
, pp.395400 http://dx.do.org/0.4257/astl.203.29.8 Identfyng Workloads n Mxed Applcatons Jeong Seok Oh, Hyo Jung Bang, Yong Do Cho, Insttute of Gas Safety R&D, Korea Gas Safety Corporaton, ShghungSh,
More informationSoftware project management with GAs
Informaton Scences 177 (27) 238 241 www.elsever.com/locate/ns Software project management wth GAs Enrque Alba *, J. Francsco Chcano Unversty of Málaga, Grupo GISUM, Departamento de Lenguajes y Cencas de
More informationThe eigenvalue derivatives of linear damped systems
Control and Cybernetcs vol. 32 (2003) No. 4 The egenvalue dervatves of lnear damped systems by YeongJeu Sun Department of Electrcal Engneerng IShou Unversty Kaohsung, Tawan 840, R.O.C emal: yjsun@su.edu.tw
More informationwhere the coordinates are related to those in the old frame as follows.
Chapter 2  Cartesan Vectors and Tensors: Ther Algebra Defnton of a vector Examples of vectors Scalar multplcaton Addton of vectors coplanar vectors Unt vectors A bass of noncoplanar vectors Scalar product
More informationCalculating the high frequency transmission line parameters of power cables
< ' Calculatng the hgh frequency transmsson lne parameters of power cables Authors: Dr. John Dcknson, Laboratory Servces Manager, N 0 RW E B Communcatons Mr. Peter J. Ncholson, Project Assgnment Manager,
More informationECE544NA Final Project: Robust Machine Learning Hardware via Classifier Ensemble
1 ECE544NA Fnal Project: Robust Machne Learnng Hardware va Classfer Ensemble Sa Zhang, szhang12@llnos.edu Dept. of Electr. & Comput. Eng., Unv. of Illnos at UrbanaChampagn, Urbana, IL, USA Abstract In
More informationRisk Model of LongTerm Production Scheduling in Open Pit Gold Mining
Rsk Model of LongTerm Producton Schedulng n Open Pt Gold Mnng R Halatchev 1 and P Lever 2 ABSTRACT Open pt gold mnng s an mportant sector of the Australan mnng ndustry. It uses large amounts of nvestments,
More informationEye Center Localization on a Facial Image Based on MultiBlock Local Binary Patterns
Eye Center Localzaton on a Facal Image Based on MultBloc Local Bnary Patterns Anatoly tn, Vladmr Khryashchev, Olga Stepanova Yaroslavl State Unversty Yaroslavl, Russa anatolyntnyar@gmal.com, vhr@yandex.ru,
More informationMaster s Thesis. Configuring robust virtual wireless sensor networks for Internet of Things inspired by brain functional networks
Master s Thess Ttle Confgurng robust vrtual wreless sensor networks for Internet of Thngs nspred by bran functonal networks Supervsor Professor Masayuk Murata Author Shnya Toyonaga February 10th, 2014
More informationQuestions that we may have about the variables
Antono Olmos, 01 Multple Regresson Problem: we want to determne the effect of Desre for control, Famly support, Number of frends, and Score on the BDI test on Perceved Support of Latno women. Dependent
More informationBERNSTEIN POLYNOMIALS
OnLne Geometrc Modelng Notes BERNSTEIN POLYNOMIALS Kenneth I. Joy Vsualzaton and Graphcs Research Group Department of Computer Scence Unversty of Calforna, Davs Overvew Polynomals are ncredbly useful
More informationAN APPOINTMENT ORDER OUTPATIENT SCHEDULING SYSTEM THAT IMPROVES OUTPATIENT EXPERIENCE
AN APPOINTMENT ORDER OUTPATIENT SCHEDULING SYSTEM THAT IMPROVES OUTPATIENT EXPERIENCE YuL Huang Industral Engneerng Department New Mexco State Unversty Las Cruces, New Mexco 88003, U.S.A. Abstract Patent
More informationForecasting the Demand of Emergency Supplies: Based on the CBR Theory and BP Neural Network
700 Proceedngs of the 8th Internatonal Conference on Innovaton & Management Forecastng the Demand of Emergency Supples: Based on the CBR Theory and BP Neural Network Fu Deqang, Lu Yun, L Changbng School
More informationLogistic Regression. Steve Kroon
Logstc Regresson Steve Kroon Course notes sectons: 24.324.4 Dsclamer: these notes do not explctly ndcate whether values are vectors or scalars, but expects the reader to dscern ths from the context. Scenaro
More informationClustering Gene Expression Data. (Slides thanks to Dr. Mark Craven)
Clusterng Gene Epresson Data Sldes thanks to Dr. Mark Craven Gene Epresson Proles we ll assume we have a D matr o gene epresson measurements rows represent genes columns represent derent eperments tme
More informationPeriod and Deadline Selection for Schedulability in RealTime Systems
Perod and Deadlne Selecton for Schedulablty n RealTme Systems Thdapat Chantem, Xaofeng Wang, M.D. Lemmon, and X. Sharon Hu Department of Computer Scence and Engneerng, Department of Electrcal Engneerng
More informationThe OC Curve of Attribute Acceptance Plans
The OC Curve of Attrbute Acceptance Plans The Operatng Characterstc (OC) curve descrbes the probablty of acceptng a lot as a functon of the lot s qualty. Fgure 1 shows a typcal OC Curve. 10 8 6 4 1 3 4
More information320 The Internatonal Arab Journal of Informaton Technology, Vol. 5, No. 3, July 2008 Comparsons Between Data Clusterng Algorthms Osama Abu Abbas Computer Scence Department, Yarmouk Unversty, Jordan Abstract:
More informationCan Auto Liability Insurance Purchases Signal Risk Attitude?
Internatonal Journal of Busness and Economcs, 2011, Vol. 10, No. 2, 159164 Can Auto Lablty Insurance Purchases Sgnal Rsk Atttude? ChuShu L Department of Internatonal Busness, Asa Unversty, Tawan ShengChang
More informationSPEE Recommended Evaluation Practice #6 Definition of Decline Curve Parameters Background:
SPEE Recommended Evaluaton Practce #6 efnton of eclne Curve Parameters Background: The producton hstores of ol and gas wells can be analyzed to estmate reserves and future ol and gas producton rates and
More informationRobust Design of Public Storage Warehouses. Yeming (Yale) Gong EMLYON Business School
Robust Desgn of Publc Storage Warehouses Yemng (Yale) Gong EMLYON Busness School Rene de Koster Rotterdam school of management, Erasmus Unversty Abstract We apply robust optmzaton and revenue management
More information9.1 The Cumulative Sum Control Chart
Learnng Objectves 9.1 The Cumulatve Sum Control Chart 9.1.1 Basc Prncples: Cusum Control Chart for Montorng the Process Mean If s the target for the process mean, then the cumulatve sum control chart s
More information1 Approximation Algorithms
CME 305: Dscrete Mathematcs and Algorthms 1 Approxmaton Algorthms In lght of the apparent ntractablty of the problems we beleve not to le n P, t makes sense to pursue deas other than complete solutons
More informationRESEARCH ON DUALSHAKER SINE VIBRATION CONTROL. Yaoqi FENG 1, Hanping QIU 1. China Academy of Space Technology (CAST) yaoqi.feng@yahoo.
ICSV4 Carns Australa 9 July, 007 RESEARCH ON DUALSHAKER SINE VIBRATION CONTROL Yaoq FENG, Hanpng QIU Dynamc Test Laboratory, BISEE Chna Academy of Space Technology (CAST) yaoq.feng@yahoo.com Abstract
More informationBypassing Synthesis: PLS for Face Recognition with Pose, LowResolution and Sketch
Bypassng Synthess: PLS for Face Recognton wth Pose, LowResoluton and Setch Abhshe Sharma Insttute of Advanced Computer Scence Unversty of Maryland, USA bhoaal@umacs.umd.edu Davd W Jacobs Insttute of Advanced
More informationgreatest common divisor
4. GCD 1 The greatest common dvsor of two ntegers a and b (not both zero) s the largest nteger whch s a common factor of both a and b. We denote ths number by gcd(a, b), or smply (a, b) when there s no
More informationColocalization of Fluorescent Probes
Colocalzaton of Fluorescent Probes APPLICATION NOTE #1 1. Introducton Fluorescence labelng technques are qute useful to mcroscopsts. Not only can fluorescent probes label subcellular structures wth a
More informationLinear Regression, Regularization BiasVariance Tradeoff
HTF: Ch3, 7 B: Ch3 Lnear Regresson, Regularzaton BasVarance Tradeoff Thanks to C Guestrn, T Detterch, R Parr, N Ray 1 Outlne Lnear Regresson MLE = Least Squares! Bass functons Evaluatng Predctors Tranng
More informationData Visualization by Pairwise Distortion Minimization
Communcatons n Statstcs, Theory and Methods 34 (6), 005 Data Vsualzaton by Parwse Dstorton Mnmzaton By Marc Sobel, and Longn Jan Lateck* Department of Statstcs and Department of Computer and Informaton
More informationOn the Optimal Control of a Cascade of HydroElectric Power Stations
On the Optmal Control of a Cascade of HydroElectrc Power Statons M.C.M. Guedes a, A.F. Rbero a, G.V. Smrnov b and S. Vlela c a Department of Mathematcs, School of Scences, Unversty of Porto, Portugal;
More informationSketching Sampled Data Streams
Sketchng Sampled Data Streams Florn Rusu, Aln Dobra CISE Department Unversty of Florda Ganesvlle, FL, USA frusu@cse.ufl.edu adobra@cse.ufl.edu Abstract Samplng s used as a unversal method to reduce the
More informationPOLYSA: A Polynomial Algorithm for Nonbinary Constraint Satisfaction Problems with and
POLYSA: A Polynomal Algorthm for Nonbnary Constrant Satsfacton Problems wth and Mguel A. Saldo, Federco Barber Dpto. Sstemas Informátcos y Computacón Unversdad Poltécnca de Valenca, Camno de Vera s/n
More informationLearning from Multiple Outlooks
Learnng from Multple Outlooks Maayan Harel Department of Electrcal Engneerng, Technon, Hafa, Israel She Mannor Department of Electrcal Engneerng, Technon, Hafa, Israel maayanga@tx.technon.ac.l she@ee.technon.ac.l
More informationEverybody needs somebody: Modeling social and grouping behavior on a linear programming multiple people tracker
Everybody needs somebody: Modelng socal and groupng behavor on a lnear programmng multple people tracker Laura LealTaxe, Gerard PonsMoll and Bodo Rosenhahn Insttute for Informaton Processng (TNT) Lebnz
More information