Figure 1. Training and Test data sets for Nasdaq-100 Index (b) NIFTY index

Size: px
Start display at page:

Download "Figure 1. Training and Test data sets for Nasdaq-100 Index (b) NIFTY index"

Transcription

1 Modelng Chaotc Behavor of Stock Indces Usng Intellgent Paradgms Ajth Abraham, Nnan Sajth Phlp and P. Saratchandran Department of Computer Scence, Oklahoma State Unversty, ulsa, Oklahoma 746, USA, Emal: Department of Physcs, Cochn Unversty of Scence and echnology, Inda, Emal: School of Electrcal and Electronc Engneerng, Nanyang echnologcal Unversty, Sngapore , E-mal: Abstract he use of ntellgent systems for stock market predctons has been dely establshed. In ths paper, e nvestgate ho the seemngly chaotc behavor of stock markets could be ell represented usng several connectonst paradgms and soft computng technques. o demonstrate the dfferent technques, e consdered Nasdaq- ndex of Nasdaq Stock Market SM and the S&P CNX NIFY stock ndex. We analyzed 7 year s Nasdaq man ndex values and 4 year s NIFY ndex values. hs paper nvestgates the development of a relable and effcent technque to model the seemngly chaotc behavor of stock markets. We consdered an artfcal neural netork traned usng Levenberg-Marquardt algorthm, Support Vector Machne (SVM, akag-sugeno neurofuzzy model and a Dfference Boostng Neural Netork (DBNN. hs paper brefly explans ho the dfferent connectonst paradgms could be formulated usng dfferent learnng methods and then nvestgates hether they can provde the requred level of performance, hch are suffcently good and robust so as to provde a relable forecast model for stock market ndces. Experment results reveal that all the connectonst paradgms consdered could represent the stock ndces behavor very accurately. Key ords: connectonst paradgm, support vector machne, neural netork, dfference boostng, neuro-fuzzy, stock market.. INRODUCION Predcton of stocks s generally beleved to be a very dffcult task. he process behaves more lke a random alk process and tme varyng. he obvous complexty of the problem paves ay for the mportance of ntellgent predcton paradgms. Durng the last decade, stocks and futures traders have come to rely upon varous types of ntellgent systems to make tradng decsons [][3][7][][8][9][6][3][8]. Several ntellgent systems have n recent years been developed for modellng expertse, decson support and complcated automaton tasks etc [8][9][5][5][4][6][9][4][7]. In ths paper, e analysed the seemngly chaotc behavour of to ell-knon stock ndces namely Nasdaq- ndex of Nasdaq SM [] and the S&P CNX NIFY stock ndex [].

2 Nasdaq- ndex reflects Nasdaq's largest companes across major ndustry groups, ncludng computer hardare and softare, telecommuncatons, retal/holesale trade and botechnology []. he Nasdaq- ndex s a modfed captalzaton-eghted ndex, hch s desgned to lmt domnaton of the Index by a fe large stocks hle generally retanng the captalzaton rankng of companes. hrough an nvestment n Nasdaq- ndex trackng stock, nvestors can partcpate n the collectve performance of many of the Nasdaq stocks that are often n the nes or have become household names. Smlarly, S&P CNX NIFY s a ell-dversfed 5 stock ndex accountng for 5 sectors of the economy []. It s used for a varety of purposes such as benchmarkng fund portfolos, ndex based dervatves and ndex funds. he CNX Indces are computed usng market captalsaton eghted method, heren the level of the Index reflects the total market value of all the stocks n the ndex relatve to a partcular base perod. he method also takes nto account consttuent changes n the ndex and mportantly corporate actons such as stock splts, rghts, etc thout affectng the ndex value. Fgure. ranng and est data sets for Nasdaq- Index (b NIFY ndex Our research s to nvestgate the performance analyss of four dfferent connectonst paradgms for modellng the Nasdaq- and NIFY stock market ndces. he four dfferent technques consdered are an artfcal neural netork traned usng the Levenberg-Marquardt algorthm [6], support vector machne [7], dfference boostng neural netork [5] and a akag-sugeno fuzzy nference system learned usng a neural netork algorthm (neuro-fuzzy model [3]. Neural netorks are excellent forecastng tools and can learn from scratch by adjustng the nterconnectons beteen layers. Support vector machnes offer excellent learnng capablty based on statstcal learnng theory. Fuzzy nference systems are excellent for decson makng under uncertanty. Neuro-fuzzy computng s a popular frameork heren neural netork tranng algorthms are used to fne-tune the parameters of fuzzy nference systems. We analysed the Nasdaq- ndex value from January 995 to January [] and the

3 NIFY ndex from January 998 to 3 December []. For both the ndces, e dvded the entre data nto almost to equal parts. No specal rules ere used to select the tranng set other than ensurng a reasonable representaton of the parameter space of the problem doman. he complexty of the tranng and test data sets for both ndces are depcted n Fgures and respectvely. In Secton e brefly descrbe the dfferent connectonst paradgms folloed by expermentaton setup and results n Secton 3. Some conclusons are also provded toards the end. Fgure. ranng and est data sets for NIFY ndex. INELLIGEN SYSEMS: A CONNECIONIS MODEL APPROACH Connectonst models learn by adjustng the nterconnectons beteen layers. When the netork s adequately traned, t s able to generalze relevant output for a set of nput data. Learnng typcally occurs by example through tranng, here the tranng algorthm teratvely adjusts the connecton eghts (synapses. In an artfcal neural netork learnng occurs by the teratve updatng of connecton eghts usng a learnng algorthm.. ARIFICIAL NEURAL NEWORKS he artfcal neural netork (ANN methodology enables us to desgn useful nonlnear systems acceptng large numbers of nputs, th the desgn based solely on nstances of nput-output relatonshps. For a tranng set consstng of n argument value pars and gven a d-dmensonal argument x and an assocated target value t ll be approxmated by the neural netork output. he functon approxmaton could be represented as {( x, t : : n} In most applcatons the tranng set s consdered to be nosy and our goal s not to reproduce t exactly but rather to construct a netork functon that generalzes ell to ne functon values. We ll try to address the problem of selectng the eghts to learn

4 the tranng set. he noton of closeness on the tranng set s typcally formalzed through an error functon of the form n y t ( here y s the netork output. Our target s to fnd a neural netork? such that the output y =? (x, s close to the desred output t for the nput x ( = strengths of synaptc connectons. he error? =? ( s a functon of because y =? depends upon the parameters defnng the selected netork?. he objectve functon? ( for a neural netork th many parameters defnes a hghly rregular surface th many local mnma, large regons of lttle slope and symmetres. he common node functons (tanh, sgmodal, logstc etc are dfferentable to arbtrary order through the chan rule of dfferentaton, hch mples that the error s also dfferentable to arbtrary order. Hence e are able to make a aylor's seres expanson n for?. We shall frst dscuss the algorthms for mnmzng? by assumng that e can truncate a aylor's seres expanson about a pont o that s possbly a local mnmum. he gradent (frst partal dervatve vector s represented by g ( ( he gradent vector ponts n the drecton of steepest ncrease of? and ts negatve ponts n the drecton of steepest decrease. he second partal dervatve also knon as Hessan matrx s represented by H ( H ( H j ( ( (3 j he aylor's seres for?, assumed tce contnuously dfferentable about, can no be gven as ( O( ( g( ( ( H ( O( here O (d denotes a term that s of zero-order n small d such that lm. If for example there s contnuous dervatve at, then the remander term s of order 3 ( and e can reduce (4 to the follong quadratc model m( ( g( ( ( H( ( (5 akng the gradent n the quadratc model of (5 yelds m g( H( If e set the gradent g= and solvng for the mnmzng * yelds (4 (6

5 * H g he model m can no be expressed n terms of mnmum value of * as m( m( m( m( g( ( H g( H( ( a result that follos from (5 by completng the square or recognzng that g( * =. Hence startng from any ntal value of the eght vector, e can n the quadratc case move one step to the mnmzng value hen t exsts. hs s knon as Neton's approach and can be used n the non-quadratc case here H s the Hessan and s postve defnte... LEVENBERG-MARQUARD ALGORIHM he Levenberg-Marquardt (LM algorthm [6] explots the fact that the error functon s a sum of squares as gven n (. Introduce the follong notaton for the error vector and ts Jacoban th respect to the netork parameters J J e j, : p, j n (9 j : he Jacoban matrx s a large p n matrx, all of hose elements are calculated drectly by backpropagaton technque. he p dmensonal gradent g for the quadratc error functon can be expressed as g ( n e e ( Je and the Hessan matrx by (7 (8 H H j j k n k e j k n e k e k j e k e k j k n e k e k j J k J jk ( Hence defnng D H ( = JJ + D n e e yelds the expresson he key to the LM algorthm s to approxmate ths expresson for the Hessan by replacng the matrx D nvolvng second dervatves by the much smpler postvely scaled unt matrx I. he LM s a descent algorthm usng ths approxmaton n the form (

6 M k JJ I, k k k M k g( k ( Successful use of LM requres approxmate lne search to determne the rate a k. he matrx JJ s automatcally symmetrc and non-negatve defnte. he typcally large sze of J may necesstate careful memory management n evaluatng the product JJ. Hence any postve ll ensure that M k s postve defnte, as requred by the descent condton. he performance of the algorthm thus depends on the choce of. When the scalar s zero, ths s just Neton's method, usng the approxmate Hessan matrx. When s large, ths becomes gradent descent th a small step sze. As Neton's method s more accurate, s decreased after each successful step (reducton n performance functon and s ncreased only hen a tentatve step ould ncrease the performance functon. By dong ths, the performance functon ll alays be reduced at each teraton of the algorthm.. SUPPOR VECOR MACHINES (SVM Support Vector Machnes (SVMs [7] combne several technques from statstcs, machne learnng and neural netorks. SVM perform structural rsk mnmzaton. hey create a classfer th mnmzed VC (Vapnk and Chervonenks dmenson. If the VC Dmenson s lo, the expected probablty of error s lo as ell, hch means good generalzaton. SVM has the common capablty to separate the classes n the lnear ay. Hoever, SVM also has another specalty that t s usng a lnear separatng hyperplane to create a classfer, yet some problems can t be lnearly separated n the orgnal nput space. hen SVM uses one of the most mportant ngredents called kernels,.e., the concept of transformng lnear algorthms nto nonlnear ones va a map nto feature spaces. Fgures 3 and 4 llustrate to categores of data usng Y+ and Y- symbols. + + Y Margn = Y y =.x b = y =.x b = y =.x b = + Y + Margn = + Y y =.x b = y =.x b = y =.x b = + Fgure 3: he lnearly separable case. Fgure 4: he lnearly nseparable case.

7 .. LINEAR SVM We consder N tranng data ponts {(x, y, (x, y,..,(x N,y N } here x R d and y { }. We ould lke to explan a lnear separatng hyperplane classfer: f ( x sgn(. x b (3 Furthermore, e ant ths hyperplane to have the maxmum separatng margn th respect to the to classes. Specfcally, e ant to fnd ths hyperplane HP : y =.x b = and to hyperplanes parallel to t and th equal dstances to t, HP : y =.x b = + and HP : y =.x b = (4 th the condton that there are no data ponts beteen HP and HP, and the dstance beteen HP and HP s maxmzed. For any separatng plane HP and the correspondng HP and HP, e can alays normalze the coeffcents vector so that HP ll be y =.x b = +, and HP ll be y =.x b =. Our am s to maxmze the dstance beteen HP and HP. So there ll be some postve examples on HP and some negatve examples on HP. hese examples are called support vectors because only they partcpate n the defnton of the separatng hyperplane, and other examples can be removed and/or moved around as long as they don t cross the planes HP and HP. Recall that the -D, the dstance from a pont (x, y to a lne Ax+Bx+C = s Ax By C. Smlarly, the dstance of a pont on HP to HP :.x b = s A B x b., and the dstance beteen HP and HP s dstance, e should mnmze. So, n order to maxmze the th the condton that there are no data ponts beteen HP and HP.x b +, for postve example y = + and.x b -, for negatve example y = - hese to condton can be combned nto: y (.x b No the problem can be formulated as, b mn subject to y (.x b (5 hs s a convex, quadratc programmng problem (n, b n a convex set. Introducng Lagrange multplers,,. n, e have the follong Lagrangan: L(, b, N y (. x b N. (6

8 .4. NON LINEAR SVM When the to classes are non-lnearly dstrbuted then SVM can transform the data ponts to another hgh dmensonal space such that the data ponts ll be lnearly separable. Let the transformaton be (. In the hgh dmensonal space, e solve L D N, j j y y j ( x. ( x j Suppose, n addton, (x (x j = k(x,x j. hat s, the dot product n that hgh dmensonal space s equvalent to a kernel functon of the nput space. So, e need not be explct about the transformaton ( as long as e kno that the kernel functon k(x, x j s equvalent to the dot product of some other hgh dmensonal space. he Mercers s condton can be used to determne f a functon can be used as a kernel functon: here exsts a mappng K ( x, y ( x ( y and an expanson f and only f, for any g(x such that g( x dx s fnte, then (7 (8 K ( x, y g( x g( y dxdy. (9 he foundatons of SVM have been developed by Vapnk [7] and are ganng popularty due to many attractve features, and promsng emprcal performance. he possblty of usng dfferent kernels allos veng learnng methods lke Radal Bass Functon Neural Netork (RBFNN or mult-layer Artfcal Neural Netorks (ANN as partcular cases of SVM despte the fact that the optmzed crtera are not the same [4]. Whle ANNs and RBFNN optmzes the mean squared error dependent on the dstrbuton of all the data, SVM optmzes a geometrcal crteron, hch s the margn and s senstve only to the extreme values and not to the dstrbuton of the data nto the feature space. he SVM approach transforms data nto a feature space F that usually has a huge dmenson. It s nterestng to note that SVM generalzaton depends on the geometrcal characterstcs of the tranng data, not on the dmensons of the nput space. ranng a support vector machne (SVM leads to a quadratc optmzaton problem th bound constrants and one lnear equalty constrant. Vapnk [7] shos ho tranng a SVM for the pattern recognton problem leads to the follong quadratc optmzaton problem l l l Mnmze: W ( y y j jk( x, x j ( Subject to l y j : C Where l s the number of tranng examples s a vector of l varables and each component corresponds to a tranng example (x, y. he soluton of ( s the vector (

9 * for hch ( s mnmzed and ( s fulflled. We used the SVMorch for smulatng the SVM learnng algorthm []..3 NEURO-FUZZY SYSEM Neuro Fuzzy (NF computng s a popular frameork for solvng complex problems []. If e have knoledge expressed n lngustc rules, e can buld a Fuzzy Inference System (FIS [8], and f e have data, or can learn from a smulaton (tranng then e can use ANNs. For buldng a FIS, e have to specfy the fuzzy sets, fuzzy operators and the knoledge base. Smlarly for constructng an ANN for an applcaton the user needs to specfy the archtecture and learnng algorthm. An analyss reveals that the drabacks pertanng to these approaches seem complementary and therefore t s natural to consder buldng an ntegrated system combnng the concepts. Whle the learnng capablty s an advantage from the vepont of FIS, the formaton of lngustc rule base ll be advantage from the vepont of ANN. Fgure 5 depcts the 6- layered archtecture of multple output ANFIS and the functonalty of each layer s as follos: Layer-. Every node n ths layer has a node functon. O ( x, for =, or O B ( y, for =3,4,. O s the membershp grade of a fuzzy set A ( = A, A, B or B and t specfes the degree to hch the gven nput x (or y satsfes the quantfer A. Usually the node functon can be any parameterzed functon. A gaussan membershp functon s specfed by to parameters c (membershp functon center and s (membershp functon dth. guassan (x, c, s = e x c. Parameters n ths layer are referred to premse parameters. Layer-. Every node n ths layer multples the ncomng sgnals and sends the product out. Each node output represents the frng strength of a rule. O A ( x B ( y,,... fuzzy "AND" can be used as the node functon n ths layer., In general any -norm operator that perform Layer-3. he rule consequent parameters are determned n ths layer. A 3 O f xp yq r, here p, q, r are the rule consequent parameters. Layer-4. Every node n ths layer s th a node functon O 4 f ( p x q y r, here s the output of layer Layer-5. Every node n ths layer aggregates all the frng strengths of rules 5 O (

10 Layer-6. Every -th node n ths layer calculates the ndvdual outputs. 6 f O Output,,.... (3 premse parameters A x consequent parameters x A B W f f / Output W B C W 3 y C D W 4 f f / Output D y O O O 3 O 4 O 5 O 6 Fgure 5. Archtecture of ANFIS th multple outputs ANFIS uses a hybrd learnng rule th a combnaton of gradent descent and least squares estmate [3]. Assumng a sngle output ANFIS represented by output F( I, S (4 here I s the set of nput varables and S s the set of parameters, f there exst a functon H such that the composte functon H? F s lnear n some of the elements of S, then these elements can be dentfed by the least squares method [3]. More formally, the parameter set S can be decomposed nto to sets: S S S (here represents drect sum, (5 such that H? F s lnear n the elements of S. hen upon applyng H to equaton (6., e have: H ( output H F( I, S (6 hch s lnear n the elements of S. No the gven values of elements of S, e can plug P tranng data sets nto (6.3, and obtan a matrx equaton: AX = B (X = unknon vector hose elements are parameters n S (7 If S =M, (M= number of lnear parameters then the dmensons of A, X and B are P M, M and P respectvely. Snce P s alays greater than M, there s no exact

11 soluton to equaton (6.4. Instead a Least Square Estmate (LSE of X, X *, s sought to mnmze the squared error AX B. X * s computed usng the pseudo-nverse of X: X * ( A A A B (8 here A s the transpose of A and ( A A A s the pseudo-nverse of A here A A s non-sngular. Due to computatonal complexty, n ANFIS a sequental method s deployed as follos: Let the -th ro vector of matrx A defned n equaton 6.4 be a and -th element of matrx B defned be b, then X can be calculated teratvely usng the follong sequental formulae: X S S X S S a ( b a S S a a a, a X,,..., P here S s often called the covarance matrx and the least squares estmate X * s equal to X P. he ntal condton to bootstrap (6.6 are X O = and S O =? I, here? s a postve large number and I s the dentty matrx of dmenson M M. For a mult output ANFIS, (6.6 s stll applcable except the output F( I, S ll become a column vector. Each epoch of ths hybrd learnng procedure s composed of a forard pass and a backard pass. In the forard pass, e have to supply the nput data and functonal sgnals go forard to calculate each node output untl the matrces A and B n (6.4 are obtaned, and the parameters n S are dentfed by the sequental least squares formulae gven n (6.6. After dentfyng parameters n S, the functonal sgnals keep gong forard tll the error measure s calculated. In the backard pass, the error rates propagate from the output layer to the nput layers, and the parameters n S are updated by the gradent method gven by E here s the generc parameter, s a learnng rate and E the error measure. For gven fxed values of parameters n S, the parameters n S thus found are guaranteed to be the global optmum pont n the S parameter space due to the choce of the squared error measure. he procedure mentoned above s manly for offlne learnng verson. Hoever, the procedure can be modfed for an onlne verson by formulatng the squared error measure as a eghted verson that gves hgher eghtng factors to more recent data pars. hs amounts to the addton of a forgettng factor? to (9. (9 (3

12 X S X S S a ( b S a a S a Sa a X,,..., P he value of? s beteen and. he smaller the? s, faster the effects of old data decay. Hoever, a smaller? sometmes causes numercal nstablty and should be avoded..4 DIFFERENCE BOOSING NEURAL NEWORK (DBNN DBNN s based on the Bayes prncple that assumes the clusterng of attrbute values hle boostng the attrbute dfferences [5]. Boostng s an teratve process by hch the netork places emphass on msclassfed examples n the tranng set untl t s correctly classfed. he method consders the error produced by each example n the tranng set n turn and updates the connecton eghts assocated to the probablty P (U m C k of each attrbute of that example (U m s the attrbute value and C k a partcular class n k number of dfferent classes n the dataset. In ths process, the probablty densty of dentcal attrbute values flattens out and the dfferences get boosted up. Instead of the seral classfers used n the AdaBoost algorthm, DBNN approach uses the same classfer throughout the tranng process. An error functon s defned for each of the mss classfed examples based on t dstance from the computed probablty of ts nearest rval. he enhancement to the attrbute s done such that the error produced by each example decdes the correcton to ts assocated eghts. Snce t s lkely that more than one class ould be sharng at least some of the same attrbute values, ths ould lead to compettve update of ther attrbute eghts, untl ether the classfer fgures out the correct class or the number of teratons are completed. he net effect of ths ould be that the classfer ould become more and more dependent on the dfferences n the examples rather than ther smlartes. DBNN s bascally a classfcaton algorthm. It assgns output state labels to nput patterns th some degree of confdence that t acqures from the tranng set. We modfed the algorthm for tme seres predcton by approxmatng the tme seres by a class of slope predctons. A major lmtaton of such a revson s that the possble output states ere lmted by the number of output states seen n the tranng set. hs s because the classfcaton algorthm lmts number of possble classes (slope values to that t encountered durng the tranng perod. hese lmtatons ll be apparent n the DBNN outputs. 3. EXPERIMENAION SEUP AND RESULS We consdered 7 year s months stock data for Nasdaq- Index and 4 year s for NIFY ndex. Our target s to develop effcent forecast models that could predct the ndex value of the follong trade day based on the openng, closng and maxmum values of the same on a gven day. he tranng and test patterns for both the ndces (scaled values are llustrated n Fgures and. For the Nasdaq-ndex the data sets ere represented (3

13 by the openng value, lo value and hgh value. NIFY ndex data sets ere represented by openng value, lo value, hgh value and closng value. We used the same tranng and test data sets to evaluate the dfferent connectonst models. More detals are reported n the follong sectons. Experments ere carred out on a Pentum IV,.5 GHz Machne th 56 MB RAM and the codes ere executed usng MALAB (ANN, ANFIS and C++ (SVM, DBNN. est data as presented to the traned connectonst netork and the output from the netork as compared th the actual ndex values n the tme seres. he assessment of the predcton performance of the dfferent connectonst paradgms ere done by quantfyng the predcton obtaned on an ndependent data set. he maxmum absolute percentage error (MAP and mean absolute percentage error (MAPE ere used to study the performance of the traned forecastng model for the test data. MAP s defned as follos: MAP max P actual, P predcted, P predcted,, here P actual, s the actual ndex value on day and P predcted, s the forecast value of the ndex on that day. Smlarly MAPE s gven as MAPE days. N Pactual, Ppredcted, N P actual,, here N represents the total number of ANN LM algorthm We used a feedforard neural netork th 4 nput nodes and a sngle hdden layer consstng of 6 neurons. We used tanh-sgmodal actvaton functon for the hdden neurons. he tranng as termnated after 5 epochs and t took about 4 seconds to tran each dataset. Neuro-fuzzy tranng We used 3 trangular membershp functons for each of the nput varable and the 7 fthen fuzzy rules ere learned for the Nasdaq- ndex and 8 f-then fuzzy rules for the NIFY ndex. ranng as termnated after epochs and t took about 3 seconds to tran each dataset. Support Vector Machnes and Dfference Boostng Neural Netork Both SVM (Gaussan kernel th? = 3 and DBNN took less than one second to learn the to data sets.

14 Performance and Results Acheved able summarzes the tranng and test results acheved for the to stock ndces usng the four dfferent approaches. Fgures 3 and 4 depct the test results for the one day ahead predcton of Nasdaq- ndex and NIFY ndex respectvely. Fgure 3. est results shong the performance of the dfferent methods for modelng Nasdaq- ndex Fgure 4. est results shong the performance of the dfferent methods for modelng NIFY ndex

15 able : Emprcal comparson (tranng and test of four learnng methods SVM Neuro-Fuzzy ANN-LM DBNN ranng results (RMSE Nasdaq NIFY estng results (RMSE Nasdaq NIFY able : Statstcal analyss of four learnng methods (test data SVM Neuro-Fuzzy ANN-LM DBNN Nasdaq- Correlaton coeffcent MAP MAPE NIFY Correlaton coeffcent MAP MAPE

16 4. CONCLUSIONS In ths paper, e have demonstrated ho the chaotc behavor of stock ndces could be ell represented by connectonst paradgms. Emprcal results on the to data sets usng four dfferent models clearly reveal the effcency of the proposed technques. In terms of RMSE values, for Nasdaq- ndex, SVM performed margnally better than other models and for NIFY ndex, ANN-LM approach gave the loest generalzaton RMSE values. For both data sets, SVM has the loest tranng tme. For Nasdaq- ndex SVM has the hghest correlaton coeffcent and loest value of MAPE but the loest MAP value as for DBNN. Hghest correlaton coeffcent as shared by SVM and ANN-LM approach for NIFY ndex but the loest MAPE value as for the neurofuzzy approach. It s nterestng to note that for predctng both ndex values, DBNN has the loest MAP value. A lo MAP value th DBNN s a crucal ndcator for evaluatng the stablty of a market under unforeseen fluctuatons. In the present example, the predctablty assures the fact that the decrease n trade s only a temporary cyclc varaton that s perfectly under control. In contrast, a chaotc fluctuaton n the market ould result n a larger MAP value and der dsagreement th the predcton by DBNN. Although smlar ould be the case th the other netorks also, snce DBNN drectly correlates ts performance th Bayesan probablty estmates, t ll be more apparent n ts case. Our research as to predct the share prce for the follong trade day based on the openng, closng and maxmum values of the same on a gven day. Our expermentaton results ndcate that the most promnent parameters that affect share prces are ther mmedate openng and closng values. he fluctuatons n the share market are chaotc n the sense that they heavly depend on the values of ther mmedate forerunnng fluctuatons. Long-term trends exst, but are slo varatons and ths nformaton s useful for long-term nvestment strateges. Our study focus on short term, on floor trades, n hch the rsk s hgher. Hoever, the results of our study sho that even n the seemngly random fluctuatons, there s an underlyng determnstc feature that s drectly encphered n the openng, closng and maxmum values of the ndex of any day makng predctablty possble. Emprcal results also shos that there are varous advantages and dsadvantages for the dfferent technques consdered. Our future research ll be orented toards determnng the optmal ay to combne the dfferent ntellgent systems usng an ensemble approach [] so as to complment the advantages and dsadvantages of the dfferent paradgms consdered. REFERENCES [] Abraham A., Nath B and Mahant P K, Hybrd Intellgent Systems for Stock Market Analyss, Computatonal Scence, Sprnger-Verlag Germany, Vassl N Alexandrov et al (Edtors, USA, pp , May. [] Abraham A., Neuro-Fuzzy Systems: State-of-the-Art Modelng echnques, Connectonst Models of Neurons, Learnng Processes, and Artfcal Intellgence, Sprnger-Verlag Germany, Jose Mra and Alberto Preto (Eds., Granada, Span, pp ,.

17 [3] Abraham A., Phlp N.S., Nath B. and Saratchandran P, Performance Analyss of Connectonst Paradgms for Modelng Chaotc Behavor of Stock Indces, Second Internatonal Workshop on Intellgent Systems Desgn and Applcatons, Computatonal Intellgence and Applcatons, Dynamc Publshers Inc., USA, pp. 8-86,. [4] Berkeley A..R.. Nasdaq's technology floor: ts presdent takes stock, IEEE Spectrum, Volume: 34 Issue:, pp , 997. [5] Bsch G.I. and Valor V., Nonlnear effects n a dscrete-tme dynamc model of a stock market, Chaos, Soltons & Fractals (3: 3-,. [6] Bshop C. M., Neural Netorks for Pattern Recognton, Oxford: Clarendon Press, 995. [7] Chan W.S. and Lu W.N., Dagnosng shocks n stock markets of southeast Asa, Australa, and Ne Zealand, Mathematcs and Computers n Smulaton 59(-3: 3-3,. [8] Cherkassky V., Fuzzy Inference Systems: A Crtcal Reve, Computatonal Intellgence: Soft Computng and Fuzzy-Neuro Integraton th Applcatons, Kayak O, Zadeh L A et al (Eds., Sprnger, pp.77-97, 998. [9] Cos K.J., Data Mnng n Fnance: Advances n Relatonal and Hybrd Methods, Neurocomputng 36(-4: 45-46,. [] Collobert R. and Bengo S., SVMorch: Support Vector Machnes for Large-Scale Regresson Problems, Journal of Machne Learnng Research, Volume, pages 43-6,. [] Francs E.H. ay and L.J. Cao, Modfed support vector machnes n fnancal tme seres forecastng, Neurocomputng 48(-4: ,. [] Hashem, S., Optmal Lnear Combnaton of Neural Netorks, Neural Netork, Volume, No. 3. pp , 995. [3] Jang J. S. R., Sun C.. and Mzutan E., Neuro-Fuzzy and Soft Computng: A Computatonal Approach to Learnng and Machne Intellgence, Prentce Hall Inc, USA, 997. [4] Joachms., Makng large-scale SVM Learnng Practcal. Advances n Kernel Methods - Support Vector Learnng, B. Schölkopf and C. Burges and A. Smola (Eds., MI-Press, 999. [5] Km K.J. and Han I., Genetc algorthms approach to feature dscretzaton n artfcal neural netorks for the predcton of stock prce ndex, Expert Systems th Applcatons 9(: 5-3, [6] Koulourots, D.E.; Dakoulaks, I.E.; Emrs, D.M, A fuzzy cogntve map-based stock market model: synthess, analyss and expermental results, he th IEEE Internatonal Conference on Fuzzy Systems, Volume:, pp ,. [7] LeBaron, B., Emprcal regulartes from nteractng long- and short-memory nvestors n an agent-based stock market, IEEE ransactons on Evolutonary Computaton, Volume: 5 Issue: 5, pp ,. [8] Legh W., Modan N., Purvs R. and Roberts., Stock market tradng rule dscovery usng techncal chartng heurstcs, Expert Systems th Applcatons 3(: 55-59,. [9] Legh W., Purvs R. and Ragusa J.M., Forecastng the NYSE composte ndex th techncal analyss, pattern recognzer, neural netork, and genetc algorthm: a case study n romantc decson support, Decson Support Systems 3(4: ,.

18 [] Masters., Advanced Algorthms for Neural Netorks: a C++ sourcebook, Wley, Ne York, 995. [] Nasdaq Stock Market SM : [] Natonal Stock Exchange of Inda Lmted: [3] Oh K.J. and Km K.J., Analyzng stock market tck data usng pecese nonlnear model, Expert Systems th Applcatons (3: 49-55,. [4] Palma-dos-Res A. and Zahed F., Desgnng personalzed ntellgent fnancal decson support systems, Decson Support Systems 6(: 3-47, 999. [5] Phlp N.S. and Joseph K.B., Boostng the Dfferences: A Fast Bayesan classfer neural netork, Intellgent Data Analyss, IOS press, Netherlands, Volume 4, pp ,. [6] Quah.S. and Srnvasan B., Improvng returns on stock nvestment through neural netork selecton, Expert Systems th Applcatons 7(4: 95-3, 999. [7] Vapnk V., he Nature of Statstcal Learnng heory. Sprnger-Verlag, Ne York, 995. [8] Wang Y.F., Mnng stock prce usng fuzzy rough set system, Expert Systems th Applcatons 4(: 3-3,. [9] Wuthrch, B., Cho, V., Leung, S., Permunetlleke, D., Sankaran, K., Zhang, J., Daly stock market forecast from textual eb data, IEEE Internatonal Conference on Systems, Man, and Cybernetcs, Volume: 3, Page(s: 7 75, 998.

Hybrid-Learning Methods for Stock Index Modeling

Hybrid-Learning Methods for Stock Index Modeling Hybrd-Learnng Methods for Stock Index Modelng 63 Chapter IV Hybrd-Learnng Methods for Stock Index Modelng Yuehu Chen, Jnan Unversty, Chna Ajth Abraham, Chung-Ang Unversty, Republc of Korea Abstract The

More information

Lecture 2: Single Layer Perceptrons Kevin Swingler

Lecture 2: Single Layer Perceptrons Kevin Swingler Lecture 2: Sngle Layer Perceptrons Kevn Sngler kms@cs.str.ac.uk Recap: McCulloch-Ptts Neuron Ths vastly smplfed model of real neurons s also knon as a Threshold Logc Unt: W 2 A Y 3 n W n. A set of synapses

More information

Forecasting the Direction and Strength of Stock Market Movement

Forecasting the Direction and Strength of Stock Market Movement Forecastng the Drecton and Strength of Stock Market Movement Jngwe Chen Mng Chen Nan Ye cjngwe@stanford.edu mchen5@stanford.edu nanye@stanford.edu Abstract - Stock market s one of the most complcated systems

More information

What is Candidate Sampling

What is Candidate Sampling What s Canddate Samplng Say we have a multclass or mult label problem where each tranng example ( x, T ) conssts of a context x a small (mult)set of target classes T out of a large unverse L of possble

More information

Forecasting the Demand of Emergency Supplies: Based on the CBR Theory and BP Neural Network

Forecasting the Demand of Emergency Supplies: Based on the CBR Theory and BP Neural Network 700 Proceedngs of the 8th Internatonal Conference on Innovaton & Management Forecastng the Demand of Emergency Supples: Based on the CBR Theory and BP Neural Network Fu Deqang, Lu Yun, L Changbng School

More information

Logistic Regression. Lecture 4: More classifiers and classes. Logistic regression. Adaboost. Optimization. Multiple class classification

Logistic Regression. Lecture 4: More classifiers and classes. Logistic regression. Adaboost. Optimization. Multiple class classification Lecture 4: More classfers and classes C4B Machne Learnng Hlary 20 A. Zsserman Logstc regresson Loss functons revsted Adaboost Loss functons revsted Optmzaton Multple class classfcaton Logstc Regresson

More information

Support Vector Machines

Support Vector Machines Support Vector Machnes Max Wellng Department of Computer Scence Unversty of Toronto 10 Kng s College Road Toronto, M5S 3G5 Canada wellng@cs.toronto.edu Abstract Ths s a note to explan support vector machnes.

More information

Australian Forex Market Analysis Using Connectionist Models

Australian Forex Market Analysis Using Connectionist Models Australan Forex Market Analyss Usng Connectonst Models A. Abraham, M. U. Chowdhury* and S. Petrovc-Lazarevc** School of Computng and Informaton Technology, Monash Unversty (Gppsland Campus), Churchll,

More information

The Development of Web Log Mining Based on Improve-K-Means Clustering Analysis

The Development of Web Log Mining Based on Improve-K-Means Clustering Analysis The Development of Web Log Mnng Based on Improve-K-Means Clusterng Analyss TngZhong Wang * College of Informaton Technology, Luoyang Normal Unversty, Luoyang, 471022, Chna wangtngzhong2@sna.cn Abstract.

More information

Causal, Explanatory Forecasting. Analysis. Regression Analysis. Simple Linear Regression. Which is Independent? Forecasting

Causal, Explanatory Forecasting. Analysis. Regression Analysis. Simple Linear Regression. Which is Independent? Forecasting Causal, Explanatory Forecastng Assumes cause-and-effect relatonshp between system nputs and ts output Forecastng wth Regresson Analyss Rchard S. Barr Inputs System Cause + Effect Relatonshp The job of

More information

CS 2750 Machine Learning. Lecture 3. Density estimation. CS 2750 Machine Learning. Announcements

CS 2750 Machine Learning. Lecture 3. Density estimation. CS 2750 Machine Learning. Announcements Lecture 3 Densty estmaton Mlos Hauskrecht mlos@cs.ptt.edu 5329 Sennott Square Next lecture: Matlab tutoral Announcements Rules for attendng the class: Regstered for credt Regstered for audt (only f there

More information

L10: Linear discriminants analysis

L10: Linear discriminants analysis L0: Lnear dscrmnants analyss Lnear dscrmnant analyss, two classes Lnear dscrmnant analyss, C classes LDA vs. PCA Lmtatons of LDA Varants of LDA Other dmensonalty reducton methods CSCE 666 Pattern Analyss

More information

Feature selection for intrusion detection. Slobodan Petrović NISlab, Gjøvik University College

Feature selection for intrusion detection. Slobodan Petrović NISlab, Gjøvik University College Feature selecton for ntruson detecton Slobodan Petrovć NISlab, Gjøvk Unversty College Contents The feature selecton problem Intruson detecton Traffc features relevant for IDS The CFS measure The mrmr measure

More information

Course outline. Financial Time Series Analysis. Overview. Data analysis. Predictive signal. Trading strategy

Course outline. Financial Time Series Analysis. Overview. Data analysis. Predictive signal. Trading strategy Fnancal Tme Seres Analyss Patrck McSharry patrck@mcsharry.net www.mcsharry.net Trnty Term 2014 Mathematcal Insttute Unversty of Oxford Course outlne 1. Data analyss, probablty, correlatons, vsualsaton

More information

BERNSTEIN POLYNOMIALS

BERNSTEIN POLYNOMIALS On-Lne Geometrc Modelng Notes BERNSTEIN POLYNOMIALS Kenneth I. Joy Vsualzaton and Graphcs Research Group Department of Computer Scence Unversty of Calforna, Davs Overvew Polynomals are ncredbly useful

More information

benefit is 2, paid if the policyholder dies within the year, and probability of death within the year is ).

benefit is 2, paid if the policyholder dies within the year, and probability of death within the year is ). REVIEW OF RISK MANAGEMENT CONCEPTS LOSS DISTRIBUTIONS AND INSURANCE Loss and nsurance: When someone s subject to the rsk of ncurrng a fnancal loss, the loss s generally modeled usng a random varable or

More information

Credit Limit Optimization (CLO) for Credit Cards

Credit Limit Optimization (CLO) for Credit Cards Credt Lmt Optmzaton (CLO) for Credt Cards Vay S. Desa CSCC IX, Ednburgh September 8, 2005 Copyrght 2003, SAS Insttute Inc. All rghts reserved. SAS Propretary Agenda Background Tradtonal approaches to credt

More information

An Alternative Way to Measure Private Equity Performance

An Alternative Way to Measure Private Equity Performance An Alternatve Way to Measure Prvate Equty Performance Peter Todd Parlux Investment Technology LLC Summary Internal Rate of Return (IRR) s probably the most common way to measure the performance of prvate

More information

A Genetic Programming Based Stock Price Predictor together with Mean-Variance Based Sell/Buy Actions

A Genetic Programming Based Stock Price Predictor together with Mean-Variance Based Sell/Buy Actions Proceedngs of the World Congress on Engneerng 28 Vol II WCE 28, July 2-4, 28, London, U.K. A Genetc Programmng Based Stock Prce Predctor together wth Mean-Varance Based Sell/Buy Actons Ramn Rajaboun and

More information

Performance Analysis and Coding Strategy of ECOC SVMs

Performance Analysis and Coding Strategy of ECOC SVMs Internatonal Journal of Grd and Dstrbuted Computng Vol.7, No. (04), pp.67-76 http://dx.do.org/0.457/jgdc.04.7..07 Performance Analyss and Codng Strategy of ECOC SVMs Zhgang Yan, and Yuanxuan Yang, School

More information

MATHEMATICAL ENGINEERING TECHNICAL REPORTS. Sequential Optimizing Investing Strategy with Neural Networks

MATHEMATICAL ENGINEERING TECHNICAL REPORTS. Sequential Optimizing Investing Strategy with Neural Networks MATHEMATICAL ENGINEERING TECHNICAL REPORTS Sequental Optmzng Investng Strategy wth Neural Networks Ryo ADACHI and Akmch TAKEMURA METR 2010 03 February 2010 DEPARTMENT OF MATHEMATICAL INFORMATICS GRADUATE

More information

A COLLABORATIVE TRADING MODEL BY SUPPORT VECTOR REGRESSION AND TS FUZZY RULE FOR DAILY STOCK TURNING POINTS DETECTION

A COLLABORATIVE TRADING MODEL BY SUPPORT VECTOR REGRESSION AND TS FUZZY RULE FOR DAILY STOCK TURNING POINTS DETECTION A COLLABORATIVE TRADING MODEL BY SUPPORT VECTOR REGRESSION AND TS FUZZY RULE FOR DAILY STOCK TURNING POINTS DETECTION JHENG-LONG WU, PEI-CHANN CHANG, KAI-TING CHANG Department of Informaton Management,

More information

LSSVM-ABC Algorithm for Stock Price prediction Osman Hegazy 1, Omar S. Soliman 2 and Mustafa Abdul Salam 3

LSSVM-ABC Algorithm for Stock Price prediction Osman Hegazy 1, Omar S. Soliman 2 and Mustafa Abdul Salam 3 LSSVM-ABC Algorthm for Stock Prce predcton Osman Hegazy 1, Omar S. Solman 2 and Mustafa Abdul Salam 3 1, 2 (Faculty of Computers and Informatcs, Caro Unversty, Egypt) 3 (Hgher echnologcal Insttute (H..I),

More information

A study on the ability of Support Vector Regression and Neural Networks to Forecast Basic Time Series Patterns

A study on the ability of Support Vector Regression and Neural Networks to Forecast Basic Time Series Patterns A study on the ablty of Support Vector Regresson and Neural Networks to Forecast Basc Tme Seres Patterns Sven F. Crone, Jose Guajardo 2, and Rchard Weber 2 Lancaster Unversty, Department of Management

More information

Institute of Informatics, Faculty of Business and Management, Brno University of Technology,Czech Republic

Institute of Informatics, Faculty of Business and Management, Brno University of Technology,Czech Republic Lagrange Multplers as Quanttatve Indcators n Economcs Ivan Mezník Insttute of Informatcs, Faculty of Busness and Management, Brno Unversty of TechnologCzech Republc Abstract The quanttatve role of Lagrange

More information

IMPACT ANALYSIS OF A CELLULAR PHONE

IMPACT ANALYSIS OF A CELLULAR PHONE 4 th ASA & μeta Internatonal Conference IMPACT AALYSIS OF A CELLULAR PHOE We Lu, 2 Hongy L Bejng FEAonlne Engneerng Co.,Ltd. Bejng, Chna ABSTRACT Drop test smulaton plays an mportant role n nvestgatng

More information

Financial market forecasting using a two-step kernel learning method for the support vector regression

Financial market forecasting using a two-step kernel learning method for the support vector regression Ann Oper Res (2010) 174: 103 120 DOI 10.1007/s10479-008-0357-7 Fnancal market forecastng usng a two-step kernel learnng method for the support vector regresson L Wang J Zhu Publshed onlne: 28 May 2008

More information

Face Verification Problem. Face Recognition Problem. Application: Access Control. Biometric Authentication. Face Verification (1:1 matching)

Face Verification Problem. Face Recognition Problem. Application: Access Control. Biometric Authentication. Face Verification (1:1 matching) Face Recognton Problem Face Verfcaton Problem Face Verfcaton (1:1 matchng) Querymage face query Face Recognton (1:N matchng) database Applcaton: Access Control www.vsage.com www.vsoncs.com Bometrc Authentcaton

More information

Single and multiple stage classifiers implementing logistic discrimination

Single and multiple stage classifiers implementing logistic discrimination Sngle and multple stage classfers mplementng logstc dscrmnaton Hélo Radke Bttencourt 1 Dens Alter de Olvera Moraes 2 Vctor Haertel 2 1 Pontfíca Unversdade Católca do Ro Grande do Sul - PUCRS Av. Ipranga,

More information

8.5 UNITARY AND HERMITIAN MATRICES. The conjugate transpose of a complex matrix A, denoted by A*, is given by

8.5 UNITARY AND HERMITIAN MATRICES. The conjugate transpose of a complex matrix A, denoted by A*, is given by 6 CHAPTER 8 COMPLEX VECTOR SPACES 5. Fnd the kernel of the lnear transformaton gven n Exercse 5. In Exercses 55 and 56, fnd the mage of v, for the ndcated composton, where and are gven by the followng

More information

A hybrid global optimization algorithm based on parallel chaos optimization and outlook algorithm

A hybrid global optimization algorithm based on parallel chaos optimization and outlook algorithm Avalable onlne www.ocpr.com Journal of Chemcal and Pharmaceutcal Research, 2014, 6(7):1884-1889 Research Artcle ISSN : 0975-7384 CODEN(USA) : JCPRC5 A hybrd global optmzaton algorthm based on parallel

More information

Bayesian Network Based Causal Relationship Identification and Funding Success Prediction in P2P Lending

Bayesian Network Based Causal Relationship Identification and Funding Success Prediction in P2P Lending Proceedngs of 2012 4th Internatonal Conference on Machne Learnng and Computng IPCSIT vol. 25 (2012) (2012) IACSIT Press, Sngapore Bayesan Network Based Causal Relatonshp Identfcaton and Fundng Success

More information

Logistic Regression. Steve Kroon

Logistic Regression. Steve Kroon Logstc Regresson Steve Kroon Course notes sectons: 24.3-24.4 Dsclamer: these notes do not explctly ndcate whether values are vectors or scalars, but expects the reader to dscern ths from the context. Scenaro

More information

Forecasting and Modelling Electricity Demand Using Anfis Predictor

Forecasting and Modelling Electricity Demand Using Anfis Predictor Journal of Mathematcs and Statstcs 7 (4): 75-8, 0 ISSN 549-3644 0 Scence Publcatons Forecastng and Modellng Electrcty Demand Usng Anfs Predctor M. Mordjaou and B. Boudjema Department of Electrcal Engneerng,

More information

THE DISTRIBUTION OF LOAN PORTFOLIO VALUE * Oldrich Alfons Vasicek

THE DISTRIBUTION OF LOAN PORTFOLIO VALUE * Oldrich Alfons Vasicek HE DISRIBUION OF LOAN PORFOLIO VALUE * Oldrch Alfons Vascek he amount of captal necessary to support a portfolo of debt securtes depends on the probablty dstrbuton of the portfolo loss. Consder a portfolo

More information

"Research Note" APPLICATION OF CHARGE SIMULATION METHOD TO ELECTRIC FIELD CALCULATION IN THE POWER CABLES *

Research Note APPLICATION OF CHARGE SIMULATION METHOD TO ELECTRIC FIELD CALCULATION IN THE POWER CABLES * Iranan Journal of Scence & Technology, Transacton B, Engneerng, ol. 30, No. B6, 789-794 rnted n The Islamc Republc of Iran, 006 Shraz Unversty "Research Note" ALICATION OF CHARGE SIMULATION METHOD TO ELECTRIC

More information

Research Article Integrated Model of Multiple Kernel Learning and Differential Evolution for EUR/USD Trading

Research Article Integrated Model of Multiple Kernel Learning and Differential Evolution for EUR/USD Trading Hndaw Publshng Corporaton e Scentfc World Journal, Artcle ID 914641, 12 pages http://dx.do.org/10.1155/2014/914641 Research Artcle Integrated Model of Multple Kernel Learnng and Dfferental Evoluton for

More information

Module 2 LOSSLESS IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur

Module 2 LOSSLESS IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur Module LOSSLESS IMAGE COMPRESSION SYSTEMS Lesson 3 Lossless Compresson: Huffman Codng Instructonal Objectves At the end of ths lesson, the students should be able to:. Defne and measure source entropy..

More information

Different Methods of Long-Term Electric Load Demand Forecasting; A Comprehensive Review

Different Methods of Long-Term Electric Load Demand Forecasting; A Comprehensive Review Dfferent Methods of Long-Term Electrc Load Demand Forecastng; A Comprehensve Revew L. Ghods* and M. Kalantar* Abstract: Long-term demand forecastng presents the frst step n plannng and developng future

More information

Statistical Methods to Develop Rating Models

Statistical Methods to Develop Rating Models Statstcal Methods to Develop Ratng Models [Evelyn Hayden and Danel Porath, Österrechsche Natonalbank and Unversty of Appled Scences at Manz] Source: The Basel II Rsk Parameters Estmaton, Valdaton, and

More information

v a 1 b 1 i, a 2 b 2 i,..., a n b n i.

v a 1 b 1 i, a 2 b 2 i,..., a n b n i. SECTION 8.4 COMPLEX VECTOR SPACES AND INNER PRODUCTS 455 8.4 COMPLEX VECTOR SPACES AND INNER PRODUCTS All the vector spaces we have studed thus far n the text are real vector spaces snce the scalars are

More information

An Evaluation of the Extended Logistic, Simple Logistic, and Gompertz Models for Forecasting Short Lifecycle Products and Services

An Evaluation of the Extended Logistic, Simple Logistic, and Gompertz Models for Forecasting Short Lifecycle Products and Services An Evaluaton of the Extended Logstc, Smple Logstc, and Gompertz Models for Forecastng Short Lfecycle Products and Servces Charles V. Trappey a,1, Hsn-yng Wu b a Professor (Management Scence), Natonal Chao

More information

The Application of Fractional Brownian Motion in Option Pricing

The Application of Fractional Brownian Motion in Option Pricing Vol. 0, No. (05), pp. 73-8 http://dx.do.org/0.457/jmue.05.0..6 The Applcaton of Fractonal Brownan Moton n Opton Prcng Qng-xn Zhou School of Basc Scence,arbn Unversty of Commerce,arbn zhouqngxn98@6.com

More information

Conversion between the vector and raster data structures using Fuzzy Geographical Entities

Conversion between the vector and raster data structures using Fuzzy Geographical Entities Converson between the vector and raster data structures usng Fuzzy Geographcal Enttes Cdála Fonte Department of Mathematcs Faculty of Scences and Technology Unversty of Combra, Apartado 38, 3 454 Combra,

More information

An Interest-Oriented Network Evolution Mechanism for Online Communities

An Interest-Oriented Network Evolution Mechanism for Online Communities An Interest-Orented Network Evoluton Mechansm for Onlne Communtes Cahong Sun and Xaopng Yang School of Informaton, Renmn Unversty of Chna, Bejng 100872, P.R. Chna {chsun,yang}@ruc.edu.cn Abstract. Onlne

More information

A Novel Methodology of Working Capital Management for Large. Public Constructions by Using Fuzzy S-curve Regression

A Novel Methodology of Working Capital Management for Large. Public Constructions by Using Fuzzy S-curve Regression Novel Methodology of Workng Captal Management for Large Publc Constructons by Usng Fuzzy S-curve Regresson Cheng-Wu Chen, Morrs H. L. Wang and Tng-Ya Hseh Department of Cvl Engneerng, Natonal Central Unversty,

More information

How To Calculate The Accountng Perod Of Nequalty

How To Calculate The Accountng Perod Of Nequalty Inequalty and The Accountng Perod Quentn Wodon and Shlomo Ytzha World Ban and Hebrew Unversty September Abstract Income nequalty typcally declnes wth the length of tme taen nto account for measurement.

More information

Meta-Analysis of Hazard Ratios

Meta-Analysis of Hazard Ratios NCSS Statstcal Softare Chapter 458 Meta-Analyss of Hazard Ratos Introducton Ths module performs a meta-analyss on a set of to-group, tme to event (survval), studes n hch some data may be censored. These

More information

Prediction of Stock Market Index Movement by Ten Data Mining Techniques

Prediction of Stock Market Index Movement by Ten Data Mining Techniques Vol. 3, o. Modern Appled Scence Predcton of Stoc Maret Index Movement by en Data Mnng echnques Phchhang Ou (Correspondng author) School of Busness, Unversty of Shangha for Scence and echnology Rm 0, Internatonal

More information

Lecture 5,6 Linear Methods for Classification. Summary

Lecture 5,6 Linear Methods for Classification. Summary Lecture 5,6 Lnear Methods for Classfcaton Rce ELEC 697 Farnaz Koushanfar Fall 2006 Summary Bayes Classfers Lnear Classfers Lnear regresson of an ndcator matrx Lnear dscrmnant analyss (LDA) Logstc regresson

More information

Support Vector Machine Model for Currency Crisis Discrimination. Arindam Chaudhuri 1. Abstract

Support Vector Machine Model for Currency Crisis Discrimination. Arindam Chaudhuri 1. Abstract Support Vector Machne Model for Currency Crss Dscrmnaton Arndam Chaudhur Abstract Support Vector Machne (SVM) s powerful classfcaton technque based on the dea of structural rsk mnmzaton. Use of kernel

More information

NEURO-FUZZY INFERENCE SYSTEM FOR E-COMMERCE WEBSITE EVALUATION

NEURO-FUZZY INFERENCE SYSTEM FOR E-COMMERCE WEBSITE EVALUATION NEURO-FUZZY INFERENE SYSTEM FOR E-OMMERE WEBSITE EVALUATION Huan Lu, School of Software, Harbn Unversty of Scence and Technology, Harbn, hna Faculty of Appled Mathematcs and omputer Scence, Belarusan State

More information

A Hybrid Model for Forecasting Sales in Turkish Paint Industry

A Hybrid Model for Forecasting Sales in Turkish Paint Industry Internatonal Journal of Computatonal Intellgence Systems, Vol.2, No. 3 (October, 2009), 277-287 A Hybrd Model for Forecastng Sales n Turksh Pant Industry Alp Ustundag * Department of Industral Engneerng,

More information

Can Auto Liability Insurance Purchases Signal Risk Attitude?

Can Auto Liability Insurance Purchases Signal Risk Attitude? Internatonal Journal of Busness and Economcs, 2011, Vol. 10, No. 2, 159-164 Can Auto Lablty Insurance Purchases Sgnal Rsk Atttude? Chu-Shu L Department of Internatonal Busness, Asa Unversty, Tawan Sheng-Chang

More information

SVM Tutorial: Classification, Regression, and Ranking

SVM Tutorial: Classification, Regression, and Ranking SVM Tutoral: Classfcaton, Regresson, and Rankng Hwanjo Yu and Sungchul Km 1 Introducton Support Vector Machnes(SVMs) have been extensvely researched n the data mnng and machne learnng communtes for the

More information

BUSINESS PROCESS PERFORMANCE MANAGEMENT USING BAYESIAN BELIEF NETWORK. 0688, dskim@ssu.ac.kr

BUSINESS PROCESS PERFORMANCE MANAGEMENT USING BAYESIAN BELIEF NETWORK. 0688, dskim@ssu.ac.kr Proceedngs of the 41st Internatonal Conference on Computers & Industral Engneerng BUSINESS PROCESS PERFORMANCE MANAGEMENT USING BAYESIAN BELIEF NETWORK Yeong-bn Mn 1, Yongwoo Shn 2, Km Jeehong 1, Dongsoo

More information

Improved SVM in Cloud Computing Information Mining

Improved SVM in Cloud Computing Information Mining Internatonal Journal of Grd Dstrbuton Computng Vol.8, No.1 (015), pp.33-40 http://dx.do.org/10.1457/jgdc.015.8.1.04 Improved n Cloud Computng Informaton Mnng Lvshuhong (ZhengDe polytechnc college JangSu

More information

CHAPTER 14 MORE ABOUT REGRESSION

CHAPTER 14 MORE ABOUT REGRESSION CHAPTER 14 MORE ABOUT REGRESSION We learned n Chapter 5 that often a straght lne descrbes the pattern of a relatonshp between two quanttatve varables. For nstance, n Example 5.1 we explored the relatonshp

More information

An Efficient and Simplified Model for Forecasting using SRM

An Efficient and Simplified Model for Forecasting using SRM HAFIZ MUHAMMAD SHAHZAD ASIF*, MUHAMMAD FAISAL HAYAT*, AND TAUQIR AHMAD* RECEIVED ON 15.04.013 ACCEPTED ON 09.01.014 ABSTRACT Learnng form contnuous fnancal systems play a vtal role n enterprse operatons.

More information

Modelling of Web Domain Visits by Radial Basis Function Neural Networks and Support Vector Machine Regression

Modelling of Web Domain Visits by Radial Basis Function Neural Networks and Support Vector Machine Regression Modellng of Web Doman Vsts by Radal Bass Functon Neural Networks and Support Vector Machne Regresson Vladmír Olej, Jana Flpová Insttute of System Engneerng and Informatcs Faculty of Economcs and Admnstraton,

More information

THE APPLICATION OF DATA MINING TECHNIQUES AND MULTIPLE CLASSIFIERS TO MARKETING DECISION

THE APPLICATION OF DATA MINING TECHNIQUES AND MULTIPLE CLASSIFIERS TO MARKETING DECISION Internatonal Journal of Electronc Busness Management, Vol. 3, No. 4, pp. 30-30 (2005) 30 THE APPLICATION OF DATA MINING TECHNIQUES AND MULTIPLE CLASSIFIERS TO MARKETING DECISION Yu-Mn Chang *, Yu-Cheh

More information

ECE544NA Final Project: Robust Machine Learning Hardware via Classifier Ensemble

ECE544NA Final Project: Robust Machine Learning Hardware via Classifier Ensemble 1 ECE544NA Fnal Project: Robust Machne Learnng Hardware va Classfer Ensemble Sa Zhang, szhang12@llnos.edu Dept. of Electr. & Comput. Eng., Unv. of Illnos at Urbana-Champagn, Urbana, IL, USA Abstract In

More information

ANALYZING THE RELATIONSHIPS BETWEEN QUALITY, TIME, AND COST IN PROJECT MANAGEMENT DECISION MAKING

ANALYZING THE RELATIONSHIPS BETWEEN QUALITY, TIME, AND COST IN PROJECT MANAGEMENT DECISION MAKING ANALYZING THE RELATIONSHIPS BETWEEN QUALITY, TIME, AND COST IN PROJECT MANAGEMENT DECISION MAKING Matthew J. Lberatore, Department of Management and Operatons, Vllanova Unversty, Vllanova, PA 19085, 610-519-4390,

More information

Exhaustive Regression. An Exploration of Regression-Based Data Mining Techniques Using Super Computation

Exhaustive Regression. An Exploration of Regression-Based Data Mining Techniques Using Super Computation Exhaustve Regresson An Exploraton of Regresson-Based Data Mnng Technques Usng Super Computaton Antony Daves, Ph.D. Assocate Professor of Economcs Duquesne Unversty Pttsburgh, PA 58 Research Fellow The

More information

How Sets of Coherent Probabilities May Serve as Models for Degrees of Incoherence

How Sets of Coherent Probabilities May Serve as Models for Degrees of Incoherence 1 st Internatonal Symposum on Imprecse Probabltes and Ther Applcatons, Ghent, Belgum, 29 June 2 July 1999 How Sets of Coherent Probabltes May Serve as Models for Degrees of Incoherence Mar J. Schervsh

More information

On the Optimal Control of a Cascade of Hydro-Electric Power Stations

On the Optimal Control of a Cascade of Hydro-Electric Power Stations On the Optmal Control of a Cascade of Hydro-Electrc Power Statons M.C.M. Guedes a, A.F. Rbero a, G.V. Smrnov b and S. Vlela c a Department of Mathematcs, School of Scences, Unversty of Porto, Portugal;

More information

How To Know The Components Of Mean Squared Error Of Herarchcal Estmator S

How To Know The Components Of Mean Squared Error Of Herarchcal Estmator S S C H E D A E I N F O R M A T I C A E VOLUME 0 0 On Mean Squared Error of Herarchcal Estmator Stans law Brodowsk Faculty of Physcs, Astronomy, and Appled Computer Scence, Jagellonan Unversty, Reymonta

More information

1. Introduction. Graham Kendall School of Computer Science and IT ASAP Research Group University of Nottingham Nottingham, NG8 1BB gxk@cs.nott.ac.

1. Introduction. Graham Kendall School of Computer Science and IT ASAP Research Group University of Nottingham Nottingham, NG8 1BB gxk@cs.nott.ac. The Co-evoluton of Tradng Strateges n A Mult-agent Based Smulated Stock Market Through the Integraton of Indvdual Learnng and Socal Learnng Graham Kendall School of Computer Scence and IT ASAP Research

More information

APPLICATION OF PROBE DATA COLLECTED VIA INFRARED BEACONS TO TRAFFIC MANEGEMENT

APPLICATION OF PROBE DATA COLLECTED VIA INFRARED BEACONS TO TRAFFIC MANEGEMENT APPLICATION OF PROBE DATA COLLECTED VIA INFRARED BEACONS TO TRAFFIC MANEGEMENT Toshhko Oda (1), Kochro Iwaoka (2) (1), (2) Infrastructure Systems Busness Unt, Panasonc System Networks Co., Ltd. Saedo-cho

More information

Searching for Interacting Features for Spam Filtering

Searching for Interacting Features for Spam Filtering Searchng for Interactng Features for Spam Flterng Chuanlang Chen 1, Yun-Chao Gong 2, Rongfang Be 1,, and X. Z. Gao 3 1 Department of Computer Scence, Bejng Normal Unversty, Bejng 100875, Chna 2 Software

More information

Mining Feature Importance: Applying Evolutionary Algorithms within a Web-based Educational System

Mining Feature Importance: Applying Evolutionary Algorithms within a Web-based Educational System Mnng Feature Importance: Applyng Evolutonary Algorthms wthn a Web-based Educatonal System Behrouz MINAEI-BIDGOLI 1, and Gerd KORTEMEYER 2, and Wllam F. PUNCH 1 1 Genetc Algorthms Research and Applcatons

More information

1. Fundamentals of probability theory 2. Emergence of communication traffic 3. Stochastic & Markovian Processes (SP & MP)

1. Fundamentals of probability theory 2. Emergence of communication traffic 3. Stochastic & Markovian Processes (SP & MP) 6.3 / -- Communcaton Networks II (Görg) SS20 -- www.comnets.un-bremen.de Communcaton Networks II Contents. Fundamentals of probablty theory 2. Emergence of communcaton traffc 3. Stochastc & Markovan Processes

More information

Characterization of Assembly. Variation Analysis Methods. A Thesis. Presented to the. Department of Mechanical Engineering. Brigham Young University

Characterization of Assembly. Variation Analysis Methods. A Thesis. Presented to the. Department of Mechanical Engineering. Brigham Young University Characterzaton of Assembly Varaton Analyss Methods A Thess Presented to the Department of Mechancal Engneerng Brgham Young Unversty In Partal Fulfllment of the Requrements for the Degree Master of Scence

More information

Stock volatility forecasting using Swarm optimized Hybrid Network

Stock volatility forecasting using Swarm optimized Hybrid Network Web Ste: www.jettcs.org Emal: edtor@jettcs.org, edtorjettcs@gmal.com Volume 2, Issue 3, May June 23 ISSN 2278-686 Stock volatlty forecastng usng Swarm optmzed Hybrd Network Puspanjal Mohapatra, Soumya

More information

Realistic Image Synthesis

Realistic Image Synthesis Realstc Image Synthess - Combned Samplng and Path Tracng - Phlpp Slusallek Karol Myszkowsk Vncent Pegoraro Overvew: Today Combned Samplng (Multple Importance Samplng) Renderng and Measurng Equaton Random

More information

Project Networks With Mixed-Time Constraints

Project Networks With Mixed-Time Constraints Project Networs Wth Mxed-Tme Constrants L Caccetta and B Wattananon Western Australan Centre of Excellence n Industral Optmsaton (WACEIO) Curtn Unversty of Technology GPO Box U1987 Perth Western Australa

More information

Trade Adjustment and Productivity in Large Crises. Online Appendix May 2013. Appendix A: Derivation of Equations for Productivity

Trade Adjustment and Productivity in Large Crises. Online Appendix May 2013. Appendix A: Derivation of Equations for Productivity Trade Adjustment Productvty n Large Crses Gta Gopnath Department of Economcs Harvard Unversty NBER Brent Neman Booth School of Busness Unversty of Chcago NBER Onlne Appendx May 2013 Appendx A: Dervaton

More information

Solving Factored MDPs with Continuous and Discrete Variables

Solving Factored MDPs with Continuous and Discrete Variables Solvng Factored MPs wth Contnuous and screte Varables Carlos Guestrn Berkeley Research Center Intel Corporaton Mlos Hauskrecht epartment of Computer Scence Unversty of Pttsburgh Branslav Kveton Intellgent

More information

Gender Classification for Real-Time Audience Analysis System

Gender Classification for Real-Time Audience Analysis System Gender Classfcaton for Real-Tme Audence Analyss System Vladmr Khryashchev, Lev Shmaglt, Andrey Shemyakov, Anton Lebedev Yaroslavl State Unversty Yaroslavl, Russa vhr@yandex.ru, shmaglt_lev@yahoo.com, andrey.shemakov@gmal.com,

More information

The OC Curve of Attribute Acceptance Plans

The OC Curve of Attribute Acceptance Plans The OC Curve of Attrbute Acceptance Plans The Operatng Characterstc (OC) curve descrbes the probablty of acceptng a lot as a functon of the lot s qualty. Fgure 1 shows a typcal OC Curve. 10 8 6 4 1 3 4

More information

Optimal Bidding Strategies for Generation Companies in a Day-Ahead Electricity Market with Risk Management Taken into Account

Optimal Bidding Strategies for Generation Companies in a Day-Ahead Electricity Market with Risk Management Taken into Account Amercan J. of Engneerng and Appled Scences (): 8-6, 009 ISSN 94-700 009 Scence Publcatons Optmal Bddng Strateges for Generaton Companes n a Day-Ahead Electrcty Market wth Rsk Management Taken nto Account

More information

An artificial Neural Network approach to monitor and diagnose multi-attribute quality control processes. S. T. A. Niaki*

An artificial Neural Network approach to monitor and diagnose multi-attribute quality control processes. S. T. A. Niaki* Journal of Industral Engneerng Internatonal July 008, Vol. 4, No. 7, 04 Islamc Azad Unversty, South Tehran Branch An artfcal Neural Network approach to montor and dagnose multattrbute qualty control processes

More information

J. Parallel Distrib. Comput.

J. Parallel Distrib. Comput. J. Parallel Dstrb. Comput. 71 (2011) 62 76 Contents lsts avalable at ScenceDrect J. Parallel Dstrb. Comput. journal homepage: www.elsever.com/locate/jpdc Optmzng server placement n dstrbuted systems n

More information

Fault tolerance in cloud technologies presented as a service

Fault tolerance in cloud technologies presented as a service Internatonal Scentfc Conference Computer Scence 2015 Pavel Dzhunev, PhD student Fault tolerance n cloud technologes presented as a servce INTRODUCTION Improvements n technques for vrtualzaton and performance

More information

A DATA MINING APPLICATION IN A STUDENT DATABASE

A DATA MINING APPLICATION IN A STUDENT DATABASE JOURNAL OF AERONAUTICS AND SPACE TECHNOLOGIES JULY 005 VOLUME NUMBER (53-57) A DATA MINING APPLICATION IN A STUDENT DATABASE Şenol Zafer ERDOĞAN Maltepe Ünversty Faculty of Engneerng Büyükbakkalköy-Istanbul

More information

Performance attribution for multi-layered investment decisions

Performance attribution for multi-layered investment decisions Performance attrbuton for mult-layered nvestment decsons 880 Thrd Avenue 7th Floor Ne Yor, NY 10022 212.866.9200 t 212.866.9201 f qsnvestors.com Inna Oounova Head of Strategc Asset Allocaton Portfolo Management

More information

An Analysis of Central Processor Scheduling in Multiprogrammed Computer Systems

An Analysis of Central Processor Scheduling in Multiprogrammed Computer Systems STAN-CS-73-355 I SU-SE-73-013 An Analyss of Central Processor Schedulng n Multprogrammed Computer Systems (Dgest Edton) by Thomas G. Prce October 1972 Techncal Report No. 57 Reproducton n whole or n part

More information

HowHow to Find the Best Online Stock Broker

HowHow to Find the Best Online Stock Broker A GENERAL APPROACH FOR SECURITY MONITORING AND PREVENTIVE CONTROL OF NETWORKS WITH LARGE WIND POWER PRODUCTION Helena Vasconcelos INESC Porto hvasconcelos@nescportopt J N Fdalgo INESC Porto and FEUP jfdalgo@nescportopt

More information

Descriptive Models. Cluster Analysis. Example. General Applications of Clustering. Examples of Clustering Applications

Descriptive Models. Cluster Analysis. Example. General Applications of Clustering. Examples of Clustering Applications CMSC828G Prncples of Data Mnng Lecture #9 Today s Readng: HMS, chapter 9 Today s Lecture: Descrptve Modelng Clusterng Algorthms Descrptve Models model presents the man features of the data, a global summary

More information

Damage detection in composite laminates using coin-tap method

Damage detection in composite laminates using coin-tap method Damage detecton n composte lamnates usng con-tap method S.J. Km Korea Aerospace Research Insttute, 45 Eoeun-Dong, Youseong-Gu, 35-333 Daejeon, Republc of Korea yaeln@kar.re.kr 45 The con-tap test has the

More information

Efficient Project Portfolio as a tool for Enterprise Risk Management

Efficient Project Portfolio as a tool for Enterprise Risk Management Effcent Proect Portfolo as a tool for Enterprse Rsk Management Valentn O. Nkonov Ural State Techncal Unversty Growth Traectory Consultng Company January 5, 27 Effcent Proect Portfolo as a tool for Enterprse

More information

Using Association Rule Mining: Stock Market Events Prediction from Financial News

Using Association Rule Mining: Stock Market Events Prediction from Financial News Usng Assocaton Rule Mnng: Stock Market Events Predcton from Fnancal News Shubhang S. Umbarkar 1, Prof. S. S. Nandgaonkar 2 1 Savtrba Phule Pune Unversty, Vdya Pratshtan s College of Engneerng, Vdya Nagar,

More information

Research Article Enhanced Two-Step Method via Relaxed Order of α-satisfactory Degrees for Fuzzy Multiobjective Optimization

Research Article Enhanced Two-Step Method via Relaxed Order of α-satisfactory Degrees for Fuzzy Multiobjective Optimization Hndaw Publshng Corporaton Mathematcal Problems n Engneerng Artcle ID 867836 pages http://dxdoorg/055/204/867836 Research Artcle Enhanced Two-Step Method va Relaxed Order of α-satsfactory Degrees for Fuzzy

More information

Fast Fuzzy Clustering of Web Page Collections

Fast Fuzzy Clustering of Web Page Collections Fast Fuzzy Clusterng of Web Page Collectons Chrstan Borgelt and Andreas Nürnberger Dept. of Knowledge Processng and Language Engneerng Otto-von-Guercke-Unversty of Magdeburg Unverstätsplatz, D-396 Magdeburg,

More information

A DYNAMIC CRASHING METHOD FOR PROJECT MANAGEMENT USING SIMULATION-BASED OPTIMIZATION. Michael E. Kuhl Radhamés A. Tolentino-Peña

A DYNAMIC CRASHING METHOD FOR PROJECT MANAGEMENT USING SIMULATION-BASED OPTIMIZATION. Michael E. Kuhl Radhamés A. Tolentino-Peña Proceedngs of the 2008 Wnter Smulaton Conference S. J. Mason, R. R. Hll, L. Mönch, O. Rose, T. Jefferson, J. W. Fowler eds. A DYNAMIC CRASHING METHOD FOR PROJECT MANAGEMENT USING SIMULATION-BASED OPTIMIZATION

More information

Calculating the high frequency transmission line parameters of power cables

Calculating the high frequency transmission line parameters of power cables < ' Calculatng the hgh frequency transmsson lne parameters of power cables Authors: Dr. John Dcknson, Laboratory Servces Manager, N 0 RW E B Communcatons Mr. Peter J. Ncholson, Project Assgnment Manager,

More information

PAS: A Packet Accounting System to Limit the Effects of DoS & DDoS. Debish Fesehaye & Klara Naherstedt University of Illinois-Urbana Champaign

PAS: A Packet Accounting System to Limit the Effects of DoS & DDoS. Debish Fesehaye & Klara Naherstedt University of Illinois-Urbana Champaign PAS: A Packet Accountng System to Lmt the Effects of DoS & DDoS Debsh Fesehaye & Klara Naherstedt Unversty of Illnos-Urbana Champagn DoS and DDoS DDoS attacks are ncreasng threats to our dgtal world. Exstng

More information

CHAPTER 5 RELATIONSHIPS BETWEEN QUANTITATIVE VARIABLES

CHAPTER 5 RELATIONSHIPS BETWEEN QUANTITATIVE VARIABLES CHAPTER 5 RELATIONSHIPS BETWEEN QUANTITATIVE VARIABLES In ths chapter, we wll learn how to descrbe the relatonshp between two quanttatve varables. Remember (from Chapter 2) that the terms quanttatve varable

More information

1 Example 1: Axis-aligned rectangles

1 Example 1: Axis-aligned rectangles COS 511: Theoretcal Machne Learnng Lecturer: Rob Schapre Lecture # 6 Scrbe: Aaron Schld February 21, 2013 Last class, we dscussed an analogue for Occam s Razor for nfnte hypothess spaces that, n conjuncton

More information

Loop Parallelization

Loop Parallelization - - Loop Parallelzaton C-52 Complaton steps: nested loops operatng on arrays, sequentell executon of teraton space DECLARE B[..,..+] FOR I :=.. FOR J :=.. I B[I,J] := B[I-,J]+B[I-,J-] ED FOR ED FOR analyze

More information