A Perfect QoS Routng Algorthm for Fndng the Best Path for Dynamc Networks Hazem M. El-Bakry Faculty of Computer Scence & Informaton Systems, Mansoura Unversty, EGYPT helbakry20@yahoo.com Nkos Mastoraks Dept. of Computer Scence Mltary Insttutons of Unversty Educaton (MIUE) - Hellenc Academy, Greece Abstract :- It s hghly desrable to protect data traffc from unexpected changes as well as provde effectve network utlzaton n the nternetworkng era. In ths paper the QoS management ssue that utlzng the actve network technology s dscussed. Such algorthm s based on the proposed work presented n [15]. Actve networks seem to be partcularly useful n the context of QoS support. The Actve QoS Routng (AQR) algorthm whch s based on On-demand routng s mplemented ncorporatng the product of avalable bt rate and delay for fndng the best path for dynamc networks usng the actve network test bed ANTS. It s nferred that wth background traffc, the AQR fnds alternatve paths very quckly and the delay and subsequently the jtter nvolved are reduced sgnfcantly. In ths paper the varant of AQR mplemented s demonstrated to be more useful n reducng the jtter when the overall traffc n the network s heavy and has useful applcaton n fndng effectve QoS routng n ad-hoc networks as well as defendng DDoS attacks by dentfyng the attack traffc path usng QoS regulatons. The man achevement of ths paper s the fast attack detecton algorthm. Such algorthm based on performng cross correlaton n the frequency doman between data traffc and the nput weghts of fast tme delay neural networks (FTDNNs). It s proved mathematcally and practcally that the number of computaton steps requred for the presented FTDNNs s less than that needed by conventonal tme delay neural networks (CTDNNs). Smulaton results usng MATLAB confrm the theoretcal computatons. Keywords:- Routng Algorthm, Data Protecton, Best Path, Dynamc Networks, Fast Attack Detecton, Neural Networks 1 Introducton The wdespread growth of the Internet and the development of streamng applcatons drected the Internet socety to focus on the desgn and development of archtectures and protocols that would provde the requested level of Qualty of Servce (QoS) [15-26]. QoS s an ntutve concept defned as the collectve effect of the servce performance whch determnes the degree of satsfacton of a user of the servce or a measure of how good a servce s, as presented to the user. It s expressed n user understandable language and manfests tself n a number of parameters, all of whch have ether subjectve or objectve values. The goal of AN s to support customzed protocol mechansms that can be ntroduced n a network. Dfferences n known AN approaches concern, e.g., the queston of whether remote applcatons should be able to download protocol mechansms to a node (router) or whether ths rght should be reserved to the operators of nodes, and the queston of whether the code for these mechansms should be carred as an addtonal payload by the data packets n transt or whether shppng and nstallng such code should be separated from the ssue of data transfer. Smulaton results of ths paper are compared wth the results presented n [15] and hgh mprovement n the QoS parameter jtter s apprecated when appled on actve network wth slght modfcatons n the applcaton of the proposed algorthm. In addton, the man objectve of ths paper s to mprove the speed of tme delay neural networks for fast attack detecton. The purpose s to ISSN: 1109-2742 1123 Issue 12, Volume 7, December 2008
perform the testng process n the frequency doman nstead of the tme doman. Ths approach was successfully appled for sub-mage detecton usng fast neural networks (FNNs) as proposed n [1,2,3]. Furthermore, t was used for fast face detecton [7,9], and fast rs detecton [8]. Another dea to further ncrease the speed of FNNs through mage decomposton was suggested n [7]. FNNs for detectng a certan code n one dmensonal seral stream of sequental data were descrbed n [4,5]. Compared wth conventonal neural networks, FNNs based on cross correlaton between the tested data and the nput weghts of neural networks n the frequency doman showed a sgnfcant reducton n the number of computaton steps requred for certan data detecton [1,2,3,4,5,7,8,9,11,12]. Here, we make use of the prevous theory on FNNs mplemented n the frequency doman to ncrease the speed of tme delay neural networks for fast attack detecton. 2 QoS Routng Actve Networks (AN) s a framework where network elements, essentally routers and swtches are programmable. Programs that are njected nto the network are executed by the network elements to acheve hgher flexblty and to present new capabltes. Wth the help of AN programs can be njected n to the network and executed wth n the network tself wthout nvolvng the end systems. QoS routng s a term used for routng mechansms whch consder QoS. It suffers from the statc nature of networkng today. QoS routng s bound to the use of common metrcs and procedures whch usually rely on dstrbuted network performance data (ncreasng network traffc) and sophstcated algorthms (ncreasng the processor load on routers) but usually yeld consderable mprovements only for certan classes of applcatons. In [16], the authors used randomness at the lnk level to acheve balance between the safety rate and delay of the routng path. A) QOS SUPPORT IN ACTIVE NETWORKS Possble utlzatons of AN to support QoS roughly fall nto the followng categores [15]: 1. Mechansms whch transfer applcaton layer functonalty nto the network: 2. Mechansms whch are usually assocated wth layers 3 or 4: AQR mechansm falls n ths category. 3. Mechansms whch rely on non-actve QoS provsonng mechansms: Here, AN are merely used to add greater flexblty to the specfcaton of a QoS request. B) AQR OPERATION AQR s an on demand based QoS routng algorthm. The AQR algorthm can be descrbed as follows as n [15]. 1. The AQR sender calculates all non-cyclc paths to the destnaton from the lnk state routng table. 2. A probng packet carryng the QoS requrements, code for QoS calculaton, the sender and recever s addresses and a lst of vsted nodes s sent to each frst hop of these paths. 3. Upon recevng an AQR probng packet, an AQR complant transt node executes the AA code (contaned n the packet or cached), whch checks f the mnmum QoS requrements found n the packet can be met (f a threshold say a maxmum delay s exceeded, the packet s dropped), compares and updates the QoS data, adds tself to the lst of already vsted nodes, Executes the code of the AQR sender, startng at step 2 except that no probng packets are sent to the source or to any other already vsted node (packets are multcast n the proper drecton at each AQR-complant transt node). 4. Only packets whch conform to the mnmum QoS requrements reach the AQR recever, where a lst of vald paths s generated. After a predefned perod, the best path s chosen and communcated to the sender. If there s more than one best path, the traffc s splt among the best paths. C) DESIGN CONSIDERATIONS The resources used are network bandwdth and CPU cycles to load the network. The system assumes overlay mode of deployng actve code n network. The securty ssues are taken care of by the ANTS system tself For AQR, these parameters are delay and avalable bandwdth. For each of these parameters, a channel s gven the propertes of the underlyng network.the overall flow s depcted as follows. ISSN: 1109-2742 1124 Issue 12, Volume 7, December 2008
There are three mportant modules n the system. They are Applcaton, Protocol and capsule modules. The applcaton module conssts of Router, Sender, Recever sub modules. The capsule module conssts of DataCapsule, Probecapsule and Replycapsule sub modules. 3 Implementaton Detals The proposed AQR algorthm s mplemented for the sample network topology shown below. In ANTS, the network topology s confgured. The topology used s shown below. The routng protocol s mplemented usng the ANTS toolkt. ANTS s an EE runnng over the node OS Janos. The protocol s mplemented as an actve applcaton that runs on each of the nodes of the network. In ANTS, there are three specal classes to create capsules, protocols and actve applcatons. The detaled nformaton about ANTS s provded n [8].A new protocol s developed by sub classng the vrtual class Protocol. Ths requres dentfyng all of the dfferent types of packet that wll enter the network by ther dfferent forwardng routnes. Each type of packet and ts forwardng routne s specfed by sub classng the vrtual class Capsule. A new applcaton s developed by sub classng the vrtual class Applcaton. An nstance of the class Node represents the local ants runtme. A new protocol and applcaton are used by creatng nstances of ther classes and attachng them to node. The applcaton s connected to the node n order to send and receve capsules from the network. The protocol s regstered wth the node n order for the network to be able to obtan ts code when t s needed. Ths model allows customzaton by the network users assumng that a network of actve nodes already exsts and s up and runnng. The actve nodes, however, wll often be part of the system under study, partcularly for expermentaton wth dfferent topologes, applcaton workloads, and node servces. For these purposes, ANTS provdes two facltes: Confguraton tools allow a network topology complete wth applcatons to be managed. Ths ncludes the calculaton of routes and the ntalzaton of local node confguratons. A node extenson archtecture allows dfferent nodes to support dfferent servce components, e.g., multcast, cachng, trans codng, etc., as approprate. Extensons are developed by sub classng the vrtual class Extenson. The Protocol class s extended to form the AQR Protocol class. There are three capsules Probe Capsule, Reply Capsule and AQR Data Capsule. These capsules form the protocol tself. A) Capsule Types A capsule s a combnaton of a packet and ts forwardng routne; the forwardng routne s executed at every actve node the capsule vsts whle n the network. New types of capsule, wth dfferent forwardng routnes, are developed by sub classng the vrtual class Capsule. The capsule starts wth the sequence d of the data capsule. Then the tmestamp of when the capsule s sent s stored. Then some flags and ndex values are stored. The path ndex s the ponter to the next node to reach. The path vald s a flag that s set true when the capsule carres a path to travel. Otherwse default shortest path s used to reach the destnaton. There s another flag, aqrflag whch s set to dfferentate between capsules sent usng AQR and capsules sent usng SPR. Fnally the path to travel s stored. The actve code of ths capsule s the forwardng routne to gude the capsule to the destnaton. It checks for the pathvald flag and f set uses the path n the capsule to travel. Otherwse shortest path s used to forward the capsule to the destnaton. For vald path, the next node s obtaned from the path stored usng the path ndex ponter. The data capsule s then forwarded to ths node. Once the nodes n the path stored get over, t ndcates that the capsule has reached the destnaton. The capsule s then delvered to the recever applcaton. B) Applcaton ISSN: 1109-2742 1125 Issue 12, Volume 7, December 2008
Applcatons are the enttes that make use of the network to send and receve capsules as well as run ndependent actvtes. New applcatons are developed by sub classng the vrtual class Applcaton. It provdes access to the node and ts servces. C) Other modules The data traffc s generated by a thread AQR Data Sender. It s run on the sender node. Ths data traffc s receved by another applcaton on the recever node, AQR Data Recever. There are also other utlty classes. 4 Confguraton Expermentng wth an actve network requres that a network topology and ANTS provdes some tools and nfrastructure conventons to automate ths process. Frst, entre network confguratons, node addresses, applcatons and all ntalzaton parameters are specfed text fles that are read by Confguraton Manager Class to start one of ts nodes locally. Ths ncludes creatng the node runtme, applcatons, and extensons, connectng them to each other, and startng ther operaton. A) Performance Analyss The algorthm s run under varous test scenaros and the test results are presented n ths secton. The QoS performance s analyzed for data traffc wth and wthout background traffc.tg s a packet Traffc Generator (TG) tool that can be used to characterze the performance of packetswtched network communcaton protocols. The TG program generates and receves one-way packet traffc streams transmtted from the UNIX user level process between traffc source and traffc snk nodes n a network. TG s used for generatng the background traffc. The TG servng as the traffc source always logs datagram transmt tmes. Ths mode of operaton may be useful for analyzng network blockng characterstcs or for loadng a network. The TG servng as a traffc snk logs all receved data grams. The behavor of the protocol s analyzed after the ntermedate routers are loaded to make the default shortest path congested. For the same topology the shortest path routng s tested. Two applcatons are run on the source and destnaton nodes. The actve code n the capsule s used to route the capsule through the shortest path. The delay changes and the routes taken are recorded and analyzed. B) Performance analyss wthout background traffc In ths test run, the data capsules of length 100 bytes are used to test the protocol. No background traffc s used to load the shortest path. Instead the routers along the shortest path are loaded to ncrease the delay along the shortest path. In ths scenaro the test s run and the results shown below. The delay values are relatve and are not exact wth respect to the sender n all the data presented. C) Performance analyss wth background traffc type 1 In ths test run, the data capsules of length 100 bytes are used to test the protocol. Also background traffc s ntroduced along the shortest path to ncrease the delay along the shortest path. The traffc s ntroduced usng the tool TG. The background traffc s udp. Packets of length 500 bytes at a rate of 100Mbps are used for the traffc. 5 Fast Attack Detecton usng Neural Networks Fndng a certan attack, n the ncomng seral data, s a searchng problem. Frst neural networks are traned to classfy attack from non attack examples and ths s done n tme doman. In attack detecton phase, each poston n the ncomng matrx s tested for presence or absence of an attack. At each poston n the nput one dmensonal matrx, each sub-matrx s multpled by a wndow of weghts, whch has the same sze as the sub-matrx. The outputs of neurons n the hdden layer are multpled by the weghts of the output layer. When the fnal output s hgh, ths means that the sub-matrx under test contans an attack and vce versa. Thus, we may conclude that ths searchng problem s a cross correlaton between the ncomng seral data and the weghts of neurons n the hdden layer. The convoluton theorem n mathematcal analyss says that a convoluton of f wth h s dentcal to the result of the followng steps: let F and H be ISSN: 1109-2742 1126 Issue 12, Volume 7, December 2008
the results of the Fourer Transformaton of f and h n the frequency doman. Multply F and H* n the frequency doman pont by pont and then transform ths product nto the spatal doman va the nverse Fourer Transform. As a result, these cross correlatons can be represented by a product n the frequency doman. Thus, by usng cross correlaton n the frequency doman, speed up n an order of magntude can be acheved durng the detecton process [1,2,3,4,5,7,8,9,14]. Assume that the sze of the attack code s 1xn. In attack detecton phase, a sub matrx I of sze 1xn (sldng wndow) s extracted from the tested matrx, whch has a sze of 1xN. Such sub matrx, whch may be an attack code, s fed to the neural network. Let W be the matrx of weghts between the nput sub-matrx and the hdden layer. Ths vector has a sze of 1xn and can be represented as 1xn matrx. The output of hdden neurons h() can be calculated as follows: n h = g + W (k)i(k) b (1) k = 1 where g s the actvaton functon and b() s the bas of each hdden neuron (). Equaton 1 represents the output of each hdden neuron for a partcular sub-matrx I. It can be obtaned to the whole nput matrx Z as follows: n/2 h (u) = g W (k) Z(u+ k) + b k = n/2 (2) Eq.2 represents a cross correlaton operaton. Gven any two functons f and d, ther cross correlaton can be obtaned by: d(x) f(x) = f(x+ n)d(n) (3) n= Hence, by evaluatng ths cross correlaton, a speed up rato can be obtaned comparable to conventonal neural networks. Also, the fnal output of the neural network can be evaluated as follows: q O(u) = g = Wo () h (u) + bo (6) 1 where q s the number of neurons n the hdden layer. O(u) s the output of the neural network when the sldng wndow located at the poston (u) n the nput matrx Z. W o s the weght matrx between hdden and output layer. The complexty of cross correlaton n the frequency doman can be analyzed as follows: 1- For a tested matrx of 1xN elements, the 1D- FFT requres a number equal to Nlog 2 N of complex computaton steps [13]. Also, the same number of complex computaton steps s requred for computng the 1D-FFT of the weght matrx at each neuron n the hdden layer. 2- At each neuron n the hdden layer, the nverse 1D-FFT s computed. Therefore, q backward and (1+q) forward transforms have to be computed. Therefore, for a gven matrx under test, the total number of operatons requred to compute the 1D- FFT s (2q+1)Nlog 2 N. 3- The number of computaton steps requred by FTDNNs s complex and must be converted nto a real verson. It s known that, the one dmensonal Fast Fourer Transform requres (N/2)log 2 N complex multplcatons and Nlog 2 N complex addtons [13]. Every complex multplcaton s realzed by sx real floatng pont operatons and every complex addton s mplemented by two real floatng pont operatons. Therefore, the total number of computaton steps requred to obtan the 1D-FFT of a 1xN matrx s: Therefore, Eq. 2 may be wrtten as follows [1]: h = g( W Z+ b ) (4) ρ=6((n/2)log 2 N) + 2(Nlog 2 N) (7) where h s the output of the hdden neuron () and whch may be smplfed to: h (u) s the actvty of the hdden unt () when the sldng wndow s located at poston (u) and (u) ρ=5nlog 2 N (8) [N-n+1]. 4- Both the nput and the weght matrces should Now, the above cross correlaton can be expressed be dot multpled n the frequency doman. Thus, a n terms of one dmensonal Fast Fourer number of complex computaton steps equal to qn Transform as follows [1]: should be consdered. Ths means 6qN real W Z F 1 operatons wll be added to the number of = ( F ( Z ) F* ( W )) (5) computaton steps requred by FTDNNs. ISSN: 1109-2742 1127 Issue 12, Volume 7, December 2008
5- In order to perform cross correlaton n the frequency doman, the weght matrx must be extended to have the same sze as the nput matrx. So, a number of zeros = (N-n) must be added to the weght matrx. Ths requres a total real number of computaton steps = q(n-n) for all neurons. Moreover, after computng the FFT for the weght matrx, the conjugate of ths matrx must be obtaned. As a result, a real number of computaton steps = qn should be added n order to obtan the conjugate of the weght matrx for all neurons. Also, a number of real computaton steps equal to N s requred to create butterfles complex numbers (e -jk(2πn/n) ), where 0<K<L. These (N/2) complex numbers are multpled by the elements of the nput matrx or by prevous complex numbers durng the computaton of FFT. To create a complex number requres two real floatng pont operatons. Thus, the total number of computaton steps requred for FTDNNs becomes: σ=(2q+1)(5nlog 2 N) +6qN+q(N-n)+qN+N (9) whch can be reformulated as: σ=(2q+1)(5nlog 2 N)+q(8N-n)+N (10) 6- Usng sldng wndow of sze 1xn for the same matrx of 1xN pxels, q(2n-1)(n-n+1) computaton steps are requred when usng CTDNNs for certan attack detecton or processng (n) nput data. The theoretcal speed up factor η can be evaluated as follows: q(2n-1)(n- n + 1) η = (11) (2q+ 1)(5Nlog N) + q(8n- n) N 2 + CTDNNs and FTDNNs are shown n Fgures 6 and 7 respectvely. Tme delay neural networks accept seral nput data wth fxed sze (n). Therefore, the number of nput neurons equals to (n). Instead of treatng (n) nputs, the proposed new approach s to collect all the ncomng data together n a long vector (for example 100xn). Then the nput data s tested by tme delay neural networks as a sngle pattern wth length L (L=100xn). Such a test s performed n the frequency doman as descrbed n secton II. The combned attack n the ncomng data may have real or complex values n a form of one or two dmensonal array. Complex-valued neural networks have many applcatons n felds dealng wth complex numbers such as telecommuncatons, speech recognton and mage processng wth the Fourer Transform [6,10]. Complex-valued neural networks mean that the nputs, weghts, thresholds and the actvaton functon have complex values. In ths secton, formulas for the speed up rato wth dfferent types of nputs (real /complex) wll be presented. Also, the speed up rato n case of a one and two dmensonal ncomng nput matrx wll be concluded. The operaton of FTDNNs depends on computng the Fast Fourer Transform for both the nput and weght matrces and obtanng the resultng two matrces. After performng dot multplcaton for the resultng two matrces n the frequency doman, the Inverse Fast Fourer Transform s determned for the fnal matrx. Here, there s an excellent advantage wth FTDNNs that should be mentoned. The Fast Fourer Transform s already dealng wth complex numbers, so there s no change n the number of computaton steps requred for FTDNNs. Therefore, the speed up rato n case of complex-valued tme delay neural networks can be evaluated as follows: 1) In case of real nputs A) For a one dmensonal nput matrx Multplcaton of (n) complex-valued weghts by (n) real nputs requres (2n) real operatons. Ths produces (n) real numbers and (n) magnary numbers. The addton of these numbers requres (2n-2) real operatons. The multplcaton and addton operatons are repeated (N-n+1) for all possble sub matrces n the ncomng nput matrx. In addton, all of these procedures are repeated at each neuron n the hdden layer. Therefore, the number of computaton steps requred by conventonal neural networks can be calculated as: θ=2q(2n-1)(n-n+1) (12) The speed up rato n ths case can be computed as follows: 2q(2n-1)(N- n + 1) η = (13) (2q+ 1)(5Nlog N) + q(8n- n) N 2 + Practcal speed up rato for searchng short successve (n) data n a long nput vector (L) usng complex-valued tme delay neural networks ISSN: 1109-2742 1128 Issue 12, Volume 7, December 2008
s shown n Fgure 8. Ths has beed performed by usng a 700 MHz processor and MATLAB. B) For a two dmensonal nput matrx Multplcaton of (n 2 ) complex-valued weghts by (n 2 ) real nputs requres (2n 2 ) real operatons. Ths produces (n 2 ) real numbers and (n 2 ) magnary numbers. The addton of these numbers requres (2n 2-2) real operatons. The multplcaton and addton operatons are repeated (N-n+1) 2 for all possble sub matrces n the ncomng nput matrx. In addton, all of these procedures are repeated at each neuron n the hdden layer. Therefore, the number of computaton steps requred by conventonal neural networks can be calculated as: θ=2q(2n 2-1)(N-n+1) 2 (14) The speed up rato n ths case can be computed as follows: 2 2q(2n -1)(N- n + 1) η = (15) 2 2 2 2 (2q+ 1)(5N log N ) + q(8n - n ) + N 2 Practcal speed up rato for detectng (nxn) real valued submatrx n a large real valued matrx (NxN) usng complex-valued tme delay neural networks s shown n Fg. 9. Ths has beed performed by usng a 700 MHz processor and MATLAB. 2) In case of complex nputs A) For a one dmensonal nput matrx Multplcaton of (n) complex-valued weghts by (n) complex nputs requres (6n) real operatons. Ths produces (n) real numbers and (n) magnary numbers. The addton of these numbers requres (2n-2) real operatons. Therefore, the number of computaton steps requred by conventonal neural networks can be calculated as: θ=2q(4n-1)(n-n+1) (16) The speed up rato n ths case can be computed as follows: 2q(4n-1)(N- n + 1) η = (17) (2q+ 1)(5Nlog N) + q(8n- n) N 2 2 + Practcal speed up rato for searchng short complex successve (n) data n a long complexvalued nput vector (L) usng complex-valued tme delay neural networks s shown n Fg. 10. Ths has beed performed by usng a 700 MHz processor and MATLAB. B) For a two dmensonal nput matrx Multplcaton of (n 2 ) complex-valued weghts by (n 2 ) real nputs requres (6n 2 ) real operatons. Ths produces (n 2 ) real numbers and (n 2 ) magnary numbers. The addton of these numbers requres (2n 2-2) real operatons. Therefore, the number of computaton steps requred by conventonal neural networks can be calculated as: θ=2q(4n 2-1)(N-n+1) 2 (18) The speed up rato n ths case can be computed as follows: 2 2q(4n -1)(N- n + 1) η = (19) 2 2 2 2 (2q+ 1)(5N log N ) + q(8n - n ) + N 2 Practcal speed up rato for detectng (nxn) complex-valued submatrx n a large complexvalued matrx (NxN) usng complex-valued neural networks s shown n Fg. 11. Ths has beed performed by usng a 700 MHz processor and MATLAB. An nterestng pont s that the memory capacty s reduced when usng FTDNN. Ths s because the number of varables s reduced compared wth CTDNN. The neural algorthm presented here can be nserted very easly n any Ant-Attack gateway software. 6. Concluson The AQR protocol s mplemented usng ANTS and the performance of the AQR algorthm s analyzed. It s nferred that wth background traffc, the AQR fnds alternatve paths quckly. It reduces the delay and subsequently reduces the jtter nvolved. The varant of AQR usng the product of avalable bt rate and delay for fndng the best path s useful n reducng the jtter where the overall traffc n the network s heavy. It helps to mantan the jtter n networks wth more bursty traffc. It s also nferred that the performance of AQR s better than that of SPR n both cases. Performance s analyzed for varous traffc classes. The probe capsules are sent along each path from the source to the destnaton to fnd the best path. By tunng the probng frequency to an optmal value, the traffc caused by the probe 2 ISSN: 1109-2742 1129 Issue 12, Volume 7, December 2008
capsules can be reduced. Comparng wth the data traffc, probe capsules are small n number. The topology should be known by all nodes to fnd the best path. In ths paper dynamc topology s consdered and floodng s used to fnd the best path. Ths fnds applcaton n fndng effectve QoS routng n ad-hoc networks as well as defendng DDoS attacks by dentfyng the attack traffc path usng QoS regulatons. The floodng overhead nvolved s unavodable and some pay off measures need to be dentfed. It s also proposed to choose the best path based on error rate so that the loss of probe capsules can also be taken nto consderaton. A new approach for fast attack detecton has been presented. Such strategy has been realzed by usng a new desgn for tme delay neural networks. Theoretcal computatons have shown that FTDNNs requre fewer computaton steps than conventonal ones. Ths has been acheved by applyng cross correlaton n the frequency doman between the ncomng seral data and the nput weghts of tme delay neural networks. Smulaton results have confrmed ths proof by usng MATLAB. Furthermore, the memory complexty has been reduced when usng the fast neural algorthm. In addton, ths algorthm can be combned n any Ant-attack gateway software. References: [1] H. M. El-Bakry, and Q. Zhao, A Modfed Cross Correlaton n the Frequency Doman for Fast Pattern Detecton Usng Neural Networks, Internatonal Journal of Sgnal Processng, vol.1, no.3, pp. 188-194, 2004. [2] H. M. El-Bakry, and Q. Zhao, Fast Object/Face Detecton Usng Neural Networks and Fast Fourer Transform, Internatonal Journal of Sgnal Processng, vol.1, no.3, pp. 182-187, 2004. [3] H. M. El-Bakry, and Q. Zhao, Fast Pattern Detecton Usng Normalzed Neural Networks and Cross Correlaton n the Frequency Doman, EURASIP Journal on Appled Sgnal Processng, Specal Issue on Advances n Intellgent Vson Systems: Methods and Applcatons Part I, vol. 2005, no. 13, 1 August 2005, pp. 2054-2060. [4] H. M. El-Bakry, and Q. Zhao, A Fast Neural Algorthm for Seral Code Detecton n a Stream of Sequental Data, Internatonal Journal of Informaton Technology, vol.2, no.1, pp. 71-90, 2005. [5] H. M. El-Bakry, and H. Stoyan, FNNs for Code Detecton n Sequental Data Usng Neural Networks for Communcaton Applcatons, Proc. of the Frst Internatonal Conference on Cybernetcs and Informaton Technologes, Systems and Applcatons: CITSA 2004, 21-25 July, 2004. Orlando, Florda, USA, Vol. IV, pp. 150-153. [6] A. Hrose, Complex-Valued Neural Networks Theores and Applcatons, Seres on nnovatve Intellegence, vol.5. Nov. 2003. [7] H. M. El-Bakry, Face detecton usng fast neural networks and mage decomposton, Neurocomputng Journal, vol. 48, 2002, pp. 1039-1046. [8] H. M. El-Bakry, Human Irs Detecton Usng Fast Cooperatve Modular Neural Nets and Image Decomposton, Machne Graphcs & Vson Journal (MG&V), vol. 11, no. 4, 2002, pp. 498-512. [9] H. M. El-Bakry, Automatc Human Face Recognton Usng Modular Neural Networks, Machne Graphcs & Vson Journal (MG&V), vol. 10, no. 1, 2001, pp. 47-73. [10] S. Jankowsk, A. Lozowsk, M. Zurada, Complex-valued Multstate Neural Assocatve Memory, IEEE Trans. on Neural Networks, vol.7, 1996, pp.1491-1496. [11] H. M. El-Bakry, and Q. Zhao, Fast Pattern Detecton Usng Neural Networks Realzed n Frequency Doman, Proc. of the Internatonal Conference on Pattern Recognton and Computer Vson, The Second World Enformatka Congress WEC'05, Istanbul, Turkey, 25-27 Feb., 2005, pp. 89-92. [12] H. M. El-Bakry, and Q. Zhao, Sub-Image Detecton Usng Fast Neural Processors and Image Decomposton, Proc. of the Internatonal Conference on Pattern Recognton and Computer Vson, The Second World Enformatka Congress WEC'05, Istanbul, Turkey, 25-27 Feb., 2005, pp. 85-88. [13] J. W. Cooley, and J. W. Tukey, "An algorthm for the machne calculaton of complex Fourer seres," Math. Comput. 19, 297 301 (1965). ISSN: 1109-2742 1130 Issue 12, Volume 7, December 2008
[14] R. Klette, and Zamperon, "Handbook of mage processng operators, " John Wley & Sonsltd, 1996. [15] Mchael Welzl, Alfred Chal, Max Mühlhäuser, An Approach to Flexble QoS Routng wth Actve Networks, Proceedngs of the Fourth Annual Internatonal Workshop on Actve Mddleware Servces (AMS 02), 2002. [16] Wang Janxn, Wang Wepng, Chen Jan'er, Chen Songqao. A randomzed QoS routng algorthm on networks wth naccurate lnk-state nformaton, Proc. the 16th World Computer Conference, Internatonal Conference of Communcaton Technology, Bejng, Aug., 2000, pp.1617-1622. [17] Stavros Vronts, Irene Sygkouna, Mara Chantzara and Eystathos Sykas, Enablng Dstrbuted QoS Management utlzng Actve Network technology, Natonal Techncal Unversty of Athens. [18] Davd J. Wetherall, John V. Guttag and Davd L. Tennenhouse, ANTS: A Toolkt for Buldng and Dynamcally Deployng Networks Protocols, Software Devces and Systems Group, Laboratory for Computer Scence, Massachusetts Insttute of Technology, Aprl 1998. [19] Juraj Sucík, Ing. Frantšek Jakab, Measurement and Evaluaton of Qualty of Servce Parameters n Computer Networks, Department of Computers and Informatcs, Techncal Unversty of Košce, Letná 9, 041 20 Košce, Slovak Republc. [20] Roche A. Guérn, Arel Orda, QoS routng n networks wth naccurate nformaton: Theory and algorthms, IEEE/ACM Transactons on Networkng (TON), v.7 n.3, p.350-364, June 1999 [21] Dean H. Lorenz, Arel Orda, QoS routng n networks wth uncertan parameters, IEEE/ACM Transactons on Networkng (TON), v.6 n.6, p.768-778, Dec. 1998 [22] Rajagopalan B, Saadck H, A Framework for QoS-based Routng n the Internet, RFC 2386, IETF, Aug., 1998. [23] M. Resslen, K. W. Ross, and S. Rajagopal, Guaranteeng statstcal QoS to regulated traffc: The sngle node case, Proceedngs of IEEE INFOCOM 99, pages 1061 1062, New York, March 1999. [24] M. Resslen, K. W. Ross, and S. Rajagopal. Guaranteeng statstcal QoS to regulated traffc: The multple node case, Proceedngs of 37th IEEE Conference on Decson and Control (CDC), pages 531 531, Tampa, December 1998. [25] Davd Wetherall, Ulana Legedza, and John Guttag, Introducng New Internet Servces: Why and How, Software Devces and Systems Group, Laboratory for Computer Scence, Massachusetts Insttute of Technology, July 1998. [26] P. Hurley, J.-Y. Le Boudec, P. Thran, and M. Kara.ABE, Provdng a Low-Delay Servce wthn Best Effort, IEEE Network Magazne, v 15, no 3,May/June 2001. Table 1. Performance data wthout background traffc. SPR AQR Max. delay (ms) 8364 7006 Mn. delay (ms) 5554 5554 Avg. delay (ms) 5760 5584 Max. jtter (ms) 2810 1452 Table 2. Performance data wth background traffc type 1 SPR AQR Max. delay (ms) 1825 1827 Mn. delay (ms) 1806 1806 Avg. delay (ms) 1808 1806 Max. jtter (ms) 19 21 ISSN: 1109-2742 1131 Issue 12, Volume 7, December 2008
Actve QoS routng QoS mprovement Behavor n splt Traffc Comparson wth shortest path routng Behavor under Tolerance parameter Fg. 1. Data flow dagram. Fg. 2. Sample network topology. Fg. 3. Structure of AQR protocol. ISSN: 1109-2742 1132 Issue 12, Volume 7, December 2008
Fg. 4 Delay response wthout background traffc. Fg.5. Delay response wth background traffc type 1. ISSN: 1109-2742 1133 Issue 12, Volume 7, December 2008
I 1 I 2 Output Layer Output I n-1 Hdden Layer I n Input Layer Dot multplcaton n tme doman between the (n) nput data and weghts of the hdden layer. Seral nput data 1:N n groups of (n) elements shfted by a step of one element each tme. I N Fg.6. Classcal tme delay neural networks. I 1 I 2 Output Layer Output I N-1 Hdden Layer I N Fg.7. Fast tme delay neural networks. Cross correlaton n the frequency doman between the total (N) nput data and the weghts of the hdden layer. ISSN: 1109-2742 1134 Issue 12, Volume 7, December 2008
Speed up Rato 40 35 30 25 20 15 10 5 0 Practcal Speed up rato (n=400) Practcal Speed up rato (n=625) Practcal Speed up rato (n=900) 10000 2E+05 5E+05 1E+06 2E+06 3E+06 4E+06 Length of one dmensonal nput matrx Fg. 8. Practcal speed up rato for tme delay neural networks n case of one dmensonal real-valued nput matrx and complex-valued weghts. Speed up Rato 40 35 30 25 20 15 10 5 0 Speed up Rato (n=20) Speed up Rato (n=25) Speed up Rato (n=30) 100 300 500 700 900 1100 1300 1500 1700 1900 Sze of two dmensonal nput matrx Fg. 9. Practcal speed up rato when usng FTDNNs n case of two dmensonal real-valued nput matrx and complex-valued weghts. ISSN: 1109-2742 1135 Issue 12, Volume 7, December 2008
Speed up Rato 80 70 60 50 40 30 20 10 0 Practcal Speed up rato (n=400) Practcal Speed up rato (n=625) Practcal Speed up rato (n=900) 10000 2E+05 5E+05 1E+06 2E+06 3E+06 4E+06 Length of one dmensonal nput matrx Fg. 10. Practcal speed up rato when usng FTDNNs n case of one dmensonal complex-valued nput matrx and complexvalued weghts. 70 60 Speed up Rato 50 40 30 20 10 0 Speed up Rato (n=20) Speed up Rato (n=25) Speed up Rato (n=30) 100 300 500 700 900 1100 1300 1500 1700 1900 Sze of two dmensonal nput matrx Fg. 11. Practcal speed up rato when usng FTDNNs n case of two dmensonal complex-valued nput matrx and complexvalued weghts. ISSN: 1109-2742 1136 Issue 12, Volume 7, December 2008