Dynamic intelligent cleaning model of dirty electric load data
|
|
- Silvia Neal
- 8 years ago
- Views:
Transcription
1 Available online at Energy Conversion and Management 49 (2008) Dynamic intelligent cleaning model of dirty electric load data Zhang Xiaoxing a, *, Sun Caixin b a State Key Laboratory of Power Transmission Equipment & System Security and New Technology, Chongqing University, Chongqing , China b The Key Laboratory of High Voltage Engineering and Electrical New Technology, Ministry of Education, Electrical Engineering College of Chongqing University, Chongqing , PR China Received 13 January 2006; received in revised form 16 April 2006; accepted 19 August 2007 Available online 25 October 2007 Abstract There are a number of dirty data in the load database derived from the supervisory control and data acquisition (SCADA) system. Thus, the data must be carefully and reasonably adjusted before it is used for electric load forecasting or power system analysis. This paper proposes a dynamic and intelligent data cleaning model based on data mining theory. Firstly, on the basis of fuzzy soft clustering, the Kohonen clustering network is improved to fulfill the parallel calculation of fuzzy c-means soft clustering. Then, the proposed dynamic algorithm can automatically find the new clustering center (the characteristic curve of the data) with the updated sample data; At last, it is composed with radial basis function neural network (RBFNN), and then, an intelligent adjusting model is proposed to identify the dirty data. The rapid and dynamic performance of the model makes it suitable for real time calculation, and the efficiency and accuracy of the model is proved by test results of electrical load data analysis in Chongqing. Ó 2007 Elsevier Ltd. All rights reserved. Keywords: Dirty data; Data mining; Kohonen clustering network; RBF neural network; Dynamic adjusting 1. Introduction High accuracy of load forecasting for power systems improves the security of the power system and reduces generation costs. Load forecasting is highly related to power system operations such as dispatch scheduling, preventive maintenance plan for generators and reliability evaluation of the power systems. In addition, accurate estimated loads are key data that are necessary for electric power price forecast on the electric power markets. So far, many studies on load forecasting have been made to improve prediction accuracy using various conventional methods such as regression models, expert systems, artificial neural network, fuzzy inference and hybrid algorithm [1 7]. Because of transmission errors of the information channel, as well as the faults of the remote terminal unit (RTU) etc., the load data derived from the supervisory control and * Corresponding author. Tel.: x8215; fax: address: mikezxx@tom.com (X. Zhang). data acquisition (SCADA) has some dirty data. Direct use of these load data may have some negative effects on the accuracy of load forecasting, so it is necessary to identify and to adjust these dirty data, which is an important step of data mining [8]. So far, various methods have been proposed to identify and to adjust the dirty data, but there is still no systematic method that can solve this problem effectively all around. Sequential probabilistic ratio analysis is used as outliers detection tools for stationary time series [9], but this method requires relative information about the data set parameters, such as data distribution, which is yet unknown in many cases. Learning vector quantization (LVQ) has been used to get rid of dirty data in Ref. [10]. This method regards data as vector array. If one element in a vector is dirty data, the whole vector is eliminated. Because it cannot identify the exact location of the dirty data, a great deal of useful information will be lost at the same time. In this paper, a dynamic and intelligent model that has three layers based on data mining theory is proposed. The first layer extracts the characteristic curve from the /$ - see front matter Ó 2007 Elsevier Ltd. All rights reserved. doi: /j.enconman
2 X. Zhang, C. Sun / Energy Conversion and Management 49 (2008) load using the Kohonen clustering network improved by the fuzzy soft clustering algorithm. In the second layer, a radial basis function neural network (RBFNN) is used to construct a pattern classifier for identifying dirty data. In the third layer, the value of the dirty data is replaced by the weighted sum of the corresponding two values in the same place in two characteristic curves with maximal membership grade. According to the updated sample data, the proposed dynamic clustering algorithm can automatically search new vectors, namely, the characteristic curve. This model fills up deficiencies of the methods mentioned in the above references, and it owns many advantages, such as high accuracy, real time and dynamic state. What s more, the efficiency and accuracy of the model is proved by test results of electrical load data analysis in Chongqing. 2. Principle and structure of intelligent adjusting model of dirty data Similarity and smoothness are the two important characteristics of electrical load curves. The several peak times in a daily curve are generally the same, and the neighboring points usually have little variation, while the existence of dirty data will obviously destroy the smoothness. However, the similarity remains unchanged because the amount of dirty data is small. Therefore, characteristic patterns can be extracted from many load curves that may contain dirty data using the clustering algorithm of data mining theory, and then, the characteristic curve can be separated from the load curves by a classification algorithm and the dirty data will finally be recognized. The structure of the model is shown in Fig. 1. The first layer is a kind of improved Kohonen network (FKCN fuzzy Kohonen clustering network). The under checked curve x j is the input of the FKCN. If the characteristic curve corresponding to a nerve cell has the biggest similarity to x j, the nerve cell will output 1 and excite the corresponding RBF sub-network. The second layer is a RBF sub-network related to each clustering center. After being trained, it is ready to identify the dirty date and locate them accurately. If the output cell of the RBF is close or equal to 1, the corresponding input cell stands for dirty data. The third layer adjusts the dirty data. The detailed principles in term of model layers are described as follows Load data clustering (the first layer) Data clustering is used for extracting the characteristic curve from the load. Clustering algorithms attempt to assess the interaction among patterns by organizing the patterns into clusters so that patterns within a cluster are more similar to each other than those patterns belonging to different clusters. Neural networks, such as the Kohonen clustering networks (KCNs) have been successfully applied in the area of pattern recognition and clustering [11 14]. One of the advantages of this approach is that it does not need any prior knowledge of the number of clusters present in the data set. However, KCNs suffer from several major problems [15]. Firstly, KCNs are heuristic procedures, so termination is not based on optimizing any model of the process or its data. Secondly, the final weight vectors usually depend on the input sequence. Thirdly, different initial conditions usually yield different results. Fourthly, several parameters of the KCN algorithms, such as the learning rate, the size of update neighborhood and the strategy to alter these two parameters during learning must be varied from one data set to another to achieve useful results. A fuzzy Kohonen clustering network (FKCN) model has been proposed by Bezdek in Ref. [15]. This method can overcome some of the difficulties described above by taking advantage of the best features of the self organizing structure of the KCNs and the fuzzy clustering model of the FCM. In this paper, the FKCN algorithm has been employed in clustering the load data. RBF 1 0 Training date normalize FKCN network RBF J-1 RBF J 1 adjust 0 Σ Data After clean 0 RBF N first layer Data need to be cleaned Second layer Third layer Fig. 1. The intelligent adjusting model of dirty data.
3 566 X. Zhang, C. Sun / Energy Conversion and Management 49 (2008) Kohonen clustering networks (KCNs) The Kohonen model is a neural network that simulates the hypothesized self organization process carried out in the human brain when some input data are presented [11]. The structure of this neural network is composed of two layers: an input layer formed by a set of units (on for each feature of the input) and an output layer formed by units or neurons arranged in a two-dimensional grid. Each neuron has a vector of coefficients associated with it. It can be interpreted as weights attached to the edges that connect the p input nodes to the c output nodes. The aggregate of the c weight vectors (the network weight vector v i ) is adjusted during learning. Given an input vector, the neurons in the output layer compete among themselves and the winner (whose weight has the minimum distance from the input) updates its weights and some set of predefined neighbors. The process continues until the weight vectors stabilize. In this method, a learning rate must be defined that decreases with time in order to force termination. The updated neighborhood must be defined and is also reduced with time. The KN algorithm process can be seen in Ref. [11] Fuzzy c-means algorithms (FCM) Fuzzy c-means clustering [16 18] is a process of grouping similar objects into the same class, but the resulting partition is fuzzy, which means that the patterns are not assigned exclusively to a single class, but partially to all classes. The goal is to optimize the clustering criteria in order to achieve a high intra-cluster similarity and a low inter-cluster similarity using p-dimensional feature vectors. The theoretical basis of these methods will only be briefly reviewed here. Let X ={x 1,x 2,...,x n } denote a data set where each element in X is a vector with P dimension, the data set X is going to be partitioned into c fuzzy clusters. A c-partition of X can be represented by u ik, where u ik is a continuous function in the [0, 1] interval and represents the membership of x k in the cluster i, 16 i 6 c, 16 k 6 n. In general [u ik ] can be denoted by a c n matrix U and satisfies the following conditions: X c i¼1 u ik ¼ 1 The fuzzy c-means algorithm consists of an iterative optimization of an objective function: J m ðu; vþ ¼ Xn X c ðu ik Þ m D ik ð2þ k¼1 i¼1 where the parameter m 2 (1, 1) determines the fuzziness of the partition. In this paper, m = 2.0. v i ={v 1,v 2,...,v c }, with v i is the cluster center of class i, and D ik ¼ðd ik Þ 2 ¼kx k v i k 2 A ð3þ is the distance in the A norm from x k to v i (A is any positive definite p p matrix). ð1þ For a given partition, the cluster centers can be calculated as follows: P n k¼1 v i ¼ ðu ikþ m x P k n k¼1 ðu ikþ m ð4þ A new partition is obtained as " # 1 u ik ¼ Xc m 1 ðd ik =d jk Þ 2 ð5þ j¼1 The iterative optimization of the objective function continues until a stopping criterion is met, usually when the distance between U matrices at successive iterations falls below a threshold, that is E t ¼kU t U t 1 k < e ð6þ FCM is a gradual optimal process with slow convergence FKCN The fuzzy Kohonen clustering network [15] is a type of neural network that combines both methods described above: KCNs and FCM. The structure of this self organization network model consists of two layers: input and output. The input layer is composed of n nodes, where n is the number of features, while the output layer is formed by c nodes, where c is the number of clusters to be found. Every single input node is fully connected to all output nodes with an adjustable weight v i assigned to each connection. Given an input vector, the neurons in the output layer update their weights based on a pre-defined learning rate a. This approach integrates the fuzzy membership u ik from the FCM in the following update rule: v i;t ¼ v i;t 1 þ a ik;t ðx k v i;t 1 Þ where the learning rate a is defined as: a ik;t ¼ðu ik;t Þ mt m t ¼ m 0 ðm 0 1Þt=T m 0 is any positive constant greater than one, t is the current iteration and T is the iteration limit. The steps for the algorithm are: Step 1: Fix c, and e to any small positive constant. Step 2: Initialize the weight vector (cluster centers) v 0 ={v 1,0,v 2,0,...,v c,0 }. Choose m 0 > 1 and maximal iterative steps T. Step 3: For t =1,2,...,T (a) Compute all learning rates a ik as defined in Eq. (8). (b) Update all weight vectors v i,t with: P n k¼1 v i;t ¼ v i;t 1 þ a ik;tðx k v i;t 1 Þ P n s¼1 a ð10þ is;t (c) Compute E t for the stopping criterion, If E t < e then stop, else next t. ð7þ ð8þ ð9þ
4 X. Zhang, C. Sun / Energy Conversion and Management 49 (2008) Dynamic soft clustering by using SFKN The sample data is a time sequence and should be updated dynamically with elapsing time. Thus, dirty data adjustment is also a dynamic process. In this paper, a detective threshold value u 0 is introduced, and the algorithm is detailed as follows: Step 1: Initializing the dynamic detective threshold value u 0. Step 2: Introducing x þ j and x j, where xþ j means the new added sample data in the data set and x j stands for the eliminated sample data. The current sample data can be expressed as: X ¼ffx j g þfx þ j g fx j gg. Step 3: Calculating u iðj j Þ, the membership grade of the remaining data x j x j towards the clustering center vector v i. Setting u iðj j Þ ¼ maxfu iðj j Þg, if u iðj j Þ < u 0, then eliminating v i, updating c = c 1, and setting all the weights to 0 whose related nodes connect with this node in the FKCN; if u iðj j Þ P u 0, remain v i. Step 4: Calculating u ij þ, where u ij þ means the membership grade of x þ j toward each clustering center v i.set u ¼ maxfu ij þ ij þg, if u < u ij 0, continue the next þ step 5; if u P u ij þ 0 and then algorithm finishes. Step 5: Introduce new added clustering center v i þ with initial value x þ j, set c = c + 1 and keep other clustering center unchanged, then choose x þ j as the input of FKCN and calculate new clustering center according to the FKCN algorithm. In step 5, most of the clustering centers remain unchanged in spite of the little variation with the network structure. At the same time, the membership grade of the original data toward these cluster centers also remains unchanged, so the parameters of the original network can be used in the new network, which will converge quickly Pattern classifying of dirty data (the second layer) In the second layer of the model, the RBF is used to construct a pattern classifier for dirty data because of its strong ability of fast convergence and classification Radial basis function network The radial basis function (RBF) network [18 21], which is a three layer neural network including input, hidden and output layers. The input layer connections are not weighted, and thus, each hidden node receives each input value, without alteration. The hidden nodes are the radial basis function units. The transfer function for the hidden nodes is non-monotonic in contrast to the monotonic sigmoid function of back propagation networks. The output nodes are simple summations. The transfer function of the hidden layer in the RBF network often uses a Gaussian function a i ¼ expð kx v i k=r 2 i Þ ð11þ where a i is the activation of the ith node in the hidden layer, X 2 R n is an input vector, v i is called the center vector of the ith node, r i is called the bandwidth vector of the ith node, and kk denotes the Euclidean norm. The output of the network y j is given by: y j ¼ Xm w ji a i ð12þ i¼1 where w ji is the connected weight between the hidden layer and the output layer, m denotes the number of nodes in the hidden layer The dirty data locating algorithm Each clustering center from the FKCN corresponds to a RBFNN, and the value of the clustering center is selected as the center of Gaussian function of each RBF. Each RBF s input layer has 96 nodes (corresponds to the 96 load points per day), and the output layer also has 96 nodes, Suppose that only a single dirty data is present and the rest of the data are normal, then choose the sampling number as 96, and then, the pattern number of the dirty data is 96 2 = 192. Input and output sample data sets can be created as follows: Step 1: Choosing clustering center v i as the i-th input of the RBFNN, that is, x 0 = v i, and the corresponding output is y 0 = (0,0,...,0). Step 2: Giving the first element of v i a deviation, x 1 =(v i (1) + e,v i (2),...,v i (p)), there is products a sample containing dirty data, and output y 1 = (1, 0,...,0) after; Giving the second element of v i a deviation, x 1 =(v i (1),v i (2) + e,...,v i (p)), there is produced a sample containing dirty data, then output y 1 = (0,1,...,0); Continue this operation to the remaining elements of v i and obtain a sample data set with positive deviation. Step 3: Change the deviation e to e and replace 1 in the output vector by 1. Repeat step 2 and obtain a sample data set with negative deviation. The trained network can identify and locate dirty data accurately no matter how the dirty data exists in the curve: whether there is only a single dirty element or a series Recognition and adjustment of dirty data (the third layer) The amendment of dirty data is realized in the third layer. The value of the dirty data positioned in the second layer is adjusted by replacing it by the weighted sum of the corresponding two values in the same position in two characteristic curves with maximal membership grade. If the sub maximal membership grade is less than 0.2, then the
5 568 X. Zhang, C. Sun / Energy Conversion and Management 49 (2008) value of the characteristic curve with maximal membership grade is chosen. For example, dirty data exists in the curve x j from point t1 to t2, and v i1, v i2 are two clustering centers with maximal membership grade, then the amendment of the dirty data can be expressed as: u i1;j u i2;j x 0 j ðtþ ¼v0 i1 ðtþ þ v 0 i2 u i1;j þ u ðtþ ð13þ i2;j u i1;j þ u i2;j v 0 i1 ðtþ ¼v i1ðtþ x jðt1 1Þ v i1 ðt1 1Þ þ x jðt2 þ 1Þ =2 ð14þ v i1 ðt2 þ 1Þ v 0 i2 ðtþ ¼v i2ðtþ x jðt1 1Þ v i2 ðt1 1Þ þ x jðt2 þ 1Þ =2 ð15þ v i2 ðt2 þ 1Þ where t 2 [t1,t2]. 3. The analysis of results Fig. 2. Normalized curves and clustering center of one type of loads. Data in workday and weekend are put into the FKCN, respectively, because these two kinds of load curves are obviously different. This operation reduces the amount of training calculation and the number of clustering centers and increases the calculation speed and improves the efficiency of the model. The following example is derived from electrical load data from April to September 2003 of the Jiangbei power supply bureau in Chongqing, China Normalization of load data Similarity and smoothness of curves are mainly considered in this system. As varied amplitude of curves influences the similarity of the curves, in order to eliminate this influence, we normalize the load as follows: x 0 L ðiþ ¼ x LðiÞ P 96 i¼1 x LðiÞ 3.2. Training results ð16þ Fig. 3. Identification of dirty data. 1: load curve; 2: clustering center with maximal membership grade 3: clustering center with sub-maximal membership grade; 4: curve after adjusted. The clustering center and curves after normalization are shown in Fig Adjusting results Load data in October 2003 (out of the training set) are adjusted randomly. Fig. 3 is a typical load curve with dirty data, where the amendment methods and corresponding results are presented clearly Comparison of accuracy between FKCN and KCNs In order to illustrate the advantages of the FKCN employed in this paper, we replace the FKCN in the first layer of the model by the general Kohonen network. On the basis of daily load data in October 2003, some dirty data are added artificially. The results of the two methods are shown in Table 1 where one can find that the accuracy of the proposed method is higher than that of the KCNs method Dynamic updating algorithm In order to verify the efficiency of this algorithm, dirty data of 5 days in December 2003 are firstly adjusted without using the dynamic updating algorithm (sample data is from April to September in 2003). Then, the model of the dynamic updating algorithm is used, and the sample data set is updated till the day before the identification day. Results are shown in Table 2. The results of the dynamic updating algorithm are satisfying because its error is less than that of the non-dynamic algorithm. In the dynamic updating algorithm, the latest adjusted clustering center and vectors are used, which increases the membership grade of the load curves and clustering centers. However, the dynamic updating algo-
6 X. Zhang, C. Sun / Energy Conversion and Management 49 (2008) Table 1 Comparison of two amendment models Dirty data points rithm improves not only the accuracy of identification but also the adjustment precision of the dirty data. 4. Conclusion Error before adjust (%) General Kohonen (%) SFKN (%) Table 2 The result of random check to load data Date no. Count of dirty data Non-dynamic algorithm Failed to Misjudged judge Dynamic updating algorithm Failed to judge Total Misjudged The analysis of examples illuminates that the FKCN algorithm improves the capability of Kohonen clustering networks and can obtain the clustering center more quickly and reasonably, overcoming the disadvantages of the Kohonen algorithm. The proposed dynamic updating algorithm can adjust the clustering center automatically on the basis of the newly added data, and the RBF networks can identify the exact location of dirty data because of its strong ability of pattern recognition. The dynamic intelligent adjusting model proposed in this paper can process data dynamically with higher accuracy and faster convergence. References [1] Rahman S, Bhatnagar R. An expert system based algorithm for short term load forecast. IEEE Trans Power Syst 1988;3(2): [2] Mori H, Kobayashi H. Optimal fuzzy inference for short-term load forecasting. IEEE Trans Power Syst 1996;11(1): [3] Song Kyung-Bin, Baek Young-Sik, Hun Hong Dug, Jang Gilsoo. Short-term load forecasting for the holidays using fuzzy linear regression method. IEEE Trans Power Syst 2005;20(1): [4] Kim KH. Development of fuzzy expert system for short-term load forecasting on special days. IEEE Trans Power Syst 1998;47(7): [5] Nazarko J, Zalewski W. The fuzzy regression approach to peak load estimation in power distribution systems. IEEE Trans Power Syst 1999;4: [6] Charytoniuk W, Chen M-S. Very short-term load forecasting using artificial neural networks. IEEE Trans Power Syst 2000;15(1): [7] Ling SH, Frank HF Leung, Lam HK, et al. Short-term electric load forecasting based on a neural fuzzy network. IEEE Trans Ind Electron 2003;50(6). [8] Fayyad UM et al., editorsadvances in knowledge discovery and data mining. AAAI Press/MIT Press; [9] Cho Kokyo. Outlier detection for stationary time series. J Stat Plan Infer 2001: [10] Nicolaos B Karayiannis. An axiomaticn approach to soft learning vector quantization and clustering. IEEE Trans Neural Networks 1999;10(5): [11] Kohonen T. Self-organization and associative memory. 3rd ed. Berlin: Springer; [12] Huntsberger T, Ajjimarangsee P. Parallel self-organization feature maps for unsupervised pattern recognition. Int J Gen Syst 1989: [13] Hartigan J. Clustering algorithms. New York: Wiley; [14] Dubes R, Jain A. Algorithms that cluster data. Englewood Cliffs: Prentice Hall; [15] Tsao EC, Bezdek JC. Fuzzy Kohonen clustering networks. Pattern Recogn 1994;27(5): [16] Dubois D, Prade H. Fuzzy sets and system: theory and applications. New York: Academic Press; [17] Bezdek JC. Pattern recognition with fuzzy objective function algorithms. New York: Plenum Press; [18] Broomhead DS, Lowe D. Multivariable functional interpolation and adaptive networks. Complex Syst 1988;2: [19] Moody TJ, Darken CJ. Fast learning in networks of locally tuned processing units. Neural Comput 1989;1: [20] Bishop CM. Neural networks for pattern recognition. Oxford: Clarendon Press; p [21] Chen S, Cowan CFN, Grant PM. Orthogonal least squares learning algorithm for radial basis function networks. IEEE Trans Neural Networks 1991;2:302 9.
Chapter 2 The Research on Fault Diagnosis of Building Electrical System Based on RBF Neural Network
Chapter 2 The Research on Fault Diagnosis of Building Electrical System Based on RBF Neural Network Qian Wu, Yahui Wang, Long Zhang and Li Shen Abstract Building electrical system fault diagnosis is the
More informationSegmentation of stock trading customers according to potential value
Expert Systems with Applications 27 (2004) 27 33 www.elsevier.com/locate/eswa Segmentation of stock trading customers according to potential value H.W. Shin a, *, S.Y. Sohn b a Samsung Economy Research
More information6.2.8 Neural networks for data mining
6.2.8 Neural networks for data mining Walter Kosters 1 In many application areas neural networks are known to be valuable tools. This also holds for data mining. In this chapter we discuss the use of neural
More informationEFFICIENT DATA PRE-PROCESSING FOR DATA MINING
EFFICIENT DATA PRE-PROCESSING FOR DATA MINING USING NEURAL NETWORKS JothiKumar.R 1, Sivabalan.R.V 2 1 Research scholar, Noorul Islam University, Nagercoil, India Assistant Professor, Adhiparasakthi College
More informationdegrees of freedom and are able to adapt to the task they are supposed to do [Gupta].
1.3 Neural Networks 19 Neural Networks are large structured systems of equations. These systems have many degrees of freedom and are able to adapt to the task they are supposed to do [Gupta]. Two very
More informationLoad balancing in a heterogeneous computer system by self-organizing Kohonen network
Bull. Nov. Comp. Center, Comp. Science, 25 (2006), 69 74 c 2006 NCC Publisher Load balancing in a heterogeneous computer system by self-organizing Kohonen network Mikhail S. Tarkov, Yakov S. Bezrukov Abstract.
More informationHow To Use Neural Networks In Data Mining
International Journal of Electronics and Computer Science Engineering 1449 Available Online at www.ijecse.org ISSN- 2277-1956 Neural Networks in Data Mining Priyanka Gaur Department of Information and
More informationStandardization and Its Effects on K-Means Clustering Algorithm
Research Journal of Applied Sciences, Engineering and Technology 6(7): 399-3303, 03 ISSN: 040-7459; e-issn: 040-7467 Maxwell Scientific Organization, 03 Submitted: January 3, 03 Accepted: February 5, 03
More informationComparison of K-means and Backpropagation Data Mining Algorithms
Comparison of K-means and Backpropagation Data Mining Algorithms Nitu Mathuriya, Dr. Ashish Bansal Abstract Data mining has got more and more mature as a field of basic research in computer science and
More informationAn Integrated Neural Fuzzy Approach for Fault Diagnosis of Transformers R. Naresh, Veena Sharma, and Manisha Vashisth
IEEE TRANSACTIONS ON POWER DELIVERY, VOL. 23, NO. 4, OCTOBER 2008 2017 An Integrated Neural Fuzzy Approach for Fault Diagnosis of Transformers R. Naresh, Veena Sharma, and Manisha Vashisth Abstract This
More informationOpen Access Research on Application of Neural Network in Computer Network Security Evaluation. Shujuan Jin *
Send Orders for Reprints to reprints@benthamscience.ae 766 The Open Electrical & Electronic Engineering Journal, 2014, 8, 766-771 Open Access Research on Application of Neural Network in Computer Network
More informationChapter 12 Discovering New Knowledge Data Mining
Chapter 12 Discovering New Knowledge Data Mining Becerra-Fernandez, et al. -- Knowledge Management 1/e -- 2004 Prentice Hall Additional material 2007 Dekai Wu Chapter Objectives Introduce the student to
More informationSocial Media Mining. Data Mining Essentials
Introduction Data production rate has been increased dramatically (Big Data) and we are able store much more data than before E.g., purchase data, social media data, mobile phone data Businesses and customers
More informationVisualization of Breast Cancer Data by SOM Component Planes
International Journal of Science and Technology Volume 3 No. 2, February, 2014 Visualization of Breast Cancer Data by SOM Component Planes P.Venkatesan. 1, M.Mullai 2 1 Department of Statistics,NIRT(Indian
More informationPRACTICAL DATA MINING IN A LARGE UTILITY COMPANY
QÜESTIIÓ, vol. 25, 3, p. 509-520, 2001 PRACTICAL DATA MINING IN A LARGE UTILITY COMPANY GEORGES HÉBRAIL We present in this paper the main applications of data mining techniques at Electricité de France,
More informationMobile Phone APP Software Browsing Behavior using Clustering Analysis
Proceedings of the 2014 International Conference on Industrial Engineering and Operations Management Bali, Indonesia, January 7 9, 2014 Mobile Phone APP Software Browsing Behavior using Clustering Analysis
More informationTHE APPLICATION OF DATA MINING TECHNOLOGY IN REAL ESTATE MARKET PREDICTION
THE APPLICATION OF DATA MINING TECHNOLOGY IN REAL ESTATE MARKET PREDICTION Xian Guang LI, Qi Ming LI Department of Construction and Real Estate, South East Univ,,Nanjing, China. Abstract: This paper introduces
More informationNeural Networks and Back Propagation Algorithm
Neural Networks and Back Propagation Algorithm Mirza Cilimkovic Institute of Technology Blanchardstown Blanchardstown Road North Dublin 15 Ireland mirzac@gmail.com Abstract Neural Networks (NN) are important
More informationApplications of improved grey prediction model for power demand forecasting
Energy Conversion and Management 44 (2003) 2241 2249 www.elsevier.com/locate/enconman Applications of improved grey prediction model for power demand forecasting Che-Chiang Hsu a, *, Chia-Yon Chen b a
More informationLeast Squares Estimation
Least Squares Estimation SARA A VAN DE GEER Volume 2, pp 1041 1045 in Encyclopedia of Statistics in Behavioral Science ISBN-13: 978-0-470-86080-9 ISBN-10: 0-470-86080-4 Editors Brian S Everitt & David
More informationNeural Network Add-in
Neural Network Add-in Version 1.5 Software User s Guide Contents Overview... 2 Getting Started... 2 Working with Datasets... 2 Open a Dataset... 3 Save a Dataset... 3 Data Pre-processing... 3 Lagging...
More informationQuality Assessment in Spatial Clustering of Data Mining
Quality Assessment in Spatial Clustering of Data Mining Azimi, A. and M.R. Delavar Centre of Excellence in Geomatics Engineering and Disaster Management, Dept. of Surveying and Geomatics Engineering, Engineering
More informationExtracting Fuzzy Rules from Data for Function Approximation and Pattern Classification
Extracting Fuzzy Rules from Data for Function Approximation and Pattern Classification Chapter 9 in Fuzzy Information Engineering: A Guided Tour of Applications, ed. D. Dubois, H. Prade, and R. Yager,
More informationA FUZZY BASED APPROACH TO TEXT MINING AND DOCUMENT CLUSTERING
A FUZZY BASED APPROACH TO TEXT MINING AND DOCUMENT CLUSTERING Sumit Goswami 1 and Mayank Singh Shishodia 2 1 Indian Institute of Technology-Kharagpur, Kharagpur, India sumit_13@yahoo.com 2 School of Computer
More informationHandling of incomplete data sets using ICA and SOM in data mining
Neural Comput & Applic (2007) 16: 167 172 DOI 10.1007/s00521-006-0058-6 ORIGINAL ARTICLE Hongyi Peng Æ Siming Zhu Handling of incomplete data sets using ICA and SOM in data mining Received: 2 September
More informationD-optimal plans in observational studies
D-optimal plans in observational studies Constanze Pumplün Stefan Rüping Katharina Morik Claus Weihs October 11, 2005 Abstract This paper investigates the use of Design of Experiments in observational
More informationMANAGING QUEUE STABILITY USING ART2 IN ACTIVE QUEUE MANAGEMENT FOR CONGESTION CONTROL
MANAGING QUEUE STABILITY USING ART2 IN ACTIVE QUEUE MANAGEMENT FOR CONGESTION CONTROL G. Maria Priscilla 1 and C. P. Sumathi 2 1 S.N.R. Sons College (Autonomous), Coimbatore, India 2 SDNB Vaishnav College
More informationANN Based Fault Classifier and Fault Locator for Double Circuit Transmission Line
International Journal of Computer Sciences and Engineering Open Access Research Paper Volume-4, Special Issue-2, April 2016 E-ISSN: 2347-2693 ANN Based Fault Classifier and Fault Locator for Double Circuit
More informationImpact of Feature Selection on the Performance of Wireless Intrusion Detection Systems
2009 International Conference on Computer Engineering and Applications IPCSIT vol.2 (2011) (2011) IACSIT Press, Singapore Impact of Feature Selection on the Performance of ireless Intrusion Detection Systems
More informationAdaptive Demand-Forecasting Approach based on Principal Components Time-series an application of data-mining technique to detection of market movement
Adaptive Demand-Forecasting Approach based on Principal Components Time-series an application of data-mining technique to detection of market movement Toshio Sugihara Abstract In this study, an adaptive
More informationNTC Project: S01-PH10 (formerly I01-P10) 1 Forecasting Women s Apparel Sales Using Mathematical Modeling
1 Forecasting Women s Apparel Sales Using Mathematical Modeling Celia Frank* 1, Balaji Vemulapalli 1, Les M. Sztandera 2, Amar Raheja 3 1 School of Textiles and Materials Technology 2 Computer Information
More informationAdvanced Web Usage Mining Algorithm using Neural Network and Principal Component Analysis
Advanced Web Usage Mining Algorithm using Neural Network and Principal Component Analysis Arumugam, P. and Christy, V Department of Statistics, Manonmaniam Sundaranar University, Tirunelveli, Tamilnadu,
More informationAn Introduction to Neural Networks
An Introduction to Vincent Cheung Kevin Cannons Signal & Data Compression Laboratory Electrical & Computer Engineering University of Manitoba Winnipeg, Manitoba, Canada Advisor: Dr. W. Kinsner May 27,
More informationInclusion degree: a perspective on measures for rough set data analysis
Information Sciences 141 (2002) 227 236 www.elsevier.com/locate/ins Inclusion degree: a perspective on measures for rough set data analysis Z.B. Xu a, *, J.Y. Liang a,1, C.Y. Dang b, K.S. Chin b a Faculty
More informationThe Combination Forecasting Model of Auto Sales Based on Seasonal Index and RBF Neural Network
, pp.67-76 http://dx.doi.org/10.14257/ijdta.2016.9.1.06 The Combination Forecasting Model of Auto Sales Based on Seasonal Index and RBF Neural Network Lihua Yang and Baolin Li* School of Economics and
More informationSelf Organizing Maps: Fundamentals
Self Organizing Maps: Fundamentals Introduction to Neural Networks : Lecture 16 John A. Bullinaria, 2004 1. What is a Self Organizing Map? 2. Topographic Maps 3. Setting up a Self Organizing Map 4. Kohonen
More informationA Study of Web Log Analysis Using Clustering Techniques
A Study of Web Log Analysis Using Clustering Techniques Hemanshu Rana 1, Mayank Patel 2 Assistant Professor, Dept of CSE, M.G Institute of Technical Education, Gujarat India 1 Assistant Professor, Dept
More informationThe Research of Data Mining Based on Neural Networks
2011 International Conference on Computer Science and Information Technology (ICCSIT 2011) IPCSIT vol. 51 (2012) (2012) IACSIT Press, Singapore DOI: 10.7763/IPCSIT.2012.V51.09 The Research of Data Mining
More informationUsing Data Mining for Mobile Communication Clustering and Characterization
Using Data Mining for Mobile Communication Clustering and Characterization A. Bascacov *, C. Cernazanu ** and M. Marcu ** * Lasting Software, Timisoara, Romania ** Politehnica University of Timisoara/Computer
More informationA Study on the Comparison of Electricity Forecasting Models: Korea and China
Communications for Statistical Applications and Methods 2015, Vol. 22, No. 6, 675 683 DOI: http://dx.doi.org/10.5351/csam.2015.22.6.675 Print ISSN 2287-7843 / Online ISSN 2383-4757 A Study on the Comparison
More informationCustomer Relationship Management using Adaptive Resonance Theory
Customer Relationship Management using Adaptive Resonance Theory Manjari Anand M.Tech.Scholar Zubair Khan Associate Professor Ravi S. Shukla Associate Professor ABSTRACT CRM is a kind of implemented model
More informationHow To Solve The Cluster Algorithm
Cluster Algorithms Adriano Cruz adriano@nce.ufrj.br 28 de outubro de 2013 Adriano Cruz adriano@nce.ufrj.br () Cluster Algorithms 28 de outubro de 2013 1 / 80 Summary 1 K-Means Adriano Cruz adriano@nce.ufrj.br
More informationAdvanced Ensemble Strategies for Polynomial Models
Advanced Ensemble Strategies for Polynomial Models Pavel Kordík 1, Jan Černý 2 1 Dept. of Computer Science, Faculty of Information Technology, Czech Technical University in Prague, 2 Dept. of Computer
More informationViSOM A Novel Method for Multivariate Data Projection and Structure Visualization
IEEE TRANSACTIONS ON NEURAL NETWORKS, VOL. 13, NO. 1, JANUARY 2002 237 ViSOM A Novel Method for Multivariate Data Projection and Structure Visualization Hujun Yin Abstract When used for visualization of
More informationTolerance of Radial Basis Functions against Stuck-At-Faults
Tolerance of Radial Basis Functions against Stuck-At-Faults Ralf Eickhoff 1 and Ulrich Rückert 1 Heinz Nixdorf Institute System and Circuit Technology University of Paderborn, Germany eickhoff,rueckert@hni.upb.de
More informationAn Analysis on Density Based Clustering of Multi Dimensional Spatial Data
An Analysis on Density Based Clustering of Multi Dimensional Spatial Data K. Mumtaz 1 Assistant Professor, Department of MCA Vivekanandha Institute of Information and Management Studies, Tiruchengode,
More informationAutomated Stellar Classification for Large Surveys with EKF and RBF Neural Networks
Chin. J. Astron. Astrophys. Vol. 5 (2005), No. 2, 203 210 (http:/www.chjaa.org) Chinese Journal of Astronomy and Astrophysics Automated Stellar Classification for Large Surveys with EKF and RBF Neural
More informationComparison of Supervised and Unsupervised Learning Classifiers for Travel Recommendations
Volume 3, No. 8, August 2012 Journal of Global Research in Computer Science REVIEW ARTICLE Available Online at www.jgrcs.info Comparison of Supervised and Unsupervised Learning Classifiers for Travel Recommendations
More informationAmerican International Journal of Research in Science, Technology, Engineering & Mathematics
American International Journal of Research in Science, Technology, Engineering & Mathematics Available online at http://www.iasir.net ISSN (Print): 2328-349, ISSN (Online): 2328-3580, ISSN (CD-ROM): 2328-3629
More informationA New Approach For Estimating Software Effort Using RBFN Network
IJCSNS International Journal of Computer Science and Network Security, VOL.8 No.7, July 008 37 A New Approach For Estimating Software Using RBFN Network Ch. Satyananda Reddy, P. Sankara Rao, KVSVN Raju,
More informationData Mining and Neural Networks in Stata
Data Mining and Neural Networks in Stata 2 nd Italian Stata Users Group Meeting Milano, 10 October 2005 Mario Lucchini e Maurizo Pisati Università di Milano-Bicocca mario.lucchini@unimib.it maurizio.pisati@unimib.it
More informationLarge-Scale Data Sets Clustering Based on MapReduce and Hadoop
Journal of Computational Information Systems 7: 16 (2011) 5956-5963 Available at http://www.jofcis.com Large-Scale Data Sets Clustering Based on MapReduce and Hadoop Ping ZHOU, Jingsheng LEI, Wenjun YE
More informationAn Overview of Knowledge Discovery Database and Data mining Techniques
An Overview of Knowledge Discovery Database and Data mining Techniques Priyadharsini.C 1, Dr. Antony Selvadoss Thanamani 2 M.Phil, Department of Computer Science, NGM College, Pollachi, Coimbatore, Tamilnadu,
More informationMethod of Combining the Degrees of Similarity in Handwritten Signature Authentication Using Neural Networks
Method of Combining the Degrees of Similarity in Handwritten Signature Authentication Using Neural Networks Ph. D. Student, Eng. Eusebiu Marcu Abstract This paper introduces a new method of combining the
More informationCOMPARISON OF OBJECT BASED AND PIXEL BASED CLASSIFICATION OF HIGH RESOLUTION SATELLITE IMAGES USING ARTIFICIAL NEURAL NETWORKS
COMPARISON OF OBJECT BASED AND PIXEL BASED CLASSIFICATION OF HIGH RESOLUTION SATELLITE IMAGES USING ARTIFICIAL NEURAL NETWORKS B.K. Mohan and S. N. Ladha Centre for Studies in Resources Engineering IIT
More informationCLUSTERING LARGE DATA SETS WITH MIXED NUMERIC AND CATEGORICAL VALUES *
CLUSTERING LARGE DATA SETS WITH MIED NUMERIC AND CATEGORICAL VALUES * ZHEUE HUANG CSIRO Mathematical and Information Sciences GPO Box Canberra ACT, AUSTRALIA huang@cmis.csiro.au Efficient partitioning
More informationComparing large datasets structures through unsupervised learning
Comparing large datasets structures through unsupervised learning Guénaël Cabanes and Younès Bennani LIPN-CNRS, UMR 7030, Université de Paris 13 99, Avenue J-B. Clément, 93430 Villetaneuse, France cabanes@lipn.univ-paris13.fr
More informationK-Means Clustering Tutorial
K-Means Clustering Tutorial By Kardi Teknomo,PhD Preferable reference for this tutorial is Teknomo, Kardi. K-Means Clustering Tutorials. http:\\people.revoledu.com\kardi\ tutorial\kmean\ Last Update: July
More informationPLAANN as a Classification Tool for Customer Intelligence in Banking
PLAANN as a Classification Tool for Customer Intelligence in Banking EUNITE World Competition in domain of Intelligent Technologies The Research Report Ireneusz Czarnowski and Piotr Jedrzejowicz Department
More informationFUZZY CLUSTERING ANALYSIS OF DATA MINING: APPLICATION TO AN ACCIDENT MINING SYSTEM
International Journal of Innovative Computing, Information and Control ICIC International c 0 ISSN 34-48 Volume 8, Number 8, August 0 pp. 4 FUZZY CLUSTERING ANALYSIS OF DATA MINING: APPLICATION TO AN ACCIDENT
More informationDetection of DDoS Attack Scheme
Chapter 4 Detection of DDoS Attac Scheme In IEEE 802.15.4 low rate wireless personal area networ, a distributed denial of service attac can be launched by one of three adversary types, namely, jamming
More informationNEURAL NETWORKS IN DATA MINING
NEURAL NETWORKS IN DATA MINING 1 DR. YASHPAL SINGH, 2 ALOK SINGH CHAUHAN 1 Reader, Bundelkhand Institute of Engineering & Technology, Jhansi, India 2 Lecturer, United Institute of Management, Allahabad,
More informationMethodology for Emulating Self Organizing Maps for Visualization of Large Datasets
Methodology for Emulating Self Organizing Maps for Visualization of Large Datasets Macario O. Cordel II and Arnulfo P. Azcarraga College of Computer Studies *Corresponding Author: macario.cordel@dlsu.edu.ph
More informationLecture 6. Artificial Neural Networks
Lecture 6 Artificial Neural Networks 1 1 Artificial Neural Networks In this note we provide an overview of the key concepts that have led to the emergence of Artificial Neural Networks as a major paradigm
More informationVisualization of large data sets using MDS combined with LVQ.
Visualization of large data sets using MDS combined with LVQ. Antoine Naud and Włodzisław Duch Department of Informatics, Nicholas Copernicus University, Grudziądzka 5, 87-100 Toruń, Poland. www.phys.uni.torun.pl/kmk
More informationNetwork (Tree) Topology Inference Based on Prüfer Sequence
Network (Tree) Topology Inference Based on Prüfer Sequence C. Vanniarajan and Kamala Krithivasan Department of Computer Science and Engineering Indian Institute of Technology Madras Chennai 600036 vanniarajanc@hcl.in,
More informationRole of Neural network in data mining
Role of Neural network in data mining Chitranjanjit kaur Associate Prof Guru Nanak College, Sukhchainana Phagwara,(GNDU) Punjab, India Pooja kapoor Associate Prof Swami Sarvanand Group Of Institutes Dinanagar(PTU)
More informationINTELLIGENT ENERGY MANAGEMENT OF ELECTRICAL POWER SYSTEMS WITH DISTRIBUTED FEEDING ON THE BASIS OF FORECASTS OF DEMAND AND GENERATION Chr.
INTELLIGENT ENERGY MANAGEMENT OF ELECTRICAL POWER SYSTEMS WITH DISTRIBUTED FEEDING ON THE BASIS OF FORECASTS OF DEMAND AND GENERATION Chr. Meisenbach M. Hable G. Winkler P. Meier Technology, Laboratory
More informationUNIVERSITY OF BOLTON SCHOOL OF ENGINEERING MS SYSTEMS ENGINEERING AND ENGINEERING MANAGEMENT SEMESTER 1 EXAMINATION 2015/2016 INTELLIGENT SYSTEMS
TW72 UNIVERSITY OF BOLTON SCHOOL OF ENGINEERING MS SYSTEMS ENGINEERING AND ENGINEERING MANAGEMENT SEMESTER 1 EXAMINATION 2015/2016 INTELLIGENT SYSTEMS MODULE NO: EEM7010 Date: Monday 11 th January 2016
More informationNeural Network Applications in Stock Market Predictions - A Methodology Analysis
Neural Network Applications in Stock Market Predictions - A Methodology Analysis Marijana Zekic, MS University of Josip Juraj Strossmayer in Osijek Faculty of Economics Osijek Gajev trg 7, 31000 Osijek
More informationA Framework for Intelligent Online Customer Service System
A Framework for Intelligent Online Customer Service System Yiping WANG Yongjin ZHANG School of Business Administration, Xi an University of Technology Abstract: In a traditional customer service support
More informationClustering. Danilo Croce Web Mining & Retrieval a.a. 2015/201 16/03/2016
Clustering Danilo Croce Web Mining & Retrieval a.a. 2015/201 16/03/2016 1 Supervised learning vs. unsupervised learning Supervised learning: discover patterns in the data that relate data attributes with
More informationIntrusion Detection via Machine Learning for SCADA System Protection
Intrusion Detection via Machine Learning for SCADA System Protection S.L.P. Yasakethu Department of Computing, University of Surrey, Guildford, GU2 7XH, UK. s.l.yasakethu@surrey.ac.uk J. Jiang Department
More informationEM Clustering Approach for Multi-Dimensional Analysis of Big Data Set
EM Clustering Approach for Multi-Dimensional Analysis of Big Data Set Amhmed A. Bhih School of Electrical and Electronic Engineering Princy Johnson School of Electrical and Electronic Engineering Martin
More informationA Stock Pattern Recognition Algorithm Based on Neural Networks
A Stock Pattern Recognition Algorithm Based on Neural Networks Xinyu Guo guoxinyu@icst.pku.edu.cn Xun Liang liangxun@icst.pku.edu.cn Xiang Li lixiang@icst.pku.edu.cn Abstract pattern respectively. Recent
More informationLVQ Plug-In Algorithm for SQL Server
LVQ Plug-In Algorithm for SQL Server Licínia Pedro Monteiro Instituto Superior Técnico licinia.monteiro@tagus.ist.utl.pt I. Executive Summary In this Resume we describe a new functionality implemented
More informationRandom forest algorithm in big data environment
Random forest algorithm in big data environment Yingchun Liu * School of Economics and Management, Beihang University, Beijing 100191, China Received 1 September 2014, www.cmnt.lv Abstract Random forest
More informationResearch on the Performance Optimization of Hadoop in Big Data Environment
Vol.8, No.5 (015), pp.93-304 http://dx.doi.org/10.1457/idta.015.8.5.6 Research on the Performance Optimization of Hadoop in Big Data Environment Jia Min-Zheng Department of Information Engineering, Beiing
More informationDecision-making with the AHP: Why is the principal eigenvector necessary
European Journal of Operational Research 145 (2003) 85 91 Decision Aiding Decision-making with the AHP: Why is the principal eigenvector necessary Thomas L. Saaty * University of Pittsburgh, Pittsburgh,
More informationAn Algorithm for Automatic Base Station Placement in Cellular Network Deployment
An Algorithm for Automatic Base Station Placement in Cellular Network Deployment István Törős and Péter Fazekas High Speed Networks Laboratory Dept. of Telecommunications, Budapest University of Technology
More informationMachine Learning with MATLAB David Willingham Application Engineer
Machine Learning with MATLAB David Willingham Application Engineer 2014 The MathWorks, Inc. 1 Goals Overview of machine learning Machine learning models & techniques available in MATLAB Streamlining the
More informationNeural Networks in Data Mining
IOSR Journal of Engineering (IOSRJEN) ISSN (e): 2250-3021, ISSN (p): 2278-8719 Vol. 04, Issue 03 (March. 2014), V6 PP 01-06 www.iosrjen.org Neural Networks in Data Mining Ripundeep Singh Gill, Ashima Department
More informationMapReduce Approach to Collective Classification for Networks
MapReduce Approach to Collective Classification for Networks Wojciech Indyk 1, Tomasz Kajdanowicz 1, Przemyslaw Kazienko 1, and Slawomir Plamowski 1 Wroclaw University of Technology, Wroclaw, Poland Faculty
More informationD A T A M I N I N G C L A S S I F I C A T I O N
D A T A M I N I N G C L A S S I F I C A T I O N FABRICIO VOZNIKA LEO NARDO VIA NA INTRODUCTION Nowadays there is huge amount of data being collected and stored in databases everywhere across the globe.
More informationInternational Journal of Computer Science Trends and Technology (IJCST) Volume 2 Issue 3, May-Jun 2014
RESEARCH ARTICLE OPEN ACCESS A Survey of Data Mining: Concepts with Applications and its Future Scope Dr. Zubair Khan 1, Ashish Kumar 2, Sunny Kumar 3 M.Tech Research Scholar 2. Department of Computer
More informationDesign call center management system of e-commerce based on BP neural network and multifractal
Available online www.jocpr.com Journal of Chemical and Pharmaceutical Research, 2014, 6(6):951-956 Research Article ISSN : 0975-7384 CODEN(USA) : JCPRC5 Design call center management system of e-commerce
More informationVisualization of Topology Representing Networks
Visualization of Topology Representing Networks Agnes Vathy-Fogarassy 1, Agnes Werner-Stark 1, Balazs Gal 1 and Janos Abonyi 2 1 University of Pannonia, Department of Mathematics and Computing, P.O.Box
More informationSupport Vector Machines with Clustering for Training with Very Large Datasets
Support Vector Machines with Clustering for Training with Very Large Datasets Theodoros Evgeniou Technology Management INSEAD Bd de Constance, Fontainebleau 77300, France theodoros.evgeniou@insead.fr Massimiliano
More informationSynchronization of sampling in distributed signal processing systems
Synchronization of sampling in distributed signal processing systems Károly Molnár, László Sujbert, Gábor Péceli Department of Measurement and Information Systems, Budapest University of Technology and
More informationSELECTING NEURAL NETWORK ARCHITECTURE FOR INVESTMENT PROFITABILITY PREDICTIONS
UDC: 004.8 Original scientific paper SELECTING NEURAL NETWORK ARCHITECTURE FOR INVESTMENT PROFITABILITY PREDICTIONS Tonimir Kišasondi, Alen Lovren i University of Zagreb, Faculty of Organization and Informatics,
More informationNetwork Traffic Prediction Based on the Wavelet Analysis and Hopfield Neural Network
Netork Traffic Prediction Based on the Wavelet Analysis and Hopfield Neural Netork Sun Guang Abstract Build a mathematical model is the key problem of netork traffic prediction. Traditional single netork
More informationCredit Card Fraud Detection Using Self Organised Map
International Journal of Information & Computation Technology. ISSN 0974-2239 Volume 4, Number 13 (2014), pp. 1343-1348 International Research Publications House http://www. irphouse.com Credit Card Fraud
More informationAn Anomaly-Based Method for DDoS Attacks Detection using RBF Neural Networks
2011 International Conference on Network and Electronics Engineering IPCSIT vol.11 (2011) (2011) IACSIT Press, Singapore An Anomaly-Based Method for DDoS Attacks Detection using RBF Neural Networks Reyhaneh
More informationChapter ML:XI (continued)
Chapter ML:XI (continued) XI. Cluster Analysis Data Mining Overview Cluster Analysis Basics Hierarchical Cluster Analysis Iterative Cluster Analysis Density-Based Cluster Analysis Cluster Evaluation Constrained
More informationPrediction of Stock Performance Using Analytical Techniques
136 JOURNAL OF EMERGING TECHNOLOGIES IN WEB INTELLIGENCE, VOL. 5, NO. 2, MAY 2013 Prediction of Stock Performance Using Analytical Techniques Carol Hargreaves Institute of Systems Science National University
More informationCredal classification of uncertain data using belief functions
23 IEEE International Conference on Systems, Man, and Cybernetics Credal classification of uncertain data using belief functions Zhun-ga Liu a,c,quanpan a, Jean Dezert b, Gregoire Mercier c a School of
More informationNTC Project: S01-PH10 (formerly I01-P10) 1 Forecasting Women s Apparel Sales Using Mathematical Modeling
1 Forecasting Women s Apparel Sales Using Mathematical Modeling Celia Frank* 1, Balaji Vemulapalli 1, Les M. Sztandera 2, Amar Raheja 3 1 School of Textiles and Materials Technology 2 Computer Information
More informationPrediction Model for Crude Oil Price Using Artificial Neural Networks
Applied Mathematical Sciences, Vol. 8, 2014, no. 80, 3953-3965 HIKARI Ltd, www.m-hikari.com http://dx.doi.org/10.12988/ams.2014.43193 Prediction Model for Crude Oil Price Using Artificial Neural Networks
More informationAUTOMATION OF ENERGY DEMAND FORECASTING. Sanzad Siddique, B.S.
AUTOMATION OF ENERGY DEMAND FORECASTING by Sanzad Siddique, B.S. A Thesis submitted to the Faculty of the Graduate School, Marquette University, in Partial Fulfillment of the Requirements for the Degree
More informationPredict the Popularity of YouTube Videos Using Early View Data
000 001 002 003 004 005 006 007 008 009 010 011 012 013 014 015 016 017 018 019 020 021 022 023 024 025 026 027 028 029 030 031 032 033 034 035 036 037 038 039 040 041 042 043 044 045 046 047 048 049 050
More information