One Solution to XOR problem using Multilayer Perceptron having Minimum Configuration

Size: px
Start display at page:

Download "One Solution to XOR problem using Multilayer Perceptron having Minimum Configuration"

Transcription

1 International Journal of Science and Engineering Volume 3, Number PP: IJSE Available at ISSN: One Solution to XOR problem using Multilayer Perceptron having Minimum Configuration Vaibhav Kant Singh Department of Computer Science and Engineering Institute of Technology, Guru Ghasidas Vishwavidyalaya, Central University, Bilaspur, (C.G.), India 1 vibhu200427@gmail.com Abstract- Artificial Neural Network (ANN) is the branch of Computer Science which deals with the Construction of programs that are having analogy with the Biological Neural Network (BNN). There are various types of ANN systems which are used to solve variety of problems. When we will look into the history of development ANN we saw the concept of linear separabilty. The problems which are supposed to be linearly separable are solved easily my making use of Single layer perceptron model proposed by Rosenblatt. XOR problem the solution to which is discussed in this paper is a nonlinearly separable problem. The problem is a complex problem and requires new type of ANN system for its solution. In this paper we will see Architectural Graph and Signal Flow Graphs representing the ANN equivalent to Minimum Configuration Multilayer perceptron (MLP). We have utilized hyperbolic tangent function as a Activation function for Hidden Layers and Threshold Function as Activation Function for Output Layer. The learning employed is Error Correction Learning and the algorithm employed is Back propagation Algorithm (BPN). In this paper one solution is proposed for the solution of XOR problem. Keywords ANN, BNN, Activation Function, Hyperbolic Tangent Function, BPN, MLP. I. INTRODUCTION TO ARTIFICIAL NEURAL NETWORK Artificial Neural Network is a parallel and distributed processor which is simulated in a digital computer and whose working is analogous to the working of Human Brain. Humans are having nervous system that performs operation in parallel after attaining inputs from the five basic sense organs. The cells which are responsible for processing the stimulus obtained by the environment are called nerve cells or neurons. ANN resembles human brain in two aspects i.e. knowledge is acquired from the environment through an interactive process of weight change and inter-neuron connection strength i.e. synaptic weights are used to store the acquired knowledge. With every iteration of the learning process the ANN becomes more knowledgeable about the environment in which it is operating. ANN are represented using three techniques namely Block diagram representation, Signal flow graph and Architectural Graph. The basic components of a neuron are set of adjustable synaptic weigths attached to the inputs and bias, a Summing Junction and a linear or non-linear activation function. The three basic elements of any ANN are neuron, network topology and learning algorithm. The learning algorithms employed in ANN is classified into three basic types in first level i.e. Supervised learning, Reinforcement learning and Unsupervised learning. Under supervised learning comes Error Correction and Stochastic learning. Error correction is further classified into LMS and BPN. Unsupervised learning on the other hand is classified into Hebbian and Competitive learning. Some of the Neural Network Systems include SOFM (Self Organizing Feature Map), Perceptron, MLP, Neoconition, ADALINE (Adaptive Linear Neural Element), MADALINE (Multiple ADALINE), LVQ (Learning Vector Quantization), AM (Associative Memory), BAM (Bidirectional Associative Memory), Boltzmann machine, BSB (Brain-State-in-a- Box), Cauchy machines, Hopfield network, ART (Adaptive Resonance Theory), RBF (Radial Basis Function), RNN (Recurrent Neural Network) etc. II. SOLUTION OF AND, OR, NAND AND NOR GATES USING MCCULLOCH AND PITTS MODEL (1) AND GATE A.B LOGIC for AND GATE Where A and B are input values

2 One Solution to XOR problem using Multilayer Perceptron having Minimum Configuration Figure 1. Block Diagram Representing a model of single neuron in ANN displaying the solution for AND, OR, NOR, NAND GATES vk=( xi.wi+bk)=uk+bk and Since uk= xi.wi So, from the Neural Network framework we are able to analyze that there are four parameters which are required for generating output yk i.e. A, B, W1 and W2.Therefore the truth-table for the AND, OR, NAND and NOR GATE is given below:- Table -1 Truth Table of AND,OR,NAND and NOR GATE A B (AND) =. (OR) = + (NAND) =. (NOR) = In this case we will be using threshold function as the activation function. The definition of Threshold function is:- 1 if vk 2 Ψ vk = =where 2 is the Threshold value 0 if vk<2 Table -2 Derivation of Solution of AND GATE using single neuron in ANN Now, we will consider the four input patterns, Taking W1=W2=1 and B k=0 a) When, A=0, B=0, W1=1, W2=1 Since vk= xi.wi=aw1+bw2=0x1+0x1=0 Since 0<2 therefore Ψ(vk)=yk=0 Therefore, yk=0 when A=0 and B=0 b) When, A=0, B=1, W1=1, W2=1 Since vk= xi.wi=aw1+bw2=0x1+1x1=1 Since 1<2 therefore Ψ(vk)=yk=0 Therefore, yk=0 when A=0 and B=1 c) When, A=1, B=0, W1=1, W2=1 Since vk= xi.wi=aw1+bw2=1x1+0x1=1 Since 1<2 therefore Ψ(vk)=yk=0 Therefore, yk=0 when A=1 and B=0 d) When, A=1, B=1, W1=1, W2=1 Since vk= xi.wi=aw1+bw2=1x1+1x1=2 Since 2=2 therefore Ψ(vk)=yk=1 Therefore, yk=1 when A=1 and B=1 From Table-2 we are able to conclude that when W1=W2=1 and threshold value set to 2 we are able to find a solution for construction of ANN equivalent to AND GATE. (2) OR GATE A+B LOGIC for AND GATE Where A and B are input values vk=( xi.wi+bk)=uk+bk and Since uk= xi.wi In this case we will be using threshold function as the activation function. The definition of Threshold function is:-

3 IJSE,Volume 3, Number 2 V K Singh 1 if vk 1 Ψ vk = =where 1 is the Threshold value 0 if vk<1 Table -3 Derivation of Solution of OR GATE using single neuron in ANN Now, we will consider the four input patterns, Taking W1=W2=1 and B k=0 a) When, A=0, B=0, W1=1, W2=1 Since vk= xi.wi=aw1+bw2=0x1+0x1=0 Since 0<2 therefore Ψ(vk)=yk=0 Therefore, yk=0 when A=0 and B=0 b) When, A=0, B=1, W1=1, W2=1 Since vk= xi.wi=aw1+bw2=0x1+1x1=1 Since 1=1 therefore Ψ(vk)=yk=1 Therefore, yk=1 when A=0 and B=1 c) When, A=1, B=0, W1=1, W2=1 Since vk= xi.wi=aw1+bw2=1x1+0x1=1 Since 1=1 therefore Ψ(vk)=yk=1 Therefore, yk=1 when A=1 and B=0 d) When, A=1, B=1, W1=1, W2=1 Since vk= xi.wi=aw1+bw2=1x1+1x1=2 Since 2>1 therefore Ψ(vk)=yk=1 Therefore, yk=1 when A=1 and B=1 From Table-3 we are able to conclude that when W1=W2=1 and threshold value set to 1 we are able to find a solution for construction of ANN equivalent to OR GATE. (3) NAND GATE. LOGIC for NAND GATE vk=( xi.wi+bk)=uk+bk and Since uk= xi.wi Where A and B are input values In this case we will be using threshold function as the activation function. The definition of Threshold function is:- 1 if vk 1 Ψ vk = =where 1 is the Threshold value 0 if vk< 1 Table -4 Derivation of Solution of NAND GATE using single neuron in ANN Now, we will consider the four input patterns, Taking W1=W2=-1 and B k=0 a) When, A=0, B=0, W1=-1, W2=-1 Since vk= xi.wi=aw1+bw2=0x-1+0x-1=0 Since 0>-1 therefore Ψ(vk)=yk=1 Therefore, yk=1 when A=0 and B=0 b) When, A=0, B=1, W1=-1, W2=-1 Since vk= xi.wi=aw1+bw2=0x-1+1x-1=-1 Since -1=-1 therefore Ψ(vk)=yk=1 Therefore, yk=1 when A=0 and B=1 c) When, A=1, B=0, W1=1, W2=1 Since vk= xi.wi=aw1+bw2=1x-1+0x-1=-1 Since -1=-1 therefore Ψ(vk)=yk=1 Therefore, yk=1 when A=1 and B=0 d) When, A=1, B=1, W1=-1, W2=-1 Since vk= xi.wi=aw1+bw2=1x-1+1x-1=-2 Since -2<-1 therefore Ψ(vk)=yk=0 Therefore, yk=0 when A=1 and B=1 From Table-4 we are able to conclude that when W1=W2=-1 and threshold value set to -1 we are able to find a solution for construction of ANN equivalent to NAND GATE. (4) NOR GATE LOGIC + for NOR GATE vk=( xi.wi+bk)=uk+bk and Since uk= xi.wi Where A and B are input values In this case we will be using threshold function as the activation function. The definition of Threshold function is:-

4 One Solution to XOR problem using Multilayer Perceptron having Minimum Configuration 1 if vk 0 Ψ vk = =where 0 is the Threshold value 0 if vk<0 Table -5 Derivation of Solution of NOR GATE using single neuron in ANN Now, we will consider the four input patterns, Taking W1=W2=-1 and B k=0 a) When, A=0, B=0, W1=-1, W2=-1 Since vk= xi.wi=aw1+bw2=0x-1+0x-1=0 Since 0=0 therefore Ψ(vk)=yk=1 Therefore, yk=1 when A=0 and B=0 b) When, A=0, B=1, W1=-1, W2=-1 Since vk= xi.wi=aw1+bw2=0x-1+1x-1=-1 Since -1<0 therefore Ψ(vk)=yk=0 Therefore, yk=0 when A=0 and B=1 c) When, A=1, B=0, W1=-1, W2=-1 Since vk= xi.wi=aw1+bw2=1x1+0x1=-1 Since -1<0 therefore Ψ(vk)=yk=0 Therefore, yk=0 when A=1 and B=0 d) When, A=1, B=1, W1=-1, W2=-1 Since vk= xi.wi=aw1+bw2=1x-1+1x-1=-2 Since -2<0 therefore Ψ(vk)=yk=0 Therefore, yk=0 when A=1 and B=1 From Table-5 we are able to conclude that when W1=W2=-1 and threshold value set to 0 we are able to find a solution for construction of ANN equivalent to NOR GATE. III. PROBLEM STATEMENT The solutions proposed for the problems in the above section are portraying a domain which exhibits a common characteristic. The common characteristic which is exhibited is called Linear Separability. The Definition of Linear Separability is Two sets of points A and B in an n-dimensional space are called linearly separable if (n+1) real numbers w1, w2, w(n+1) exist, such that every point x1,x2.xn satisfies +1 exist, and every point x1,x2.xn satisfies < +1. Rosenblatt in 1958 proposed perceptron model for solving the problems which are linearly separable using supervised learning algorithm which was named perceptron convergence algorithm. Since XOR is a non-linearly separable problem thus require special proposal for its solution. Since, the outputs that XOR produce can t make classification of the inputs using one line in two dimensions. Table -6 Representation of the Inputs in two dimension separated into two classes on the basis of the output that it produce a)graph representing the Linear separablity in AND and NAND GATE where inputs could be classified into classes. b)graph representing the Linear separablity in OR and NOR GATE where inputs could be classified into classes. IV. LITERATURE SURVEY In [1] Abu and Jaques showed the information capacity of general form of memory is formalized. Estimation is made of the number of bits of information that can be stored in the Hopfield model of Associative Memory. In [2] Amari proposed an advance theory of learning and self-organization, covering backpropagation and its generalization as well as the formation of topological maps and neural representations of information. In [3] Akaike reviewed the classical maximum likelihood estimation procedure and a new estimate minimum information theoretical criterion (AIC) estimate (MAICE) which is designed for the purpose of statistical identification is introduced. In [4] Atiya and

5 IJSE,Volume 3, Number 2 V K Singh Abu developed a method for the storage of analog vectors i.e. vectors whose components are real valued, the method is developed for the Hopfield continuous-time network. In [5] Barron established an approximation of properties of a class of ANN. It is shown that Feedforward networks with one layer of sigmoidal non linearities achieve integrated squared errors of order O(1/n), where n is the number of nodes. In [6] Bruck showed the convergence properties of the Hopfield model are dependent on the structure of the interconnection matrix w and the method by which the nodes are updated. In [7] Freeman aim is to emplify the two nodes of information, described in the paper. In [8] the authors proposed a theoretical framework for backpropagation (BP) in order to identify some of its limitation as a general learning procedure and the reasons for its success in several experiments on pattern recognition. In [9] Giles et. al. proved that one method, recurrent cascade correlation, due to its topology has fundamental limitations in representation and there in its learning capabilities. In [10] Cardoso and Laheld introduced a class of adaptive algorithms for source separation which implements an adaptive versions of equivalent estimation and is henceforth called EASI. In [11] the authors Feldkamp and Puskorius presented a coherent neural net based framework for solving various signal processing problem. V. MULTILAYER PERCEPTRON Multilayer Perceptron as the name implies concerns with multiple layers of Neurons. Generally there are three distinguishing feature of Multilayer perceptron which are:- A. Generally the neurons present in the Neural Network are non-linear i.e. the activation function used at each neuron is generally non-linear. Sigmoid function is generally used as activation function. Logistic function or hyperbolic tangent function is used as activation function. B. The neurons present in the network offers a high degree of Connectivity. Generally the neurons present in the network are fully connected. Between the networks the input nodes may directly make connection with the output node. First hidden layer may have connection with the third hidden layer and several variations of this sort may exist between the neurons of the MLP. C. In MLP the Hidden neurons are meant to achieve higher order statistics. Either you may increase the number hidden neurons in the same layer or the number of hidden layers may be increased to transform the problem into simpler form. The learning algorithm used in the case of Multilayer perceptron is called Back Propagation Network (BPN) algorithm. BPN is a type of supervised learning algorithm. It employs error correction learning. BPN comprises of two passes in its framework. Forward pass and Backward pass. In forward pass for the current set of Input actual output is generated. Then since the learning is supervised learning and that too error correction learning, Actual output is compared with the desired output. If error is acknowledged in the forward pass. The error invokes control mechanism which will propagate weight change in the network in backward direction. The procedure continues until the system i.e. ANN learns all the patterns applicable for that domain. BPN training network requires the following steps: STEP1: Select the next training pair from the training set; apply the input vector to the network input. STEP2: Calculate the output of the network. STEP3: Calculate the error between the network output (the target vector from the training pair). STEP4: Adjust the weights of the network in a way that minimizes the error. STEP5: Repeat Steps1 through 4 for each vector in the training set until the error for the entire set is acceptably low. The Correction applied to Wji(n) is defined by the delta rule = Here, η=learning rate constant and ξ(n)=cost function or instantaneous value of error energy.

6 One Solution to XOR problem using Multilayer Perceptron having Minimum Configuration Figure 2. Architectural Graph Representing MLP having two hidden layers VI. MINIMUM CONFIGURATION MULTILAYER PERCEPTRON Figure 3. Architectural Graph Representing MLP having minimum configuration i.e. in the output layer linear activation function could be used MLP beside of having multiple layers in which every element exhibits non linearity by virtue of non linear activation function, provides a variant where there could be one or more hidden layer with Non linear element whereas in the output layer there are going to be linear elements. It means in Minimum configuration MLP in the output layer there could be one or more linear neurons.

7 IJSE,Volume 3, Number 2 V K Singh VII. FIRST SOLUTION TO THE XOR PROBLEM USING MINIMUM CONFIGURATION MULTILAYER PERCEPTRON Figure 4. Architectural Graph Representing MLP having minimum configuration for one solution to XOR problem described below Table -7 Truth Table for XOR GATE x1 x2 = In the first solution for XOR problem in the hidden layer two neurons are present in the proposed solution the activation function used is hyperbolic tangent function. In the output layer the activation function used is Threshold function. The Derivation of the Solution to the XOR problem is given below for the four possible inputs. Figure 5 is having the internal configuration of the MLP. Figure 5 is used to derive the solution. Figure 5. Signal flow graph representing First solution to XOR problem using minimum configuration MLP In the first solution for XOR problem in the hidden layer two neurons are present in the proposed solution the activation function used is hyperbolic tangent function. In the output layer the activation function used is Threshold function. The Derivation of the Solution to the XOR problem is given below for the four possible inputs. Figure 5 is having the internal configuration of the MLP. Figure 5 is used to derive the solution.

8 One Solution to XOR problem using Multilayer Perceptron having Minimum Configuration CASE 1:- When x1=0 and x2=0, At node 1 value of signal will be (1) = = 0.5..(2) At node 2 Signal value will be (3) = = 0.5..(4) Here, the activation function used for hidden layer is Hyperbolic tangent function. The Def. of which is given below:- tanh = sinh 1 = 1 cosh = + = e e h = x=induced local field value, e= The natural logarithm base also known as Euler s number & Range=[-1,+1] At node 3 the signal value will be from Eq(2and 5) = 0.5 = = = At node 4 the signal value will be from Eq (4 and 5) = 0.5 = = = At node 5 the signal value will be from Eq 6 and Eq 7= = = In the output layer the function used is Threshold function. The Definition of the threshold function is given below:- 1 h h = h h h < h h From Eq(9) and Eq(10) the value of output for the first case i.e. x1=0 and x2=0 will be = =0, h h < CASE 2:- When x1=1 and x2=0, At node 1 value of signal will be h 1 2 = = 0.5 (12) At node 2 Signal value will be , h 1 2 = = 1.5..(13) At node 3 the signal value will be from Eq(12and 5) = 0.5 = = = At node 4 the signal value will be from Eq (13 and 5) = 1.5 = = = At node 5 the signal value will be from Eq 14, Eq. 15 and Eq 8 = = From Eq(16) and Eq(10) the value of output for the second case i.e. x1=1 and x2=0 will be =1, h h >

9 IJSE,Volume 3, Number 2 V K Singh CASE 3:- When x1=0 and x2=1, At node 1 value of signal will be h 1 2 = = 1.5 (18) At node 2 Signal value will be h 1 2 = = 0.5..(19) At node 3 the signal value will be from Eq(18and 5) = 1.5 = = = At node 4 the signal value will be from Eq (19 and 5) = 0.5 = = = At node 5 the signal value will be from Eq 20, Eq 21 and Eq 8 = = From Eq(22) and Eq(10) the value of output for the third case i.e. x1=0 and x2=1 will be = =1, h h > CASE 4:- When x1=0 and x2=1, At node 1 value of signal will be = h 1 2 = = 0.5 (24) At node 2 Signal value will be , h 1 2 = = 0.5..(25) At node 3 the signal value will be from Eq(24and 5) = 0.5 = = = At node 4 the signal value will be from Eq (25 and 5) = 0.5 = = = At node 5 the signal value will be from Eq 26, Eq 27 and Eq 8 = = From Eq 28 and Eq 10 the value of the output y for input x1=1 and x2=1 will be = =0, h h < VIII.CONCLUSION From Eq. (11), Eq. (17), Eq. (23) and Eq. (29) it is concluded that the solution proposed proves that it is possible to solve XOR problem using minimum configuration Multilayer Perceptron. MLP provides very nice framework for solving problems that are specifying a non linearly separable domain. Hyperbolic function could be utilized as an activation function for training in the MLP. Hyperbolic tangent function which is a non linear function is utilized as an activation function for squashing the induced local field value or activation value produced after summation to produce output. By inclusion of hidden layer it was possible to solve problem which was complex.

10 One Solution to XOR problem using Multilayer Perceptron having Minimum Configuration REFERENCE [1] Y.S. Abu-Mostafa and J.M. St. Jacques, Information capacity of the Hopfield model, IEEE Transactions on Information Theory, vol. IT- 31, pp , [2] S. Amari, Mathematical foundations of neurocomputing, Proceeding of IEEE, vol. 78, pp , [3] H. Akaike, A new look at the statistical model identification, IEEE Transactions on Automatic Control, vol AC-19, pp , [4] A.F. Atiya and Y.S. Abu-Mostafa, An analog feedback associative memory, IEEE Transactions on Neural Networks, vol. 4, pp , [5] A.R. Barron, Universal approximation bounds for superpositions of a sigmoidal function, IEEE Transactions on Information Theory, vol. 39, pp , [6] J. Bruck, On the convergence properties of the Hopfield model, Proceedings of the IEEE, vol. 78, pp , [7] W.J. Freeman, Why neural networks don t yet fly: Inquiry into the neurodynamics of biological intelligence, IEEE International Conference on Neural Networks, vol. II, pp. 1-7, San Diego, CA, [8] M. Gori and A. Tesi, On the problem of local minima in backpropagation, IEEE Transaction Pattern Analysis and Machine Intellifgence, vol. 14, pp , [9] C.L. Giles, D. Chen, G.Z. Sun, H.H. Chen, Y.C. Lee and M.W. Goudreau, Constructive learning of recurrent neural networks: Limitations of recurrent cascade correlation with a simple solution, IEEE Transactions on Neural Networks, vol. 6, pp , [10] J.F. Cardoso and B. Laheld, Equivariant adaptive source separation, IEEE Transactions on Signal Processing, vol. 44, pp , [11] L.A. Feldkamp and G.V. Puskorius, A signal processing framework based on dynamic neural network with application to problems in adaptation, filtering and classification, Proceeding of the IEEE, vol. 86, 1998.

NEURAL NETWORKS A Comprehensive Foundation

NEURAL NETWORKS A Comprehensive Foundation NEURAL NETWORKS A Comprehensive Foundation Second Edition Simon Haykin McMaster University Hamilton, Ontario, Canada Prentice Hall Prentice Hall Upper Saddle River; New Jersey 07458 Preface xii Acknowledgments

More information

Neural Networks and Support Vector Machines

Neural Networks and Support Vector Machines INF5390 - Kunstig intelligens Neural Networks and Support Vector Machines Roar Fjellheim INF5390-13 Neural Networks and SVM 1 Outline Neural networks Perceptrons Neural networks Support vector machines

More information

NEURAL NETWORK FUNDAMENTALS WITH GRAPHS, ALGORITHMS, AND APPLICATIONS

NEURAL NETWORK FUNDAMENTALS WITH GRAPHS, ALGORITHMS, AND APPLICATIONS NEURAL NETWORK FUNDAMENTALS WITH GRAPHS, ALGORITHMS, AND APPLICATIONS N. K. Bose HRB-Systems Professor of Electrical Engineering The Pennsylvania State University, University Park P. Liang Associate Professor

More information

Introduction to Machine Learning and Data Mining. Prof. Dr. Igor Trajkovski trajkovski@nyus.edu.mk

Introduction to Machine Learning and Data Mining. Prof. Dr. Igor Trajkovski trajkovski@nyus.edu.mk Introduction to Machine Learning and Data Mining Prof. Dr. Igor Trakovski trakovski@nyus.edu.mk Neural Networks 2 Neural Networks Analogy to biological neural systems, the most robust learning systems

More information

Feed-Forward mapping networks KAIST 바이오및뇌공학과 정재승

Feed-Forward mapping networks KAIST 바이오및뇌공학과 정재승 Feed-Forward mapping networks KAIST 바이오및뇌공학과 정재승 How much energy do we need for brain functions? Information processing: Trade-off between energy consumption and wiring cost Trade-off between energy consumption

More information

Role of Neural network in data mining

Role of Neural network in data mining Role of Neural network in data mining Chitranjanjit kaur Associate Prof Guru Nanak College, Sukhchainana Phagwara,(GNDU) Punjab, India Pooja kapoor Associate Prof Swami Sarvanand Group Of Institutes Dinanagar(PTU)

More information

An Introduction to Neural Networks

An Introduction to Neural Networks An Introduction to Vincent Cheung Kevin Cannons Signal & Data Compression Laboratory Electrical & Computer Engineering University of Manitoba Winnipeg, Manitoba, Canada Advisor: Dr. W. Kinsner May 27,

More information

Stock Prediction using Artificial Neural Networks

Stock Prediction using Artificial Neural Networks Stock Prediction using Artificial Neural Networks Abhishek Kar (Y8021), Dept. of Computer Science and Engineering, IIT Kanpur Abstract In this work we present an Artificial Neural Network approach to predict

More information

Artificial Neural Networks and Support Vector Machines. CS 486/686: Introduction to Artificial Intelligence

Artificial Neural Networks and Support Vector Machines. CS 486/686: Introduction to Artificial Intelligence Artificial Neural Networks and Support Vector Machines CS 486/686: Introduction to Artificial Intelligence 1 Outline What is a Neural Network? - Perceptron learners - Multi-layer networks What is a Support

More information

Lecture 6. Artificial Neural Networks

Lecture 6. Artificial Neural Networks Lecture 6 Artificial Neural Networks 1 1 Artificial Neural Networks In this note we provide an overview of the key concepts that have led to the emergence of Artificial Neural Networks as a major paradigm

More information

Neural Network Design in Cloud Computing

Neural Network Design in Cloud Computing International Journal of Computer Trends and Technology- volume4issue2-2013 ABSTRACT: Neural Network Design in Cloud Computing B.Rajkumar #1,T.Gopikiran #2,S.Satyanarayana *3 #1,#2Department of Computer

More information

Recurrent Neural Networks

Recurrent Neural Networks Recurrent Neural Networks Neural Computation : Lecture 12 John A. Bullinaria, 2015 1. Recurrent Neural Network Architectures 2. State Space Models and Dynamical Systems 3. Backpropagation Through Time

More information

An Introduction to Artificial Neural Networks (ANN) - Methods, Abstraction, and Usage

An Introduction to Artificial Neural Networks (ANN) - Methods, Abstraction, and Usage An Introduction to Artificial Neural Networks (ANN) - Methods, Abstraction, and Usage Introduction An artificial neural network (ANN) reflects a system that is based on operations of biological neural

More information

AN APPLICATION OF TIME SERIES ANALYSIS FOR WEATHER FORECASTING

AN APPLICATION OF TIME SERIES ANALYSIS FOR WEATHER FORECASTING AN APPLICATION OF TIME SERIES ANALYSIS FOR WEATHER FORECASTING Abhishek Agrawal*, Vikas Kumar** 1,Ashish Pandey** 2,Imran Khan** 3 *(M. Tech Scholar, Department of Computer Science, Bhagwant University,

More information

A Time Series ANN Approach for Weather Forecasting

A Time Series ANN Approach for Weather Forecasting A Time Series ANN Approach for Weather Forecasting Neeraj Kumar 1, Govind Kumar Jha 2 1 Associate Professor and Head Deptt. Of Computer Science,Nalanda College Of Engineering Chandi(Bihar) 2 Assistant

More information

Comparison of Supervised and Unsupervised Learning Classifiers for Travel Recommendations

Comparison of Supervised and Unsupervised Learning Classifiers for Travel Recommendations Volume 3, No. 8, August 2012 Journal of Global Research in Computer Science REVIEW ARTICLE Available Online at www.jgrcs.info Comparison of Supervised and Unsupervised Learning Classifiers for Travel Recommendations

More information

Method of Combining the Degrees of Similarity in Handwritten Signature Authentication Using Neural Networks

Method of Combining the Degrees of Similarity in Handwritten Signature Authentication Using Neural Networks Method of Combining the Degrees of Similarity in Handwritten Signature Authentication Using Neural Networks Ph. D. Student, Eng. Eusebiu Marcu Abstract This paper introduces a new method of combining the

More information

6.2.8 Neural networks for data mining

6.2.8 Neural networks for data mining 6.2.8 Neural networks for data mining Walter Kosters 1 In many application areas neural networks are known to be valuable tools. This also holds for data mining. In this chapter we discuss the use of neural

More information

Introduction to Artificial Neural Networks

Introduction to Artificial Neural Networks POLYTECHNIC UNIVERSITY Department of Computer and Information Science Introduction to Artificial Neural Networks K. Ming Leung Abstract: A computing paradigm known as artificial neural network is introduced.

More information

Feedforward Neural Networks and Backpropagation

Feedforward Neural Networks and Backpropagation Feedforward Neural Networks and Backpropagation Feedforward neural networks Architectural issues, computational capabilities Sigmoidal and radial basis functions Gradient-based learning and Backprogation

More information

ARTIFICIAL NEURAL NETWORKS FOR DATA MINING

ARTIFICIAL NEURAL NETWORKS FOR DATA MINING ARTIFICIAL NEURAL NETWORKS FOR DATA MINING Amrender Kumar I.A.S.R.I., Library Avenue, Pusa, New Delhi-110 012 akha@iasri.res.in 1. Introduction Neural networks, more accurately called Artificial Neural

More information

Neural Networks in Data Mining

Neural Networks in Data Mining IOSR Journal of Engineering (IOSRJEN) ISSN (e): 2250-3021, ISSN (p): 2278-8719 Vol. 04, Issue 03 (March. 2014), V6 PP 01-06 www.iosrjen.org Neural Networks in Data Mining Ripundeep Singh Gill, Ashima Department

More information

An Artificial Neural Networks-Based on-line Monitoring Odor Sensing System

An Artificial Neural Networks-Based on-line Monitoring Odor Sensing System Journal of Computer Science 5 (11): 878-882, 2009 ISSN 1549-3636 2009 Science Publications An Artificial Neural Networks-Based on-line Monitoring Odor Sensing System Yousif Al-Bastaki The College of Information

More information

Open Access Research on Application of Neural Network in Computer Network Security Evaluation. Shujuan Jin *

Open Access Research on Application of Neural Network in Computer Network Security Evaluation. Shujuan Jin * Send Orders for Reprints to reprints@benthamscience.ae 766 The Open Electrical & Electronic Engineering Journal, 2014, 8, 766-771 Open Access Research on Application of Neural Network in Computer Network

More information

Machine Learning and Data Mining -

Machine Learning and Data Mining - Machine Learning and Data Mining - Perceptron Neural Networks Nuno Cavalheiro Marques (nmm@di.fct.unl.pt) Spring Semester 2010/2011 MSc in Computer Science Multi Layer Perceptron Neurons and the Perceptron

More information

Neural network software tool development: exploring programming language options

Neural network software tool development: exploring programming language options INEB- PSI Technical Report 2006-1 Neural network software tool development: exploring programming language options Alexandra Oliveira aao@fe.up.pt Supervisor: Professor Joaquim Marques de Sá June 2006

More information

EFFICIENT DATA PRE-PROCESSING FOR DATA MINING

EFFICIENT DATA PRE-PROCESSING FOR DATA MINING EFFICIENT DATA PRE-PROCESSING FOR DATA MINING USING NEURAL NETWORKS JothiKumar.R 1, Sivabalan.R.V 2 1 Research scholar, Noorul Islam University, Nagercoil, India Assistant Professor, Adhiparasakthi College

More information

Keywords: Image complexity, PSNR, Levenberg-Marquardt, Multi-layer neural network.

Keywords: Image complexity, PSNR, Levenberg-Marquardt, Multi-layer neural network. Global Journal of Computer Science and Technology Volume 11 Issue 3 Version 1.0 Type: Double Blind Peer Reviewed International Research Journal Publisher: Global Journals Inc. (USA) Online ISSN: 0975-4172

More information

Performance Evaluation of Artificial Neural. Networks for Spatial Data Analysis

Performance Evaluation of Artificial Neural. Networks for Spatial Data Analysis Contemporary Engineering Sciences, Vol. 4, 2011, no. 4, 149-163 Performance Evaluation of Artificial Neural Networks for Spatial Data Analysis Akram A. Moustafa Department of Computer Science Al al-bayt

More information

3 An Illustrative Example

3 An Illustrative Example Objectives An Illustrative Example Objectives - Theory and Examples -2 Problem Statement -2 Perceptron - Two-Input Case -4 Pattern Recognition Example -5 Hamming Network -8 Feedforward Layer -8 Recurrent

More information

Back Propagation Neural Network for Wireless Networking

Back Propagation Neural Network for Wireless Networking International Journal of Computer Sciences and Engineering Open Access Review Paper Volume-4, Issue-4 E-ISSN: 2347-2693 Back Propagation Neural Network for Wireless Networking Menal Dahiya Maharaja Surajmal

More information

Neural Networks and Back Propagation Algorithm

Neural Networks and Back Propagation Algorithm Neural Networks and Back Propagation Algorithm Mirza Cilimkovic Institute of Technology Blanchardstown Blanchardstown Road North Dublin 15 Ireland mirzac@gmail.com Abstract Neural Networks (NN) are important

More information

American International Journal of Research in Science, Technology, Engineering & Mathematics

American International Journal of Research in Science, Technology, Engineering & Mathematics American International Journal of Research in Science, Technology, Engineering & Mathematics Available online at http://www.iasir.net ISSN (Print): 2328-349, ISSN (Online): 2328-3580, ISSN (CD-ROM): 2328-3629

More information

A Simple Feature Extraction Technique of a Pattern By Hopfield Network

A Simple Feature Extraction Technique of a Pattern By Hopfield Network A Simple Feature Extraction Technique of a Pattern By Hopfield Network A.Nag!, S. Biswas *, D. Sarkar *, P.P. Sarkar *, B. Gupta **! Academy of Technology, Hoogly - 722 *USIC, University of Kalyani, Kalyani

More information

NEURAL NETWORKS IN DATA MINING

NEURAL NETWORKS IN DATA MINING NEURAL NETWORKS IN DATA MINING 1 DR. YASHPAL SINGH, 2 ALOK SINGH CHAUHAN 1 Reader, Bundelkhand Institute of Engineering & Technology, Jhansi, India 2 Lecturer, United Institute of Management, Allahabad,

More information

Chapter 4: Artificial Neural Networks

Chapter 4: Artificial Neural Networks Chapter 4: Artificial Neural Networks CS 536: Machine Learning Littman (Wu, TA) Administration icml-03: instructional Conference on Machine Learning http://www.cs.rutgers.edu/~mlittman/courses/ml03/icml03/

More information

SEMINAR OUTLINE. Introduction to Data Mining Using Artificial Neural Networks. Definitions of Neural Networks. Definitions of Neural Networks

SEMINAR OUTLINE. Introduction to Data Mining Using Artificial Neural Networks. Definitions of Neural Networks. Definitions of Neural Networks SEMINAR OUTLINE Introduction to Data Mining Using Artificial Neural Networks ISM 611 Dr. Hamid Nemati Introduction to and Characteristics of Neural Networks Comparison of Neural Networks to traditional

More information

129: Artificial Neural Networks. Ajith Abraham Oklahoma State University, Stillwater, OK, USA 1 INTRODUCTION TO ARTIFICIAL NEURAL NETWORKS

129: Artificial Neural Networks. Ajith Abraham Oklahoma State University, Stillwater, OK, USA 1 INTRODUCTION TO ARTIFICIAL NEURAL NETWORKS 129: Artificial Neural Networks Ajith Abraham Oklahoma State University, Stillwater, OK, USA 1 Introduction to Artificial Neural Networks 901 2 Neural Network Architectures 902 3 Neural Network Learning

More information

Comparison of Supervised and Unsupervised Learning Algorithms for Pattern Classification

Comparison of Supervised and Unsupervised Learning Algorithms for Pattern Classification Comparison of Supervised and Unsupervised Learning Algorithms for Pattern Classification R. Sathya Professor, Dept. of MCA, Jyoti Nivas College (Autonomous), Professor and Head, Dept. of Mathematics, Bangalore,

More information

Novelty Detection in image recognition using IRF Neural Networks properties

Novelty Detection in image recognition using IRF Neural Networks properties Novelty Detection in image recognition using IRF Neural Networks properties Philippe Smagghe, Jean-Luc Buessler, Jean-Philippe Urban Université de Haute-Alsace MIPS 4, rue des Frères Lumière, 68093 Mulhouse,

More information

SELECTING NEURAL NETWORK ARCHITECTURE FOR INVESTMENT PROFITABILITY PREDICTIONS

SELECTING NEURAL NETWORK ARCHITECTURE FOR INVESTMENT PROFITABILITY PREDICTIONS UDC: 004.8 Original scientific paper SELECTING NEURAL NETWORK ARCHITECTURE FOR INVESTMENT PROFITABILITY PREDICTIONS Tonimir Kišasondi, Alen Lovren i University of Zagreb, Faculty of Organization and Informatics,

More information

Biological Neurons and Neural Networks, Artificial Neurons

Biological Neurons and Neural Networks, Artificial Neurons Biological Neurons and Neural Networks, Artificial Neurons Neural Computation : Lecture 2 John A. Bullinaria, 2015 1. Organization of the Nervous System and Brain 2. Brains versus Computers: Some Numbers

More information

Analecta Vol. 8, No. 2 ISSN 2064-7964

Analecta Vol. 8, No. 2 ISSN 2064-7964 EXPERIMENTAL APPLICATIONS OF ARTIFICIAL NEURAL NETWORKS IN ENGINEERING PROCESSING SYSTEM S. Dadvandipour Institute of Information Engineering, University of Miskolc, Egyetemváros, 3515, Miskolc, Hungary,

More information

Power Prediction Analysis using Artificial Neural Network in MS Excel

Power Prediction Analysis using Artificial Neural Network in MS Excel Power Prediction Analysis using Artificial Neural Network in MS Excel NURHASHINMAH MAHAMAD, MUHAMAD KAMAL B. MOHAMMED AMIN Electronic System Engineering Department Malaysia Japan International Institute

More information

Comparison of K-means and Backpropagation Data Mining Algorithms

Comparison of K-means and Backpropagation Data Mining Algorithms Comparison of K-means and Backpropagation Data Mining Algorithms Nitu Mathuriya, Dr. Ashish Bansal Abstract Data mining has got more and more mature as a field of basic research in computer science and

More information

Electroencephalography Analysis Using Neural Network and Support Vector Machine during Sleep

Electroencephalography Analysis Using Neural Network and Support Vector Machine during Sleep Engineering, 23, 5, 88-92 doi:.4236/eng.23.55b8 Published Online May 23 (http://www.scirp.org/journal/eng) Electroencephalography Analysis Using Neural Network and Support Vector Machine during Sleep JeeEun

More information

Data quality in Accounting Information Systems

Data quality in Accounting Information Systems Data quality in Accounting Information Systems Comparing Several Data Mining Techniques Erjon Zoto Department of Statistics and Applied Informatics Faculty of Economy, University of Tirana Tirana, Albania

More information

Preface. C++ Neural Networks and Fuzzy Logic:Preface. Table of Contents

Preface. C++ Neural Networks and Fuzzy Logic:Preface. Table of Contents C++ Neural Networks and Fuzzy Logic by Valluru B. Rao MTBooks, IDG Books Worldwide, Inc. ISBN: 1558515526 Pub Date: 06/01/95 Table of Contents Preface The number of models available in neural network literature

More information

Chapter 12 Discovering New Knowledge Data Mining

Chapter 12 Discovering New Knowledge Data Mining Chapter 12 Discovering New Knowledge Data Mining Becerra-Fernandez, et al. -- Knowledge Management 1/e -- 2004 Prentice Hall Additional material 2007 Dekai Wu Chapter Objectives Introduce the student to

More information

How To Use Neural Networks In Data Mining

How To Use Neural Networks In Data Mining International Journal of Electronics and Computer Science Engineering 1449 Available Online at www.ijecse.org ISSN- 2277-1956 Neural Networks in Data Mining Priyanka Gaur Department of Information and

More information

Iranian J Env Health Sci Eng, 2004, Vol.1, No.2, pp.51-57. Application of Intelligent System for Water Treatment Plant Operation.

Iranian J Env Health Sci Eng, 2004, Vol.1, No.2, pp.51-57. Application of Intelligent System for Water Treatment Plant Operation. Iranian J Env Health Sci Eng, 2004, Vol.1, No.2, pp.51-57 Application of Intelligent System for Water Treatment Plant Operation *A Mirsepassi Dept. of Environmental Health Engineering, School of Public

More information

Neural Computation - Assignment

Neural Computation - Assignment Neural Computation - Assignment Analysing a Neural Network trained by Backpropagation AA SSt t aa t i iss i t i icc aa l l AA nn aa l lyy l ss i iss i oo f vv aa r i ioo i uu ss l lee l aa r nn i inn gg

More information

Data Mining Using Neural Networks: A Guide for Statisticians

Data Mining Using Neural Networks: A Guide for Statisticians Data Mining Using Neural Networks: A Guide for Statisticians Basilio de Bragança Pereira UFRJ - Universidade Federal do Rio de Janeiro Calyampudi Radhakrishna Rao PSU - Penn State University June 2009

More information

EVALUATION OF NEURAL NETWORK BASED CLASSIFICATION SYSTEMS FOR CLINICAL CANCER DATA CLASSIFICATION

EVALUATION OF NEURAL NETWORK BASED CLASSIFICATION SYSTEMS FOR CLINICAL CANCER DATA CLASSIFICATION EVALUATION OF NEURAL NETWORK BASED CLASSIFICATION SYSTEMS FOR CLINICAL CANCER DATA CLASSIFICATION K. Mumtaz Vivekanandha Institute of Information and Management Studies, Tiruchengode, India S.A.Sheriff

More information

ANN Based Fault Classifier and Fault Locator for Double Circuit Transmission Line

ANN Based Fault Classifier and Fault Locator for Double Circuit Transmission Line International Journal of Computer Sciences and Engineering Open Access Research Paper Volume-4, Special Issue-2, April 2016 E-ISSN: 2347-2693 ANN Based Fault Classifier and Fault Locator for Double Circuit

More information

Forecasting of Economic Quantities using Fuzzy Autoregressive Model and Fuzzy Neural Network

Forecasting of Economic Quantities using Fuzzy Autoregressive Model and Fuzzy Neural Network Forecasting of Economic Quantities using Fuzzy Autoregressive Model and Fuzzy Neural Network Dušan Marček 1 Abstract Most models for the time series of stock prices have centered on autoregressive (AR)

More information

APPLICATION OF ARTIFICIAL NEURAL NETWORKS USING HIJRI LUNAR TRANSACTION AS EXTRACTED VARIABLES TO PREDICT STOCK TREND DIRECTION

APPLICATION OF ARTIFICIAL NEURAL NETWORKS USING HIJRI LUNAR TRANSACTION AS EXTRACTED VARIABLES TO PREDICT STOCK TREND DIRECTION LJMS 2008, 2 Labuan e-journal of Muamalat and Society, Vol. 2, 2008, pp. 9-16 Labuan e-journal of Muamalat and Society APPLICATION OF ARTIFICIAL NEURAL NETWORKS USING HIJRI LUNAR TRANSACTION AS EXTRACTED

More information

A Prediction Model for Taiwan Tourism Industry Stock Index

A Prediction Model for Taiwan Tourism Industry Stock Index A Prediction Model for Taiwan Tourism Industry Stock Index ABSTRACT Han-Chen Huang and Fang-Wei Chang Yu Da University of Science and Technology, Taiwan Investors and scholars pay continuous attention

More information

Comparison Between Multilayer Feedforward Neural Networks and a Radial Basis Function Network to Detect and Locate Leaks in Pipelines Transporting Gas

Comparison Between Multilayer Feedforward Neural Networks and a Radial Basis Function Network to Detect and Locate Leaks in Pipelines Transporting Gas A publication of 1375 CHEMICAL ENGINEERINGTRANSACTIONS VOL. 32, 2013 Chief Editors:SauroPierucci, JiříJ. Klemeš Copyright 2013, AIDIC ServiziS.r.l., ISBN 978-88-95608-23-5; ISSN 1974-9791 The Italian Association

More information

SMORN-VII REPORT NEURAL NETWORK BENCHMARK ANALYSIS RESULTS & FOLLOW-UP 96. Özer CIFTCIOGLU Istanbul Technical University, ITU. and

SMORN-VII REPORT NEURAL NETWORK BENCHMARK ANALYSIS RESULTS & FOLLOW-UP 96. Özer CIFTCIOGLU Istanbul Technical University, ITU. and NEA/NSC-DOC (96)29 AUGUST 1996 SMORN-VII REPORT NEURAL NETWORK BENCHMARK ANALYSIS RESULTS & FOLLOW-UP 96 Özer CIFTCIOGLU Istanbul Technical University, ITU and Erdinç TÜRKCAN Netherlands Energy Research

More information

Real-Time Credit-Card Fraud Detection using Artificial Neural Network Tuned by Simulated Annealing Algorithm

Real-Time Credit-Card Fraud Detection using Artificial Neural Network Tuned by Simulated Annealing Algorithm Proc. of Int. Conf. on Recent Trends in Information, Telecommunication and Computing, ITC Real-Time Credit-Card Fraud Detection using Artificial Neural Network Tuned by Simulated Annealing Algorithm Azeem

More information

Neural Networks algorithms and applications

Neural Networks algorithms and applications Neural Networks algorithms and applications By Fiona Nielsen 4i 12/12-2001 Supervisor: Geert Rasmussen Niels Brock Business College 1 Introduction Neural Networks is a field of Artificial Intelligence

More information

Impact of Feature Selection on the Performance of Wireless Intrusion Detection Systems

Impact of Feature Selection on the Performance of Wireless Intrusion Detection Systems 2009 International Conference on Computer Engineering and Applications IPCSIT vol.2 (2011) (2011) IACSIT Press, Singapore Impact of Feature Selection on the Performance of ireless Intrusion Detection Systems

More information

Neural Network Applications in Stock Market Predictions - A Methodology Analysis

Neural Network Applications in Stock Market Predictions - A Methodology Analysis Neural Network Applications in Stock Market Predictions - A Methodology Analysis Marijana Zekic, MS University of Josip Juraj Strossmayer in Osijek Faculty of Economics Osijek Gajev trg 7, 31000 Osijek

More information

Artificial neural networks

Artificial neural networks Artificial neural networks Now Neurons Neuron models Perceptron learning Multi-layer perceptrons Backpropagation 2 It all starts with a neuron 3 Some facts about human brain ~ 86 billion neurons ~ 10 15

More information

Use of Artificial Neural Network in Data Mining For Weather Forecasting

Use of Artificial Neural Network in Data Mining For Weather Forecasting Use of Artificial Neural Network in Data Mining For Weather Forecasting Gaurav J. Sawale #, Dr. Sunil R. Gupta * # Department Computer Science & Engineering, P.R.M.I.T& R, Badnera. 1 gaurav.sawale@yahoo.co.in

More information

SUCCESSFUL PREDICTION OF HORSE RACING RESULTS USING A NEURAL NETWORK

SUCCESSFUL PREDICTION OF HORSE RACING RESULTS USING A NEURAL NETWORK SUCCESSFUL PREDICTION OF HORSE RACING RESULTS USING A NEURAL NETWORK N M Allinson and D Merritt 1 Introduction This contribution has two main sections. The first discusses some aspects of multilayer perceptrons,

More information

Data Mining. Supervised Methods. Ciro Donalek donalek@astro.caltech.edu. Ay/Bi 199ab: Methods of Computa@onal Sciences hcp://esci101.blogspot.

Data Mining. Supervised Methods. Ciro Donalek donalek@astro.caltech.edu. Ay/Bi 199ab: Methods of Computa@onal Sciences hcp://esci101.blogspot. Data Mining Supervised Methods Ciro Donalek donalek@astro.caltech.edu Supervised Methods Summary Ar@ficial Neural Networks Mul@layer Perceptron Support Vector Machines SoLwares Supervised Models: Supervised

More information

Learning to Process Natural Language in Big Data Environment

Learning to Process Natural Language in Big Data Environment CCF ADL 2015 Nanchang Oct 11, 2015 Learning to Process Natural Language in Big Data Environment Hang Li Noah s Ark Lab Huawei Technologies Part 1: Deep Learning - Present and Future Talk Outline Overview

More information

SURVIVABILITY ANALYSIS OF PEDIATRIC LEUKAEMIC PATIENTS USING NEURAL NETWORK APPROACH

SURVIVABILITY ANALYSIS OF PEDIATRIC LEUKAEMIC PATIENTS USING NEURAL NETWORK APPROACH 330 SURVIVABILITY ANALYSIS OF PEDIATRIC LEUKAEMIC PATIENTS USING NEURAL NETWORK APPROACH T. M. D.Saumya 1, T. Rupasinghe 2 and P. Abeysinghe 3 1 Department of Industrial Management, University of Kelaniya,

More information

MANAGING QUEUE STABILITY USING ART2 IN ACTIVE QUEUE MANAGEMENT FOR CONGESTION CONTROL

MANAGING QUEUE STABILITY USING ART2 IN ACTIVE QUEUE MANAGEMENT FOR CONGESTION CONTROL MANAGING QUEUE STABILITY USING ART2 IN ACTIVE QUEUE MANAGEMENT FOR CONGESTION CONTROL G. Maria Priscilla 1 and C. P. Sumathi 2 1 S.N.R. Sons College (Autonomous), Coimbatore, India 2 SDNB Vaishnav College

More information

Tennis Winner Prediction based on Time-Series History with Neural Modeling

Tennis Winner Prediction based on Time-Series History with Neural Modeling Tennis Winner Prediction based on Time-Series History with Neural Modeling Amornchai Somboonphokkaphan, Suphakant Phimoltares, and Chidchanok Lursinsap Abstract Tennis is one of the most popular sports

More information

A New Approach For Estimating Software Effort Using RBFN Network

A New Approach For Estimating Software Effort Using RBFN Network IJCSNS International Journal of Computer Science and Network Security, VOL.8 No.7, July 008 37 A New Approach For Estimating Software Using RBFN Network Ch. Satyananda Reddy, P. Sankara Rao, KVSVN Raju,

More information

Neural Networks for Data Mining

Neural Networks for Data Mining ONLINE CHAPTER 6 Neural Networks for Data Mining Learning Objectives Understand the concept and different types of artificial neural networks (ANN) Learn the advantages and limitations of ANN Understand

More information

OPTIMUM LEARNING RATE FOR CLASSIFICATION PROBLEM

OPTIMUM LEARNING RATE FOR CLASSIFICATION PROBLEM OPTIMUM LEARNING RATE FOR CLASSIFICATION PROBLEM WITH MLP IN DATA MINING Lalitha Saroja Thota 1 and Suresh Babu Changalasetty 2 1 Department of Computer Science, King Khalid University, Abha, KSA 2 Department

More information

THREE DIMENSIONAL REPRESENTATION OF AMINO ACID CHARAC- TERISTICS

THREE DIMENSIONAL REPRESENTATION OF AMINO ACID CHARAC- TERISTICS THREE DIMENSIONAL REPRESENTATION OF AMINO ACID CHARAC- TERISTICS O.U. Sezerman 1, R. Islamaj 2, E. Alpaydin 2 1 Laborotory of Computational Biology, Sabancı University, Istanbul, Turkey. 2 Computer Engineering

More information

AUTOMATION OF ENERGY DEMAND FORECASTING. Sanzad Siddique, B.S.

AUTOMATION OF ENERGY DEMAND FORECASTING. Sanzad Siddique, B.S. AUTOMATION OF ENERGY DEMAND FORECASTING by Sanzad Siddique, B.S. A Thesis submitted to the Faculty of the Graduate School, Marquette University, in Partial Fulfillment of the Requirements for the Degree

More information

NTC Project: S01-PH10 (formerly I01-P10) 1 Forecasting Women s Apparel Sales Using Mathematical Modeling

NTC Project: S01-PH10 (formerly I01-P10) 1 Forecasting Women s Apparel Sales Using Mathematical Modeling 1 Forecasting Women s Apparel Sales Using Mathematical Modeling Celia Frank* 1, Balaji Vemulapalli 1, Les M. Sztandera 2, Amar Raheja 3 1 School of Textiles and Materials Technology 2 Computer Information

More information

Time Series Data Mining in Rainfall Forecasting Using Artificial Neural Network

Time Series Data Mining in Rainfall Forecasting Using Artificial Neural Network Time Series Data Mining in Rainfall Forecasting Using Artificial Neural Network Prince Gupta 1, Satanand Mishra 2, S.K.Pandey 3 1,3 VNS Group, RGPV, Bhopal, 2 CSIR-AMPRI, BHOPAL prince2010.gupta@gmail.com

More information

Application of Neural Networks to Character Recognition

Application of Neural Networks to Character Recognition Proceedings of Students/Faculty Research Day, CSIS, Pace University, May 4th, 2007 Application of Neural Networks to Character Recognition Abstract Dong Xiao Ni Seidenberg School of CSIS, Pace University,

More information

Follow links Class Use and other Permissions. For more information, send email to: permissions@pupress.princeton.edu

Follow links Class Use and other Permissions. For more information, send email to: permissions@pupress.princeton.edu COPYRIGHT NOTICE: David A. Kendrick, P. Ruben Mercado, and Hans M. Amman: Computational Economics is published by Princeton University Press and copyrighted, 2006, by Princeton University Press. All rights

More information

Machine Learning: Multi Layer Perceptrons

Machine Learning: Multi Layer Perceptrons Machine Learning: Multi Layer Perceptrons Prof. Dr. Martin Riedmiller Albert-Ludwigs-University Freiburg AG Maschinelles Lernen Machine Learning: Multi Layer Perceptrons p.1/61 Outline multi layer perceptrons

More information

Utilization of Neural Network for Disease Forecasting

Utilization of Neural Network for Disease Forecasting Utilization of Neural Network for Disease Forecasting Oyas Wahyunggoro 1, Adhistya Erna Permanasari 1, and Ahmad Chamsudin 1,2 1 Department of Electrical Engineering and Information Technology, Gadjah

More information

UNIVERSITY OF BOLTON SCHOOL OF ENGINEERING MS SYSTEMS ENGINEERING AND ENGINEERING MANAGEMENT SEMESTER 1 EXAMINATION 2015/2016 INTELLIGENT SYSTEMS

UNIVERSITY OF BOLTON SCHOOL OF ENGINEERING MS SYSTEMS ENGINEERING AND ENGINEERING MANAGEMENT SEMESTER 1 EXAMINATION 2015/2016 INTELLIGENT SYSTEMS TW72 UNIVERSITY OF BOLTON SCHOOL OF ENGINEERING MS SYSTEMS ENGINEERING AND ENGINEERING MANAGEMENT SEMESTER 1 EXAMINATION 2015/2016 INTELLIGENT SYSTEMS MODULE NO: EEM7010 Date: Monday 11 th January 2016

More information

Using artificial intelligence for data reduction in mechanical engineering

Using artificial intelligence for data reduction in mechanical engineering Using artificial intelligence for data reduction in mechanical engineering L. Mdlazi 1, C.J. Stander 1, P.S. Heyns 1, T. Marwala 2 1 Dynamic Systems Group Department of Mechanical and Aeronautical Engineering,

More information

Forecasting of Indian Rupee (INR) / US Dollar (USD) Currency Exchange Rate Using Artificial Neural Network

Forecasting of Indian Rupee (INR) / US Dollar (USD) Currency Exchange Rate Using Artificial Neural Network Forecasting of Indian Rupee (INR) / US Dollar (USD) Currency Exchange Rate Using Artificial Neural Network Yusuf Perwej 1 and Asif Perwej 2 1 M.Tech, MCA, Department of Computer Science & Information System,

More information

Neural Networks in Quantitative Finance

Neural Networks in Quantitative Finance Neural Networks in Quantitative Finance Master Thesis submitted to Prof. Dr. Wolfgang Härdle Institute for Statistics and Econometrics CASE - Center for Applied Statistics and Economics Humboldt-Universität

More information

Self Organizing Maps: Fundamentals

Self Organizing Maps: Fundamentals Self Organizing Maps: Fundamentals Introduction to Neural Networks : Lecture 16 John A. Bullinaria, 2004 1. What is a Self Organizing Map? 2. Topographic Maps 3. Setting up a Self Organizing Map 4. Kohonen

More information

Design call center management system of e-commerce based on BP neural network and multifractal

Design call center management system of e-commerce based on BP neural network and multifractal Available online www.jocpr.com Journal of Chemical and Pharmaceutical Research, 2014, 6(6):951-956 Research Article ISSN : 0975-7384 CODEN(USA) : JCPRC5 Design call center management system of e-commerce

More information

degrees of freedom and are able to adapt to the task they are supposed to do [Gupta].

degrees of freedom and are able to adapt to the task they are supposed to do [Gupta]. 1.3 Neural Networks 19 Neural Networks are large structured systems of equations. These systems have many degrees of freedom and are able to adapt to the task they are supposed to do [Gupta]. Two very

More information

Advanced analytics at your hands

Advanced analytics at your hands 2.3 Advanced analytics at your hands Neural Designer is the most powerful predictive analytics software. It uses innovative neural networks techniques to provide data scientists with results in a way previously

More information

HYBRID PROBABILITY BASED ENSEMBLES FOR BANKRUPTCY PREDICTION

HYBRID PROBABILITY BASED ENSEMBLES FOR BANKRUPTCY PREDICTION HYBRID PROBABILITY BASED ENSEMBLES FOR BANKRUPTCY PREDICTION Chihli Hung 1, Jing Hong Chen 2, Stefan Wermter 3, 1,2 Department of Management Information Systems, Chung Yuan Christian University, Taiwan

More information

Artificial Neural Networks are bio-inspired mechanisms for intelligent decision support. Artificial Neural Networks. Research Article 2014

Artificial Neural Networks are bio-inspired mechanisms for intelligent decision support. Artificial Neural Networks. Research Article 2014 An Experiment to Signify Fuzzy Logic as an Effective User Interface Tool for Artificial Neural Network Nisha Macwan *, Priti Srinivas Sajja G.H. Patel Department of Computer Science India Abstract Artificial

More information

A Content based Spam Filtering Using Optical Back Propagation Technique

A Content based Spam Filtering Using Optical Back Propagation Technique A Content based Spam Filtering Using Optical Back Propagation Technique Sarab M. Hameed 1, Noor Alhuda J. Mohammed 2 Department of Computer Science, College of Science, University of Baghdad - Iraq ABSTRACT

More information

INTELLIGENT ENERGY MANAGEMENT OF ELECTRICAL POWER SYSTEMS WITH DISTRIBUTED FEEDING ON THE BASIS OF FORECASTS OF DEMAND AND GENERATION Chr.

INTELLIGENT ENERGY MANAGEMENT OF ELECTRICAL POWER SYSTEMS WITH DISTRIBUTED FEEDING ON THE BASIS OF FORECASTS OF DEMAND AND GENERATION Chr. INTELLIGENT ENERGY MANAGEMENT OF ELECTRICAL POWER SYSTEMS WITH DISTRIBUTED FEEDING ON THE BASIS OF FORECASTS OF DEMAND AND GENERATION Chr. Meisenbach M. Hable G. Winkler P. Meier Technology, Laboratory

More information

Neural Network Predictor for Fraud Detection: A Study Case for the Federal Patrimony Department

Neural Network Predictor for Fraud Detection: A Study Case for the Federal Patrimony Department DOI: 10.5769/C2012010 or http://dx.doi.org/10.5769/c2012010 Neural Network Predictor for Fraud Detection: A Study Case for the Federal Patrimony Department Antonio Manuel Rubio Serrano (1,2), João Paulo

More information

Tolerance of Radial Basis Functions against Stuck-At-Faults

Tolerance of Radial Basis Functions against Stuck-At-Faults Tolerance of Radial Basis Functions against Stuck-At-Faults Ralf Eickhoff 1 and Ulrich Rückert 1 Heinz Nixdorf Institute System and Circuit Technology University of Paderborn, Germany eickhoff,rueckert@hni.upb.de

More information

TRAINING A LIMITED-INTERCONNECT, SYNTHETIC NEURAL IC

TRAINING A LIMITED-INTERCONNECT, SYNTHETIC NEURAL IC 777 TRAINING A LIMITED-INTERCONNECT, SYNTHETIC NEURAL IC M.R. Walker. S. Haghighi. A. Afghan. and L.A. Akers Center for Solid State Electronics Research Arizona State University Tempe. AZ 85287-6206 mwalker@enuxha.eas.asu.edu

More information

Impelling Heart Attack Prediction System using Data Mining and Artificial Neural Network

Impelling Heart Attack Prediction System using Data Mining and Artificial Neural Network General Article International Journal of Current Engineering and Technology E-ISSN 2277 4106, P-ISSN 2347-5161 2014 INPRESSCO, All Rights Reserved Available at http://inpressco.com/category/ijcet Impelling

More information

SEARCH AND CLASSIFICATION OF "INTERESTING" BUSINESS APPLICATIONS IN THE WORLD WIDE WEB USING A NEURAL NETWORK APPROACH

SEARCH AND CLASSIFICATION OF INTERESTING BUSINESS APPLICATIONS IN THE WORLD WIDE WEB USING A NEURAL NETWORK APPROACH SEARCH AND CLASSIFICATION OF "INTERESTING" BUSINESS APPLICATIONS IN THE WORLD WIDE WEB USING A NEURAL NETWORK APPROACH Abstract Karl Kurbel, Kirti Singh, Frank Teuteberg Europe University Viadrina Frankfurt

More information