One Solution to XOR problem using Multilayer Perceptron having Minimum Configuration


 Marjory Bennett
 1 years ago
 Views:
Transcription
1 International Journal of Science and Engineering Volume 3, Number PP: IJSE Available at ISSN: One Solution to XOR problem using Multilayer Perceptron having Minimum Configuration Vaibhav Kant Singh Department of Computer Science and Engineering Institute of Technology, Guru Ghasidas Vishwavidyalaya, Central University, Bilaspur, (C.G.), India 1 Abstract Artificial Neural Network (ANN) is the branch of Computer Science which deals with the Construction of programs that are having analogy with the Biological Neural Network (BNN). There are various types of ANN systems which are used to solve variety of problems. When we will look into the history of development ANN we saw the concept of linear separabilty. The problems which are supposed to be linearly separable are solved easily my making use of Single layer perceptron model proposed by Rosenblatt. XOR problem the solution to which is discussed in this paper is a nonlinearly separable problem. The problem is a complex problem and requires new type of ANN system for its solution. In this paper we will see Architectural Graph and Signal Flow Graphs representing the ANN equivalent to Minimum Configuration Multilayer perceptron (MLP). We have utilized hyperbolic tangent function as a Activation function for Hidden Layers and Threshold Function as Activation Function for Output Layer. The learning employed is Error Correction Learning and the algorithm employed is Back propagation Algorithm (BPN). In this paper one solution is proposed for the solution of XOR problem. Keywords ANN, BNN, Activation Function, Hyperbolic Tangent Function, BPN, MLP. I. INTRODUCTION TO ARTIFICIAL NEURAL NETWORK Artificial Neural Network is a parallel and distributed processor which is simulated in a digital computer and whose working is analogous to the working of Human Brain. Humans are having nervous system that performs operation in parallel after attaining inputs from the five basic sense organs. The cells which are responsible for processing the stimulus obtained by the environment are called nerve cells or neurons. ANN resembles human brain in two aspects i.e. knowledge is acquired from the environment through an interactive process of weight change and interneuron connection strength i.e. synaptic weights are used to store the acquired knowledge. With every iteration of the learning process the ANN becomes more knowledgeable about the environment in which it is operating. ANN are represented using three techniques namely Block diagram representation, Signal flow graph and Architectural Graph. The basic components of a neuron are set of adjustable synaptic weigths attached to the inputs and bias, a Summing Junction and a linear or nonlinear activation function. The three basic elements of any ANN are neuron, network topology and learning algorithm. The learning algorithms employed in ANN is classified into three basic types in first level i.e. Supervised learning, Reinforcement learning and Unsupervised learning. Under supervised learning comes Error Correction and Stochastic learning. Error correction is further classified into LMS and BPN. Unsupervised learning on the other hand is classified into Hebbian and Competitive learning. Some of the Neural Network Systems include SOFM (Self Organizing Feature Map), Perceptron, MLP, Neoconition, ADALINE (Adaptive Linear Neural Element), MADALINE (Multiple ADALINE), LVQ (Learning Vector Quantization), AM (Associative Memory), BAM (Bidirectional Associative Memory), Boltzmann machine, BSB (BrainStateina Box), Cauchy machines, Hopfield network, ART (Adaptive Resonance Theory), RBF (Radial Basis Function), RNN (Recurrent Neural Network) etc. II. SOLUTION OF AND, OR, NAND AND NOR GATES USING MCCULLOCH AND PITTS MODEL (1) AND GATE A.B LOGIC for AND GATE Where A and B are input values
2 One Solution to XOR problem using Multilayer Perceptron having Minimum Configuration Figure 1. Block Diagram Representing a model of single neuron in ANN displaying the solution for AND, OR, NOR, NAND GATES vk=( xi.wi+bk)=uk+bk and Since uk= xi.wi So, from the Neural Network framework we are able to analyze that there are four parameters which are required for generating output yk i.e. A, B, W1 and W2.Therefore the truthtable for the AND, OR, NAND and NOR GATE is given below: Table 1 Truth Table of AND,OR,NAND and NOR GATE A B (AND) =. (OR) = + (NAND) =. (NOR) = In this case we will be using threshold function as the activation function. The definition of Threshold function is: 1 if vk 2 Ψ vk = =where 2 is the Threshold value 0 if vk<2 Table 2 Derivation of Solution of AND GATE using single neuron in ANN Now, we will consider the four input patterns, Taking W1=W2=1 and B k=0 a) When, A=0, B=0, W1=1, W2=1 Since vk= xi.wi=aw1+bw2=0x1+0x1=0 Since 0<2 therefore Ψ(vk)=yk=0 Therefore, yk=0 when A=0 and B=0 b) When, A=0, B=1, W1=1, W2=1 Since vk= xi.wi=aw1+bw2=0x1+1x1=1 Since 1<2 therefore Ψ(vk)=yk=0 Therefore, yk=0 when A=0 and B=1 c) When, A=1, B=0, W1=1, W2=1 Since vk= xi.wi=aw1+bw2=1x1+0x1=1 Since 1<2 therefore Ψ(vk)=yk=0 Therefore, yk=0 when A=1 and B=0 d) When, A=1, B=1, W1=1, W2=1 Since vk= xi.wi=aw1+bw2=1x1+1x1=2 Since 2=2 therefore Ψ(vk)=yk=1 Therefore, yk=1 when A=1 and B=1 From Table2 we are able to conclude that when W1=W2=1 and threshold value set to 2 we are able to find a solution for construction of ANN equivalent to AND GATE. (2) OR GATE A+B LOGIC for AND GATE Where A and B are input values vk=( xi.wi+bk)=uk+bk and Since uk= xi.wi In this case we will be using threshold function as the activation function. The definition of Threshold function is:
3 IJSE,Volume 3, Number 2 V K Singh 1 if vk 1 Ψ vk = =where 1 is the Threshold value 0 if vk<1 Table 3 Derivation of Solution of OR GATE using single neuron in ANN Now, we will consider the four input patterns, Taking W1=W2=1 and B k=0 a) When, A=0, B=0, W1=1, W2=1 Since vk= xi.wi=aw1+bw2=0x1+0x1=0 Since 0<2 therefore Ψ(vk)=yk=0 Therefore, yk=0 when A=0 and B=0 b) When, A=0, B=1, W1=1, W2=1 Since vk= xi.wi=aw1+bw2=0x1+1x1=1 Since 1=1 therefore Ψ(vk)=yk=1 Therefore, yk=1 when A=0 and B=1 c) When, A=1, B=0, W1=1, W2=1 Since vk= xi.wi=aw1+bw2=1x1+0x1=1 Since 1=1 therefore Ψ(vk)=yk=1 Therefore, yk=1 when A=1 and B=0 d) When, A=1, B=1, W1=1, W2=1 Since vk= xi.wi=aw1+bw2=1x1+1x1=2 Since 2>1 therefore Ψ(vk)=yk=1 Therefore, yk=1 when A=1 and B=1 From Table3 we are able to conclude that when W1=W2=1 and threshold value set to 1 we are able to find a solution for construction of ANN equivalent to OR GATE. (3) NAND GATE. LOGIC for NAND GATE vk=( xi.wi+bk)=uk+bk and Since uk= xi.wi Where A and B are input values In this case we will be using threshold function as the activation function. The definition of Threshold function is: 1 if vk 1 Ψ vk = =where 1 is the Threshold value 0 if vk< 1 Table 4 Derivation of Solution of NAND GATE using single neuron in ANN Now, we will consider the four input patterns, Taking W1=W2=1 and B k=0 a) When, A=0, B=0, W1=1, W2=1 Since vk= xi.wi=aw1+bw2=0x1+0x1=0 Since 0>1 therefore Ψ(vk)=yk=1 Therefore, yk=1 when A=0 and B=0 b) When, A=0, B=1, W1=1, W2=1 Since vk= xi.wi=aw1+bw2=0x1+1x1=1 Since 1=1 therefore Ψ(vk)=yk=1 Therefore, yk=1 when A=0 and B=1 c) When, A=1, B=0, W1=1, W2=1 Since vk= xi.wi=aw1+bw2=1x1+0x1=1 Since 1=1 therefore Ψ(vk)=yk=1 Therefore, yk=1 when A=1 and B=0 d) When, A=1, B=1, W1=1, W2=1 Since vk= xi.wi=aw1+bw2=1x1+1x1=2 Since 2<1 therefore Ψ(vk)=yk=0 Therefore, yk=0 when A=1 and B=1 From Table4 we are able to conclude that when W1=W2=1 and threshold value set to 1 we are able to find a solution for construction of ANN equivalent to NAND GATE. (4) NOR GATE LOGIC + for NOR GATE vk=( xi.wi+bk)=uk+bk and Since uk= xi.wi Where A and B are input values In this case we will be using threshold function as the activation function. The definition of Threshold function is:
4 One Solution to XOR problem using Multilayer Perceptron having Minimum Configuration 1 if vk 0 Ψ vk = =where 0 is the Threshold value 0 if vk<0 Table 5 Derivation of Solution of NOR GATE using single neuron in ANN Now, we will consider the four input patterns, Taking W1=W2=1 and B k=0 a) When, A=0, B=0, W1=1, W2=1 Since vk= xi.wi=aw1+bw2=0x1+0x1=0 Since 0=0 therefore Ψ(vk)=yk=1 Therefore, yk=1 when A=0 and B=0 b) When, A=0, B=1, W1=1, W2=1 Since vk= xi.wi=aw1+bw2=0x1+1x1=1 Since 1<0 therefore Ψ(vk)=yk=0 Therefore, yk=0 when A=0 and B=1 c) When, A=1, B=0, W1=1, W2=1 Since vk= xi.wi=aw1+bw2=1x1+0x1=1 Since 1<0 therefore Ψ(vk)=yk=0 Therefore, yk=0 when A=1 and B=0 d) When, A=1, B=1, W1=1, W2=1 Since vk= xi.wi=aw1+bw2=1x1+1x1=2 Since 2<0 therefore Ψ(vk)=yk=0 Therefore, yk=0 when A=1 and B=1 From Table5 we are able to conclude that when W1=W2=1 and threshold value set to 0 we are able to find a solution for construction of ANN equivalent to NOR GATE. III. PROBLEM STATEMENT The solutions proposed for the problems in the above section are portraying a domain which exhibits a common characteristic. The common characteristic which is exhibited is called Linear Separability. The Definition of Linear Separability is Two sets of points A and B in an ndimensional space are called linearly separable if (n+1) real numbers w1, w2, w(n+1) exist, such that every point x1,x2.xn satisfies +1 exist, and every point x1,x2.xn satisfies < +1. Rosenblatt in 1958 proposed perceptron model for solving the problems which are linearly separable using supervised learning algorithm which was named perceptron convergence algorithm. Since XOR is a nonlinearly separable problem thus require special proposal for its solution. Since, the outputs that XOR produce can t make classification of the inputs using one line in two dimensions. Table 6 Representation of the Inputs in two dimension separated into two classes on the basis of the output that it produce a)graph representing the Linear separablity in AND and NAND GATE where inputs could be classified into classes. b)graph representing the Linear separablity in OR and NOR GATE where inputs could be classified into classes. IV. LITERATURE SURVEY In [1] Abu and Jaques showed the information capacity of general form of memory is formalized. Estimation is made of the number of bits of information that can be stored in the Hopfield model of Associative Memory. In [2] Amari proposed an advance theory of learning and selforganization, covering backpropagation and its generalization as well as the formation of topological maps and neural representations of information. In [3] Akaike reviewed the classical maximum likelihood estimation procedure and a new estimate minimum information theoretical criterion (AIC) estimate (MAICE) which is designed for the purpose of statistical identification is introduced. In [4] Atiya and
5 IJSE,Volume 3, Number 2 V K Singh Abu developed a method for the storage of analog vectors i.e. vectors whose components are real valued, the method is developed for the Hopfield continuoustime network. In [5] Barron established an approximation of properties of a class of ANN. It is shown that Feedforward networks with one layer of sigmoidal non linearities achieve integrated squared errors of order O(1/n), where n is the number of nodes. In [6] Bruck showed the convergence properties of the Hopfield model are dependent on the structure of the interconnection matrix w and the method by which the nodes are updated. In [7] Freeman aim is to emplify the two nodes of information, described in the paper. In [8] the authors proposed a theoretical framework for backpropagation (BP) in order to identify some of its limitation as a general learning procedure and the reasons for its success in several experiments on pattern recognition. In [9] Giles et. al. proved that one method, recurrent cascade correlation, due to its topology has fundamental limitations in representation and there in its learning capabilities. In [10] Cardoso and Laheld introduced a class of adaptive algorithms for source separation which implements an adaptive versions of equivalent estimation and is henceforth called EASI. In [11] the authors Feldkamp and Puskorius presented a coherent neural net based framework for solving various signal processing problem. V. MULTILAYER PERCEPTRON Multilayer Perceptron as the name implies concerns with multiple layers of Neurons. Generally there are three distinguishing feature of Multilayer perceptron which are: A. Generally the neurons present in the Neural Network are nonlinear i.e. the activation function used at each neuron is generally nonlinear. Sigmoid function is generally used as activation function. Logistic function or hyperbolic tangent function is used as activation function. B. The neurons present in the network offers a high degree of Connectivity. Generally the neurons present in the network are fully connected. Between the networks the input nodes may directly make connection with the output node. First hidden layer may have connection with the third hidden layer and several variations of this sort may exist between the neurons of the MLP. C. In MLP the Hidden neurons are meant to achieve higher order statistics. Either you may increase the number hidden neurons in the same layer or the number of hidden layers may be increased to transform the problem into simpler form. The learning algorithm used in the case of Multilayer perceptron is called Back Propagation Network (BPN) algorithm. BPN is a type of supervised learning algorithm. It employs error correction learning. BPN comprises of two passes in its framework. Forward pass and Backward pass. In forward pass for the current set of Input actual output is generated. Then since the learning is supervised learning and that too error correction learning, Actual output is compared with the desired output. If error is acknowledged in the forward pass. The error invokes control mechanism which will propagate weight change in the network in backward direction. The procedure continues until the system i.e. ANN learns all the patterns applicable for that domain. BPN training network requires the following steps: STEP1: Select the next training pair from the training set; apply the input vector to the network input. STEP2: Calculate the output of the network. STEP3: Calculate the error between the network output (the target vector from the training pair). STEP4: Adjust the weights of the network in a way that minimizes the error. STEP5: Repeat Steps1 through 4 for each vector in the training set until the error for the entire set is acceptably low. The Correction applied to Wji(n) is defined by the delta rule = Here, η=learning rate constant and ξ(n)=cost function or instantaneous value of error energy.
6 One Solution to XOR problem using Multilayer Perceptron having Minimum Configuration Figure 2. Architectural Graph Representing MLP having two hidden layers VI. MINIMUM CONFIGURATION MULTILAYER PERCEPTRON Figure 3. Architectural Graph Representing MLP having minimum configuration i.e. in the output layer linear activation function could be used MLP beside of having multiple layers in which every element exhibits non linearity by virtue of non linear activation function, provides a variant where there could be one or more hidden layer with Non linear element whereas in the output layer there are going to be linear elements. It means in Minimum configuration MLP in the output layer there could be one or more linear neurons.
7 IJSE,Volume 3, Number 2 V K Singh VII. FIRST SOLUTION TO THE XOR PROBLEM USING MINIMUM CONFIGURATION MULTILAYER PERCEPTRON Figure 4. Architectural Graph Representing MLP having minimum configuration for one solution to XOR problem described below Table 7 Truth Table for XOR GATE x1 x2 = In the first solution for XOR problem in the hidden layer two neurons are present in the proposed solution the activation function used is hyperbolic tangent function. In the output layer the activation function used is Threshold function. The Derivation of the Solution to the XOR problem is given below for the four possible inputs. Figure 5 is having the internal configuration of the MLP. Figure 5 is used to derive the solution. Figure 5. Signal flow graph representing First solution to XOR problem using minimum configuration MLP In the first solution for XOR problem in the hidden layer two neurons are present in the proposed solution the activation function used is hyperbolic tangent function. In the output layer the activation function used is Threshold function. The Derivation of the Solution to the XOR problem is given below for the four possible inputs. Figure 5 is having the internal configuration of the MLP. Figure 5 is used to derive the solution.
8 One Solution to XOR problem using Multilayer Perceptron having Minimum Configuration CASE 1: When x1=0 and x2=0, At node 1 value of signal will be (1) = = 0.5..(2) At node 2 Signal value will be (3) = = 0.5..(4) Here, the activation function used for hidden layer is Hyperbolic tangent function. The Def. of which is given below: tanh = sinh 1 = 1 cosh = + = e e h = x=induced local field value, e= The natural logarithm base also known as Euler s number & Range=[1,+1] At node 3 the signal value will be from Eq(2and 5) = 0.5 = = = At node 4 the signal value will be from Eq (4 and 5) = 0.5 = = = At node 5 the signal value will be from Eq 6 and Eq 7= = = In the output layer the function used is Threshold function. The Definition of the threshold function is given below: 1 h h = h h h < h h From Eq(9) and Eq(10) the value of output for the first case i.e. x1=0 and x2=0 will be = =0, h h < CASE 2: When x1=1 and x2=0, At node 1 value of signal will be h 1 2 = = 0.5 (12) At node 2 Signal value will be , h 1 2 = = 1.5..(13) At node 3 the signal value will be from Eq(12and 5) = 0.5 = = = At node 4 the signal value will be from Eq (13 and 5) = 1.5 = = = At node 5 the signal value will be from Eq 14, Eq. 15 and Eq 8 = = From Eq(16) and Eq(10) the value of output for the second case i.e. x1=1 and x2=0 will be =1, h h >
9 IJSE,Volume 3, Number 2 V K Singh CASE 3: When x1=0 and x2=1, At node 1 value of signal will be h 1 2 = = 1.5 (18) At node 2 Signal value will be h 1 2 = = 0.5..(19) At node 3 the signal value will be from Eq(18and 5) = 1.5 = = = At node 4 the signal value will be from Eq (19 and 5) = 0.5 = = = At node 5 the signal value will be from Eq 20, Eq 21 and Eq 8 = = From Eq(22) and Eq(10) the value of output for the third case i.e. x1=0 and x2=1 will be = =1, h h > CASE 4: When x1=0 and x2=1, At node 1 value of signal will be = h 1 2 = = 0.5 (24) At node 2 Signal value will be , h 1 2 = = 0.5..(25) At node 3 the signal value will be from Eq(24and 5) = 0.5 = = = At node 4 the signal value will be from Eq (25 and 5) = 0.5 = = = At node 5 the signal value will be from Eq 26, Eq 27 and Eq 8 = = From Eq 28 and Eq 10 the value of the output y for input x1=1 and x2=1 will be = =0, h h < VIII.CONCLUSION From Eq. (11), Eq. (17), Eq. (23) and Eq. (29) it is concluded that the solution proposed proves that it is possible to solve XOR problem using minimum configuration Multilayer Perceptron. MLP provides very nice framework for solving problems that are specifying a non linearly separable domain. Hyperbolic function could be utilized as an activation function for training in the MLP. Hyperbolic tangent function which is a non linear function is utilized as an activation function for squashing the induced local field value or activation value produced after summation to produce output. By inclusion of hidden layer it was possible to solve problem which was complex.
10 One Solution to XOR problem using Multilayer Perceptron having Minimum Configuration REFERENCE [1] Y.S. AbuMostafa and J.M. St. Jacques, Information capacity of the Hopfield model, IEEE Transactions on Information Theory, vol. IT 31, pp , [2] S. Amari, Mathematical foundations of neurocomputing, Proceeding of IEEE, vol. 78, pp , [3] H. Akaike, A new look at the statistical model identification, IEEE Transactions on Automatic Control, vol AC19, pp , [4] A.F. Atiya and Y.S. AbuMostafa, An analog feedback associative memory, IEEE Transactions on Neural Networks, vol. 4, pp , [5] A.R. Barron, Universal approximation bounds for superpositions of a sigmoidal function, IEEE Transactions on Information Theory, vol. 39, pp , [6] J. Bruck, On the convergence properties of the Hopfield model, Proceedings of the IEEE, vol. 78, pp , [7] W.J. Freeman, Why neural networks don t yet fly: Inquiry into the neurodynamics of biological intelligence, IEEE International Conference on Neural Networks, vol. II, pp. 17, San Diego, CA, [8] M. Gori and A. Tesi, On the problem of local minima in backpropagation, IEEE Transaction Pattern Analysis and Machine Intellifgence, vol. 14, pp , [9] C.L. Giles, D. Chen, G.Z. Sun, H.H. Chen, Y.C. Lee and M.W. Goudreau, Constructive learning of recurrent neural networks: Limitations of recurrent cascade correlation with a simple solution, IEEE Transactions on Neural Networks, vol. 6, pp , [10] J.F. Cardoso and B. Laheld, Equivariant adaptive source separation, IEEE Transactions on Signal Processing, vol. 44, pp , [11] L.A. Feldkamp and G.V. Puskorius, A signal processing framework based on dynamic neural network with application to problems in adaptation, filtering and classification, Proceeding of the IEEE, vol. 86, 1998.
Neural Networks. Neural network is a network or circuit of neurons. Neurons can be. Biological neurons Artificial neurons
Neural Networks Neural network is a network or circuit of neurons Neurons can be Biological neurons Artificial neurons Biological neurons Building block of the brain Human brain contains over 10 billion
More informationNEURAL NETWORKS A Comprehensive Foundation
NEURAL NETWORKS A Comprehensive Foundation Second Edition Simon Haykin McMaster University Hamilton, Ontario, Canada Prentice Hall Prentice Hall Upper Saddle River; New Jersey 07458 Preface xii Acknowledgments
More informationNeural networks. Chapter 20, Section 5 1
Neural networks Chapter 20, Section 5 Chapter 20, Section 5 Outline Brains Neural networks Perceptrons Multilayer perceptrons Applications of neural networks Chapter 20, Section 5 2 Brains 0 neurons of
More informationNEURAL NETWORK FUNDAMENTALS WITH GRAPHS, ALGORITHMS, AND APPLICATIONS
NEURAL NETWORK FUNDAMENTALS WITH GRAPHS, ALGORITHMS, AND APPLICATIONS N. K. Bose HRBSystems Professor of Electrical Engineering The Pennsylvania State University, University Park P. Liang Associate Professor
More informationNeural Networks and Support Vector Machines
INF5390  Kunstig intelligens Neural Networks and Support Vector Machines Roar Fjellheim INF539013 Neural Networks and SVM 1 Outline Neural networks Perceptrons Neural networks Support vector machines
More informationIntroduction to Machine Learning and Data Mining. Prof. Dr. Igor Trajkovski trajkovski@nyus.edu.mk
Introduction to Machine Learning and Data Mining Prof. Dr. Igor Trakovski trakovski@nyus.edu.mk Neural Networks 2 Neural Networks Analogy to biological neural systems, the most robust learning systems
More informationFeedForward mapping networks KAIST 바이오및뇌공학과 정재승
FeedForward mapping networks KAIST 바이오및뇌공학과 정재승 How much energy do we need for brain functions? Information processing: Tradeoff between energy consumption and wiring cost Tradeoff between energy consumption
More informationLecture 1: Introduction to Neural Networks Kevin Swingler / Bruce Graham
Lecture 1: Introduction to Neural Networks Kevin Swingler / Bruce Graham kms@cs.stir.ac.uk 1 What are Neural Networks? Neural Networks are networks of neurons, for example, as found in real (i.e. biological)
More informationRole of Neural network in data mining
Role of Neural network in data mining Chitranjanjit kaur Associate Prof Guru Nanak College, Sukhchainana Phagwara,(GNDU) Punjab, India Pooja kapoor Associate Prof Swami Sarvanand Group Of Institutes Dinanagar(PTU)
More informationA TUTORIAL. BY: Negin Yousefpour PhD Student Civil Engineering Department TEXAS A&M UNIVERSITY
ARTIFICIAL NEURAL NETWORKS: A TUTORIAL BY: Negin Yousefpour PhD Student Civil Engineering Department TEXAS A&M UNIVERSITY Contents Introduction Origin Of Neural Network Biological Neural Networks ANN Overview
More informationArtificial Neural Networks and Support Vector Machines. CS 486/686: Introduction to Artificial Intelligence
Artificial Neural Networks and Support Vector Machines CS 486/686: Introduction to Artificial Intelligence 1 Outline What is a Neural Network?  Perceptron learners  Multilayer networks What is a Support
More informationAn Introduction to Neural Networks
An Introduction to Vincent Cheung Kevin Cannons Signal & Data Compression Laboratory Electrical & Computer Engineering University of Manitoba Winnipeg, Manitoba, Canada Advisor: Dr. W. Kinsner May 27,
More informationArtificial Neural Computation Systems
Artificial Neural Computation Systems Spring 2003 Technical University of Szczecin Department of Electrical Engineering Lecturer: Prof. Adam Krzyzak,PS 5. Lecture 15.03.2003 147 1. Multilayer Perceptrons............
More informationCONNECTIONIST THEORIES OF LEARNING
CONNECTIONIST THEORIES OF LEARNING Themis N. Karaminis, Michael S.C. Thomas Department of Psychological Sciences, Birkbeck College, University of London London, WC1E 7HX UK tkaram01@students.bbk.ac.uk,
More informationStock Prediction using Artificial Neural Networks
Stock Prediction using Artificial Neural Networks Abhishek Kar (Y8021), Dept. of Computer Science and Engineering, IIT Kanpur Abstract In this work we present an Artificial Neural Network approach to predict
More informationLecture 6. Artificial Neural Networks
Lecture 6 Artificial Neural Networks 1 1 Artificial Neural Networks In this note we provide an overview of the key concepts that have led to the emergence of Artificial Neural Networks as a major paradigm
More informationRecurrent Neural Networks
Recurrent Neural Networks Neural Computation : Lecture 12 John A. Bullinaria, 2015 1. Recurrent Neural Network Architectures 2. State Space Models and Dynamical Systems 3. Backpropagation Through Time
More informationNeural Network Design in Cloud Computing
International Journal of Computer Trends and Technology volume4issue22013 ABSTRACT: Neural Network Design in Cloud Computing B.Rajkumar #1,T.Gopikiran #2,S.Satyanarayana *3 #1,#2Department of Computer
More informationComparison of Supervised and Unsupervised Learning Classifiers for Travel Recommendations
Volume 3, No. 8, August 2012 Journal of Global Research in Computer Science REVIEW ARTICLE Available Online at www.jgrcs.info Comparison of Supervised and Unsupervised Learning Classifiers for Travel Recommendations
More informationIntroduction to Artificial Neural Networks MAE491/591
Introduction to Artificial Neural Networks MAE491/591 Artificial Neural Networks: Biological Inspiration The brain has been extensively studied by scientists. Vast complexity prevents all but rudimentary
More informationLearning. Artificial Intelligence. Learning. Types of Learning. Inductive Learning Method. Inductive Learning. Learning.
Learning Learning is essential for unknown environments, i.e., when designer lacks omniscience Artificial Intelligence Learning Chapter 8 Learning is useful as a system construction method, i.e., expose
More informationINTRODUCTION TO NEURAL NETWORKS
INTRODUCTION TO NEURAL NETWORKS Pictures are taken from http://www.cs.cmu.edu/~tom/mlbookchapterslides.html http://research.microsoft.com/~cmbishop/prml/index.htm By Nobel Khandaker Neural Networks An
More informationMethod of Combining the Degrees of Similarity in Handwritten Signature Authentication Using Neural Networks
Method of Combining the Degrees of Similarity in Handwritten Signature Authentication Using Neural Networks Ph. D. Student, Eng. Eusebiu Marcu Abstract This paper introduces a new method of combining the
More informationIntroduction to Artificial Neural Networks
POLYTECHNIC UNIVERSITY Department of Computer and Information Science Introduction to Artificial Neural Networks K. Ming Leung Abstract: A computing paradigm known as artificial neural network is introduced.
More informationAn Introduction to Artificial Neural Networks (ANN)  Methods, Abstraction, and Usage
An Introduction to Artificial Neural Networks (ANN)  Methods, Abstraction, and Usage Introduction An artificial neural network (ANN) reflects a system that is based on operations of biological neural
More information6.2.8 Neural networks for data mining
6.2.8 Neural networks for data mining Walter Kosters 1 In many application areas neural networks are known to be valuable tools. This also holds for data mining. In this chapter we discuss the use of neural
More informationRatebased artificial neural networks and error backpropagation learning. Scott Murdison Machine learning journal club May 16, 2016
Ratebased artificial neural networks and error backpropagation learning Scott Murdison Machine learning journal club May 16, 2016 Murdison, Leclercq, Lefèvre and Blohm J Neurophys 2015 Neural networks???
More informationCHAPTER 6 NEURAL NETWORK BASED SURFACE ROUGHNESS ESTIMATION
CHAPTER 6 NEURAL NETWORK BASED SURFACE ROUGHNESS ESTIMATION 6.1. KNOWLEDGE REPRESENTATION The function of any representation scheme is to capture the essential features of a problem domain and make that
More informationNeural Nets. General Model Building
Neural Nets To give you an idea of how new this material is, let s do a little history lesson. The origins are typically dated back to the early 1940 s and work by two physiologists, McCulloch and Pitts.
More informationBuilding MLP networks by construction
University of Wollongong Research Online Faculty of Informatics  Papers (Archive) Faculty of Engineering and Information Sciences 2000 Building MLP networks by construction Ah Chung Tsoi University of
More information1 SELFORGANIZATION MECHANISM IN THE NETWORKS
Mathematical literature reveals that the number of neural network structures, concepts, methods, and their applications have been well known in neural modeling literature for sometime. It started with
More informationPMR5406 Redes Neurais e Lógica Fuzzy Aula 3 Multilayer Percetrons
PMR5406 Redes Neurais e Aula 3 Multilayer Percetrons Baseado em: Neural Networks, Simon Haykin, PrenticeHall, 2 nd edition Slides do curso por Elena Marchiori, Vrie Unviersity Multilayer Perceptrons Architecture
More informationAN APPLICATION OF TIME SERIES ANALYSIS FOR WEATHER FORECASTING
AN APPLICATION OF TIME SERIES ANALYSIS FOR WEATHER FORECASTING Abhishek Agrawal*, Vikas Kumar** 1,Ashish Pandey** 2,Imran Khan** 3 *(M. Tech Scholar, Department of Computer Science, Bhagwant University,
More informationA Time Series ANN Approach for Weather Forecasting
A Time Series ANN Approach for Weather Forecasting Neeraj Kumar 1, Govind Kumar Jha 2 1 Associate Professor and Head Deptt. Of Computer Science,Nalanda College Of Engineering Chandi(Bihar) 2 Assistant
More informationFeedforward Neural Networks and Backpropagation
Feedforward Neural Networks and Backpropagation Feedforward neural networks Architectural issues, computational capabilities Sigmoidal and radial basis functions Gradientbased learning and Backprogation
More informationForecasting Demand in the Clothing Industry. Eduardo Miguel Rodrigues 1, Manuel Carlos Figueiredo 2 2
XI Congreso Galego de Estatística e Investigación de Operacións A Coruña, 242526 de outubro de 2013 Forecasting Demand in the Clothing Industry Eduardo Miguel Rodrigues 1, Manuel Carlos Figueiredo 2
More informationNeural Network Architectures
6 Neural Network Architectures Bogdan M. Wilamowski Auburn University 6. Introduction... 66. Special EasytoTrain Neural Network Architectures... 6 Polynomial Networks Functional Link Networks Sarajedini
More informationNeural network software tool development: exploring programming language options
INEB PSI Technical Report 20061 Neural network software tool development: exploring programming language options Alexandra Oliveira aao@fe.up.pt Supervisor: Professor Joaquim Marques de Sá June 2006
More informationEFFICIENT DATA PREPROCESSING FOR DATA MINING
EFFICIENT DATA PREPROCESSING FOR DATA MINING USING NEURAL NETWORKS JothiKumar.R 1, Sivabalan.R.V 2 1 Research scholar, Noorul Islam University, Nagercoil, India Assistant Professor, Adhiparasakthi College
More informationPerformance Evaluation of Artificial Neural. Networks for Spatial Data Analysis
Contemporary Engineering Sciences, Vol. 4, 2011, no. 4, 149163 Performance Evaluation of Artificial Neural Networks for Spatial Data Analysis Akram A. Moustafa Department of Computer Science Al albayt
More informationOpen Access Research on Application of Neural Network in Computer Network Security Evaluation. Shujuan Jin *
Send Orders for Reprints to reprints@benthamscience.ae 766 The Open Electrical & Electronic Engineering Journal, 2014, 8, 766771 Open Access Research on Application of Neural Network in Computer Network
More informationARTIFICIAL NEURAL NETWORKS FOR DATA MINING
ARTIFICIAL NEURAL NETWORKS FOR DATA MINING Amrender Kumar I.A.S.R.I., Library Avenue, Pusa, New Delhi110 012 akha@iasri.res.in 1. Introduction Neural networks, more accurately called Artificial Neural
More informationAn Artificial Neural NetworksBased online Monitoring Odor Sensing System
Journal of Computer Science 5 (11): 878882, 2009 ISSN 15493636 2009 Science Publications An Artificial Neural NetworksBased online Monitoring Odor Sensing System Yousif AlBastaki The College of Information
More information129: Artificial Neural Networks. Ajith Abraham Oklahoma State University, Stillwater, OK, USA 1 INTRODUCTION TO ARTIFICIAL NEURAL NETWORKS
129: Artificial Neural Networks Ajith Abraham Oklahoma State University, Stillwater, OK, USA 1 Introduction to Artificial Neural Networks 901 2 Neural Network Architectures 902 3 Neural Network Learning
More informationMachine Learning and Data Mining 
Machine Learning and Data Mining  Perceptron Neural Networks Nuno Cavalheiro Marques (nmm@di.fct.unl.pt) Spring Semester 2010/2011 MSc in Computer Science Multi Layer Perceptron Neurons and the Perceptron
More informationNeural Networks. Introduction to Artificial Intelligence CSE 150 May 29, 2007
Neural Networks Introduction to Artificial Intelligence CSE 150 May 29, 2007 Administration Last programming assignment has been posted! Final Exam: Tuesday, June 12, 11:302:30 Last Lecture Naïve Bayes
More informationKeywords: Image complexity, PSNR, LevenbergMarquardt, Multilayer neural network.
Global Journal of Computer Science and Technology Volume 11 Issue 3 Version 1.0 Type: Double Blind Peer Reviewed International Research Journal Publisher: Global Journals Inc. (USA) Online ISSN: 09754172
More informationSEMINAR OUTLINE. Introduction to Data Mining Using Artificial Neural Networks. Definitions of Neural Networks. Definitions of Neural Networks
SEMINAR OUTLINE Introduction to Data Mining Using Artificial Neural Networks ISM 611 Dr. Hamid Nemati Introduction to and Characteristics of Neural Networks Comparison of Neural Networks to traditional
More informationChapter 4: Artificial Neural Networks
Chapter 4: Artificial Neural Networks CS 536: Machine Learning Littman (Wu, TA) Administration icml03: instructional Conference on Machine Learning http://www.cs.rutgers.edu/~mlittman/courses/ml03/icml03/
More informationA Simple Feature Extraction Technique of a Pattern By Hopfield Network
A Simple Feature Extraction Technique of a Pattern By Hopfield Network A.Nag!, S. Biswas *, D. Sarkar *, P.P. Sarkar *, B. Gupta **! Academy of Technology, Hoogly  722 *USIC, University of Kalyani, Kalyani
More informationNEURAL NETWORKS IN DATA MINING
NEURAL NETWORKS IN DATA MINING 1 DR. YASHPAL SINGH, 2 ALOK SINGH CHAUHAN 1 Reader, Bundelkhand Institute of Engineering & Technology, Jhansi, India 2 Lecturer, United Institute of Management, Allahabad,
More informationUsing Neural Networks for Pattern Classification Problems
Using Neural Networks for Pattern Classification Problems Converting an Image Camera captures an image Image needs to be converted to a form that can be processed by the Neural Network Converting an Image
More informationComparison of Supervised and Unsupervised Learning Algorithms for Pattern Classification
Comparison of Supervised and Unsupervised Learning Algorithms for Pattern Classification R. Sathya Professor, Dept. of MCA, Jyoti Nivas College (Autonomous), Professor and Head, Dept. of Mathematics, Bangalore,
More informationBack Propagation Neural Network for Wireless Networking
International Journal of Computer Sciences and Engineering Open Access Review Paper Volume4, Issue4 EISSN: 23472693 Back Propagation Neural Network for Wireless Networking Menal Dahiya Maharaja Surajmal
More informationNeural Networks in Data Mining
IOSR Journal of Engineering (IOSRJEN) ISSN (e): 22503021, ISSN (p): 22788719 Vol. 04, Issue 03 (March. 2014), V6 PP 0106 www.iosrjen.org Neural Networks in Data Mining Ripundeep Singh Gill, Ashima Department
More informationElectroencephalography Analysis Using Neural Network and Support Vector Machine during Sleep
Engineering, 23, 5, 8892 doi:.4236/eng.23.55b8 Published Online May 23 (http://www.scirp.org/journal/eng) Electroencephalography Analysis Using Neural Network and Support Vector Machine during Sleep JeeEun
More informationFORECASTING THE JORDANIAN STOCK PRICES USING ARTIFICIAL NEURAL NETWORK
1 FORECASTING THE JORDANIAN STOCK PRICES USING ARTIFICIAL NEURAL NETWORK AYMAN A. ABU HAMMAD Civil Engineering Department Applied Science University P. O. Box: 926296, Amman 11931 Jordan SOUMA M. ALHAJ
More informationNeural Networks and Back Propagation Algorithm
Neural Networks and Back Propagation Algorithm Mirza Cilimkovic Institute of Technology Blanchardstown Blanchardstown Road North Dublin 15 Ireland mirzac@gmail.com Abstract Neural Networks (NN) are important
More informationAmerican International Journal of Research in Science, Technology, Engineering & Mathematics
American International Journal of Research in Science, Technology, Engineering & Mathematics Available online at http://www.iasir.net ISSN (Print): 2328349, ISSN (Online): 23283580, ISSN (CDROM): 23283629
More informationSolving Nonlinear Equations Using Recurrent Neural Networks
Solving Nonlinear Equations Using Recurrent Neural Networks Karl Mathia and Richard Saeks, Ph.D. Accurate Automation Corporation 71 Shallowford Road Chattanooga, Tennessee 37421 Abstract A class of recurrent
More informationIntroduction to Neural Networks : Revision Lectures
Introduction to Neural Networks : Revision Lectures John A. Bullinaria, 2004 1. Module Aims and Learning Outcomes 2. Biological and Artificial Neural Networks 3. Training Methods for Multi Layer Perceptrons
More informationNovelty Detection in image recognition using IRF Neural Networks properties
Novelty Detection in image recognition using IRF Neural Networks properties Philippe Smagghe, JeanLuc Buessler, JeanPhilippe Urban Université de HauteAlsace MIPS 4, rue des Frères Lumière, 68093 Mulhouse,
More information3 An Illustrative Example
Objectives An Illustrative Example Objectives  Theory and Examples 2 Problem Statement 2 Perceptron  TwoInput Case 4 Pattern Recognition Example 5 Hamming Network 8 Feedforward Layer 8 Recurrent
More informationInternational Journal of Electronics and Computer Science Engineering 1449
International Journal of Electronics and Computer Science Engineering 1449 Available Online at www.ijecse.org ISSN 22771956 Neural Networks in Data Mining Priyanka Gaur Department of Information and
More informationBiological Neurons and Neural Networks, Artificial Neurons
Biological Neurons and Neural Networks, Artificial Neurons Neural Computation : Lecture 2 John A. Bullinaria, 2015 1. Organization of the Nervous System and Brain 2. Brains versus Computers: Some Numbers
More informationComparison of Kmeans and Backpropagation Data Mining Algorithms
Comparison of Kmeans and Backpropagation Data Mining Algorithms Nitu Mathuriya, Dr. Ashish Bansal Abstract Data mining has got more and more mature as a field of basic research in computer science and
More informationAnalecta Vol. 8, No. 2 ISSN 20647964
EXPERIMENTAL APPLICATIONS OF ARTIFICIAL NEURAL NETWORKS IN ENGINEERING PROCESSING SYSTEM S. Dadvandipour Institute of Information Engineering, University of Miskolc, Egyetemváros, 3515, Miskolc, Hungary,
More informationSELECTING NEURAL NETWORK ARCHITECTURE FOR INVESTMENT PROFITABILITY PREDICTIONS
UDC: 004.8 Original scientific paper SELECTING NEURAL NETWORK ARCHITECTURE FOR INVESTMENT PROFITABILITY PREDICTIONS Tonimir Kišasondi, Alen Lovren i University of Zagreb, Faculty of Organization and Informatics,
More informationPower Prediction Analysis using Artificial Neural Network in MS Excel
Power Prediction Analysis using Artificial Neural Network in MS Excel NURHASHINMAH MAHAMAD, MUHAMAD KAMAL B. MOHAMMED AMIN Electronic System Engineering Department Malaysia Japan International Institute
More informationIntroduction of the Radial Basis Function (RBF) Networks
Introduction of the Radial Basis Function (RBF) Networks Adrian G. Bors adrian.bors@cs.york.ac.uk Department of Computer Science University of York York, YO10 5DD, UK Abstract In this paper we provide
More informationDynamic neural network with adaptive Gauss neuron activation function
Dynamic neural network with adaptive Gauss neuron activation function Dubravko Majetic, Danko Brezak, Branko Novakovic & Josip Kasac Abstract: An attempt has been made to establish a nonlinear dynamic
More informationSOFTWARE EFFORT ESTIMATION USING RADIAL BASIS FUNCTION NEURAL NETWORKS Ana Maria Bautista, Angel Castellanos, Tomas San Feliu
International Journal Information Theories and Applications, Vol. 21, Number 4, 2014 319 SOFTWARE EFFORT ESTIMATION USING RADIAL BASIS FUNCTION NEURAL NETWORKS Ana Maria Bautista, Angel Castellanos, Tomas
More informationNeural Machine Translation by Jointly Learning to Align and Translate
Neural Machine Translation by Jointly Learning to Align and Translate Neural Traduction Automatique par Conjointement Apprentissage Pour Aligner et Traduire Dzmitry Bahdanau KyungHyun Cho Yoshua Bengio
More informationData quality in Accounting Information Systems
Data quality in Accounting Information Systems Comparing Several Data Mining Techniques Erjon Zoto Department of Statistics and Applied Informatics Faculty of Economy, University of Tirana Tirana, Albania
More informationIntroduction to Neural Networks
Introduction to Neural Networks 2nd Year UG, MSc in Computer Science http://www.cs.bham.ac.uk/~jxb/inn.html Lecturer: Dr. John A. Bullinaria http://www.cs.bham.ac.uk/~jxb John A. Bullinaria, 2004 Module
More informationANN Based Fault Classifier and Fault Locator for Double Circuit Transmission Line
International Journal of Computer Sciences and Engineering Open Access Research Paper Volume4, Special Issue2, April 2016 EISSN: 23472693 ANN Based Fault Classifier and Fault Locator for Double Circuit
More informationIntroduction to Neural Computation. Neural Computation
Introduction to Neural Computation Level 4/M Neural Computation Level 3 Website: http://www.cs.bham.ac.uk/~jxb/inc.html Lecturer: Dr. John A. Bullinaria John A. Bullinaria, 2015 Module Administration and
More informationPreface. C++ Neural Networks and Fuzzy Logic:Preface. Table of Contents
C++ Neural Networks and Fuzzy Logic by Valluru B. Rao MTBooks, IDG Books Worldwide, Inc. ISBN: 1558515526 Pub Date: 06/01/95 Table of Contents Preface The number of models available in neural network literature
More information6. Feedforward mapping networks
6. Feedforward mapping networks Fundamentals of Computational Neuroscience, T. P. Trappenberg, 2002. Lecture Notes on Brain and Computation ByoungTak Zhang Biointelligence Laboratory School of Computer
More informationNeural Computation  Assignment
Neural Computation  Assignment Analysing a Neural Network trained by Backpropagation AA SSt t aa t i iss i t i icc aa l l AA nn aa l lyy l ss i iss i oo f vv aa r i ioo i uu ss l lee l aa r nn i inn gg
More informationLecture 8 Artificial neural networks: Unsupervised learning
Lecture 8 Artificial neural networks: Unsupervised learning Introduction Hebbian learning Generalised Hebbian learning algorithm Competitive learning Selforganising computational map: Kohonen network
More informationForecasting of Economic Quantities using Fuzzy Autoregressive Model and Fuzzy Neural Network
Forecasting of Economic Quantities using Fuzzy Autoregressive Model and Fuzzy Neural Network Dušan Marček 1 Abstract Most models for the time series of stock prices have centered on autoregressive (AR)
More informationIranian J Env Health Sci Eng, 2004, Vol.1, No.2, pp.5157. Application of Intelligent System for Water Treatment Plant Operation.
Iranian J Env Health Sci Eng, 2004, Vol.1, No.2, pp.5157 Application of Intelligent System for Water Treatment Plant Operation *A Mirsepassi Dept. of Environmental Health Engineering, School of Public
More informationChapter 7. Diagnosis and Prognosis of Breast Cancer using Histopathological Data
Chapter 7 Diagnosis and Prognosis of Breast Cancer using Histopathological Data In the previous chapter, a method for classification of mammograms using wavelet analysis and adaptive neurofuzzy inference
More informationEVALUATION OF NEURAL NETWORK BASED CLASSIFICATION SYSTEMS FOR CLINICAL CANCER DATA CLASSIFICATION
EVALUATION OF NEURAL NETWORK BASED CLASSIFICATION SYSTEMS FOR CLINICAL CANCER DATA CLASSIFICATION K. Mumtaz Vivekanandha Institute of Information and Management Studies, Tiruchengode, India S.A.Sheriff
More informationChapter 12 Discovering New Knowledge Data Mining
Chapter 12 Discovering New Knowledge Data Mining BecerraFernandez, et al.  Knowledge Management 1/e  2004 Prentice Hall Additional material 2007 Dekai Wu Chapter Objectives Introduce the student to
More informationAPPLICATION OF ARTIFICIAL NEURAL NETWORKS USING HIJRI LUNAR TRANSACTION AS EXTRACTED VARIABLES TO PREDICT STOCK TREND DIRECTION
LJMS 2008, 2 Labuan ejournal of Muamalat and Society, Vol. 2, 2008, pp. 916 Labuan ejournal of Muamalat and Society APPLICATION OF ARTIFICIAL NEURAL NETWORKS USING HIJRI LUNAR TRANSACTION AS EXTRACTED
More informationIntroduction to Neural Networks for Senior Design
Introduction to Neural Networks for Senior Design Intro1 Neural Networks: The Big Picture Artificial Intelligence Neural Networks Expert Systems Machine Learning not ruleoriented ruleoriented Intro2
More informationIdentification of NonClassical Boundary Conditions with the Aid of Artificial Neural Networks
University of Tartu Faculty of Mathematics and Computer Science Institute of Computer Science Information Technology Mairit Vikat Identification of NonClassical Boundary Conditions with the Aid of Artificial
More informationIntroduction to Artificial Neural Networks. Introduction to Artificial Neural Networks
Introduction to Artificial Neural Networks v.3 August Michel Verleysen Introduction  Introduction to Artificial Neural Networks p Why ANNs? p Biological inspiration p Some examples of problems p Historical
More informationLearning to Process Natural Language in Big Data Environment
CCF ADL 2015 Nanchang Oct 11, 2015 Learning to Process Natural Language in Big Data Environment Hang Li Noah s Ark Lab Huawei Technologies Part 1: Deep Learning  Present and Future Talk Outline Overview
More informationArtificial neural networks
Artificial neural networks Now Neurons Neuron models Perceptron learning Multilayer perceptrons Backpropagation 2 It all starts with a neuron 3 Some facts about human brain ~ 86 billion neurons ~ 10 15
More informationBank efficiency evaluation using a neural networkdea method
Iranian Journal of Mathematical Sciences and Informatics Vol. 4, No. 2 (2009), pp. 3348 Bank efficiency evaluation using a neural networkdea method G. Aslani a,s.h.momenimasuleh,a,a.malek b and F. Ghorbani
More informationNeural Networks algorithms and applications
Neural Networks algorithms and applications By Fiona Nielsen 4i 12/122001 Supervisor: Geert Rasmussen Niels Brock Business College 1 Introduction Neural Networks is a field of Artificial Intelligence
More informationComparison Between Multilayer Feedforward Neural Networks and a Radial Basis Function Network to Detect and Locate Leaks in Pipelines Transporting Gas
A publication of 1375 CHEMICAL ENGINEERINGTRANSACTIONS VOL. 32, 2013 Chief Editors:SauroPierucci, JiříJ. Klemeš Copyright 2013, AIDIC ServiziS.r.l., ISBN 9788895608235; ISSN 19749791 The Italian Association
More informationData Mining Using Neural Networks: A Guide for Statisticians
Data Mining Using Neural Networks: A Guide for Statisticians Basilio de Bragança Pereira UFRJ  Universidade Federal do Rio de Janeiro Calyampudi Radhakrishna Rao PSU  Penn State University June 2009
More informationData Mining. Supervised Methods. Ciro Donalek donalek@astro.caltech.edu. Ay/Bi 199ab: Methods of Computa@onal Sciences hcp://esci101.blogspot.
Data Mining Supervised Methods Ciro Donalek donalek@astro.caltech.edu Supervised Methods Summary Ar@ficial Neural Networks Mul@layer Perceptron Support Vector Machines SoLwares Supervised Models: Supervised
More informationRealTime CreditCard Fraud Detection using Artificial Neural Network Tuned by Simulated Annealing Algorithm
Proc. of Int. Conf. on Recent Trends in Information, Telecommunication and Computing, ITC RealTime CreditCard Fraud Detection using Artificial Neural Network Tuned by Simulated Annealing Algorithm Azeem
More informationNeural Networks. CAP5610 Machine Learning Instructor: GuoJun Qi
Neural Networks CAP5610 Machine Learning Instructor: GuoJun Qi Recap: linear classifier Logistic regression Maximizing the posterior distribution of class Y conditional on the input vector X Support vector
More informationA Prediction Model for Taiwan Tourism Industry Stock Index
A Prediction Model for Taiwan Tourism Industry Stock Index ABSTRACT HanChen Huang and FangWei Chang Yu Da University of Science and Technology, Taiwan Investors and scholars pay continuous attention
More information