Artificial Neural Computation Systems

Size: px
Start display at page:

Download "Artificial Neural Computation Systems"

Transcription

1 Artificial Neural Computation Systems Spring 2003 Technical University of Szczecin Department of Electrical Engineering Lecturer: Prof. Adam Krzyzak,PS

2 5. Lecture Multilayer Perceptrons Structure Training Notation The cost function Adapting output neurons Adapting hidden neurons Summary so far Steepest Descent Training Modes Sequential Mode Batch Mode Activation Functions Logistic sigmoid Hyperbolic tangent function Softmax

3 5. Lecture

4 1. Multilayer Perceptrons Workhorse of ANNs Universal approximator Successfully applied to many difficult and diverse problems most often function approximation 1.1 Structure An MLP consists of an input layer, one or more hidden layers, and an output layer (See Fig. 4.1). The activation functions of the hidden layer neurons are nonlinear (usually tanh/sigmoid) otherwise the total 148

5 network would be equivalent to a single layer linear network! Output layer activation functions can be linear or nonlinear, most often sigmoid, tanh, or softmax. The input signal propagates through the network in a forward direction, layer-by-layer. 149

6 150

7 1.2 Training Typically trained using a supervised error-correction learning algorithm with steepest descent, or one of its faster converging friends. The trick lies in credit assignment How to compute the effect of the weights in the hidden layers to the cost function = How to compute the gradient of the cost with respct to the weights. This is done using the so-called error back-propagation algorithm. Two passes through the layers of an MLP network: 1. In the forward pass, the response of the network to an input vector is computed and all the synaptic weights are kept fixed. 151

8 2. During the backward pass, the error signal is propagated backward through the network, and the weights are adjusted using an error-correction rule. After adjustment, the output of the network will be closer to the desired response. 1.3 Notation The indices i, j and k refer to neurons in different layers in that order from left to right. In iteration n, the n-th training vector is presented to the network. E(n) refers to the instantaneous sum of error squares or error energy at iteration n. 152

9 E av is the average of E(n) over all n. e j (n) is the error signal at the output of neuron j for iteration n. d j (n) is the desired response for neuron j. y j (n) is the the output of neuron j for iteration n. w ji (n) is the weight connecting the output of neuron i to the input of neuron j at iteration n. The correction applied to this weight is denoted by w ji (n). v j (n) denotes the local field of neuron j at iteration n (the weighted sum of inputs plus bias of that neuron). The activation function (nonlinearity) associated with neuron j is denoted by ϕ j (.). 153

10 b j denotes the bias applied to neuron j, corresponding to the weight w j0 = b j and a fixed input +1. x i (n) denotes the i-th element of the input vector. η denotes the learning-rate parameter. m l denotes the number of neurons in layer l. - The network has L layers. - For output layer, the notation m L = M is also used. 1.4 The cost function The error signal at the output of neuron j at iteration n is defined by e j (n) = d j (n) y j (n), (5..1) 154

11 when neuron j is an output node of the whole network. The total instantaneous error energy E(n) for all the neurons in the output layer is E(n) = 1 e 2 2 j(n) (5..2) j C where the set C contains all the neurons in the output layer. Let N be the total number of training vectors (examples, patterns). Then the average squared error energy is E av = 1 N E(n) (5..3) N n=1 155

12 The objective is to derive a learning algorithm for minimizing E av with respect to the free parameters. Weights are updated on a pattern-by-pattern basis during each epoch (epoch is one complete presentation of the entire training set). In other words, instantaneous stochastic gradient based on a single sample only. The average of these updates over one epoch estimates the gradient of E av. 1.5 Adapting output neurons 156

13 Consider now Figure 4.3 showing neuron j. It is fed by a set of function signals produced by a layer of neurons to its left, indexed by i. 157

14 The local field v j (n) of neuron j is clearly v j (n) = m w ji (n)y i (n) (5..4) i=0 The function signal y j (n) appearing at the output of neuron j at iteration n is then y j (n) = ϕ j (v j (n)). (5..5) The correction w ji (n) made to the synaptic weight w ji (n) is proportional to the partial derivative E(n)/ w ji (n) of the instantaneous error. Using the chain rule of calculus, this gradient can be ex- 158

15 pressed as follows: E(n) w ji (n) = E(n) e j (n) y j (n) v j (n) e j (n) y j (n) v j (n) w ji (n) (5..6) Differentiating both sides of Eq. (5..2) with respect to e j (n), we get E(n) e j (n) = e j(n) (5..7) Differentiating Eq. (5..1) with respect to y j (n) yields e j (n) y j (n) = 1 (5..8) 159

16 Differentiating Eq. (5..5) with respect to v j (n), we get y j (n) v j (n) = ϕ j(v j (n)) (5..9) where ϕ j denotes the derivative of ϕ j. Finally, differentiating (5..4) with respect to w ji (n) yields v j (n) w ji (n) = y i(n). (5..10) Inserting these partial derivatives into (5..6) yields E(n) w ji (n) = e j(n)ϕ j(v j (n))y i (n) (5..11) 160

17 If we use steepest descent to optimize the weights, the correction w ji (n) applied to the weight w ji (n) is defined by the delta rule: w ji (n) = η E(n) w ji (n) (5..12) where η is the learning-rate parameter. Inserting (5..11) into (5..12) yields w ji (n) = ηδ j (n)y i (n) (5..13) where the local gradient is defined by δ j (n) = E(n) v j (n) = e j(n)ϕ j(v j (n)) (5..14) 161

18 1.6 Adapting hidden neurons There is no desired response available for neuron j. Question: How to compute the responsibility of this neu- 162

19 ron to the error made at the output? The error signal for a hidden neuron can be determined recursively in terms of the error signals of all neurons connected to it. This makes back-propagation efficient. Figure 4.4 illustrates the situation where neuron j is a hidden node. Using Eq. (5..14), we may rewrite the local gradient δ j (n) for hidden neuron j as follows: δ j (n) = E(n) y j (n) y j (n) v j (n) = E(n) y j (n) ϕ j(v j (n)) (5..15) The partial derivative E(n) y j (n) may be calculated as follows. 163

20 From Figure 4.4 we see that E(n) = 1 e 2 2 k(n), neuron k is an output node (5..16) k C Differentiating this with respect to the function signal y j (n) and using the chain rule we get E(n) y j (n) = k e k e k (n) y j (n) = k e k (n) v k (n) e k v k (n) y j (n) (5..17) From Figure 4.4 we note that when the neuron k is an output node e k (n) = d k (n) y k (n) = d k (n) ϕ k (v k (n)) (5..18) 164

21 so that e k (n) v k (n) = ϕ k(v k (n)) (5..19) Figure 4.4 shows also that the local field of neuron k is v k (n) = m w kj (n)y j (n) (5..20) j=0 where the bias term is again included as the weight w k0 (n). Differentiating this with respect to y j (n) yields v k (n) y j (n) = w kj(n) (5..21) 165

22 Inserting these expressions into (5..17) we get the desired partial derivative E(n) y j (n) = k e k (n)ϕ k(v k (n))w kj (n) = k δ k (n)w kj (n) (5..22) Here again δ k (n) denotes the local gradient for neuron k. Finally, inserting (5..22) into (5..15) yields the back-propagation formula for the local gradient δ j (n): δ j (n) = ϕ j(v j (n)) k δ k (n)w kj (n) (5..23) This holds when neuron j is hidden. 166

23 Let us briefly study the factors in this formula: δ j (n) = ϕ j(v j (n)) k δ k (n)w kj (n) ϕ j (v j(n)) depends solely on the activation function ϕ j (.) of the hidden neuron j. The local gradients δ k (n) require knowledge of the error signals e k (n) of the neurons in the next (righthand side) layer. The synaptic weights w kj (n) describe the connections of neuron j to the neurons in the next layer to the right. Thus propagating the errors backwards takes only about the same amount of computation as the forward pass. 167

24 1.7 Summary so far The correction w ji (n) of the weight connecting neuron i to neuron j is described by Eq. (5..13), in Haykin, Eq. (4.25) above Figure

25 The local gradient δ j (n) is computed from Eq. (5..14) if neuron j lies in the output layer. If neuron j lies in the hidden layer, the local gradient is computed from Eq. (5..23). So, back-propagation = computationally efficient way of evaluating the gradient of the cost function with respect to the weights of the neurons in the hidden layer. Once we have the gradient, any optimization method can be applied; we are not limited to steepest descent only. Thus back-propagation network is an incorrect term! Back-propagation steepest descent, as often erroneously appears in the literature! 169

26 1.8 Steepest Descent Training Modes One complete presentation of the entire training set is called an epoch. The learning process is continued over several epochs. Learning could be stopped when the weight values and biases stabilize, and the average squared error converges to some minimum value (more later!). It is useful to present the training samples in a randomized order during each epoch. One may use either sequential (on-line, stochastic) or batch learning mode. 170

27 Sequential Mode The weights are updated after presenting each training example (input vector). This is what we derived earlier! Advantages Simple to implement Requires less storage Perhaps less likely to get trapped in a local minimum Batch Mode The weights are updated after each epoch only. 171

28 All the training examples are presented once before updating the weights and biases. In batch mode, the cost function is the average squared error E av = 1 N e 2 2N j(n) (5..24) n=1 j C The synaptic weight is updated using the batch delta rule w ji = η E av w ji = η N N n=1 e j (n) e j(n) w ji (5..25) The partial derivative e j (n)/ w ji may be computed as in the sequential mode. 172

29 Advantages provides an accurate estimate of the gradient vector convergence to a local minimum at least is guaranteed 173

30 1.9 Activation Functions The derivative of the activation function ϕ(.) is needed in computing the local gradient δ. Therefore, ϕ(.) must be continuous and differentiable. It would be computationally advantageous if ϕ (v) is expressible as a function of v and ϕ(v) only. In MLP networks, two forms of sigmoidal nonlinearities are commonly used as activation functions: 1. Logistic sigmoid 2. Hyperbolic tangent A generalization of the logistic function is: 3. Softmax 174

31 Logistic sigmoid 1 y = ϕ(v) =, a > 0, < v < 1 + exp( av) For clarity, we have omitted here the neuron index j and the iteration number n. The range of ϕ(v) and hence the output y = ϕ(v) always lies in the interval 0 y 1. The derivative of y = ϕ(v) can be expressed in terms of the output y as ϕ (v) = ay(1 y) 175

32 This formula allows writing the local gradient δ j (n) in a simple form. If neuron j is an output node, δ j (n) = e j (n)ϕ j(v j (n)) (5..26) = a[d j (n) y j (n)]y j (n)[1 y j (n)] And for a hidden node: δ j (n) = ϕ j(v j (n)) k δ k (n)w kj (n) (5..27) = ay j (n)[1 y j (n)] k δ k (n)w kj (n) 176

33 Hyperbolic tangent function y = ϕ(v) = a tanh(bv), where a and b are positive constants. In fact, the hyperbolic tangent is just the logistic function rescaled and biased. Its derivative with respect to v is ϕ (v) = ab[1 tanh 2 (bv)] = b [a y][a + y] a Using this, the local gradients of output neurons and hidden neurons can be simplified to Eqs. (4.37) and (4.38) in Haykin. 177

34 178

35 Softmax y = ϕ(v k ) = exp(v k) k exp(v k ) Used in output layer for classification problems with 1-of- C target value coding (There are as many output neurons as there are classes to recognize. Only the output target for the correct class is one, the others are zeros.) Softmax provides posterior probabilities of class membership generalization of the logistic function for multiple classes. One of the output units can be left unconnected, since other outputs sum to one. 179

Feed-Forward mapping networks KAIST 바이오및뇌공학과 정재승

Feed-Forward mapping networks KAIST 바이오및뇌공학과 정재승 Feed-Forward mapping networks KAIST 바이오및뇌공학과 정재승 How much energy do we need for brain functions? Information processing: Trade-off between energy consumption and wiring cost Trade-off between energy consumption

More information

Neural network software tool development: exploring programming language options

Neural network software tool development: exploring programming language options INEB- PSI Technical Report 2006-1 Neural network software tool development: exploring programming language options Alexandra Oliveira aao@fe.up.pt Supervisor: Professor Joaquim Marques de Sá June 2006

More information

Lecture 8 February 4

Lecture 8 February 4 ICS273A: Machine Learning Winter 2008 Lecture 8 February 4 Scribe: Carlos Agell (Student) Lecturer: Deva Ramanan 8.1 Neural Nets 8.1.1 Logistic Regression Recall the logistic function: g(x) = 1 1 + e θt

More information

Lecture 6. Artificial Neural Networks

Lecture 6. Artificial Neural Networks Lecture 6 Artificial Neural Networks 1 1 Artificial Neural Networks In this note we provide an overview of the key concepts that have led to the emergence of Artificial Neural Networks as a major paradigm

More information

An Introduction to Neural Networks

An Introduction to Neural Networks An Introduction to Vincent Cheung Kevin Cannons Signal & Data Compression Laboratory Electrical & Computer Engineering University of Manitoba Winnipeg, Manitoba, Canada Advisor: Dr. W. Kinsner May 27,

More information

SMORN-VII REPORT NEURAL NETWORK BENCHMARK ANALYSIS RESULTS & FOLLOW-UP 96. Özer CIFTCIOGLU Istanbul Technical University, ITU. and

SMORN-VII REPORT NEURAL NETWORK BENCHMARK ANALYSIS RESULTS & FOLLOW-UP 96. Özer CIFTCIOGLU Istanbul Technical University, ITU. and NEA/NSC-DOC (96)29 AUGUST 1996 SMORN-VII REPORT NEURAL NETWORK BENCHMARK ANALYSIS RESULTS & FOLLOW-UP 96 Özer CIFTCIOGLU Istanbul Technical University, ITU and Erdinç TÜRKCAN Netherlands Energy Research

More information

Machine Learning: Multi Layer Perceptrons

Machine Learning: Multi Layer Perceptrons Machine Learning: Multi Layer Perceptrons Prof. Dr. Martin Riedmiller Albert-Ludwigs-University Freiburg AG Maschinelles Lernen Machine Learning: Multi Layer Perceptrons p.1/61 Outline multi layer perceptrons

More information

Chapter 4: Artificial Neural Networks

Chapter 4: Artificial Neural Networks Chapter 4: Artificial Neural Networks CS 536: Machine Learning Littman (Wu, TA) Administration icml-03: instructional Conference on Machine Learning http://www.cs.rutgers.edu/~mlittman/courses/ml03/icml03/

More information

Recurrent Neural Networks

Recurrent Neural Networks Recurrent Neural Networks Neural Computation : Lecture 12 John A. Bullinaria, 2015 1. Recurrent Neural Network Architectures 2. State Space Models and Dynamical Systems 3. Backpropagation Through Time

More information

Introduction to Machine Learning and Data Mining. Prof. Dr. Igor Trajkovski trajkovski@nyus.edu.mk

Introduction to Machine Learning and Data Mining. Prof. Dr. Igor Trajkovski trajkovski@nyus.edu.mk Introduction to Machine Learning and Data Mining Prof. Dr. Igor Trakovski trakovski@nyus.edu.mk Neural Networks 2 Neural Networks Analogy to biological neural systems, the most robust learning systems

More information

University of Cambridge Engineering Part IIB Module 4F10: Statistical Pattern Processing Handout 8: Multi-Layer Perceptrons

University of Cambridge Engineering Part IIB Module 4F10: Statistical Pattern Processing Handout 8: Multi-Layer Perceptrons University of Cambridge Engineering Part IIB Module 4F0: Statistical Pattern Processing Handout 8: Multi-Layer Perceptrons x y (x) Inputs x 2 y (x) 2 Outputs x d First layer Second Output layer layer y

More information

Predicting customer level risk patterns in non-life insurance ERIK VILLAUME

Predicting customer level risk patterns in non-life insurance ERIK VILLAUME Predicting customer level risk patterns in non-life insurance ERIK VILLAUME Master of Science Thesis Stockholm, Sweden Abstract Several models for predicting future customer profitability early into customer

More information

ARTIFICIAL NEURAL NETWORKS FOR DATA MINING

ARTIFICIAL NEURAL NETWORKS FOR DATA MINING ARTIFICIAL NEURAL NETWORKS FOR DATA MINING Amrender Kumar I.A.S.R.I., Library Avenue, Pusa, New Delhi-110 012 akha@iasri.res.in 1. Introduction Neural networks, more accurately called Artificial Neural

More information

Deep Learning for Multivariate Financial Time Series. Gilberto Batres-Estrada

Deep Learning for Multivariate Financial Time Series. Gilberto Batres-Estrada Deep Learning for Multivariate Financial Time Series Gilberto Batres-Estrada June 4, 2015 Abstract Deep learning is a framework for training and modelling neural networks which recently have surpassed

More information

Comparison of Supervised and Unsupervised Learning Algorithms for Pattern Classification

Comparison of Supervised and Unsupervised Learning Algorithms for Pattern Classification Comparison of Supervised and Unsupervised Learning Algorithms for Pattern Classification R. Sathya Professor, Dept. of MCA, Jyoti Nivas College (Autonomous), Professor and Head, Dept. of Mathematics, Bangalore,

More information

IBM SPSS Neural Networks 22

IBM SPSS Neural Networks 22 IBM SPSS Neural Networks 22 Note Before using this information and the product it supports, read the information in Notices on page 21. Product Information This edition applies to version 22, release 0,

More information

Neural Networks and Support Vector Machines

Neural Networks and Support Vector Machines INF5390 - Kunstig intelligens Neural Networks and Support Vector Machines Roar Fjellheim INF5390-13 Neural Networks and SVM 1 Outline Neural networks Perceptrons Neural networks Support vector machines

More information

Self Organizing Maps: Fundamentals

Self Organizing Maps: Fundamentals Self Organizing Maps: Fundamentals Introduction to Neural Networks : Lecture 16 John A. Bullinaria, 2004 1. What is a Self Organizing Map? 2. Topographic Maps 3. Setting up a Self Organizing Map 4. Kohonen

More information

SUCCESSFUL PREDICTION OF HORSE RACING RESULTS USING A NEURAL NETWORK

SUCCESSFUL PREDICTION OF HORSE RACING RESULTS USING A NEURAL NETWORK SUCCESSFUL PREDICTION OF HORSE RACING RESULTS USING A NEURAL NETWORK N M Allinson and D Merritt 1 Introduction This contribution has two main sections. The first discusses some aspects of multilayer perceptrons,

More information

Stock Prediction using Artificial Neural Networks

Stock Prediction using Artificial Neural Networks Stock Prediction using Artificial Neural Networks Abhishek Kar (Y8021), Dept. of Computer Science and Engineering, IIT Kanpur Abstract In this work we present an Artificial Neural Network approach to predict

More information

Follow links Class Use and other Permissions. For more information, send email to: permissions@pupress.princeton.edu

Follow links Class Use and other Permissions. For more information, send email to: permissions@pupress.princeton.edu COPYRIGHT NOTICE: David A. Kendrick, P. Ruben Mercado, and Hans M. Amman: Computational Economics is published by Princeton University Press and copyrighted, 2006, by Princeton University Press. All rights

More information

AN APPLICATION OF TIME SERIES ANALYSIS FOR WEATHER FORECASTING

AN APPLICATION OF TIME SERIES ANALYSIS FOR WEATHER FORECASTING AN APPLICATION OF TIME SERIES ANALYSIS FOR WEATHER FORECASTING Abhishek Agrawal*, Vikas Kumar** 1,Ashish Pandey** 2,Imran Khan** 3 *(M. Tech Scholar, Department of Computer Science, Bhagwant University,

More information

Neural Computation - Assignment

Neural Computation - Assignment Neural Computation - Assignment Analysing a Neural Network trained by Backpropagation AA SSt t aa t i iss i t i icc aa l l AA nn aa l lyy l ss i iss i oo f vv aa r i ioo i uu ss l lee l aa r nn i inn gg

More information

CS 688 Pattern Recognition Lecture 4. Linear Models for Classification

CS 688 Pattern Recognition Lecture 4. Linear Models for Classification CS 688 Pattern Recognition Lecture 4 Linear Models for Classification Probabilistic generative models Probabilistic discriminative models 1 Generative Approach ( x ) p C k p( C k ) Ck p ( ) ( x Ck ) p(

More information

Performance Evaluation of Artificial Neural. Networks for Spatial Data Analysis

Performance Evaluation of Artificial Neural. Networks for Spatial Data Analysis Contemporary Engineering Sciences, Vol. 4, 2011, no. 4, 149-163 Performance Evaluation of Artificial Neural Networks for Spatial Data Analysis Akram A. Moustafa Department of Computer Science Al al-bayt

More information

Forecasting of Economic Quantities using Fuzzy Autoregressive Model and Fuzzy Neural Network

Forecasting of Economic Quantities using Fuzzy Autoregressive Model and Fuzzy Neural Network Forecasting of Economic Quantities using Fuzzy Autoregressive Model and Fuzzy Neural Network Dušan Marček 1 Abstract Most models for the time series of stock prices have centered on autoregressive (AR)

More information

STA 4273H: Statistical Machine Learning

STA 4273H: Statistical Machine Learning STA 4273H: Statistical Machine Learning Russ Salakhutdinov Department of Statistics! rsalakhu@utstat.toronto.edu! http://www.cs.toronto.edu/~rsalakhu/ Lecture 6 Three Approaches to Classification Construct

More information

Power Prediction Analysis using Artificial Neural Network in MS Excel

Power Prediction Analysis using Artificial Neural Network in MS Excel Power Prediction Analysis using Artificial Neural Network in MS Excel NURHASHINMAH MAHAMAD, MUHAMAD KAMAL B. MOHAMMED AMIN Electronic System Engineering Department Malaysia Japan International Institute

More information

Data Mining Techniques Chapter 7: Artificial Neural Networks

Data Mining Techniques Chapter 7: Artificial Neural Networks Data Mining Techniques Chapter 7: Artificial Neural Networks Artificial Neural Networks.................................................. 2 Neural network example...................................................

More information

CLASSIFICATION AND PREDICTION IN DATA MINING WITH NEURAL NETWORKS

CLASSIFICATION AND PREDICTION IN DATA MINING WITH NEURAL NETWORKS ISTANBUL UNIVERSITY JOURNAL OF ELECTRICAL & ELECTRONICS ENGINEERING YEAR VOLUME NUMBER : 2003 : 3 : 1 (707-712) CLASSIFICATION AND PREDICTION IN DATA MINING WITH NEURAL NETWORKS Serhat ÖZEKES 1 Onur OSMAN

More information

Role of Neural network in data mining

Role of Neural network in data mining Role of Neural network in data mining Chitranjanjit kaur Associate Prof Guru Nanak College, Sukhchainana Phagwara,(GNDU) Punjab, India Pooja kapoor Associate Prof Swami Sarvanand Group Of Institutes Dinanagar(PTU)

More information

6.2.8 Neural networks for data mining

6.2.8 Neural networks for data mining 6.2.8 Neural networks for data mining Walter Kosters 1 In many application areas neural networks are known to be valuable tools. This also holds for data mining. In this chapter we discuss the use of neural

More information

Horse Racing Prediction Using Artificial Neural Networks

Horse Racing Prediction Using Artificial Neural Networks Horse Racing Prediction Using Artificial Neural Networks ELNAZ DAVOODI, ALI REZA KHANTEYMOORI Mathematics and Computer science Department Institute for Advanced Studies in Basic Sciences (IASBS) Gavazang,

More information

IBM SPSS Neural Networks 19

IBM SPSS Neural Networks 19 IBM SPSS Neural Networks 19 Note: Before using this information and the product it supports, read the general information under Notices on p. 95. This document contains proprietary information of SPSS

More information

Application of Neural Network in User Authentication for Smart Home System

Application of Neural Network in User Authentication for Smart Home System Application of Neural Network in User Authentication for Smart Home System A. Joseph, D.B.L. Bong, D.A.A. Mat Abstract Security has been an important issue and concern in the smart home systems. Smart

More information

Time Series Data Mining in Rainfall Forecasting Using Artificial Neural Network

Time Series Data Mining in Rainfall Forecasting Using Artificial Neural Network Time Series Data Mining in Rainfall Forecasting Using Artificial Neural Network Prince Gupta 1, Satanand Mishra 2, S.K.Pandey 3 1,3 VNS Group, RGPV, Bhopal, 2 CSIR-AMPRI, BHOPAL prince2010.gupta@gmail.com

More information

Artificial Neural Networks and Support Vector Machines. CS 486/686: Introduction to Artificial Intelligence

Artificial Neural Networks and Support Vector Machines. CS 486/686: Introduction to Artificial Intelligence Artificial Neural Networks and Support Vector Machines CS 486/686: Introduction to Artificial Intelligence 1 Outline What is a Neural Network? - Perceptron learners - Multi-layer networks What is a Support

More information

SEMINAR OUTLINE. Introduction to Data Mining Using Artificial Neural Networks. Definitions of Neural Networks. Definitions of Neural Networks

SEMINAR OUTLINE. Introduction to Data Mining Using Artificial Neural Networks. Definitions of Neural Networks. Definitions of Neural Networks SEMINAR OUTLINE Introduction to Data Mining Using Artificial Neural Networks ISM 611 Dr. Hamid Nemati Introduction to and Characteristics of Neural Networks Comparison of Neural Networks to traditional

More information

The Backpropagation Algorithm

The Backpropagation Algorithm 7 The Backpropagation Algorithm 7. Learning as gradient descent We saw in the last chapter that multilayered networks are capable of computing a wider range of Boolean functions than networks with a single

More information

PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 4: LINEAR MODELS FOR CLASSIFICATION

PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 4: LINEAR MODELS FOR CLASSIFICATION PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 4: LINEAR MODELS FOR CLASSIFICATION Introduction In the previous chapter, we explored a class of regression models having particularly simple analytical

More information

A Time Series ANN Approach for Weather Forecasting

A Time Series ANN Approach for Weather Forecasting A Time Series ANN Approach for Weather Forecasting Neeraj Kumar 1, Govind Kumar Jha 2 1 Associate Professor and Head Deptt. Of Computer Science,Nalanda College Of Engineering Chandi(Bihar) 2 Assistant

More information

ARTIFICIAL NEURAL NETWORKS FOR ADAPTIVE MANAGEMENT TRAFFIC LIGHT OBJECTS AT THE INTERSECTION

ARTIFICIAL NEURAL NETWORKS FOR ADAPTIVE MANAGEMENT TRAFFIC LIGHT OBJECTS AT THE INTERSECTION The 10 th International Conference RELIABILITY and STATISTICS in TRANSPORTATION and COMMUNICATION - 2010 Proceedings of the 10th International Conference Reliability and Statistics in Transportation and

More information

A New Approach to Neural Network based Stock Trading Strategy

A New Approach to Neural Network based Stock Trading Strategy A New Approach to Neural Network based Stock Trading Strategy Miroslaw Kordos, Andrzej Cwiok University of Bielsko-Biala, Department of Mathematics and Computer Science, Bielsko-Biala, Willowa 2, Poland:

More information

degrees of freedom and are able to adapt to the task they are supposed to do [Gupta].

degrees of freedom and are able to adapt to the task they are supposed to do [Gupta]. 1.3 Neural Networks 19 Neural Networks are large structured systems of equations. These systems have many degrees of freedom and are able to adapt to the task they are supposed to do [Gupta]. Two very

More information

CHAPTER 5 PREDICTIVE MODELING STUDIES TO DETERMINE THE CONVEYING VELOCITY OF PARTS ON VIBRATORY FEEDER

CHAPTER 5 PREDICTIVE MODELING STUDIES TO DETERMINE THE CONVEYING VELOCITY OF PARTS ON VIBRATORY FEEDER 93 CHAPTER 5 PREDICTIVE MODELING STUDIES TO DETERMINE THE CONVEYING VELOCITY OF PARTS ON VIBRATORY FEEDER 5.1 INTRODUCTION The development of an active trap based feeder for handling brakeliners was discussed

More information

Analysis of Multilayer Neural Networks with Direct and Cross-Forward Connection

Analysis of Multilayer Neural Networks with Direct and Cross-Forward Connection Analysis of Multilayer Neural Networks with Direct and Cross-Forward Connection Stanis law P laczek and Bijaya Adhikari Vistula University, Warsaw, Poland stanislaw.placzek@wp.pl,bijaya.adhikari1991@gmail.com

More information

Simplified Machine Learning for CUDA. Umar Arshad @arshad_umar Arrayfire @arrayfire

Simplified Machine Learning for CUDA. Umar Arshad @arshad_umar Arrayfire @arrayfire Simplified Machine Learning for CUDA Umar Arshad @arshad_umar Arrayfire @arrayfire ArrayFire CUDA and OpenCL experts since 2007 Headquartered in Atlanta, GA In search for the best and the brightest Expert

More information

Introduction to Machine Learning Using Python. Vikram Kamath

Introduction to Machine Learning Using Python. Vikram Kamath Introduction to Machine Learning Using Python Vikram Kamath Contents: 1. 2. 3. 4. 5. 6. 7. 8. 9. 10. Introduction/Definition Where and Why ML is used Types of Learning Supervised Learning Linear Regression

More information

Comparison of Regression Model and Artificial Neural Network Model for the prediction of Electrical Power generated in Nigeria

Comparison of Regression Model and Artificial Neural Network Model for the prediction of Electrical Power generated in Nigeria Available online at www.pelagiaresearchlibrary.com Advances in Applied Science Research, 2011, 2 (5):329-339 ISSN: 0976-8610 CODEN (USA): AASRFC Comparison of Regression Model and Artificial Neural Network

More information

A Multi-level Artificial Neural Network for Residential and Commercial Energy Demand Forecast: Iran Case Study

A Multi-level Artificial Neural Network for Residential and Commercial Energy Demand Forecast: Iran Case Study 211 3rd International Conference on Information and Financial Engineering IPEDR vol.12 (211) (211) IACSIT Press, Singapore A Multi-level Artificial Neural Network for Residential and Commercial Energy

More information

Package AMORE. February 19, 2015

Package AMORE. February 19, 2015 Encoding UTF-8 Version 0.2-15 Date 2014-04-10 Title A MORE flexible neural network package Package AMORE February 19, 2015 Author Manuel Castejon Limas, Joaquin B. Ordieres Mere, Ana Gonzalez Marcos, Francisco

More information

Background 2. Lecture 2 1. The Least Mean Square (LMS) algorithm 4. The Least Mean Square (LMS) algorithm 3. br(n) = u(n)u H (n) bp(n) = u(n)d (n)

Background 2. Lecture 2 1. The Least Mean Square (LMS) algorithm 4. The Least Mean Square (LMS) algorithm 3. br(n) = u(n)u H (n) bp(n) = u(n)d (n) Lecture 2 1 During this lecture you will learn about The Least Mean Squares algorithm (LMS) Convergence analysis of the LMS Equalizer (Kanalutjämnare) Background 2 The method of the Steepest descent that

More information

Tennis Winner Prediction based on Time-Series History with Neural Modeling

Tennis Winner Prediction based on Time-Series History with Neural Modeling Tennis Winner Prediction based on Time-Series History with Neural Modeling Amornchai Somboonphokkaphan, Suphakant Phimoltares, and Chidchanok Lursinsap Abstract Tennis is one of the most popular sports

More information

Implementation of Neural Networks with Theano. http://deeplearning.net/tutorial/

Implementation of Neural Networks with Theano. http://deeplearning.net/tutorial/ Implementation of Neural Networks with Theano http://deeplearning.net/tutorial/ Feed Forward Neural Network (MLP) Hidden Layer Object Hidden Layer Object Hidden Layer Object Logistic Regression Object

More information

Feedforward Neural Networks and Backpropagation

Feedforward Neural Networks and Backpropagation Feedforward Neural Networks and Backpropagation Feedforward neural networks Architectural issues, computational capabilities Sigmoidal and radial basis functions Gradient-based learning and Backprogation

More information

APPLICATION OF ARTIFICIAL NEURAL NETWORKS USING HIJRI LUNAR TRANSACTION AS EXTRACTED VARIABLES TO PREDICT STOCK TREND DIRECTION

APPLICATION OF ARTIFICIAL NEURAL NETWORKS USING HIJRI LUNAR TRANSACTION AS EXTRACTED VARIABLES TO PREDICT STOCK TREND DIRECTION LJMS 2008, 2 Labuan e-journal of Muamalat and Society, Vol. 2, 2008, pp. 9-16 Labuan e-journal of Muamalat and Society APPLICATION OF ARTIFICIAL NEURAL NETWORKS USING HIJRI LUNAR TRANSACTION AS EXTRACTED

More information

NEURAL NETWORKS A Comprehensive Foundation

NEURAL NETWORKS A Comprehensive Foundation NEURAL NETWORKS A Comprehensive Foundation Second Edition Simon Haykin McMaster University Hamilton, Ontario, Canada Prentice Hall Prentice Hall Upper Saddle River; New Jersey 07458 Preface xii Acknowledgments

More information

Data Mining using Artificial Neural Network Rules

Data Mining using Artificial Neural Network Rules Data Mining using Artificial Neural Network Rules Pushkar Shinde MCOERC, Nasik Abstract - Diabetes patients are increasing in number so it is necessary to predict, treat and diagnose the disease. Data

More information

Machine Learning and Data Mining -

Machine Learning and Data Mining - Machine Learning and Data Mining - Perceptron Neural Networks Nuno Cavalheiro Marques (nmm@di.fct.unl.pt) Spring Semester 2010/2011 MSc in Computer Science Multi Layer Perceptron Neurons and the Perceptron

More information

Supporting Online Material for

Supporting Online Material for www.sciencemag.org/cgi/content/full/313/5786/504/dc1 Supporting Online Material for Reducing the Dimensionality of Data with Neural Networks G. E. Hinton* and R. R. Salakhutdinov *To whom correspondence

More information

Neural Networks and Back Propagation Algorithm

Neural Networks and Back Propagation Algorithm Neural Networks and Back Propagation Algorithm Mirza Cilimkovic Institute of Technology Blanchardstown Blanchardstown Road North Dublin 15 Ireland mirzac@gmail.com Abstract Neural Networks (NN) are important

More information

Iranian J Env Health Sci Eng, 2004, Vol.1, No.2, pp.51-57. Application of Intelligent System for Water Treatment Plant Operation.

Iranian J Env Health Sci Eng, 2004, Vol.1, No.2, pp.51-57. Application of Intelligent System for Water Treatment Plant Operation. Iranian J Env Health Sci Eng, 2004, Vol.1, No.2, pp.51-57 Application of Intelligent System for Water Treatment Plant Operation *A Mirsepassi Dept. of Environmental Health Engineering, School of Public

More information

Data Mining Algorithms Part 1. Dejan Sarka

Data Mining Algorithms Part 1. Dejan Sarka Data Mining Algorithms Part 1 Dejan Sarka Join the conversation on Twitter: @DevWeek #DW2015 Instructor Bio Dejan Sarka (dsarka@solidq.com) 30 years of experience SQL Server MVP, MCT, 13 books 7+ courses

More information

NN-OPT: Neural Network for Option Pricing Using Multinomial Tree

NN-OPT: Neural Network for Option Pricing Using Multinomial Tree NN-OPT: Neural Network for Option Pricing Using Multinomial Tree Hung-Ching (Justin) Chen and Malik Magdon-Ismail Rensselaer Polytechnic Institute, Dept. of Computer Science, Troy, NY 12180, USA {chenh3,

More information

Novelty Detection in image recognition using IRF Neural Networks properties

Novelty Detection in image recognition using IRF Neural Networks properties Novelty Detection in image recognition using IRF Neural Networks properties Philippe Smagghe, Jean-Luc Buessler, Jean-Philippe Urban Université de Haute-Alsace MIPS 4, rue des Frères Lumière, 68093 Mulhouse,

More information

Chapter 2 The Research on Fault Diagnosis of Building Electrical System Based on RBF Neural Network

Chapter 2 The Research on Fault Diagnosis of Building Electrical System Based on RBF Neural Network Chapter 2 The Research on Fault Diagnosis of Building Electrical System Based on RBF Neural Network Qian Wu, Yahui Wang, Long Zhang and Li Shen Abstract Building electrical system fault diagnosis is the

More information

Performance Evaluation On Human Resource Management Of China S Commercial Banks Based On Improved Bp Neural Networks

Performance Evaluation On Human Resource Management Of China S Commercial Banks Based On Improved Bp Neural Networks Performance Evaluation On Human Resource Management Of China S *1 Honglei Zhang, 2 Wenshan Yuan, 1 Hua Jiang 1 School of Economics and Management, Hebei University of Engineering, Handan 056038, P. R.

More information

Efficient online learning of a non-negative sparse autoencoder

Efficient online learning of a non-negative sparse autoencoder and Machine Learning. Bruges (Belgium), 28-30 April 2010, d-side publi., ISBN 2-93030-10-2. Efficient online learning of a non-negative sparse autoencoder Andre Lemme, R. Felix Reinhart and Jochen J. Steil

More information

Prediction Model for Crude Oil Price Using Artificial Neural Networks

Prediction Model for Crude Oil Price Using Artificial Neural Networks Applied Mathematical Sciences, Vol. 8, 2014, no. 80, 3953-3965 HIKARI Ltd, www.m-hikari.com http://dx.doi.org/10.12988/ams.2014.43193 Prediction Model for Crude Oil Price Using Artificial Neural Networks

More information

Artificial Neural Networks for Gas Turbine Modeling and Sensor Validation

Artificial Neural Networks for Gas Turbine Modeling and Sensor Validation ISRN LUTMDN/TMHP 06/5085 SE ISSN 0282-1990 Artificial Neural Networks for Gas Turbine Modeling and Sensor Validation Magnus Fast Thesis for the Degree of Master of Science Division of Thermal Power Engineering

More information

PLAANN as a Classification Tool for Customer Intelligence in Banking

PLAANN as a Classification Tool for Customer Intelligence in Banking PLAANN as a Classification Tool for Customer Intelligence in Banking EUNITE World Competition in domain of Intelligent Technologies The Research Report Ireneusz Czarnowski and Piotr Jedrzejowicz Department

More information

Analecta Vol. 8, No. 2 ISSN 2064-7964

Analecta Vol. 8, No. 2 ISSN 2064-7964 EXPERIMENTAL APPLICATIONS OF ARTIFICIAL NEURAL NETWORKS IN ENGINEERING PROCESSING SYSTEM S. Dadvandipour Institute of Information Engineering, University of Miskolc, Egyetemváros, 3515, Miskolc, Hungary,

More information

Neural Networks in Quantitative Finance

Neural Networks in Quantitative Finance Neural Networks in Quantitative Finance Master Thesis submitted to Prof. Dr. Wolfgang Härdle Institute for Statistics and Econometrics CASE - Center for Applied Statistics and Economics Humboldt-Universität

More information

Note on growth and growth accounting

Note on growth and growth accounting CHAPTER 0 Note on growth and growth accounting 1. Growth and the growth rate In this section aspects of the mathematical concept of the rate of growth used in growth models and in the empirical analysis

More information

OPTIMUM LEARNING RATE FOR CLASSIFICATION PROBLEM

OPTIMUM LEARNING RATE FOR CLASSIFICATION PROBLEM OPTIMUM LEARNING RATE FOR CLASSIFICATION PROBLEM WITH MLP IN DATA MINING Lalitha Saroja Thota 1 and Suresh Babu Changalasetty 2 1 Department of Computer Science, King Khalid University, Abha, KSA 2 Department

More information

Temporal Difference Learning in the Tetris Game

Temporal Difference Learning in the Tetris Game Temporal Difference Learning in the Tetris Game Hans Pirnay, Slava Arabagi February 6, 2009 1 Introduction Learning to play the game Tetris has been a common challenge on a few past machine learning competitions.

More information

The general inefficiency of batch training for gradient descent learning

The general inefficiency of batch training for gradient descent learning Neural Networks 16 (2003) 1429 1451 www.elsevier.com/locate/neunet The general inefficiency of batch training for gradient descent learning D. Randall Wilson a, *, Tony R. Martinez b,1 a Fonix Corporation,

More information

Back Propagation Neural Network for Wireless Networking

Back Propagation Neural Network for Wireless Networking International Journal of Computer Sciences and Engineering Open Access Review Paper Volume-4, Issue-4 E-ISSN: 2347-2693 Back Propagation Neural Network for Wireless Networking Menal Dahiya Maharaja Surajmal

More information

ANN Based Fault Classifier and Fault Locator for Double Circuit Transmission Line

ANN Based Fault Classifier and Fault Locator for Double Circuit Transmission Line International Journal of Computer Sciences and Engineering Open Access Research Paper Volume-4, Special Issue-2, April 2016 E-ISSN: 2347-2693 ANN Based Fault Classifier and Fault Locator for Double Circuit

More information

DEVELOPMENT OF THE ARTIFICIAL NEURAL NETWORK MODEL FOR PREDICTION OF IRAQI EXPRESS WAYS CONSTRUCTION COST

DEVELOPMENT OF THE ARTIFICIAL NEURAL NETWORK MODEL FOR PREDICTION OF IRAQI EXPRESS WAYS CONSTRUCTION COST International Journal of Civil Engineering and Technology (IJCIET) Volume 6, Issue 10, Oct 2015, pp. 62-76, Article ID: IJCIET_06_10_006 Available online at http://www.iaeme.com/ijciet/issues.asp?jtype=ijciet&vtype=6&itype=10

More information

Methods and Applications for Distance Based ANN Training

Methods and Applications for Distance Based ANN Training Methods and Applications for Distance Based ANN Training Christoph Lassner, Rainer Lienhart Multimedia Computing and Computer Vision Lab Augsburg University, Universitätsstr. 6a, 86159 Augsburg, Germany

More information

Package neuralnet. February 20, 2015

Package neuralnet. February 20, 2015 Type Package Title Training of neural networks Version 1.32 Date 2012-09-19 Package neuralnet February 20, 2015 Author Stefan Fritsch, Frauke Guenther , following earlier work

More information

Comparison of K-means and Backpropagation Data Mining Algorithms

Comparison of K-means and Backpropagation Data Mining Algorithms Comparison of K-means and Backpropagation Data Mining Algorithms Nitu Mathuriya, Dr. Ashish Bansal Abstract Data mining has got more and more mature as a field of basic research in computer science and

More information

Data Mining Using Neural Networks: A Guide for Statisticians

Data Mining Using Neural Networks: A Guide for Statisticians Data Mining Using Neural Networks: A Guide for Statisticians Basilio de Bragança Pereira UFRJ - Universidade Federal do Rio de Janeiro Calyampudi Radhakrishna Rao PSU - Penn State University June 2009

More information

General Framework for an Iterative Solution of Ax b. Jacobi s Method

General Framework for an Iterative Solution of Ax b. Jacobi s Method 2.6 Iterative Solutions of Linear Systems 143 2.6 Iterative Solutions of Linear Systems Consistent linear systems in real life are solved in one of two ways: by direct calculation (using a matrix factorization,

More information

A Neural Network based Approach for Predicting Customer Churn in Cellular Network Services

A Neural Network based Approach for Predicting Customer Churn in Cellular Network Services A Neural Network based Approach for Predicting Customer Churn in Cellular Network Services Anuj Sharma Information Systems Area Indian Institute of Management, Indore, India Dr. Prabin Kumar Panigrahi

More information

129: Artificial Neural Networks. Ajith Abraham Oklahoma State University, Stillwater, OK, USA 1 INTRODUCTION TO ARTIFICIAL NEURAL NETWORKS

129: Artificial Neural Networks. Ajith Abraham Oklahoma State University, Stillwater, OK, USA 1 INTRODUCTION TO ARTIFICIAL NEURAL NETWORKS 129: Artificial Neural Networks Ajith Abraham Oklahoma State University, Stillwater, OK, USA 1 Introduction to Artificial Neural Networks 901 2 Neural Network Architectures 902 3 Neural Network Learning

More information

Introduction to Logistic Regression

Introduction to Logistic Regression OpenStax-CNX module: m42090 1 Introduction to Logistic Regression Dan Calderon This work is produced by OpenStax-CNX and licensed under the Creative Commons Attribution License 3.0 Abstract Gives introduction

More information

Bank Customers (Credit) Rating System Based On Expert System and ANN

Bank Customers (Credit) Rating System Based On Expert System and ANN Bank Customers (Credit) Rating System Based On Expert System and ANN Project Review Yingzhen Li Abstract The precise rating of customers has a decisive impact on loan business. We constructed the BP network,

More information

A Simple Feature Extraction Technique of a Pattern By Hopfield Network

A Simple Feature Extraction Technique of a Pattern By Hopfield Network A Simple Feature Extraction Technique of a Pattern By Hopfield Network A.Nag!, S. Biswas *, D. Sarkar *, P.P. Sarkar *, B. Gupta **! Academy of Technology, Hoogly - 722 *USIC, University of Kalyani, Kalyani

More information

COMBINED NEURAL NETWORKS FOR TIME SERIES ANALYSIS

COMBINED NEURAL NETWORKS FOR TIME SERIES ANALYSIS COMBINED NEURAL NETWORKS FOR TIME SERIES ANALYSIS Iris Ginzburg and David Horn School of Physics and Astronomy Raymond and Beverly Sackler Faculty of Exact Science Tel-Aviv University Tel-A viv 96678,

More information

TRAINING A LIMITED-INTERCONNECT, SYNTHETIC NEURAL IC

TRAINING A LIMITED-INTERCONNECT, SYNTHETIC NEURAL IC 777 TRAINING A LIMITED-INTERCONNECT, SYNTHETIC NEURAL IC M.R. Walker. S. Haghighi. A. Afghan. and L.A. Akers Center for Solid State Electronics Research Arizona State University Tempe. AZ 85287-6206 mwalker@enuxha.eas.asu.edu

More information

Neural Network Predictor for Fraud Detection: A Study Case for the Federal Patrimony Department

Neural Network Predictor for Fraud Detection: A Study Case for the Federal Patrimony Department DOI: 10.5769/C2012010 or http://dx.doi.org/10.5769/c2012010 Neural Network Predictor for Fraud Detection: A Study Case for the Federal Patrimony Department Antonio Manuel Rubio Serrano (1,2), João Paulo

More information

IFT3395/6390. Machine Learning from linear regression to Neural Networks. Machine Learning. Training Set. t (3.5, -2,..., 127, 0,...

IFT3395/6390. Machine Learning from linear regression to Neural Networks. Machine Learning. Training Set. t (3.5, -2,..., 127, 0,... IFT3395/6390 Historical perspective: back to 1957 (Prof. Pascal Vincent) (Rosenblatt, Perceptron ) Machine Learning from linear regression to Neural Networks Computer Science Artificial Intelligence Symbolic

More information

Using Artifical Neural Networks to Model Opponents in Texas Hold'em

Using Artifical Neural Networks to Model Opponents in Texas Hold'em CMPUT 499 - Research Project Review Using Artifical Neural Networks to Model Opponents in Texas Hold'em Aaron Davidson email: davidson@cs.ualberta.ca November 28th, 1999 Abstract: This paper describes

More information

Neural network models: Foundations and applications to an audit decision problem

Neural network models: Foundations and applications to an audit decision problem Annals of Operations Research 75(1997)291 301 291 Neural network models: Foundations and applications to an audit decision problem Rebecca C. Wu Department of Accounting, College of Management, National

More information

Design call center management system of e-commerce based on BP neural network and multifractal

Design call center management system of e-commerce based on BP neural network and multifractal Available online www.jocpr.com Journal of Chemical and Pharmaceutical Research, 2014, 6(6):951-956 Research Article ISSN : 0975-7384 CODEN(USA) : JCPRC5 Design call center management system of e-commerce

More information

4F7 Adaptive Filters (and Spectrum Estimation) Least Mean Square (LMS) Algorithm Sumeetpal Singh Engineering Department Email : sss40@eng.cam.ac.

4F7 Adaptive Filters (and Spectrum Estimation) Least Mean Square (LMS) Algorithm Sumeetpal Singh Engineering Department Email : sss40@eng.cam.ac. 4F7 Adaptive Filters (and Spectrum Estimation) Least Mean Square (LMS) Algorithm Sumeetpal Singh Engineering Department Email : sss40@eng.cam.ac.uk 1 1 Outline The LMS algorithm Overview of LMS issues

More information

Open Access Research on Application of Neural Network in Computer Network Security Evaluation. Shujuan Jin *

Open Access Research on Application of Neural Network in Computer Network Security Evaluation. Shujuan Jin * Send Orders for Reprints to reprints@benthamscience.ae 766 The Open Electrical & Electronic Engineering Journal, 2014, 8, 766-771 Open Access Research on Application of Neural Network in Computer Network

More information

Impact of Feature Selection on the Performance of Wireless Intrusion Detection Systems

Impact of Feature Selection on the Performance of Wireless Intrusion Detection Systems 2009 International Conference on Computer Engineering and Applications IPCSIT vol.2 (2011) (2011) IACSIT Press, Singapore Impact of Feature Selection on the Performance of ireless Intrusion Detection Systems

More information