Machine Learning and Data Mining -
|
|
|
- Eileen Booth
- 10 years ago
- Views:
Transcription
1 Machine Learning and Data Mining - Perceptron Neural Networks Nuno Cavalheiro Marques ([email protected]) Spring Semester 2010/2011 MSc in Computer Science
2 Multi Layer Perceptron Neurons and the Perceptron Model logic with neurons (McCulloch and Pitts, 1943). x 0 usually is set to 1, for all samples. In that case w 0 is called neuron bias.
3 Multi Layer Perceptron Most Relevant Work in Neural Networks McCulloch and Pitts in 1943 Introduces Neural Networks as computing devices:a Logical Calculus of the Ideas Immanent in Nervous Activity. Hebb, 1949 The basic model of network self-organization. Rosenbatt,1958 Learning with a teacher and the Perceptron Minsky and Papert (1969), show that a perceptron can not learn a simple XOR multilayer (feedforward) perceptron (or MLP), and backpropagation of the error: Rumelhart, Hinton, and Williams (1986)
4 Learning Perceptron Weights Backpropagation Basic Algorithm Backpropagation learning in networks of perceptrons. BP (and variants) are the best way to train MLPs. Generalized Delta Rule: w = η (E[ w])
5 Why Several Neural Network models? Models, Paradigms and Methods The exact description of a biological neuron is still unknown. In Neuro-physiology each cell is a complex dynamical system controlled by electric fields and chemical neurotransmitters. (Mathematical) Model Representation of an existing system which presents (some) knowledge in usable form. Finite set of variables and interactions. (Eg. Scientific) Method A way how to solve a given problem. Eg. Bodies of techniques for investigating phenomena, acquiring, correcting and integrating knowledge. Collection of data through observation and experimentation, and the formulation and testing of hypotheses.
6 Why Several Neural Network models? Models, Paradigms and Methods for ANN Paradigms Model or case example of a general theory Paradigms for some biological neuronal behaviour. Available models are incorrect or incomplete regarding the biological neuron. What is the goal? Understand the biological brain? Get better machine learning methods? Solve specific problems? Question for each new model: what is its added value? (Can we solve it with a previous model?) Eg. Self Organizing Models (SOM): Unsupervised model that can be used for clustering and visualizing data.
7 Additional Models ANN models JASTAP Recurrent SOM ART PNN RBF... Relevant problems for new models How to train/learn? How to understand and represent knowledge? What is this model contribution?
8 Additional Models Spiking models Implement more neuro-realistic models, taking time into account Input is a time series Some works use spike input trains.
9 Additional Models Issues on Any Neural Networks How can NN learn in more complex models? Genetic algorithms can learn weights and architecture. Generalised BP for spiking neurons but model harder to understand. What are the advantages of more complex models? Even in simple MLPs: only sub-optimal learning (e.g. Vapnik). Simulate or understand biology? Berger: Bio-realistic differential population models. Spiking models are good models of biological neurons? In SOM: hundreds of simple units similar to biological neural maps. Hans Moravec: why use silicon to duplicate biology. Hölldobler: MLP implement any logical function kernel.
10 Backpropagation Basic Algorithm Backpropagation learning in networks of perceptrons. BP (and variants) are the best way to train MLPs. Generalized Delta Rule: w = η (E[ w])
11 Gradient Descent: concepts (from [TM97]) I Consider simpler linear unit, where: o = w 0 + w 1 x w n x n Let s learn w i s that minimize the squared error of simpler linear unit E[ w] 1 (t d o d ) 2 2 d D Where D is set of training examples.
12 Gradient Descent: concepts (from [TM97]) II Gradient: Training rule: i.e., E[ w] [ E, E, E ] w 0 w 1 w n w = η E[ w] w i = η E w i
13 Gradient Descent: concepts (from [TM97]) III E = 1 w i w i 2 = 1 2 d = 1 2 d (t d o d ) 2 d w i (t d o d ) 2 2(t d o d ) w i (t d o d ) = (t d o d ) (t d w x d ) w i d E w i = (t d o d )( x i,d ) d
14 Gradient Descent Algorithm for Perceptron (from [TM97]) I Gradient-Descent(training examples, η) Each training example is a pair of the form x, t, where x is the vector of input values, and t is the target output value. η is the learning rate (e.g.,.05). Note: [TM97] is assuming the linear unit Initialize each w i to some small random value Until the termination condition is met, Do Initialize each w i to zero. For each x, t in training examples, Do Input the instance x to the unit and compute the output o
15 Gradient Descent Algorithm for Perceptron (from [TM97]) II For each linear unit weight w i, Do w i w i + η(t o)x i For each linear unit weight w i, Do w i w i + w i
16 Sigmoid versus step function Step function x > 0 is non differentiable. Sigmoid functions are similar but differentiable. Function: σ(x) = e x has the nice property: = σ(x)(1 σ(x)) σ(x) x
17 Error Gradient for a Sigmoid Unit (from [TM97]) I E = 1 w i w i 2 (t d o d ) 2 d D = 1 2 d = 1 2(t d o d ) 2 = d d w i (t d o d ) 2 (t d o d ) (t d o d ) w i ) ( o d w i = d (t d o d ) o d net d net d w i
18 Error Gradient for a Sigmoid Unit (from [TM97]) II But we know: o d = σ(net d) = o d (1 o d ) net d net d So: net d = ( w x d) = x i,d w i w i E = (t d o d )o d (1 o d )x i,d w i d D
19 MLP Backpropagation Algorithm (from [TM97]) I Initialize all weights to small random numbers. Until satisfied, Do For each training example, Do 1 Input the training example to the network and compute the network outputs 2 For each output unit k 3 For each hidden unit h δ k o k (1 o k )(t k o k ) δ h o h (1 o h ) k outputs w h,k δ k
20 MLP Backpropagation Algorithm (from [TM97]) II 4 Update each network weight w i,j where w i,j w i,j + w i,j w i,j = ηδ j x i,j
21 Backpropagation properties More on Backpropagation (from [TM97]) Gradient descent over entire network weight vector Easily generalized to directed graphs (Elman) Will find a local, not necessarily global error minimum In practice, often works well (can run multiple times) Often include weight momentum α w i,j (n) = ηδ j x i,j + α w i,j (n 1) Minimizes error over training examples Will it generalize well to subsequent examples? Training can take thousands of iterations slow! Using network after training is very fast
22 Backpropagation properties MLP-Decision Regions Single Perceptron: linear discriminator unit One hidden layer: any logic function
23 Backpropagation properties Overfitting in Neural Networks Too much units on hidden layers can prevent good generalization. Blackbox: Non-localized knowledge representation
24 Backpropagation properties Local Minima with Backpropagation Algorithm Local minima: usual problem with greedy methods. Wrong starting point can be problematic. Usual near zero random initialization can have problems. We should have a good starting point (based on our knowledge). We should avoid traps in learning space (based on our
25 Visualization of the Neural Network Using Visualization for Learning Analysis Discovering what the neural Network learned Sensitivity Analysis Rule Generation Standard Graphics Neural Network Graphic Hinton Diagram for weights Clustering and Segmentation Visualization Using Domain knowledge to view the output
26 Visualization of the Neural Network Tom Mitchell s Binary Encoding (in Mitchell, 1997)
27 Visualization of the Neural Network Tom Mitchell s Iterations for Binary Encoding (in Mitchell, 1997)
28 Visualization of the Neural Network POS Tagger Network Graphic PCT DET N ADJ V PREP CONJ WH ADV HAVE Vt BE PRONt DO AUX PRON PN INT Input Hidden Output
29 Visualization of the Neural Network POS Tagger weights Hidden Input Units Output Units Hidden Units
30 NN as a Logic Function Including Knowledge in Neural Networks Both logic and neural networks are two founding areas in AI. Logic is applied to Computational Logic or Logic Programming We can reason over structured objects. Neural Networks are particularly good for modeling sub-symbolic information Can learn, adapt to new environments, graceful degradation. Used in Industrial processes or controllers Namely after training, specialized hardware and very fast response time. Core method intends to integrate both approaches.
31 NN as a Logic Function Issues on Neuro-Symbolic Integration I BP doesn t work so well with strict logic rules ([BHM08]). In machine learning we need ways to guide learning. Propositional Fixation (only atomic propositions). Care should be taken while encoding symbols. FOL and logical arguments can be taken into account. Neural Networks can implement an Universal Turing machine. We need recursion (difficult to understand). In general, no clear semantics in integrated information. Only by understanding the starting model and the learning that is taking place we can advance beyond black-box approaches to machine learning.
32 NN as a Logic Function Issues on Neuro-Symbolic Integration II Neural network usage in data mining requires good knowledge and explanation of data. Can the basic ideas be extended to other models and knowledge representations?
33 Symbolic or Subsimbolic How can neural networks take advantage of express knowledge? Knowledge extraction achieved by machine learning in neural networks (backpropagation) After proper training, networks extract meaningful knowledge from data Implicitly: classification and clustering models Explicitly: Neuro-symbolic approach New relevant connections in learned model (i.e. we can understand the neural network) Domain knowledge (rules) used to guide learning Feed-Forward neural networks have many parameters Rules Bias learning to a nearby local-minima Recursion (and neuro-computation) can be inserted with Elman Neural Networks.
34 Neuro-Symbolic Networks: the Core Method Core method basic algorithm [HK,94] Implements the T p operator (next logical step) If appropriate, recursion will lead to stable models. m nb. of propositions in P Input and output layer have size m, each unit has threshold 0.5 and represents one proposition. For each clause A L 1 L 2... L k with l 0 in P: 1 Add a binary unit c to hidden layer. 2 For each literal L j Lj, 1 j k, connect input unit for L j to c with weight ω if L j is an positive literal, or ω if L j is a negative literal. 3 Set threshold T c of unit c as l ω 0.5, where l is the number of positive literals.
35 Neuro-Symbolic Networks: the Core Method Discussion of ω while encoding rules For ω = 0 network stops to improve at some point. For ω > 0 network outperforms network ω = 0. For ω = 5 network did not learn very well. Avoiding Local Minima With errors on only a few samples, back-propagation can get into a local minima.
36 Neuro-Symbolic Networks: the Core Method The MLP s Sigmoid Function [M09] Understanding Backpropagation Learning in the Core Method Networks
37 Neuro-Symbolic Networks: the Core Method Backpropagation and NeSy encoding I Parameter ω gives stability to rules while training. Recall gradient trains sigmoid units with: E w i = d (t d o d ) o d net d x i,d. (1) bigger values of ω make the sigmoid approximate a stepwise function and make go near zero. o d net d good if certain on the rules. Not advisable for rules needing revision. There is a learning zone that depends on ω and net. Free units (with near zero weights) are the first to learn.
38 Probabilistic/Numeric Encodings Basic Core Method for Probabilistic/Numeric Encodings How to handle other inputs (e.g. a probability of some observation). Problem with direct use of probabilities, in core method: , but = We need additional constraints, e.g.: X > T X. But can the network learn the best value of T X? Use a sigmoid neuron having as input X and bias ω T X : 1 alpha < will be true iff X > T X e ω (X T X ) < alpha.
39 Probabilistic/Numeric Encodings Illustrative Example: Learning AND 1HL Original Core Method 2HL Extended Core Method Figure: Initial core methods neural networks, before training and initial weight disturbance.
40 Probabilistic/Numeric Encodings Differences in Backprop for Learning AND Figure: Train error for learning an AND with four distinct encodings.
41 Elman Neural Network [E1990] Core Method can not encode variable length patterns Novel methods for encoding temporal rules to extend the power of neuro-symbolic integration. Elman neural networks allow rules with preconditions at non-deterministic time (e.g. important in NLP [FP01] (Analogical) Recursive connections allow a super-turing power (e.g. Siegelmann, 1994/1999). Learning space: bias learning to a nearby local-minima and allow complex (but controled) feedback loops.
42 The Classic Elman Neural Network Elman Neural Networks Neuronal Unit Elman Neural Network
43 The Classic Elman Neural Network Learning in Elman Neural Networks Non-parametric statistical models learning from examples. Error Backpropagation algorithm iteratively adjusts weights. Semi-recursive neural network using Backpropagation-Through Time. Find patterns in a sequences of values.
44 Main References Main References Mitchell, T.M.: Machine Learning. McGraw-Hill (March 1997) Steffen Hölldobler and Yvonne Kalinke, Towards a massively parallel computational model for logic programming, in Proceedings ECAI94 Workshop on Combining Symbolic and Connectionist Processing, pp ECAI, (1994). Jeffrey L. Elman Finding Structure in Time In Cognitive Science vol. 14, 1990.
45 Other References Other References I David E. Rumelhart and Geoffrey E. Hinton and Ronald Williams, Learning representations by back-propagating errors, in Nature (323), pp (October 1986). Sebastian Bader, Steffen Hölldobler and Nuno Marques. Guiding Backprop by Inserting Rules Proceedings ECAI94 Workshop on Neuro-Symbolic Integration., ECCAI, (2008). Nuno Marques. An Extension of the Core Method for Continuous Values New Trends in Artificial Intelligence. APPIA, (2009).
Introduction to Machine Learning and Data Mining. Prof. Dr. Igor Trajkovski [email protected]
Introduction to Machine Learning and Data Mining Prof. Dr. Igor Trakovski [email protected] Neural Networks 2 Neural Networks Analogy to biological neural systems, the most robust learning systems
Chapter 4: Artificial Neural Networks
Chapter 4: Artificial Neural Networks CS 536: Machine Learning Littman (Wu, TA) Administration icml-03: instructional Conference on Machine Learning http://www.cs.rutgers.edu/~mlittman/courses/ml03/icml03/
Artificial Neural Networks and Support Vector Machines. CS 486/686: Introduction to Artificial Intelligence
Artificial Neural Networks and Support Vector Machines CS 486/686: Introduction to Artificial Intelligence 1 Outline What is a Neural Network? - Perceptron learners - Multi-layer networks What is a Support
Recurrent Neural Networks
Recurrent Neural Networks Neural Computation : Lecture 12 John A. Bullinaria, 2015 1. Recurrent Neural Network Architectures 2. State Space Models and Dynamical Systems 3. Backpropagation Through Time
Application of Neural Network in User Authentication for Smart Home System
Application of Neural Network in User Authentication for Smart Home System A. Joseph, D.B.L. Bong, D.A.A. Mat Abstract Security has been an important issue and concern in the smart home systems. Smart
Lecture 6. Artificial Neural Networks
Lecture 6 Artificial Neural Networks 1 1 Artificial Neural Networks In this note we provide an overview of the key concepts that have led to the emergence of Artificial Neural Networks as a major paradigm
Feed-Forward mapping networks KAIST 바이오및뇌공학과 정재승
Feed-Forward mapping networks KAIST 바이오및뇌공학과 정재승 How much energy do we need for brain functions? Information processing: Trade-off between energy consumption and wiring cost Trade-off between energy consumption
Neural Network Design in Cloud Computing
International Journal of Computer Trends and Technology- volume4issue2-2013 ABSTRACT: Neural Network Design in Cloud Computing B.Rajkumar #1,T.Gopikiran #2,S.Satyanarayana *3 #1,#2Department of Computer
SELECTING NEURAL NETWORK ARCHITECTURE FOR INVESTMENT PROFITABILITY PREDICTIONS
UDC: 004.8 Original scientific paper SELECTING NEURAL NETWORK ARCHITECTURE FOR INVESTMENT PROFITABILITY PREDICTIONS Tonimir Kišasondi, Alen Lovren i University of Zagreb, Faculty of Organization and Informatics,
SUCCESSFUL PREDICTION OF HORSE RACING RESULTS USING A NEURAL NETWORK
SUCCESSFUL PREDICTION OF HORSE RACING RESULTS USING A NEURAL NETWORK N M Allinson and D Merritt 1 Introduction This contribution has two main sections. The first discusses some aspects of multilayer perceptrons,
Neural Networks and Support Vector Machines
INF5390 - Kunstig intelligens Neural Networks and Support Vector Machines Roar Fjellheim INF5390-13 Neural Networks and SVM 1 Outline Neural networks Perceptrons Neural networks Support vector machines
Role of Neural network in data mining
Role of Neural network in data mining Chitranjanjit kaur Associate Prof Guru Nanak College, Sukhchainana Phagwara,(GNDU) Punjab, India Pooja kapoor Associate Prof Swami Sarvanand Group Of Institutes Dinanagar(PTU)
6.2.8 Neural networks for data mining
6.2.8 Neural networks for data mining Walter Kosters 1 In many application areas neural networks are known to be valuable tools. This also holds for data mining. In this chapter we discuss the use of neural
An Introduction to Neural Networks
An Introduction to Vincent Cheung Kevin Cannons Signal & Data Compression Laboratory Electrical & Computer Engineering University of Manitoba Winnipeg, Manitoba, Canada Advisor: Dr. W. Kinsner May 27,
Neural network software tool development: exploring programming language options
INEB- PSI Technical Report 2006-1 Neural network software tool development: exploring programming language options Alexandra Oliveira [email protected] Supervisor: Professor Joaquim Marques de Sá June 2006
Neural Networks algorithms and applications
Neural Networks algorithms and applications By Fiona Nielsen 4i 12/12-2001 Supervisor: Geert Rasmussen Niels Brock Business College 1 Introduction Neural Networks is a field of Artificial Intelligence
EFFICIENT DATA PRE-PROCESSING FOR DATA MINING
EFFICIENT DATA PRE-PROCESSING FOR DATA MINING USING NEURAL NETWORKS JothiKumar.R 1, Sivabalan.R.V 2 1 Research scholar, Noorul Islam University, Nagercoil, India Assistant Professor, Adhiparasakthi College
Introduction to Artificial Neural Networks
POLYTECHNIC UNIVERSITY Department of Computer and Information Science Introduction to Artificial Neural Networks K. Ming Leung Abstract: A computing paradigm known as artificial neural network is introduced.
Analecta Vol. 8, No. 2 ISSN 2064-7964
EXPERIMENTAL APPLICATIONS OF ARTIFICIAL NEURAL NETWORKS IN ENGINEERING PROCESSING SYSTEM S. Dadvandipour Institute of Information Engineering, University of Miskolc, Egyetemváros, 3515, Miskolc, Hungary,
How To Use Neural Networks In Data Mining
International Journal of Electronics and Computer Science Engineering 1449 Available Online at www.ijecse.org ISSN- 2277-1956 Neural Networks in Data Mining Priyanka Gaur Department of Information and
Stock Prediction using Artificial Neural Networks
Stock Prediction using Artificial Neural Networks Abhishek Kar (Y8021), Dept. of Computer Science and Engineering, IIT Kanpur Abstract In this work we present an Artificial Neural Network approach to predict
COMBINED NEURAL NETWORKS FOR TIME SERIES ANALYSIS
COMBINED NEURAL NETWORKS FOR TIME SERIES ANALYSIS Iris Ginzburg and David Horn School of Physics and Astronomy Raymond and Beverly Sackler Faculty of Exact Science Tel-Aviv University Tel-A viv 96678,
NEURAL NETWORKS A Comprehensive Foundation
NEURAL NETWORKS A Comprehensive Foundation Second Edition Simon Haykin McMaster University Hamilton, Ontario, Canada Prentice Hall Prentice Hall Upper Saddle River; New Jersey 07458 Preface xii Acknowledgments
An Introduction to Artificial Neural Networks (ANN) - Methods, Abstraction, and Usage
An Introduction to Artificial Neural Networks (ANN) - Methods, Abstraction, and Usage Introduction An artificial neural network (ANN) reflects a system that is based on operations of biological neural
SEMINAR OUTLINE. Introduction to Data Mining Using Artificial Neural Networks. Definitions of Neural Networks. Definitions of Neural Networks
SEMINAR OUTLINE Introduction to Data Mining Using Artificial Neural Networks ISM 611 Dr. Hamid Nemati Introduction to and Characteristics of Neural Networks Comparison of Neural Networks to traditional
AN APPLICATION OF TIME SERIES ANALYSIS FOR WEATHER FORECASTING
AN APPLICATION OF TIME SERIES ANALYSIS FOR WEATHER FORECASTING Abhishek Agrawal*, Vikas Kumar** 1,Ashish Pandey** 2,Imran Khan** 3 *(M. Tech Scholar, Department of Computer Science, Bhagwant University,
Method of Combining the Degrees of Similarity in Handwritten Signature Authentication Using Neural Networks
Method of Combining the Degrees of Similarity in Handwritten Signature Authentication Using Neural Networks Ph. D. Student, Eng. Eusebiu Marcu Abstract This paper introduces a new method of combining the
Data Mining and Neural Networks in Stata
Data Mining and Neural Networks in Stata 2 nd Italian Stata Users Group Meeting Milano, 10 October 2005 Mario Lucchini e Maurizo Pisati Università di Milano-Bicocca [email protected] [email protected]
129: Artificial Neural Networks. Ajith Abraham Oklahoma State University, Stillwater, OK, USA 1 INTRODUCTION TO ARTIFICIAL NEURAL NETWORKS
129: Artificial Neural Networks Ajith Abraham Oklahoma State University, Stillwater, OK, USA 1 Introduction to Artificial Neural Networks 901 2 Neural Network Architectures 902 3 Neural Network Learning
Neural Network Applications in Stock Market Predictions - A Methodology Analysis
Neural Network Applications in Stock Market Predictions - A Methodology Analysis Marijana Zekic, MS University of Josip Juraj Strossmayer in Osijek Faculty of Economics Osijek Gajev trg 7, 31000 Osijek
Artificial neural networks
Artificial neural networks Now Neurons Neuron models Perceptron learning Multi-layer perceptrons Backpropagation 2 It all starts with a neuron 3 Some facts about human brain ~ 86 billion neurons ~ 10 15
Open Access Research on Application of Neural Network in Computer Network Security Evaluation. Shujuan Jin *
Send Orders for Reprints to [email protected] 766 The Open Electrical & Electronic Engineering Journal, 2014, 8, 766-771 Open Access Research on Application of Neural Network in Computer Network
Novelty Detection in image recognition using IRF Neural Networks properties
Novelty Detection in image recognition using IRF Neural Networks properties Philippe Smagghe, Jean-Luc Buessler, Jean-Philippe Urban Université de Haute-Alsace MIPS 4, rue des Frères Lumière, 68093 Mulhouse,
Performance Evaluation On Human Resource Management Of China S Commercial Banks Based On Improved Bp Neural Networks
Performance Evaluation On Human Resource Management Of China S *1 Honglei Zhang, 2 Wenshan Yuan, 1 Hua Jiang 1 School of Economics and Management, Hebei University of Engineering, Handan 056038, P. R.
IFT3395/6390. Machine Learning from linear regression to Neural Networks. Machine Learning. Training Set. t (3.5, -2,..., 127, 0,...
IFT3395/6390 Historical perspective: back to 1957 (Prof. Pascal Vincent) (Rosenblatt, Perceptron ) Machine Learning from linear regression to Neural Networks Computer Science Artificial Intelligence Symbolic
Machine Learning. Chapter 18, 21. Some material adopted from notes by Chuck Dyer
Machine Learning Chapter 18, 21 Some material adopted from notes by Chuck Dyer What is learning? Learning denotes changes in a system that... enable a system to do the same task more efficiently the next
Biological Neurons and Neural Networks, Artificial Neurons
Biological Neurons and Neural Networks, Artificial Neurons Neural Computation : Lecture 2 John A. Bullinaria, 2015 1. Organization of the Nervous System and Brain 2. Brains versus Computers: Some Numbers
Effect of Using Neural Networks in GA-Based School Timetabling
Effect of Using Neural Networks in GA-Based School Timetabling JANIS ZUTERS Department of Computer Science University of Latvia Raina bulv. 19, Riga, LV-1050 LATVIA [email protected] Abstract: - The school
NEURAL NETWORKS IN DATA MINING
NEURAL NETWORKS IN DATA MINING 1 DR. YASHPAL SINGH, 2 ALOK SINGH CHAUHAN 1 Reader, Bundelkhand Institute of Engineering & Technology, Jhansi, India 2 Lecturer, United Institute of Management, Allahabad,
MANAGING QUEUE STABILITY USING ART2 IN ACTIVE QUEUE MANAGEMENT FOR CONGESTION CONTROL
MANAGING QUEUE STABILITY USING ART2 IN ACTIVE QUEUE MANAGEMENT FOR CONGESTION CONTROL G. Maria Priscilla 1 and C. P. Sumathi 2 1 S.N.R. Sons College (Autonomous), Coimbatore, India 2 SDNB Vaishnav College
American International Journal of Research in Science, Technology, Engineering & Mathematics
American International Journal of Research in Science, Technology, Engineering & Mathematics Available online at http://www.iasir.net ISSN (Print): 2328-349, ISSN (Online): 2328-3580, ISSN (CD-ROM): 2328-3629
Machine Learning: Multi Layer Perceptrons
Machine Learning: Multi Layer Perceptrons Prof. Dr. Martin Riedmiller Albert-Ludwigs-University Freiburg AG Maschinelles Lernen Machine Learning: Multi Layer Perceptrons p.1/61 Outline multi layer perceptrons
Neural Networks for Machine Learning. Lecture 13a The ups and downs of backpropagation
Neural Networks for Machine Learning Lecture 13a The ups and downs of backpropagation Geoffrey Hinton Nitish Srivastava, Kevin Swersky Tijmen Tieleman Abdel-rahman Mohamed A brief history of backpropagation
Learning to Process Natural Language in Big Data Environment
CCF ADL 2015 Nanchang Oct 11, 2015 Learning to Process Natural Language in Big Data Environment Hang Li Noah s Ark Lab Huawei Technologies Part 1: Deep Learning - Present and Future Talk Outline Overview
APPLICATION OF ARTIFICIAL NEURAL NETWORKS USING HIJRI LUNAR TRANSACTION AS EXTRACTED VARIABLES TO PREDICT STOCK TREND DIRECTION
LJMS 2008, 2 Labuan e-journal of Muamalat and Society, Vol. 2, 2008, pp. 9-16 Labuan e-journal of Muamalat and Society APPLICATION OF ARTIFICIAL NEURAL NETWORKS USING HIJRI LUNAR TRANSACTION AS EXTRACTED
Performance Evaluation of Artificial Neural. Networks for Spatial Data Analysis
Contemporary Engineering Sciences, Vol. 4, 2011, no. 4, 149-163 Performance Evaluation of Artificial Neural Networks for Spatial Data Analysis Akram A. Moustafa Department of Computer Science Al al-bayt
The KDD Process: Applying Data Mining
The KDD Process: Applying Nuno Cavalheiro Marques ([email protected]) Spring Semester 2010/2011 MSc in Computer Science Outline I 1 Knowledge Discovery in Data beyond the Computer 2 by Visualization Lift
Power Prediction Analysis using Artificial Neural Network in MS Excel
Power Prediction Analysis using Artificial Neural Network in MS Excel NURHASHINMAH MAHAMAD, MUHAMAD KAMAL B. MOHAMMED AMIN Electronic System Engineering Department Malaysia Japan International Institute
Chapter 12 Discovering New Knowledge Data Mining
Chapter 12 Discovering New Knowledge Data Mining Becerra-Fernandez, et al. -- Knowledge Management 1/e -- 2004 Prentice Hall Additional material 2007 Dekai Wu Chapter Objectives Introduce the student to
Comparison of Supervised and Unsupervised Learning Classifiers for Travel Recommendations
Volume 3, No. 8, August 2012 Journal of Global Research in Computer Science REVIEW ARTICLE Available Online at www.jgrcs.info Comparison of Supervised and Unsupervised Learning Classifiers for Travel Recommendations
Chapter 6. The stacking ensemble approach
82 This chapter proposes the stacking ensemble approach for combining different data mining classifiers to get better performance. Other combination techniques like voting, bagging etc are also described
TRAINING A LIMITED-INTERCONNECT, SYNTHETIC NEURAL IC
777 TRAINING A LIMITED-INTERCONNECT, SYNTHETIC NEURAL IC M.R. Walker. S. Haghighi. A. Afghan. and L.A. Akers Center for Solid State Electronics Research Arizona State University Tempe. AZ 85287-6206 [email protected]
A Time Series ANN Approach for Weather Forecasting
A Time Series ANN Approach for Weather Forecasting Neeraj Kumar 1, Govind Kumar Jha 2 1 Associate Professor and Head Deptt. Of Computer Science,Nalanda College Of Engineering Chandi(Bihar) 2 Assistant
NEURAL NETWORK FUNDAMENTALS WITH GRAPHS, ALGORITHMS, AND APPLICATIONS
NEURAL NETWORK FUNDAMENTALS WITH GRAPHS, ALGORITHMS, AND APPLICATIONS N. K. Bose HRB-Systems Professor of Electrical Engineering The Pennsylvania State University, University Park P. Liang Associate Professor
Neural Networks and Back Propagation Algorithm
Neural Networks and Back Propagation Algorithm Mirza Cilimkovic Institute of Technology Blanchardstown Blanchardstown Road North Dublin 15 Ireland [email protected] Abstract Neural Networks (NN) are important
Comparison of K-means and Backpropagation Data Mining Algorithms
Comparison of K-means and Backpropagation Data Mining Algorithms Nitu Mathuriya, Dr. Ashish Bansal Abstract Data mining has got more and more mature as a field of basic research in computer science and
NEUROEVOLUTION OF AUTO-TEACHING ARCHITECTURES
NEUROEVOLUTION OF AUTO-TEACHING ARCHITECTURES EDWARD ROBINSON & JOHN A. BULLINARIA School of Computer Science, University of Birmingham Edgbaston, Birmingham, B15 2TT, UK [email protected] This
A Neural Network based Approach for Predicting Customer Churn in Cellular Network Services
A Neural Network based Approach for Predicting Customer Churn in Cellular Network Services Anuj Sharma Information Systems Area Indian Institute of Management, Indore, India Dr. Prabin Kumar Panigrahi
Graduate Co-op Students Information Manual. Department of Computer Science. Faculty of Science. University of Regina
Graduate Co-op Students Information Manual Department of Computer Science Faculty of Science University of Regina 2014 1 Table of Contents 1. Department Description..3 2. Program Requirements and Procedures
NEUROMATHEMATICS: DEVELOPMENT TENDENCIES. 1. Which tasks are adequate of neurocomputers?
Appl. Comput. Math. 2 (2003), no. 1, pp. 57-64 NEUROMATHEMATICS: DEVELOPMENT TENDENCIES GALUSHKIN A.I., KOROBKOVA. S.V., KAZANTSEV P.A. Abstract. This article is the summary of a set of Russian scientists
1. Classification problems
Neural and Evolutionary Computing. Lab 1: Classification problems Machine Learning test data repository Weka data mining platform Introduction Scilab 1. Classification problems The main aim of a classification
Price Prediction of Share Market using Artificial Neural Network (ANN)
Prediction of Share Market using Artificial Neural Network (ANN) Zabir Haider Khan Department of CSE, SUST, Sylhet, Bangladesh Tasnim Sharmin Alin Department of CSE, SUST, Sylhet, Bangladesh Md. Akter
SURVIVABILITY ANALYSIS OF PEDIATRIC LEUKAEMIC PATIENTS USING NEURAL NETWORK APPROACH
330 SURVIVABILITY ANALYSIS OF PEDIATRIC LEUKAEMIC PATIENTS USING NEURAL NETWORK APPROACH T. M. D.Saumya 1, T. Rupasinghe 2 and P. Abeysinghe 3 1 Department of Industrial Management, University of Kelaniya,
A Multi-level Artificial Neural Network for Residential and Commercial Energy Demand Forecast: Iran Case Study
211 3rd International Conference on Information and Financial Engineering IPEDR vol.12 (211) (211) IACSIT Press, Singapore A Multi-level Artificial Neural Network for Residential and Commercial Energy
Neural Networks in Data Mining
IOSR Journal of Engineering (IOSRJEN) ISSN (e): 2250-3021, ISSN (p): 2278-8719 Vol. 04, Issue 03 (March. 2014), V6 PP 01-06 www.iosrjen.org Neural Networks in Data Mining Ripundeep Singh Gill, Ashima Department
Neural Computation - Assignment
Neural Computation - Assignment Analysing a Neural Network trained by Backpropagation AA SSt t aa t i iss i t i icc aa l l AA nn aa l lyy l ss i iss i oo f vv aa r i ioo i uu ss l lee l aa r nn i inn gg
A Review of Data Mining Techniques
Available Online at www.ijcsmc.com International Journal of Computer Science and Mobile Computing A Monthly Journal of Computer Science and Information Technology IJCSMC, Vol. 3, Issue. 4, April 2014,
FRAUD DETECTION IN ELECTRIC POWER DISTRIBUTION NETWORKS USING AN ANN-BASED KNOWLEDGE-DISCOVERY PROCESS
FRAUD DETECTION IN ELECTRIC POWER DISTRIBUTION NETWORKS USING AN ANN-BASED KNOWLEDGE-DISCOVERY PROCESS Breno C. Costa, Bruno. L. A. Alberto, André M. Portela, W. Maduro, Esdras O. Eler PDITec, Belo Horizonte,
Models of Cortical Maps II
CN510: Principles and Methods of Cognitive and Neural Modeling Models of Cortical Maps II Lecture 19 Instructor: Anatoli Gorchetchnikov dy dt The Network of Grossberg (1976) Ay B y f (
Introduction to Machine Learning Using Python. Vikram Kamath
Introduction to Machine Learning Using Python Vikram Kamath Contents: 1. 2. 3. 4. 5. 6. 7. 8. 9. 10. Introduction/Definition Where and Why ML is used Types of Learning Supervised Learning Linear Regression
International Journal of Computer Science Trends and Technology (IJCST) Volume 2 Issue 3, May-Jun 2014
RESEARCH ARTICLE OPEN ACCESS A Survey of Data Mining: Concepts with Applications and its Future Scope Dr. Zubair Khan 1, Ashish Kumar 2, Sunny Kumar 3 M.Tech Research Scholar 2. Department of Computer
NTC Project: S01-PH10 (formerly I01-P10) 1 Forecasting Women s Apparel Sales Using Mathematical Modeling
1 Forecasting Women s Apparel Sales Using Mathematical Modeling Celia Frank* 1, Balaji Vemulapalli 1, Les M. Sztandera 2, Amar Raheja 3 1 School of Textiles and Materials Technology 2 Computer Information
Impelling Heart Attack Prediction System using Data Mining and Artificial Neural Network
General Article International Journal of Current Engineering and Technology E-ISSN 2277 4106, P-ISSN 2347-5161 2014 INPRESSCO, All Rights Reserved Available at http://inpressco.com/category/ijcet Impelling
DYNAMIC LOAD BALANCING OF FINE-GRAIN SERVICES USING PREDICTION BASED ON SERVICE INPUT JAN MIKSATKO. B.S., Charles University, 2003 A THESIS
DYNAMIC LOAD BALANCING OF FINE-GRAIN SERVICES USING PREDICTION BASED ON SERVICE INPUT by JAN MIKSATKO B.S., Charles University, 2003 A THESIS Submitted in partial fulfillment of the requirements for the
Multiple Layer Perceptron Training Using Genetic Algorithms
Multiple Layer Perceptron Training Using Genetic Algorithms Udo Seiffert University of South Australia, Adelaide Knowledge-Based Intelligent Engineering Systems Centre (KES) Mawson Lakes, 5095, Adelaide,
Predicting the Risk of Heart Attacks using Neural Network and Decision Tree
Predicting the Risk of Heart Attacks using Neural Network and Decision Tree S.Florence 1, N.G.Bhuvaneswari Amma 2, G.Annapoorani 3, K.Malathi 4 PG Scholar, Indian Institute of Information Technology, Srirangam,
A Neural Network Based System for Intrusion Detection and Classification of Attacks
A Neural Network Based System for Intrusion Detection and Classification of Attacks Mehdi MORADI and Mohammad ZULKERNINE Abstract-- With the rapid expansion of computer networks during the past decade,
Neural Network Add-in
Neural Network Add-in Version 1.5 Software User s Guide Contents Overview... 2 Getting Started... 2 Working with Datasets... 2 Open a Dataset... 3 Save a Dataset... 3 Data Pre-processing... 3 Lagging...
Keywords: Image complexity, PSNR, Levenberg-Marquardt, Multi-layer neural network.
Global Journal of Computer Science and Technology Volume 11 Issue 3 Version 1.0 Type: Double Blind Peer Reviewed International Research Journal Publisher: Global Journals Inc. (USA) Online ISSN: 0975-4172
Machine Learning: Overview
Machine Learning: Overview Why Learning? Learning is a core of property of being intelligent. Hence Machine learning is a core subarea of Artificial Intelligence. There is a need for programs to behave
A Content based Spam Filtering Using Optical Back Propagation Technique
A Content based Spam Filtering Using Optical Back Propagation Technique Sarab M. Hameed 1, Noor Alhuda J. Mohammed 2 Department of Computer Science, College of Science, University of Baghdad - Iraq ABSTRACT
ARTIFICIAL NEURAL NETWORKS FOR DATA MINING
ARTIFICIAL NEURAL NETWORKS FOR DATA MINING Amrender Kumar I.A.S.R.I., Library Avenue, Pusa, New Delhi-110 012 [email protected] 1. Introduction Neural networks, more accurately called Artificial Neural
Fall 2012 Q530. Programming for Cognitive Science
Fall 2012 Q530 Programming for Cognitive Science Aimed at little or no programming experience. Improve your confidence and skills at: Writing code. Reading code. Understand the abilities and limitations
Learning is a very general term denoting the way in which agents:
What is learning? Learning is a very general term denoting the way in which agents: Acquire and organize knowledge (by building, modifying and organizing internal representations of some external reality);
Optimizing content delivery through machine learning. James Schneider Anton DeFrancesco
Optimizing content delivery through machine learning James Schneider Anton DeFrancesco Obligatory company slide Our Research Areas Machine learning The problem Prioritize import information in low bandwidth
Credit Card Fraud Detection Using Self Organised Map
International Journal of Information & Computation Technology. ISSN 0974-2239 Volume 4, Number 13 (2014), pp. 1343-1348 International Research Publications House http://www. irphouse.com Credit Card Fraud
Data Mining. Supervised Methods. Ciro Donalek [email protected]. Ay/Bi 199ab: Methods of Computa@onal Sciences hcp://esci101.blogspot.
Data Mining Supervised Methods Ciro Donalek [email protected] Supervised Methods Summary Ar@ficial Neural Networks Mul@layer Perceptron Support Vector Machines SoLwares Supervised Models: Supervised
THE HUMAN BRAIN. observations and foundations
THE HUMAN BRAIN observations and foundations brains versus computers a typical brain contains something like 100 billion miniscule cells called neurons estimates go from about 50 billion to as many as
Data Mining Techniques Chapter 7: Artificial Neural Networks
Data Mining Techniques Chapter 7: Artificial Neural Networks Artificial Neural Networks.................................................. 2 Neural network example...................................................
CS Master Level Courses and Areas COURSE DESCRIPTIONS. CSCI 521 Real-Time Systems. CSCI 522 High Performance Computing
CS Master Level Courses and Areas The graduate courses offered may change over time, in response to new developments in computer science and the interests of faculty and students; the list of graduate
ABSTRACT The World MINING 1.2.1 1.2.2. R. Vasudevan. Trichy. Page 9. usage mining. basic. processing. Web usage mining. Web. useful information
SSRG International Journal of Electronics and Communication Engineering (SSRG IJECE) volume 1 Issue 1 Feb Neural Networks and Web Mining R. Vasudevan Dept of ECE, M. A.M Engineering College Trichy. ABSTRACT
An Artificial Neural Networks-Based on-line Monitoring Odor Sensing System
Journal of Computer Science 5 (11): 878-882, 2009 ISSN 1549-3636 2009 Science Publications An Artificial Neural Networks-Based on-line Monitoring Odor Sensing System Yousif Al-Bastaki The College of Information
The Backpropagation Algorithm
7 The Backpropagation Algorithm 7. Learning as gradient descent We saw in the last chapter that multilayered networks are capable of computing a wider range of Boolean functions than networks with a single
INTELLIGENT ENERGY MANAGEMENT OF ELECTRICAL POWER SYSTEMS WITH DISTRIBUTED FEEDING ON THE BASIS OF FORECASTS OF DEMAND AND GENERATION Chr.
INTELLIGENT ENERGY MANAGEMENT OF ELECTRICAL POWER SYSTEMS WITH DISTRIBUTED FEEDING ON THE BASIS OF FORECASTS OF DEMAND AND GENERATION Chr. Meisenbach M. Hable G. Winkler P. Meier Technology, Laboratory
Impact of Feature Selection on the Performance of Wireless Intrusion Detection Systems
2009 International Conference on Computer Engineering and Applications IPCSIT vol.2 (2011) (2011) IACSIT Press, Singapore Impact of Feature Selection on the Performance of ireless Intrusion Detection Systems
Development of a Network Configuration Management System Using Artificial Neural Networks
Development of a Network Configuration Management System Using Artificial Neural Networks R. Ramiah*, E. Gemikonakli, and O. Gemikonakli** *MSc Computer Network Management, **Tutor and Head of Department
Feature Subset Selection in E-mail Spam Detection
Feature Subset Selection in E-mail Spam Detection Amir Rajabi Behjat, Universiti Technology MARA, Malaysia IT Security for the Next Generation Asia Pacific & MEA Cup, Hong Kong 14-16 March, 2012 Feature
