Artificial neural networks



Similar documents
Artificial Neural Networks and Support Vector Machines. CS 486/686: Introduction to Artificial Intelligence

Biological Neurons and Neural Networks, Artificial Neurons

Introduction to Machine Learning and Data Mining. Prof. Dr. Igor Trajkovski

Introduction to Artificial Neural Networks

Feed-Forward mapping networks KAIST 바이오및뇌공학과 정재승

Neural Network Design in Cloud Computing

NEURAL NETWORK FUNDAMENTALS WITH GRAPHS, ALGORITHMS, AND APPLICATIONS

Neural Networks and Support Vector Machines

CHAPTER I From Biological to Artificial Neuron Model

129: Artificial Neural Networks. Ajith Abraham Oklahoma State University, Stillwater, OK, USA 1 INTRODUCTION TO ARTIFICIAL NEURAL NETWORKS

Recurrent Neural Networks

TRAINING A LIMITED-INTERCONNECT, SYNTHETIC NEURAL IC

SEMINAR OUTLINE. Introduction to Data Mining Using Artificial Neural Networks. Definitions of Neural Networks. Definitions of Neural Networks

Chapter 4: Artificial Neural Networks

Stock Prediction using Artificial Neural Networks

Brain-in-a-bag: creating an artificial brain

EFFICIENT DATA PRE-PROCESSING FOR DATA MINING

How To Use Neural Networks In Data Mining

Lecture 6. Artificial Neural Networks

Neural Networks in Data Mining

Machine Learning and Data Mining -

An Introduction to Artificial Neural Networks (ANN) - Methods, Abstraction, and Usage

ARTIFICIAL NEURAL NETWORKS FOR DATA MINING

THE HUMAN BRAIN. observations and foundations

Feedforward Neural Networks and Backpropagation

PORTFOLIO SELECTION USING

Neural network software tool development: exploring programming language options

Neural Networks for Data Mining

Chapter 12 Discovering New Knowledge Data Mining

Data Mining and Neural Networks in Stata

Role of Neural network in data mining

MULTIPLE CHOICE. Choose the one alternative that best completes the statement or answers the question.

Data Mining Using Neural Networks: A Guide for Statisticians

Computational Intelligence Introduction

An Introduction to Neural Networks

Prediction of Stock Market in Nigeria Using Artificial Neural Network

Models of Cortical Maps II

Data Mining Techniques Chapter 7: Artificial Neural Networks

Neural Networks algorithms and applications

Preface. C++ Neural Networks and Fuzzy Logic:Preface. Table of Contents

Learning to Process Natural Language in Big Data Environment

SELECTING NEURAL NETWORK ARCHITECTURE FOR INVESTMENT PROFITABILITY PREDICTIONS

MANAGING QUEUE STABILITY USING ART2 IN ACTIVE QUEUE MANAGEMENT FOR CONGESTION CONTROL

Neural Networks and Back Propagation Algorithm

An Investigation Into Stock Market Predictions Using Neural Networks Applied To Fundamental Financial Data

3 An Illustrative Example

Method of Combining the Degrees of Similarity in Handwritten Signature Authentication Using Neural Networks

Introduction to Machine Learning Using Python. Vikram Kamath

An Artificial Neural Networks-Based on-line Monitoring Odor Sensing System

American International Journal of Research in Science, Technology, Engineering & Mathematics

Self Organizing Maps: Fundamentals

AN APPLICATION OF TIME SERIES ANALYSIS FOR WEATHER FORECASTING

Deep Learning for Multivariate Financial Time Series. Gilberto Batres-Estrada

Biology Slide 1 of 38

Power Prediction Analysis using Artificial Neural Network in MS Excel

Data quality in Accounting Information Systems

PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 4: LINEAR MODELS FOR CLASSIFICATION

Application of Neural Network in User Authentication for Smart Home System

A Time Series ANN Approach for Weather Forecasting

The Neuron and the Synapse. The Neuron. Parts of the Neuron. Functions of the neuron:

NEURAL NETWORKS A Comprehensive Foundation

Appendix 4 Simulation software for neuronal network models

Machine Learning: Multi Layer Perceptrons

Impact of Feature Selection on the Performance of Wireless Intrusion Detection Systems

SUCCESSFUL PREDICTION OF HORSE RACING RESULTS USING A NEURAL NETWORK

Anatomy Review. Graphics are used with permission of: Pearson Education Inc., publishing as Benjamin Cummings (

Ann Model for Effects of Socio-Metric-Scales on Life Insurance

Standards Alignment Minnesota Science Standards Alignment Matrix

The CHIR Algorithm for Feed Forward. Networks with Binary Weights

Regular Languages and Finite Automata

Neural Networks: a replacement for Gaussian Processes?

Analecta Vol. 8, No. 2 ISSN

6.2.8 Neural networks for data mining

Lecture 2: The SVM classifier

Lesson # 5A Workshop: Complex Networks and Graphs (Adapted from Teaching Engineering s It s a Connected World )

ELLIOTT WAVES RECOGNITION VIA NEURAL NETWORKS

Design call center management system of e-commerce based on BP neural network and multifractal

To my parents and friends for support.

Linear Threshold Units

A Simple Feature Extraction Technique of a Pattern By Hopfield Network

Horse Racing Prediction Using Artificial Neural Networks

Follow links Class Use and other Permissions. For more information, send to:

D A T A M I N I N G C L A S S I F I C A T I O N

Neural Networks in Quantitative Finance

Neural Networks for Machine Learning. Lecture 13a The ups and downs of backpropagation

Keywords: Data Mining, Neural Networks, Data Mining Process, Knowledge Discovery, Implementation. I. INTRODUCTION

Lecture One: Brain Basics

ADVANCES IN KNOWLEDGE DISCOVERY IN DATABASES

Anatomy Review Graphics are used with permission of: adam.com ( Benjamin Cummings Publishing Co (

Neural Computation - Assignment

CHAPTER 6 PRINCIPLES OF NEURAL CIRCUITS.

Performance Evaluation of Artificial Neural. Networks for Spatial Data Analysis

Prediction Model for Crude Oil Price Using Artificial Neural Networks

COMBINED NEURAL NETWORKS FOR TIME SERIES ANALYSIS

Keywords: Image complexity, PSNR, Levenberg-Marquardt, Multi-layer neural network.

Fuzzy Signature Neural Network

Computing with Spiking Neuron Networks

The Backpropagation Algorithm

Transcription:

Artificial neural networks

Now Neurons Neuron models Perceptron learning Multi-layer perceptrons Backpropagation 2

It all starts with a neuron 3

Some facts about human brain ~ 86 billion neurons ~ 10 15 synapses Fig: I. Goodfellow 4

Neuron The basic information processing element of neural systems. The neuron receives input signals generated by other neurons through its dendrites, integrates these signals in its body, then generates its own signal (a series of electric pulses) that travel along the axon which in turn makes contacts with dendrites of other neurons. The points of contact between neurons are called synapses. http://animatlab.com/help/documentation/neural-network-editor/neural-simulation-plug-ins/firing- Rate-Neural-Plug-in/Neuron-Basics 5

Neuron The pulses generated by the neuron travels along the axon as an electrical wave. Once these pulses reach the synapses at the end of the axon open up chemical vesicles exciting the other neuron. Slide credit: Erol Sahin 6

Neuron (Carlson, 1992) (Carlson, 1992) http://animatlab.com/help/documentation/neural-network-editor/neural-simulation-plug-ins/firing-rate-neural-plug-in/neuron- Basics 7

The biological neuron - 2 (Carlson, 1992) http://animatlab.com/help/documentation/neural-network-editor/neural-simulation-plug-ins/firing- Rate-Neural-Plug-in/Neuron-Basics 8

http://www.billconnelly.net/?p=291 9

Artificial neuron 10

History of artificial neurons Threshold Logic Unit, or Linear Threshold Unit, a.k.a. McCulloch Pitts Neurons 1943 Perceptron by Rosenblatt This model already considered more flexible weight values in the neurons, and was used in machines with adaptive capabilities. The representation of the threshold values as a bias term was introduced by Bernard Widrow in 1960 see ADALINE. In the late 1980s, when research on neural networks regained strength, neurons with more continuous shapes started to be considered. The possibility of differentiating the activation function allows the direct use of the gradient descent and other optimization algorithms for the adjustment of the weights. Neural networks also started to be used as a general function approximation model. The best known training algorithm called backpropagation has been rediscovered several times but its first development goes back to the work of Paul Werbos 11

Erol Şahin 12

13

14

McCulloch-Pitts Neuron (McCulloch & Pitts, 1943) Binary input-output Can represent Boolean functions. No training. net = http://www.tau.ac.il/~tsirel/dump/static/knowino.org/wiki/artificial_neural_network.html 15

McCulloch-Pitts Neuron Implement AND: Let w x1 and w x2 to be 1, and w xb to be -2. When input is 1 & 1; net is 0. When one input is 0; net is -1. When input is 0 & 0; net is -2. http://www.webpages.ttu.edu/dleverin/neural_network/neural_networks.html http://www.tau.ac.il/~tsirel/dump/static/knowino.org/wiki/artificial_neural_network.html 17

McCulloch-Pitts Neuron Wikipedia: Initially, only a simple model was considered, with binary inputs and outputs, some restrictions on the possible weights, and a more flexible threshold value. Since the beginning it was already noticed that any boolean function could be implemented by networks of such devices, what is easily seen from the fact that one can implement the AND and OR functions, and use them in the disjunctive or the conjunctive normal form. Researchers also soon realized that cyclic networks, with feedbacks through neurons, could define dynamical systems with memory, but most of the research concentrated (and still does) on strictly feed-forward networks because of the smaller difficulty they present. 18

McCulloch-Pitts Neuron Binary input-output is a big limitation Also called [ ] caricature models since they are intended to reflect one or more neurophysiological observations, but without regard to realism [ ] No training! No learning! -- Wikipedia They were useful in inspiring research into connectionist models 19

Hebb s Postulate (Hebb, 1949) When an axon of cell A is near enough to excite a cell B and repeatedly or persistently takes part in firing it, some growth process or metabolic change takes place in one or both cells such that A's efficiency, as one of the cells firing B, is increased In short: Neurons that fire together, wire together. In other words: w ij x i x j 20

Hebb s Learning Law Very simple to formulate as a learning rule: If the activation of the neurons, y1 and y2, are both on (+1) then the weight between the two neurons grow. (Off: 0) Else the weight between remains the same. However, when bipolar activation {-1,+1} scheme is used, then the weights can also decrease when the activation of two neurons does not match. Slide credit: Erol Sahin 21

22

And many others Widrow & Hoff, 1962 Grossberg, 1969 Kohonen, 1972 von der Malsburg, 1973 Narendra & Thathtchar, 1974 Palm, 1980 Hopfield, 1982 23

Let s go back to a biological neuron A biological neuron has: Dendrites Soma Axon Firing is discrete, unlike most artificial neurons Rather than the response function, the firing rate is critical 24

Neurone vs. Node Very crude abstraction Many details overseen Spherical cow problem! 25

Spherical cow Q: How does a physicist milk a cow? A: Well, first let us consider a spherical cow... Or https://en.wikipedia.org/wiki/spherical_cow Milk production at a dairy farm was low, so the farmer wrote to the local university, asking for help from academia. A multidisciplinary team of professors was assembled, headed by a theoretical physicist, and two weeks of intensive on-site investigation took place. The scholars then returned to the university, notebooks crammed with data, where the task of writing the report was left to the team leader. Shortly thereafter the physicist returned to the farm, saying to the farmer, "I have the solution, but it only works in the case of spherical cows in a vacuum". https://www.washingtonpost.com/news/wonk/wp/2013/09/04/the-coase-theorem-is-widely-cited-in-economics-ronald-coase-hated-it/ 26

Let us take a closer look at perceptrons Initial proposal of connectionist networks Rosenblatt, 50 s and 60 s Essentially a linear discriminant composed of nodes and weights x 1 x 1 1 w 0 x 2 w 1 w 2 or x 2 w 1 w 2 w 0 o o w n w n x n x n Activation Function o x = 1, w 0 + w 1 x 1 +.. w n x n > 0 0, otherwise Or, simply where o x = sgn(w x) 27

Perceptron clearer structure Retina Associative units Response unit 1 f ( y _ in) 0 1 if if if y _ in y _ in y _ in Variable weights Fixed weights Step activation function Slide adapted from Dr. Nigel Crook from Oxford Brookes University Slide credit: Erol Sahin 28

Perceptron - activation Simple matrix multiplication, as we have seen in the previous lecture X 1 w 1,1 Wx w w 1,1 2,1 w w 1,2 2,2 x x 1 2 w 2,1 Y 1 w w 1,1 2,1 x x 1 1 w w 1,2 2,2 x x 2 2 X 2 w 1,2 w 2,2 Y 2 2 j1 2 j1 w w 1, j 2, j x x j j Slide adapted from Dr. Nigel Crook from Oxford Brookes University Slide credit: Erol Sahin 30