Biological Neurons and Neural Networks, Artificial Neurons



Similar documents
CHAPTER 5 SIGNALLING IN NEURONS

CHAPTER I From Biological to Artificial Neuron Model

Biology Slide 1 of 38

Artificial neural networks

Neurophysiology. 2.1 Equilibrium Potential

Standards Alignment Minnesota Science Standards Alignment Matrix

Neural Networks in Data Mining

Introduction to Machine Learning and Data Mining. Prof. Dr. Igor Trajkovski

CHAPTER 6 PRINCIPLES OF NEURAL CIRCUITS.

The Neuron and the Synapse. The Neuron. Parts of the Neuron. Functions of the neuron:

Chapter 7: The Nervous System

Recurrent Neural Networks

Introduction to Artificial Neural Networks

Artificial Neural Networks and Support Vector Machines. CS 486/686: Introduction to Artificial Intelligence

THE HUMAN BRAIN. observations and foundations

PART I: Neurons and the Nerve Impulse

Neural Network Design in Cloud Computing

Parts of the Nerve Cell and Their Functions

Origin of Electrical Membrane Potential

Self Organizing Maps: Fundamentals

Anatomy Review. Graphics are used with permission of: Pearson Education Inc., publishing as Benjamin Cummings (

Computational Intelligence Introduction

Anatomy Review Graphics are used with permission of: adam.com ( Benjamin Cummings Publishing Co (

12. Nervous System: Nervous Tissue

Resting membrane potential ~ -70mV - Membrane is polarized

Nerve Cell Communication

Neurotransmission: Muscle Messages

Structure and Function of Neurons

CELLS IN THE NERVOUS SYSTEM

Functional neuroimaging. Imaging brain function in real time (not just the structure of the brain).

Nerves and Nerve Impulse

Masters research projects. 1. Adapting Granger causality for use on EEG data.

ANIMATED NEUROSCIENCE

An Introduction to Artificial Neural Networks (ANN) - Methods, Abstraction, and Usage

Neural Networks and Back Propagation Algorithm

Stock Prediction using Artificial Neural Networks

AP Biology I. Nervous System Notes

Auditory neuroanatomy: the Spanish heritage. Santiago Ramón y Cajal,

MULTIPLE CHOICE. Choose the one alternative that best completes the statement or answers the question.

Name: Teacher: Olsen Hour:

NEURAL NETWORK FUNDAMENTALS WITH GRAPHS, ALGORITHMS, AND APPLICATIONS

BIOPHYSICS OF NERVE CELLS & NETWORKS

Passive Conduction - Cable Theory

Lecture 6. Artificial Neural Networks

Introduction to Psychology, 7th Edition, Rod Plotnik Module 3: Brain s Building Blocks. Module 3. Brain s Building Blocks

Appendix 4 Simulation software for neuronal network models

the puzzle packet! Brought to you by

degrees of freedom and are able to adapt to the task they are supposed to do [Gupta].

MATHEMATICAL MODELING OF NEURAL ACTIVITY

WebQuest: Neurotransmitters, Cravings & Addiction By: Sandra R. Holmes (page 1 of 18 )

Feed-Forward mapping networks KAIST 바이오및뇌공학과 정재승

Introduction to Machine Learning Using Python. Vikram Kamath

An Artificial Neural Networks-Based on-line Monitoring Odor Sensing System

An Introduction to Neural Networks

Explore the Neuroscience for Kids Web Site (ANSWERS) Start at:

Nervous Tissue Dr. Archana Rani Associate Professor Department of Anatomy KGMU UP, Lucknow

What is the basic component of the brain and spinal cord communication system?

Lecture One: Brain Basics

31.1 The Neuron. BUILD Vocabulary. Lesson Objectives

Machine Learning and Data Mining -

NEUROEVOLUTION OF AUTO-TEACHING ARCHITECTURES

Brain & Mind. Bicester Community College Science Department

PORTFOLIO SELECTION USING

3. The neuron has many branch-like extensions called that receive input from other neurons. a. glia b. dendrites c. axons d.

DLP Driven, Learning, Optical Neural Networks

Brain-in-a-bag: creating an artificial brain

Slide 4: Forebrain Structures. Slide 5: 4 Lobes of the Cerebral Cortex. Slide 6: The Cerebral Hemispheres (L & R)

Physiological Basis of the BOLD Signal. Kerstin Preuschoff Social and Neural systems Lab University of Zurich

Neural Networks and Support Vector Machines

Chapter 07: Instruction Level Parallelism VLIW, Vector, Array and Multithreaded Processors. Lesson 05: Array Processors

129: Artificial Neural Networks. Ajith Abraham Oklahoma State University, Stillwater, OK, USA 1 INTRODUCTION TO ARTIFICIAL NEURAL NETWORKS

Models of Cortical Maps II

Coding and decoding with convolutional codes. The Viterbi Algor

Brains, Ontologies & Virtual Machines

Follow links Class Use and other Permissions. For more information, send to:


Programming Exercise 3: Multi-class Classification and Neural Networks

On-Line Learning with Structural Adaptation in a Network of Spiking Neurons for Visual Pattern Recognition

How To Understand The Distributed Potential Of A Dendritic Tree

Simulation of an Action Potential using the Hodgkin-Huxley Model in Python. Nathan Law Medical Biophysics 3970

2x + y = 3. Since the second equation is precisely the same as the first equation, it is enough to find x and y satisfying the system

Data Mining and Neural Networks in Stata

Prediction Model for Crude Oil Price Using Artificial Neural Networks

7 Network Models. 7.1 Introduction

U N IT 10 NE RVOUS SYS TE M REVIEW 1. Which of the following is controlled by the somatic nervous system? A. rate of heartbeat B.

Lecture 8 February 4

Bayesian probability theory

Neural Network Stock Trading Systems Donn S. Fishbein, MD, PhD Neuroquant.com

Action Potentials I Generation. Reading: BCP Chapter 4

North Bergen School District Benchmarks

Chapter 11: Functional Organization of Nervous Tissue

NEURON AND NEURAL TRAMSMISSION: ANATOMY OF A NEURON. created by Dr. Joanne Hsu

Kevin Gurney This book is copyright under the Berne Convention. No reproduction without permission. All rights reserved.

Transcription:

Biological Neurons and Neural Networks, Artificial Neurons Neural Computation : Lecture 2 John A. Bullinaria, 2015 1. Organization of the Nervous System and Brain 2. Brains versus Computers: Some Numbers 3. Biological Neurons and Neural Networks 4. Rate Coding versus Spike Time Coding 5. The McCulloch-Pitts Neuron 6. Some Useful Notation: Vectors, Matrices, Functions 7. The McCulloch-Pitts Neuron Equation L2-1

The Nervous System The human nervous system can be broken down into three stages that may be represented in block diagram form as: Stimulus Receptors Neural Network/ Brain Effectors Response The receptors collect information from the environment e.g. photons on the retina. The effectors generate interactions with the environment e.g. activate muscles. The flow of information/activation is represented by arrows feedforward and feedback. Naturally, this module will be primarily concerned with how the neural network in the middle works, but understanding its inputs and outputs is also important. L2-2

Levels of Brain Organization The brain contains both large scale and small scale anatomical structures and different functions take place at the higher and lower levels. There is a hierarchy of interwoven levels of organization: 1. Molecules and Ions 2. Synapses 3. Neuronal microcircuits 4. Dendritic trees 5. Neurons 6. Local circuits 7. Inter-regional circuits 8. Central nervous system The ANNs studied in this module are mostly approximations of levels 5 and 6. L2-3

Brains versus Computers : Some numbers 1. There are approximately 10 billion neurons in the human cortex, compared with tens of thousands of processors in the most powerful parallel computers. 2. Lack of processing units can be compensated by speed. The typical operating speeds of biological neurons is measured in milliseconds (10-3 s), while current silicon chips can usually operate in nanoseconds (10-9 s). 3. Each biological neuron is connected to several thousands of other neurons, similar to the connectivity in powerful parallel computers. 4. The human brain is extremely energy efficient, using approximately 10-16 joules per operation per second, whereas the best computers today use around 10-6 joules per operation per second. 5. Brains have been evolving for tens of millions of years, but computers have only been evolving for tens of decades, though different mechanisms are involved. L2-4

Slice Through a Real Brain http://library.med.utah.edu/webpath/histhtml/neuranat/neuranca.html L2-5

Structure of a Human Brain L2-6

Basic Components of Biological Neurons 1. The majority of neurons encode their activations or outputs as a series of brief electrical pulses (i.e. spikes or action potentials). 2. The neuron s cell body (soma) processes the incoming activations and converts them into output activations. 3. The neuron s nucleus contains the genetic material in the form of DNA. This exists in most types of cells, not just neurons. 4. Dendrites are fibres which emanate from the cell body and provide the receptive zones that receive activation from other neurons. 5. Axons are fibres acting as transmission lines that send activation to other neurons. 6. The junctions that allow signal transmission between the axons and dendrites are called synapses. The process of transmission is by diffusion of chemicals called neurotransmitters across the synaptic cleft. L2-7

Schematic Diagram of a Biological Neuron L2-8

Neural Signal Processing The key components of neural signal processing are: 1. Signals from connected neurons are collected by the dendrites. 2. The cells body (soma) sums the incoming signals (spatially and temporally). 3. When sufficient input is received (i.e., a threshold is exceeded), the neuron generates an action potential or spike (i.e., it fires ). 4. That action potential is transmitted along the axon to other neurons, or to structures outside the nervous systems (e.g., muscles). 5. If sufficient input is not received (i.e. the threshold is not exceeded), the inputs quickly decay and no action potential is generated. 6. Timing is clearly important input signals must arrive together, strong inputs will generate more action potentials per unit time. L2-9

Neuron Action Potential From: Principles of Neurocomputing for Science & Engineering, Ham & Kostanic, McGraw-Hill, 2001. L2-10

Excitatory Postsynaptic Potential From: Principles of Neurocomputing for Science & Engineering, Ham & Kostanic, McGraw-Hill, 2001. L2-11

Rate Coding versus Spike Time Coding In biological neural networks, the individual spike timings are often important. So spike time coding is the most realistic representation for artificial neural networks. However, averages of spike rates across time or populations of neurons carry a lot of the useful information, and so rate coding is a useful approximation. Spike coding is more powerful, but the computer models are much more complicated and more difficult to train. Rate coding blurs the information coded in individual neurons, but usually leads to simpler models with differentiable outputs, which we will see later is important for generating efficient learning algorithms. Sigmoid shaped activation functions in the rate coding approach follow from the cumulative effect of Gaussian distributed spikes. L2-12

The McCulloch-Pitts Neuron A simple rate coding model of real neurons is also known as a Threshold Logic Unit : in 1 in 2 out in n 1. A set of synapses (i.e. connections) brings in activations from other neurons. 2. A processing unit sums the inputs, and then applies a non-linear activation function (which is also often called a threshold or transfer or squashing function). 3. An output line transmits the result to other neurons. L2-13

Some Useful Notation We often need to deal with ordered sets of numbers, which we write as vectors, e.g. x = (x 1, x 2, x 3,, x n ), y = (y 1, y 2, y 3,, y m ) The components x i can be added up to give a scalar (number), e.g. s = x 1 + x 2 + x 3 + + x n = x i Two vectors of the same length may be added to give another vector, e.g. z = x + y = (x 1 + y 1, x 2 + y 2,, x n + y n ) Two vectors of the same length may be multiplied to give a scalar, e.g. n i=1 p = x.y = x 1 y 1 + x 2 y 2 + + x n y n = n i=1 x i y i To avoid any ambiguity or confusion, we will mostly be using the component notation (i.e. explicit indices and summation signs) throughout this module. L2-14

The Power of the Notation : Matrices We can use the same vector component notation to represent more complex things with many dimensions/indices. For two indices we have matrices, e.g. an m n matrix w 11 w 12... w 1n w 21 w 22... w 1n w = : : : w m1 w m1... w mn Matrices of the same size can be added or subtracted component by component. An m n matrix a can be multiplied with an n p matrix b to give an m p matrix c. This becomes straightforward if we write it in terms of components: c ik = n a ij b jk j =1 An n component vector can be regarded as a 1 n or n 1 matrix. L2-15

Some Useful Functions A function y = f(x) describes a relationship (input-output mapping) from x to y. Example 1 The threshold or step function step(x) is defined as 1.2 1 if x 0 step( x)= 0 if x<0 sgn(x) 1.0 0.8 0.6 0.4 0.2 0.0-4 -2 0 x 2 4 Example 2 The logistic sigmoid function Sigmoid(x) is defined as Sigmoid(x) = 1 1+ e x Sigmoid(x) 1.2 1.0 0.8 0.6 0.4 0.2 This is a smoothed (differentiable) form of the threshold function. 0.0-8 -4 0 x 4 8 L2-16

The McCulloch-Pitts Neuron Equation Using the above notation, it is possible to write down a simple equation for the output out of a McCulloch-Pitts neuron as a function of its n inputs in i : out = step( n i=1 in i θ) where θ is the neuron s activation threshold. We can easily see that: out = 1 if n in k θ out = 0 if in k <θ k =1 n k =1 Note that the McCulloch-Pitts neuron is an extremely simplified model of real biological neurons. Some of its missing features include: non-binary inputs and outputs, non-linear summation, smooth thresholding, stochasticity, and temporal information processing. Nevertheless, McCulloch-Pitts neurons are computationally very powerful. One can show that networks of such neurons are capable of universal computation. L2-17

Overview and Reading 1. Biological neurons, consisting of a cell body, axons, dendrites and synapses, are able to process and transmit neural activation. 2. The McCulloch-Pitts neuron model (Threshold Logic Unit) is a crude rate-coding approximation to real neurons, that performs a simple summation and thresholding function on activation levels. 3. Appropriate mathematical notation facilitates the specification and programming of artificial neurons and networks of artificial neurons. Reading 1. Haykin-1999: Sections 1.1, 1.2, 1.3 2. Ham & Kostanic: Sections 1.2, 1.3 3. Beale & Jackson: Sections 1.2, 3.1, 3.2 4. Gurney: Sections 2.1, 2.2. L2-18