A stochastic calculus approach of Learning in Spike Models
|
|
- Charlene Young
- 7 years ago
- Views:
Transcription
1 A stochastic calculus approach of Learning in Spike Models Adriana Climescu-Haulica Laboratoire de Modélisation et Calcul Institute d Informatique et Mathématiques Appliquées de Grenoble 51, rue des Mathématiques, Grenoble cedex 9 France adriana.climescu@imag.fr Abstract Some aspects of the Hebb s rule are formalized by means of a SRM 0 model where refractoriness and external input are neglected. Using tools from stochastic calculus theory it is shown explicitly that Hebb s rule is a 0-1 rule, based on a local learning window. Assuming the knowledge of the membrane potential, some learning inequalities are proposed, reflecting the existence of a delay on which the learning can be accomplished. The model allows a natural space-time generalization. 1
2 1 The model We consider a neuron i that receives spike input from N neurons. The membrane potential u i (t) is described by a SRM 0 model (Gerstner): u i (t) = 1 N N j=1 0 w ij (t s)ɛ(s)s j (t s)ds where refractoriness and external input are neglected. w ij (t) is the weight of the (i, j) synapse at time t ɛ(s) is the response kernel modelling the postsynaptic potential S j is defined as S j (t) = f F j δ(t t f j ) where F j is the set of spiking times of the neuron j. Using the change of variable t s = x the membrane potential is expressed as a stochastic integral with respect to a Poisson process Π j (t) = f F j 1 {t t f j } u i (t) = 1 N 1 N N j=1 t 0 N j=1 w ij (x)ɛ(t x)dπ j (x) f F j w ij (t f j )ɛ(t tf j ) 2
3 Figure 1: N spike neurons as input 3
4 2 0-1 Hebbian Learning rule Proposition There is a vectorial stochastic process m(t) = (m 1 (t), m 2 (t),..., m N (t)) such that the conditional probability P ( Π j (t) A u i [0, t] = h[0, t] ) = λ A ( m j (h[0, t]) ) = { 1 if mj (h[0, t]) A 0 if m j (h[0, t]) / A where each component of the stochastic process m(t) is constructed by the relation m j (h[0, t]) = k=1 1 λ j k H j (I[0, t]), e j k L 2 [ ν j (t)] h[0, t], ej k L 2 [ ν j (t)] with H j = Ru j the square root functional associated with the covariance functional of the membrane potential the kernel R j u(h)(t) = t 0 C u (t, x)h(x)d ν j (x) = H j Hj (h)(t) λ j k and ej k Ru. j C u (t, x) = t x 0 w ij (s)ɛ(t s)w ij (s)ɛ(x s)dν j (s) are the eigenvalues and eigenvectors of the covariance functional 4
5 Definition An interval W L = [n D, n] is a counting window learning set associated with the spike train S j if there is a spike time t f i for the neuron i such that and Π j (t f i ) = n Π j (t f i ) Π j(t f i t) = D 1. The Hebb s rule become P ( Π j (t) W L u i [0, t] = h[0, t] ) = { 1 if mj (h[0, t]) W L 0 if m j (h[0, t]) / W L With an a priori interpretation, the Hebb rule is a detection criterion: between the N neurons firing the neuron i which one become wired with i? If m j (u i [0, t]) W L then the synapse (i, j) is strengthened If m j (u i [0, t]) / W L then the synapse (i, j) vanishes. 5
6 3 Learning Inequations With an a posteriori interpretation, the Hebb rule allows the knowledge of the synaptic weights modified by a learning process. Assume that the synapse (i, j) was strengthened. Then with { n D m j (u i [0, t]) n (1) Π j (t f i ) = n Π j (t f i ) S j(t f i t) = D 1 As it is known that the stochastic process m j (t) can be assimilated with a Poisson process, the study of the inequations (1) reduces to the study of an equation k=1 1 λ j k H j (I[0, t]), e j k L 2 [ ν j (t)] u i[0, t], e j k L 2 [ ν j (t)] = l with l an integer value of the interval [n D, n]. In order to determine w ij (t) from the above equation, an extraction algorithm is needed. The learning inequalities reflects the existence of a delay on which the learning can be accomplished. 6
7 4 Generalization for space-time approach On this framework, the membrane potential can be modelled as a stochastic integral with respect to a space-time Poisson process u i (t, x) = w ij (s, y)ɛ(t s, x y)dπ j (s, y) [O,t] D x with Π j (t, x) = δ(t t f j )δ(x xf j ) f F j Definition An interval W L = [n D, n] is a counting window learning set associated with the spike train S j if there is a spike time t f i and a spike place for the neuron i such that x f i and Π j (t f i, xf i ) = n Π j (t f i, xf i ) Π j(t f i t, xf i x) = D 1. The Hebb rule become If m j (u i ([0, t] D x )) W L then the synapse (i, j) is strengthened If m j (u i ([0, t] D x )) / W L then the synapse (i, j) vanishes. The above formulation exhibits the local aspect in time and space of the Hebb rule. 7
8 5 Conclusions The stochastic integral with respect to a Poisson process is a natural framework for spike response models. The Hebb rule is expressed as a detection criterion depending on the membrane potential. As kernels of the covariance functional of the membrane potential, the synaptic weights are involved in learning inequalities. In order to extract them, an algorithmic solution is needed. Space-time spiking neurons can be modelled by means of a space-time Poisson processes. The above remarks apply to space-time model as well. References [1] A. Climescu-Haulica, Calcul stochastique appliqué aux problèmes de détection des signaux aléatoires, Ph.D. Thesis, EPFL, Lausanne, 1999 [2] W. Gerstner and W.M. Kistler, Spiking Neurons Models. Single Neurons, Populations, Plasticity, Cambridge University Press,
The Leaky Integrate-and-Fire Neuron Model
The Leaky Integrate-and-Fire Neuron Model Emin Orhan eorhan@bcs.rochester.edu November 2, 2 In this note, I review the behavior of a leaky integrate-and-fire (LIF) neuron under different stimulation conditions.
More informationA Simplified Neuron Model as a Principal Component Analyzer
J. Math. Biology (1982) 15:267-273 Journal o[ Mathematical Biology 9 Springer-Verlag 1982 A Simplified Neuron Model as a Principal Component Analyzer Erkki Oja University of Kuopio, Institute of Mathematics,
More informationMATH 423 Linear Algebra II Lecture 38: Generalized eigenvectors. Jordan canonical form (continued).
MATH 423 Linear Algebra II Lecture 38: Generalized eigenvectors Jordan canonical form (continued) Jordan canonical form A Jordan block is a square matrix of the form λ 1 0 0 0 0 λ 1 0 0 0 0 λ 0 0 J = 0
More informationNumerical Analysis Lecture Notes
Numerical Analysis Lecture Notes Peter J. Olver 6. Eigenvalues and Singular Values In this section, we collect together the basic facts about eigenvalues and eigenvectors. From a geometrical viewpoint,
More informationStochastic modeling of a serial killer
Stochastic modeling of a serial killer M.V. Simkin and V.P. Roychowdhury Department of Electrical Engineering, University of California, Los Angeles, CA 995-594 We analyze the time pattern of the activity
More information15.062 Data Mining: Algorithms and Applications Matrix Math Review
.6 Data Mining: Algorithms and Applications Matrix Math Review The purpose of this document is to give a brief review of selected linear algebra concepts that will be useful for the course and to develop
More informationAlgebraic and Transcendental Numbers
Pondicherry University July 2000 Algebraic and Transcendental Numbers Stéphane Fischler This text is meant to be an introduction to algebraic and transcendental numbers. For a detailed (though elementary)
More informationSTDP-Induced Periodic Encoding of Static Patterns in Balanced Recurrent Neural Networks
STDP-Induced Periodic Encoding of Static Patterns in Balanced Recurrent Neural Networks Anonymous Author(s) Affiliation Address email Abstract We present learning simulation results on a balanced recurrent
More informationStatistical Machine Learning from Data
Samy Bengio Statistical Machine Learning from Data 1 Statistical Machine Learning from Data Gaussian Mixture Models Samy Bengio IDIAP Research Institute, Martigny, Switzerland, and Ecole Polytechnique
More informationStructure Preserving Model Reduction for Logistic Networks
Structure Preserving Model Reduction for Logistic Networks Fabian Wirth Institute of Mathematics University of Würzburg Workshop on Stochastic Models of Manufacturing Systems Einhoven, June 24 25, 2010.
More informationSchrödinger operators with non-confining potentials.
Schrödinger operators with non-confining potentials. B. Simon s Non-standard eigenvalues estimates. Fakultät für Mathematik Ruhr-Universität Bochum 21 Sept. 2011 Plan of this talk. 1 Part1: Motivation
More informationr (t) = 2r(t) + sin t θ (t) = r(t) θ(t) + 1 = 1 1 θ(t) 1 9.4.4 Write the given system in matrix form x = Ax + f ( ) sin(t) x y 1 0 5 z = dy cos(t)
Solutions HW 9.4.2 Write the given system in matrix form x = Ax + f r (t) = 2r(t) + sin t θ (t) = r(t) θ(t) + We write this as ( ) r (t) θ (t) = ( ) ( ) 2 r(t) θ(t) + ( ) sin(t) 9.4.4 Write the given system
More informationRecall the basic property of the transpose (for any A): v A t Aw = v w, v, w R n.
ORTHOGONAL MATRICES Informally, an orthogonal n n matrix is the n-dimensional analogue of the rotation matrices R θ in R 2. When does a linear transformation of R 3 (or R n ) deserve to be called a rotation?
More informationDATA ANALYSIS II. Matrix Algorithms
DATA ANALYSIS II Matrix Algorithms Similarity Matrix Given a dataset D = {x i }, i=1,..,n consisting of n points in R d, let A denote the n n symmetric similarity matrix between the points, given as where
More informationAppendix 4 Simulation software for neuronal network models
Appendix 4 Simulation software for neuronal network models D.1 Introduction This Appendix describes the Matlab software that has been made available with Cerebral Cortex: Principles of Operation (Rolls
More informationOverview. Neural Encoding. Encoding: Stimulus-response relation. The neural code. Understanding the neural code
3 / 57 4 / 57 Overview Understanding the neural code Neural Encoding Mark van Rossum School of Informatics, University of Edinburgh January 215 Encoding: Prediction of neural response to a given stimulus
More information1 Overview and background
In Neil Salkind (Ed.), Encyclopedia of Research Design. Thousand Oaks, CA: Sage. 010 The Greenhouse-Geisser Correction Hervé Abdi 1 Overview and background When performing an analysis of variance with
More informationNeural Networks: a replacement for Gaussian Processes?
Neural Networks: a replacement for Gaussian Processes? Matthew Lilley and Marcus Frean Victoria University of Wellington, P.O. Box 600, Wellington, New Zealand marcus@mcs.vuw.ac.nz http://www.mcs.vuw.ac.nz/
More informationA new approach for regularization of inverse problems in image processing
A new approach for regularization of inverse problems in image processing I. Souopgui 1,2, E. Kamgnia 2, 1, A. Vidard 1 (1) INRIA / LJK Grenoble (2) University of Yaounde I 10 th African Conference on
More informationPhase Change Memory for Neuromorphic Systems and Applications
Phase Change Memory for Neuromorphic Systems and Applications M. Suri 1, O. Bichler 2, D. Querlioz 3, V. Sousa 1, L. Perniola 1, D. Vuillaume 4, C. Gamrat 2, and B. DeSalvo 1 (manan.suri@cea.fr, barbara.desalvo@cea.fr)
More informationSection 4.4 Inner Product Spaces
Section 4.4 Inner Product Spaces In our discussion of vector spaces the specific nature of F as a field, other than the fact that it is a field, has played virtually no role. In this section we no longer
More informationRepresenting Reversible Cellular Automata with Reversible Block Cellular Automata
Discrete Mathematics and Theoretical Computer Science Proceedings AA (DM-CCG), 2001, 145 154 Representing Reversible Cellular Automata with Reversible Block Cellular Automata Jérôme Durand-Lose Laboratoire
More informationA Granger Causality Measure for Point Process Models of Neural Spiking Activity
A Granger Causality Measure for Point Process Models of Neural Spiking Activity Diego Mesa PhD Student - Bioengineering University of California - San Diego damesa@eng.ucsd.edu Abstract A network-centric
More informationBiological Neurons and Neural Networks, Artificial Neurons
Biological Neurons and Neural Networks, Artificial Neurons Neural Computation : Lecture 2 John A. Bullinaria, 2015 1. Organization of the Nervous System and Brain 2. Brains versus Computers: Some Numbers
More informationIntroduction to Machine Learning Using Python. Vikram Kamath
Introduction to Machine Learning Using Python Vikram Kamath Contents: 1. 2. 3. 4. 5. 6. 7. 8. 9. 10. Introduction/Definition Where and Why ML is used Types of Learning Supervised Learning Linear Regression
More informationNESTML Tutorial. I.Blundell, M.J.Eppler, D.Plotnikov. Software Engineering RWTH Aachen. http://www.se-rwth.de/
NESTML Tutorial I.Blundell, M.J., D.Plotnikov http://www.se-rwth.de/ Seite 2 Usage of the NESTML Infrastructure Starting eclipse: cd /home/nest/eclipse_nestml./eclipse Working folder for the code generation:
More informationAn Introduction to Machine Learning
An Introduction to Machine Learning L5: Novelty Detection and Regression Alexander J. Smola Statistical Machine Learning Program Canberra, ACT 0200 Australia Alex.Smola@nicta.com.au Tata Institute, Pune,
More informationOn-Line Learning with Structural Adaptation in a Network of Spiking Neurons for Visual Pattern Recognition
On-Line Learning with Structural Adaptation in a Network of Spiking Neurons for Visual Pattern Recognition Simei Gomes Wysoski, Lubica Benuskova, and Nikola Kasabov Knowledge Engineering and Discovery
More informationDiscussion on the paper Hypotheses testing by convex optimization by A. Goldenschluger, A. Juditsky and A. Nemirovski.
Discussion on the paper Hypotheses testing by convex optimization by A. Goldenschluger, A. Juditsky and A. Nemirovski. Fabienne Comte, Celine Duval, Valentine Genon-Catalot To cite this version: Fabienne
More informationReal Roots of Univariate Polynomials with Real Coefficients
Real Roots of Univariate Polynomials with Real Coefficients mostly written by Christina Hewitt March 22, 2012 1 Introduction Polynomial equations are used throughout mathematics. When solving polynomials
More information15.1. Integration as the limit of a sum. Introduction. Prerequisites. Learning Outcomes. Learning Style
Integration as the limit of a sum 15.1 Introduction In Chapter 14, integration was introduced as the reverse of differentiation. A more rigorous treatment would show that integration is a process of adding
More informationComputing with Spiking Neuron Networks
Computing with Spiking Neuron Networks Hélène Paugam-Moisy 1 and Sander Bohte 2 Abstract Spiking Neuron Networks (SNNs) are often referred to as the 3 rd generation of neural networks. Highly inspired
More information2014-2015 The Master s Degree with Thesis Course Descriptions in Industrial Engineering
2014-2015 The Master s Degree with Thesis Course Descriptions in Industrial Engineering Compulsory Courses IENG540 Optimization Models and Algorithms In the course important deterministic optimization
More informationSimilarity and Diagonalization. Similar Matrices
MATH022 Linear Algebra Brief lecture notes 48 Similarity and Diagonalization Similar Matrices Let A and B be n n matrices. We say that A is similar to B if there is an invertible n n matrix P such that
More informationSingle item inventory control under periodic review and a minimum order quantity
Single item inventory control under periodic review and a minimum order quantity G. P. Kiesmüller, A.G. de Kok, S. Dabia Faculty of Technology Management, Technische Universiteit Eindhoven, P.O. Box 513,
More information[1] Diagonal factorization
8.03 LA.6: Diagonalization and Orthogonal Matrices [ Diagonal factorization [2 Solving systems of first order differential equations [3 Symmetric and Orthonormal Matrices [ Diagonal factorization Recall:
More informationIntroduction to Machine Learning and Data Mining. Prof. Dr. Igor Trajkovski trajkovski@nyus.edu.mk
Introduction to Machine Learning and Data Mining Prof. Dr. Igor Trakovski trakovski@nyus.edu.mk Neural Networks 2 Neural Networks Analogy to biological neural systems, the most robust learning systems
More informationASEN 3112 - Structures. MDOF Dynamic Systems. ASEN 3112 Lecture 1 Slide 1
19 MDOF Dynamic Systems ASEN 3112 Lecture 1 Slide 1 A Two-DOF Mass-Spring-Dashpot Dynamic System Consider the lumped-parameter, mass-spring-dashpot dynamic system shown in the Figure. It has two point
More informationบทท 1 บทน า ๑. หล กการและเหต ผล ๒. ว ตถ ประสงค
บทท 1 บทน า ๑. หล กการและเหต ผล การจ ดก จกรรมการเร ยนการสอนตามหล กส ตรการศ กษาตามพระราชบ ญญ ต การศ กษา แหงชาต พ.ศ. 2542 แก#ไขเพ มเต มฉบ บท 2 พ.ศ. 2545 ก าหนด ให#ม การเร ยนการสอนในกล ม สาระก จกรรมพ ฒนาผ
More informationThe Performance (and Limits) of Simple Neuron Models: Generalizations of the Leaky Integrate-and-Fire Model
The Performance (and Limits) of Simple Neuron Models: Generalizations of the Leaky Integrate-and-Fire Model Richard Naud and Wulfram Gerstner The study of neuronal populations with respect to coding, computation
More informationChapter 17. Orthogonal Matrices and Symmetries of Space
Chapter 17. Orthogonal Matrices and Symmetries of Space Take a random matrix, say 1 3 A = 4 5 6, 7 8 9 and compare the lengths of e 1 and Ae 1. The vector e 1 has length 1, while Ae 1 = (1, 4, 7) has length
More informationClassification of Cartan matrices
Chapter 7 Classification of Cartan matrices In this chapter we describe a classification of generalised Cartan matrices This classification can be compared as the rough classification of varieties in terms
More information{apolin},{mcampos}@ieee.org
{apolin},{mcampos}@ieee.org x( ) 2 x( ) x( ) = ( ) x = [ ( ) x ı x + ( ) y ( ) y ( ) z ı y + ( ) z ] T ı z 2 x ( ) = 2 ( ) x + 2 ( ) 2 y + 2 ( ) 2 z 2 2 E = 1 2 E c 2 t 2 s(x,t) 2 s x + 2 s 2 y + 2
More informationClassical and Operant Conditioning as Roots of Interaction for Robots
Classical and Operant Conditioning as Roots of Interaction for Robots Jean Marc Salotti and Florent Lepretre Laboratoire EA487 Cognition et Facteurs Humains, Institut de Cognitique, Université de Bordeaux,
More informationSF2940: Probability theory Lecture 8: Multivariate Normal Distribution
SF2940: Probability theory Lecture 8: Multivariate Normal Distribution Timo Koski 24.09.2015 Timo Koski Matematisk statistik 24.09.2015 1 / 1 Learning outcomes Random vectors, mean vector, covariance matrix,
More informationTHE DYING FIBONACCI TREE. 1. Introduction. Consider a tree with two types of nodes, say A and B, and the following properties:
THE DYING FIBONACCI TREE BERNHARD GITTENBERGER 1. Introduction Consider a tree with two types of nodes, say A and B, and the following properties: 1. Let the root be of type A.. Each node of type A produces
More informationA Regime-Switching Model for Electricity Spot Prices. Gero Schindlmayr EnBW Trading GmbH g.schindlmayr@enbw.com
A Regime-Switching Model for Electricity Spot Prices Gero Schindlmayr EnBW Trading GmbH g.schindlmayr@enbw.com May 31, 25 A Regime-Switching Model for Electricity Spot Prices Abstract Electricity markets
More informationMethod To Solve Linear, Polynomial, or Absolute Value Inequalities:
Solving Inequalities An inequality is the result of replacing the = sign in an equation with ,, or. For example, 3x 2 < 7 is a linear inequality. We call it linear because if the < were replaced with
More informationRECIFE: a MCDSS for Railway Capacity
RECIFE: a MCDSS for Railway Capacity Xavier GANDIBLEUX (1), Pierre RITEAU (1), and Xavier DELORME (2) (1) LINA - Laboratoire d Informatique de Nantes Atlantique Universite de Nantes 2 rue de la Houssiniere
More informationAuto-structure of presynaptic activity defines postsynaptic firing statistics and can modulate STDP-based structure formation and learning
Auto-structure of presynaptic activity defines postsynaptic firing statistics and can modulate STDP-based structure formation and learning Gordon Pipa (1,2,3,4), Raul Vicente (1,2), and Alexander Tikhonov
More informationCHAPTER IV - BROWNIAN MOTION
CHAPTER IV - BROWNIAN MOTION JOSEPH G. CONLON 1. Construction of Brownian Motion There are two ways in which the idea of a Markov chain on a discrete state space can be generalized: (1) The discrete time
More informationInner Product Spaces
Math 571 Inner Product Spaces 1. Preliminaries An inner product space is a vector space V along with a function, called an inner product which associates each pair of vectors u, v with a scalar u, v, and
More informationPoisson Models for Count Data
Chapter 4 Poisson Models for Count Data In this chapter we study log-linear models for count data under the assumption of a Poisson error structure. These models have many applications, not only to the
More informationQUALITY ENGINEERING PROGRAM
QUALITY ENGINEERING PROGRAM Production engineering deals with the practical engineering problems that occur in manufacturing planning, manufacturing processes and in the integration of the facilities and
More informationBayesian Analysis of Call Center Arrival Data
Bayesian Analysis of Call Center Arrival Data Fabrizio Ruggeri 1, Refik Soyer 2 and Murat Tarimcilar 2 1 Istituto di Matematica Applicata e Tecnologie Informatiche Consiglio Nazionale delle Ricerche Via
More informationFeed-Forward mapping networks KAIST 바이오및뇌공학과 정재승
Feed-Forward mapping networks KAIST 바이오및뇌공학과 정재승 How much energy do we need for brain functions? Information processing: Trade-off between energy consumption and wiring cost Trade-off between energy consumption
More informationComputational Neuroscience. Models of Synaptic Transmission and Plasticity. Prof. Dr. Michele GIUGLIANO 2036FBDBMW
Computational Neuroscience 2036FBDBMW Master of Science in Computer Science (Scientific Computing) Master of Science in Biomedical Sciences (Neurosciences) Master of Science in Physics Prof. Dr. Michele
More informationTesting against a Change from Short to Long Memory
Testing against a Change from Short to Long Memory Uwe Hassler and Jan Scheithauer Goethe-University Frankfurt This version: January 2, 2008 Abstract This paper studies some well-known tests for the null
More informationPacific Journal of Mathematics
Pacific Journal of Mathematics GLOBAL EXISTENCE AND DECREASING PROPERTY OF BOUNDARY VALUES OF SOLUTIONS TO PARABOLIC EQUATIONS WITH NONLOCAL BOUNDARY CONDITIONS Sangwon Seo Volume 193 No. 1 March 2000
More informationThe Einstein field equations
The Einstein field equations Part I: the right-hand side Atle Hahn GFM, Universidade de Lisboa Lisbon, 21st January 2010 Contents: 1 Einstein field equations: overview 2 Special relativity: review 3 Classical
More informationCHAPTER 3 SECURITY CONSTRAINED OPTIMAL SHORT-TERM HYDROTHERMAL SCHEDULING
60 CHAPTER 3 SECURITY CONSTRAINED OPTIMAL SHORT-TERM HYDROTHERMAL SCHEDULING 3.1 INTRODUCTION Optimal short-term hydrothermal scheduling of power systems aims at determining optimal hydro and thermal generations
More informationOrthogonal Diagonalization of Symmetric Matrices
MATH10212 Linear Algebra Brief lecture notes 57 Gram Schmidt Process enables us to find an orthogonal basis of a subspace. Let u 1,..., u k be a basis of a subspace V of R n. We begin the process of finding
More informationNonlinear Optimization: Algorithms 3: Interior-point methods
Nonlinear Optimization: Algorithms 3: Interior-point methods INSEAD, Spring 2006 Jean-Philippe Vert Ecole des Mines de Paris Jean-Philippe.Vert@mines.org Nonlinear optimization c 2006 Jean-Philippe Vert,
More informationFunction minimization
Function minimization Volker Blobel University of Hamburg March 2005 1. Optimization 2. One-dimensional minimization 3. Search methods 4. Unconstrained minimization 5. Derivative calculation 6. Trust-region
More informationLinear smoother. ŷ = S y. where s ij = s ij (x) e.g. s ij = diag(l i (x)) To go the other way, you need to diagonalize S
Linear smoother ŷ = S y where s ij = s ij (x) e.g. s ij = diag(l i (x)) To go the other way, you need to diagonalize S 2 Online Learning: LMS and Perceptrons Partially adapted from slides by Ryan Gabbard
More informationIntroduction to Matrix Algebra
Psychology 7291: Multivariate Statistics (Carey) 8/27/98 Matrix Algebra - 1 Introduction to Matrix Algebra Definitions: A matrix is a collection of numbers ordered by rows and columns. It is customary
More informationNEURAL NETWORKS A Comprehensive Foundation
NEURAL NETWORKS A Comprehensive Foundation Second Edition Simon Haykin McMaster University Hamilton, Ontario, Canada Prentice Hall Prentice Hall Upper Saddle River; New Jersey 07458 Preface xii Acknowledgments
More information1.3. DOT PRODUCT 19. 6. If θ is the angle (between 0 and π) between two non-zero vectors u and v,
1.3. DOT PRODUCT 19 1.3 Dot Product 1.3.1 Definitions and Properties The dot product is the first way to multiply two vectors. The definition we will give below may appear arbitrary. But it is not. It
More information7 Network Models. 7.1 Introduction
7 Network Models 7.1 Introduction Extensive synaptic connectivity is a hallmark of neural circuitry. For example, a typical neuron in the mammalian neocortex receives thousands of synaptic inputs. Network
More informationCell Microscopic Segmentation with Spiking Neuron Networks
Cell Microscopic Segmentation with Spiking Neuron Networks Boudjelal Meftah, Olivier Lezoray, Michel Lecluse, Abdelkader Benyettou To cite this version: Boudjelal Meftah, Olivier Lezoray, Michel Lecluse,
More informationHomework # 3 Solutions
Homework # 3 Solutions February, 200 Solution (2.3.5). Noting that and ( + 3 x) x 8 = + 3 x) by Equation (2.3.) x 8 x 8 = + 3 8 by Equations (2.3.7) and (2.3.0) =3 x 8 6x2 + x 3 ) = 2 + 6x 2 + x 3 x 8
More informationStreaming Parallel GPU Acceleration of Large-Scale filter-based Spiking Neural Networks
Streaming Parallel GPU Acceleration of Large-Scale filter-based Spiking Neural Networks Leszek Ślażyński1,SanderBohte 1 1 Department of Life Sciences, Centrum Wiskunde & Informatica, Science Park 123,
More informationClass Meeting # 1: Introduction to PDEs
MATH 18.152 COURSE NOTES - CLASS MEETING # 1 18.152 Introduction to PDEs, Fall 2011 Professor: Jared Speck Class Meeting # 1: Introduction to PDEs 1. What is a PDE? We will be studying functions u = u(x
More informationBIOPHYSICS OF NERVE CELLS & NETWORKS
UNIVERSITY OF LONDON MSci EXAMINATION May 2007 for Internal Students of Imperial College of Science, Technology and Medicine This paper is also taken for the relevant Examination for the Associateship
More informationReinforcement Learning
Reinforcement Learning LU 2 - Markov Decision Problems and Dynamic Programming Dr. Martin Lauer AG Maschinelles Lernen und Natürlichsprachliche Systeme Albert-Ludwigs-Universität Freiburg martin.lauer@kit.edu
More informationFitting Subject-specific Curves to Grouped Longitudinal Data
Fitting Subject-specific Curves to Grouped Longitudinal Data Djeundje, Viani Heriot-Watt University, Department of Actuarial Mathematics & Statistics Edinburgh, EH14 4AS, UK E-mail: vad5@hw.ac.uk Currie,
More informationarxiv:1007.3653v2 [math.ca] 25 Nov 2010
Complexity reduction of C-Algorithm Magali Bardet a, Islam Boussaada b, arxiv:1007.3653v2 [math.ca] 25 Nov 2010 Abstract a Laboratoire d Informatique, de Traitement de l Information et des Systèmes, Université
More informationRESONANCES AND BALLS IN OBSTACLE SCATTERING WITH NEUMANN BOUNDARY CONDITIONS
RESONANCES AND BALLS IN OBSTACLE SCATTERING WITH NEUMANN BOUNDARY CONDITIONS T. J. CHRISTIANSEN Abstract. We consider scattering by an obstacle in R d, d 3 odd. We show that for the Neumann Laplacian if
More informationCHAPTER 5 SIGNALLING IN NEURONS
5.1. SYNAPTIC TRANSMISSION CHAPTER 5 SIGNALLING IN NEURONS One of the main functions of neurons is to communicate with other neurons. An individual neuron may receive information from many different sources.
More informationOn a phase diagram for random neural networks with embedded spike timing dependent plasticity
BioSystems 89 (2007) 280 286 On a phase diagram for random neural networks with embedded spike timing dependent plasticity Tatyana S. Turova a,, Alessandro E.P. Villa b,c,1 a Mathematical Center, Lund
More informationObject tracking & Motion detection in video sequences
Introduction Object tracking & Motion detection in video sequences Recomended link: http://cmp.felk.cvut.cz/~hlavac/teachpresen/17compvision3d/41imagemotion.pdf 1 2 DYNAMIC SCENE ANALYSIS The input to
More informationBrief Review of Tensors
Appendix A Brief Review of Tensors A1 Introductory Remarks In the study of particle mechanics and the mechanics of solid rigid bodies vector notation provides a convenient means for describing many physical
More informationScheduling Jobs and Preventive Maintenance Activities on Parallel Machines
Scheduling Jobs and Preventive Maintenance Activities on Parallel Machines Maher Rebai University of Technology of Troyes Department of Industrial Systems 12 rue Marie Curie, 10000 Troyes France maher.rebai@utt.fr
More informationNumerical Analysis of the Jeans Instability
June 15, 2010 Background Goal Refine our understanding of Jeans Length and its relation to astrophysical simulations. Currently, it is widely accepted that one needs four cells per Jeans Length to get
More informationSupplement to Call Centers with Delay Information: Models and Insights
Supplement to Call Centers with Delay Information: Models and Insights Oualid Jouini 1 Zeynep Akşin 2 Yves Dallery 1 1 Laboratoire Genie Industriel, Ecole Centrale Paris, Grande Voie des Vignes, 92290
More informationStatistical Machine Learning
Statistical Machine Learning UoC Stats 37700, Winter quarter Lecture 4: classical linear and quadratic discriminants. 1 / 25 Linear separation For two classes in R d : simple idea: separate the classes
More informationImplementing Deep Neural Networks with Non Volatile Memories
NeuroSTIC 2015 July 1st, 2015 Implementing Deep Neural Networks with Non Volatile Memories Olivier Bichler 1 (olivier.bichler@cea.fr) Daniele Garbin 2 Elisa Vianello 2 Luca Perniola 2 Barbara DeSalvo 2
More informationDepartment of Economics
Department of Economics On Testing for Diagonality of Large Dimensional Covariance Matrices George Kapetanios Working Paper No. 526 October 2004 ISSN 1473-0278 On Testing for Diagonality of Large Dimensional
More informationCHAPTER 6 PRINCIPLES OF NEURAL CIRCUITS.
CHAPTER 6 PRINCIPLES OF NEURAL CIRCUITS. 6.1. CONNECTIONS AMONG NEURONS Neurons are interconnected with one another to form circuits, much as electronic components are wired together to form a functional
More informationMatematisk Fysik. Paolo Sibani Institut for Fysik og Kemi, SDU
Matematisk Fysik Paolo Sibani Institut for Fysik og Kemi, SDU 2 Contents 1 Introduction 1 2 Matlab primer 3 3 Stochastic matrices 9 4 Coupled differential equations 13 5 Eigenvalue problem for the lattice
More informationPerpetual barrier options in jump-diffusion models
Stochastics (2007) 79(1 2) (139 154) Discussion Paper No. 2006-058 of Sonderforschungsbereich 649 Economic Risk (22 pp) Perpetual barrier options in jump-diffusion models Pavel V. Gapeev Abstract We present
More informationUnderstanding Big Data Spectral Clustering
Understanding Big Data Spectral Clustering Romain Couillet, Florent Benaych-Georges To cite this version: Romain Couillet, Florent Benaych-Georges Understanding Big Data Spectral Clustering 205 IEEE 6th
More informationOctober 3rd, 2012. Linear Algebra & Properties of the Covariance Matrix
Linear Algebra & Properties of the Covariance Matrix October 3rd, 2012 Estimation of r and C Let rn 1, rn, t..., rn T be the historical return rates on the n th asset. rn 1 rṇ 2 r n =. r T n n = 1, 2,...,
More informationLinear Algebra Notes for Marsden and Tromba Vector Calculus
Linear Algebra Notes for Marsden and Tromba Vector Calculus n-dimensional Euclidean Space and Matrices Definition of n space As was learned in Math b, a point in Euclidean three space can be thought of
More informationModélisation et résolutions numérique et symbolique
Modélisation et résolutions numérique et symbolique via les logiciels Maple et Matlab Jeremy Berthomieu Mohab Safey El Din Stef Graillat Mohab.Safey@lip6.fr Outline Previous course: partial review of what
More informationAPPLICATIONS OF BAYES THEOREM
ALICATIONS OF BAYES THEOREM C&E 940, September 005 Geoff Bohling Assistant Scientist Kansas Geological Survey geoff@kgs.ku.edu 864-093 Notes, overheads, Excel example file available at http://people.ku.edu/~gbohling/cpe940
More informationReinforcement Learning
Reinforcement Learning LU 2 - Markov Decision Problems and Dynamic Programming Dr. Joschka Bödecker AG Maschinelles Lernen und Natürlichsprachliche Systeme Albert-Ludwigs-Universität Freiburg jboedeck@informatik.uni-freiburg.de
More informationMultidimensional Black-Scholes options
MPRA Munich Personal RePEc Archive Multidimensional Black-Scholes options Francesco Paolo Esposito 10 December 2010 Online at https://mpra.ub.uni-muenchen.de/42821/ MPRA Paper No. 42821, posted 24 November
More information