Multiplication With Neurons

Size: px
Start display at page:

Download "Multiplication With Neurons"

Transcription

1 University of Edinburgh School of Informatics Multiplication With Neurons Author: Panagiotis E. Nezis Supervisor: Dr. Mark van Rossum Thesis MSc in Informatics August 17, 28

2 ἐν οἶδα ὁτι οὐδὲν οἶδα Socrates I know that I don t know Καὶ πάσχειν δή τι ὑπὸ τοῦ χρόνου, καθάπερ καὶ λέγειν εἰώθαμεν ὅτι κατατήκει ὁ χρόνος, καὶ γηράσκει πάνθ ὑπὸ τοῦ χρόνου, καὶ ἐπιλανθάνεται διὰ τὸν χρόνον, ἀλλ οὐ μεμάθηκεν, οὐδὲ νέον γέγονεν οὐδὲ καλόν φθορᾶς γὰρ αἴτιος καθ αὑτὸν μᾶλλον ὁ χρόνος ἀριθμὸς γὰρ κινήσεως, ἡ δὲ κίνησις ἐξίστησι τὸ ὑπάρχον. Aristotle Physics, Book IV, Ch. 12 A thing, then, will be affected by time, just as we are accustomed to say that time wastes things away, and that all things grow old through time, and that there is oblivion owing to the lapse of time, but we do not say the same of getting to know or of becoming young or fair. For time is by its nature the cause rather of decay, since it is the number of change, and change removes what is. Translated from Greek by R. P. Hardie and R. K. Gaye

3

4 Abstract Experimental evidence can be found that supports the existence of multiplicative mechanisms in the nervous system [23]. The exact way multiplication is implemented in neurons is unclear. However there is a lot of interest about its details, driven by the experimental observations, which imply its existence. In this thesis we used feed forward networks of integrate-and-fire neurons in order to approximate multiplication. The main hypothesis done is that the minimum function can give a multiplicative like response. Networks that implement the minimum function of two inputs were created and tested. The results show that the hypothesis was correct and we successfully managed to approach multiplication in most cases. The limitations and some interesting observations like the importance of spike timing are also described. i

5 ii

6 Acknowledgments I would like to thank my supervisor, Mark van Rossum, for his enthusiasm, encouragement and insight; our discussions were as enjoyable as they were productive. I am also grateful to all other Professors I had both in the University of Edinburgh and the National Technical University of Athens, for turning me into a scientist. There is also a number of persons, who may not have been directly involved in this project, but without whom things would have been much harder. Last but not least, my family receives my deepest gratitude and love for their faith and their support during the current and previous studies. iii

7 iv

8 Declaration I declare that this thesis was composed by myself, that the work contained herein is my own except where explicitly stated otherwise in the text, and that this work has not been submitted for any other degree or professional qualification except as specified. (Panagiotis Evangelou Nezis) v

9 vi

10 Contents Abstract Acknowledgments Declaration i iii v 1 Introduction Proposal Layout of the Thesis Integrate-and-Fire Neuron Models Introduction Biological Background Anatomy of a Neuron Membrane and Ion Channels Synapses Electrical Properties of Cells Membrane Voltage - Resting Potential Spike Generation Membrane Capacitance & Resistance Synaptic Reversal Potential and Conductance Electrical Structure of Neurons The Integrate-and-Fire Model Nonleaky Integrate-and-Fire Neuron Leaky Integrate-and-Fire Neuron Synaptic Input Multiplication in the Nervous System Introduction Importance of Multiplication Function Approximation Relationship Between Operators Multiplication and Decision Making Biological Evidence of Multiplication Barn Owl s Auditory System The Lobula Giant Movement Detector, LGMD, of Locusts Other Evidence Existing Models Multiplication via Silent Inhibition vii

11 viii CONTENTS Spike Coincidence Detector Multiplication with Networks of I&F Neurons Introduction Aim of the Thesis Firing Rates and Rate Coding Firing Rates Rate Coding Excitation vs. Inhibition Subtractive Effects of Inhibitory Synapses Rectification Power-law Nonlinearities Approximating Multiplication Proposed Networks Network Network Simulation Results 31.1 Introduction Neuron s Behavior Adjusting the Parameters Multiplication of Firing Rates Experimental Procedure Experiment Experiment Experiment Comparison of the Two Networks Spike Timing is Important Discussion Introduction Achievements and Limitations Future Work Final Remarks A Simulating Biological Neural Networks using SIMULINK 4 A.1 Introduction A.2 SIMULINK A.2.1 Advantages of Simulink A.2.2 S-functions A.3 The Biological Neural Networks SIMULINK Library (BNNSL) A.3.1 Current Sources A.3.2 Output Devices A.3.3 Neuronal Models A.3.4 BNNSL in Action Bibliography 2

12 Chapter 1 Introduction The way information is transmitted in the brain, is a central problem in neuroscience. Neurons respond to a certain stimulus by the generation of action potentials which are called spike trains. Spike trains are stochastic and repeated presentation of the same stimulus don t cause identical firing patterns. It is believed that information is encoded in the spatiotemporal pattern of these trains of action potentials. There is a debate between those who support that information is embedded in temporal codes and researchers in favor of rate coding. In rate coding hypothesis, one of the oldest ideas about neural coding [3], information is embedded in the mean firing rates of a population of neurons. On the other hand, temporal coding relies on precise timing of action potentials and inter-spike intervals. Aim of this proposal is to explore how networks of rate-coding neurons can do multiplication of signals. In the literature experimental evidence can be found that supports the existence of multiplicative mechanisms in the nervous system. Studies have shown that the optomotor control in the fly is controlled by neural circuits performing multiplication [14],[12]. More recent experiments have found a multiplicative like response in auditory neurons of the barn owl s midbrain [23], [9]. The exact way multiplication is implemented in neurons is unclear. However there is a lot of interest about its details, driven by the experimental observations, which imply its existence. Koch and Poggio [18] have discussed different biophysical properties, present in single cells, capable of producing multiplicative interactions. In this proposal we are going to use integrate-and-fire neurons which don t include the nonlinearities Koch and Poggio propose. As a result the main aim is to approximate multiplication, being confined by the limits of these neuronal models. 1.1 Proposal In this project we are going to use feed-forward networks of integrate-and-fire neurons. The aim of these small population models is not to do exact multiplication since this is not possible, but to approximate it. Synaptic input is inserted in the neurons among with a noisy bias current. The synapses may be either excitatory or inhibitory. An excitatory synapse is a synapse in which an action potential in the presynaptic cell increases the probability of an action potential occurring in the postsynaptic cell. A postsynaptic potential is considered inhibitory when the resulting change in membrane voltage makes it more difficult for the cell to fire an action potential, lowering the firing rate of the neuron. They are the opposite of excitatory postsynaptic potentials (EPSPs), which result from the flow of ions like sodium into the cell. In our case inhibition is implemented through GABA A synapses with a reversal potential equal to the resting one [3]. This is called shunting inhibition and it has been shown to have a subtractive effect to the firing rate in most circumstances (the shunting conductance is independent 1

13 2 Chapter 1 - Introduction of the firing rate) [16], despite its divisive effect in subthreshold amplitudes. Since the firing rate of a neuron cannot take a negative value, the output will be a rectified copy of the input which is the difference between the excitatory and inhibitory synaptic inputs. The only nonlinearity present in this neuronal model is the rectification. We are going to combine it with excitation and subtractive inhibition in order to approximate multiplication. The minimum function is going to be used to approximate multiplication. Boolean functions like minimum or maximum can easily be implemented using rate coding neurons. 1.2 Layout of the Thesis The contents of this thesis are structured in such a way that the non-specialist reader is presented initially with all the background knowledge needed. The aim was to make the thesis as self-contained as possible. Readers who are familiarised with the concepts presented in the background chapter, could skip it or read it selectively. The remainder of this thesis is outlined as follows. Chapter 2 presents all background knowledge needed in order a non-specialist reader to be able to understand the rest of this thesis. The main aim of this chapter is to present the Integrate-and-Fire neuron model, but first the necessary underlying biological concepts are described. We present the anatomy of a neuron, we analyze the electrical properties of neural cells and how action potential are generated before giving the equations that describe the Integrate-and-Fire model. This chapter (or part of it) could be skipped by somebody familiar with this background information. In Chapter 3 we try to mention the importance of this thesis. Initially we explain abstractly the necessity of a multiplicative operation in perceptive tasks and describe its relation with the Boolean AND operation. Next, we present experimental evidence of multiplicative operations in the neural system. The fact that the mechanisms that implement such multiplicative operations are not well researched, despite there are multiple reports about neural multiplication, made me interested in this thesis. Finally on the same chapter we present some of the models that researchers have proposed. In Chapter 4 we present our approach to the problem of multiplication like operations in the brain. Initially we show that an Integrate-and-Fire neuron with an excitatory and an inhibitory input acts as a rectifying unit. Next we show that multiplication could be approached with the minimum function, given that we don t care for the exact multiplication of two firing rates but for a proportional relation. Finally we present two feed forward networks of I&F neurons that implement the minimum function and were used in the simulations. The results of our research can be seen in Chapter. The simple networks proposed in Chapter 4 are able to implement multiplicative like operations however their performance is not the same. We show which of the two networks performs better and try to analyze why this happens. We also prove another important fact, that spike timing is important, even when dealing just with rate coding networks. Finally in Chapter 6 we discuss the results of this thesis and propose some things that could be done if time permitted it. In order to do the simulations we created a SIMULINK library specific for Integrate-and-Fire neurons. The Appendix describes how SIMULINK works, its advantages compared to other approaches, the Library we created and some examples of its usage.

14 Chapter 2 Integrate-and-Fire Neuron Models 2.1 Introduction The nervous system which is responsible for every action we make, has a magnificent structure including billions of neurons connected to each other in an intricate pattern [13]. Neurons are the elementary processing units in the brain and communicate with short electrical pulses known as spikes or action potentials. It is believed that information is transmitted through firing sequences of spikes. Although spiking neurons are the majority of cells present in the cortex, there is also a large number of glia cells that play a supporting role and are responsible for energy supply of the brain. In Figure 2.1 we can see coloured hippocampal neurons and glia cells. Figure 2.1 Left-Hippocampal neurons (green) and glial cells (red). Right- Hippocampal neuron expressing monomeric Green Fluorescent Protein (GFP). (Images taken from Biological research has produced detailed knowledge about the biophysical mechanisms underlying neuronal functionality and spike generation. From a modeling perspective, this knowledge can be used in order to construct neuron models, which can be used in computer simulations of neurons and neural networks. These simulations can help us understand how information is encoded into neural signals and how the network connectivity determines the firing activity. A large number of neuron models has been proposed, ranging from complex realistic descriptions of the 3

15 4 Chapter 2 - Integrate-and-Fire Neuron Models Figure 2.2 Diagram of a typical neuron. (Image taken from Wikipedia.) biophysical mechanisms to simplified models involving a small number of differential equations. These simplified models may seem unrealistic but are very useful for the study and analysis of large neural systems. In this chapter we are going to present the Integrate-and-Fire model, one of the most widely used neuron models, which uses just one differential equation to describe the membrane potential of a neuron in terms of the current it receives (injected current and synaptic inputs). This is the model we are going to use for the multiplication networks in this thesis. Before it we will describe some underlying biological concepts like the anatomy of neurons and the electrical properties of the membrane. 2.2 Biological Background Before describing the Integrate-and-Fire model it would be helpful to give some biological background about neurons and biological cells in general. In this section the anatomy of neurons is described along with the structure of cellular membranes, the operation of ion channels which are responsible for spike generation and finally the synapses and synaptic transmission Anatomy of a Neuron Neurons are electrically excitable cells in the nervous system that process and transmit information. They are the most important units of the brain and of the whole nervous system. There is a wide variety in the shape, size, and electrochemical properties of neurons which can be explained by the diverse functions they perform. In Figure 2.2 we can see a diagram of the anatomy of a typical neuron. The soma is the central part of the neuron where all the computational procedures like spike generation occur.

16 Section Biological Background Several branched tendrils are attached to neurons. Each neuron has multiple dendrites which play a critical role in integrating synaptic inputs and in determining the extent to which action potentials are produced by the neuron. There is just one axon which is a long nerve fiber, which can extend tens, hundreds, or even tens of thousands of times the diameter of the soma in length. In contrast with dendrites, the axon conducts electrical impulses away from the neuron s cell body acting as a transmission line. Action potentials almost always begin at the axon hillock (the part of the neuron where the soma and the axon are connected) and travel down the axon. Finally synapses pass information from a presynaptic cell to a postsynaptic cell. We will see synapses and synaptic transmission in more detail in a following paragraph Membrane and Ion Channels The cell membrane is a selectively permeable lipid bilayer found in all cells. It contains a wide variety of biological molecules, mainly proteins and lipids, which play a significant role for many cellular processes such as ion channel conductance and cell signaling. In Figure 2.3 we can see an illustration of the membrane. The spherical lipid bilayer is approximately 7 nm thick and is responsible for regulating the movement of materials into and out of cells. Except the phospholipid bilayer, we can see that the membrane includes several membrane proteins which determine the selective permeability of the membrane and passive and active transport mechanisms. Figure 2.3 Illustration of a cell membrane. We can see the phospholipid bilayer and some of the proteins, lipids and other biological molecules that it contains. Among these proteins we can see an ion channel. (Image taken from Wikipedia.)

17 6 Chapter 2 - Integrate-and-Fire Neuron Models The most important proteins for neural functionality are the ion channels, integral membrane proteins through which ions can cross the membrane. There are plenty such channels, most of them being highly selective and allowing only a single type of ion to pass through them. The phospholipid bilayer is nearly impermeable to ions so these proteins are the elementary units underlying principal functionalities such as spike generation and electrical signaling (within and between neurons) Synapses Synapses are specialized junctions responsible for the communication between neurons. There are two main types of synapses, the chemical ones and the electrical synapses which are also known as gap-junctions [6]. Chemical synapses are the most important and most numerous in the nervous system. Despite gap junctions are very important parts of the nervous system (for example they are particularly important in cardiac muscle [2]), in this thesis we will assume that only chemical synapses are present on the dendritic tree. In the following paragraphs we will briefly describe how a synapse works. In chemical synapses transmission is mediated by a chemical, called neuro-transmitter [31]. Synaptic transmission begins when an action potential reaches the presynaptic axon terminal. The occurring depolarization of the presynaptic membrane initiates a sequence of events leading to neurotransmitter release and activation of receptors on the postsynaptic membrane. An illustration of a synapse can be seen on Figure 2.4. Figure 2.4 Illustration of a synapse and synaptic transmission. (Image taken from [21].) Axon terminal contains a pool of synaptic vesicles which are little balls filled with neurotransmitter. When the axon terminal is depolarized, voltage-gated calcium (Ca) channels open and calcium ions (Ca 2+ ) rush into the axon terminal. Some of these ions bind to synaptic vesicles bringing them closer to the presynaptic membrane, causing a fusion between the vesicle and synaptic membrane and finally the neurotransmitter is released which goes into the extracellular space. Some of the neurotransmitter molecules bind to special receptor molecules on the postsynaptic membrane. The response of the postsynaptic cell varies, since it depends on the kind of transmitter-receptor combination. For excitatory synapses, the neurotransmitter causes the opening of channels which let through ions of potassium K + and Sodium Na +. On the other hand inhibitory synapses activate mainly Chlorine channels Cl.

18 Section Electrical Properties of Cells Electrical Properties of Cells A neural cell can be modeled using electrical components like resistors, capacitors and voltage sources. The occurring electrical circuits are used for computational simulations and approach sufficiently the behavior of real cells Membrane Voltage - Resting Potential If one measures the intracellular (V i ) and extracellular (V e ) potentials of a neuron, one will observe the existence of a voltage difference (V m ) across its membrane. V m (t) = V i (t) V e (t) (2.1) Different intracellular and extracellular concentrations of ions are responsible for this voltage. Most of the times V m is negative (except when a spike occurs). If the neuron is in rest (the sum of ionic currents flowing it and out of the membrane is zero) then the electrical potential across the membrane is called resting potential V rest. For a typical neuron V rest is about 7 mv Spike Generation Figure 2. Generation of an action potential. The uniqueness of neurons is their ability to produce, propagate and decode spike trains. Before presenting the way a neural cell can be represented electrically it would be interesting to see how a spike is generated. If we inject the neuron with current the V m increases. When the membrane potential reaches a certain threshold, enough voltagegated sodium channels open and the relative ionic permeability favors sodium (Na) over potassium (K). The number of channels that open depends on the injected current and equivalently to the voltage applied at the membrane. This explains why higher potential cause faster spikes. When the cell is in rest there is a negative electrical potential inside it. The opening of the channels makes the Na + ions flow into the cell, causing a rapid depolarization of the membrane. The flow of positive charged ions inside the cell leads the membrane to a potential close to E Na. After it, the voltage gated sodium channels inactivate and the voltage-gated potassium channels open. As a result K + ions rush out of the cell through the open channels causing the membrane potential to become negative again. Since at this time, there is very little sodium permeability the potential approaches E K causing a hyperpolarization close to the resting potential until the potassium channels close again. This is the process of a spike generation. Sodium channels cannot be activated again until some time has passed (this time is known as the absolute refractory period).

19 8 Chapter 2 - Integrate-and-Fire Neuron Models Membrane Capacitance & Resistance Capacitance C m The neuron membrane, as we have already seen, is an insulating layer consisting mainly of lipids and proteins. However both the intracellular and extracellular solutions contain ions and have conducting properties. So the role of the insulating membrane is equivalent to that of a capacitor on an electrical circuit. The actual membrane capacitance C m is specified in terms of the specific capacitance per unit area c m, measured in units of Farad per square centimeter (F/cm 2 ). If A is the area of a cell (in cm 2 ) then the actual capacitance C m (in F) is given by : C m = c m A. (2.2) C m is proportional to membrane area A, so the bigger the neuron the larger its capacitance. Given that the charge distributed on a surface is proportional to the capacitance (Q = CV ), we can see that larger neurons have bigger amounts of ions (charge) distributed across their membranes. A typical value for the specific capacitance c m, which was used in our simulations, is 1 µf/cm 2. Resistance R m The ion channels allow the ionic current to flow through the cell s membrane. Since there is a difference between the membrane voltage V m and the resting voltage V rest of the cell we can model the current flow through the ionic channels with a simple resistance R m. The actual membrane resistance R m is specified in terms of the specific resistance (or resistivity) r m, measured in units of ohms-square centimeter (Ω cm 2 ). If A the area of a cell (in cm 2 ) then the actual resistance R m (measured in Ω) is given by : R m = r m A. (2.3) We can see that R m is inversely proportional to membrane area A, so big neurons are more leaky than smaller cells. A typical value for the resistivity r m, which was used in our simulations, is 2 kω cm Synaptic Reversal Potential and Conductance An ionic reversal potential Vsyn rev is associated to every synapse. At this potential there is no net flux of ions through the ionic channel and the membrane potential across it is stabilized to Vsyn rev [17]. For an excitatory synapse the reversal potential is about mv while for an inhibitory one Vsyn rev has a value close to the neuron s resting potential ( 7 mv ). It has been experimentally observed that spiking activity on the presynaptic cell causes a conductance change in the membrane of the postsynaptic cell. This synaptic conductance g syn (t) depends on the presence of presynaptic action potentials and changes with time. It increases almost instantly to a maximum value g and then subsides exponentially within a time period of ms. This is the synaptic time constant τ syn. Although ionic channels and synaptic transmission is a highly nonlinear phenomenon, the presence of a synapse in a membrane clatch can be modeled satisfactory with the synaptic conductance g syn (t) in series with the synapse s reversal potential V rev syn.

20 Section The Integrate-and-Fire Model Electrical Structure of Neurons Using the aforementioned electrical properties of neural cells, we can describe the dynamics of the membrane potential V m (t) in response to the input current using a single RC circuit. The existence of a chemical synapse can be modeled by adding the synaptic conductance g syn (t) and the reversal potential Vsyn rev in parallel with the RC circuit. R m g syn (t) R m I inj C m V m C m V m V rest V rev V rest (a) Simple RC circuit (b) With synapse Figure 2.6 Equivalent electrical circuits of a simple neuron (a) and a neuron with a fast chemical synapse (b). 2.4 The Integrate-and-Fire Model The Integrate-and-Fire (I&F) is a very simple neuron model used widely to simulate and analyse neural systems [3]. Despite its simplicity the I&F model captures key features of real neuron s behaviour, like the rapid spike generation. The Integrate-and-Fire model emphasizes on the subthreshold membrane voltage properties and doesn t take into account complex mechanisms responsible for spike generation like the ionic channels. The exclusion of such difficult to model biophysical mechanisms, makes the IF model capable of being analysed mathematically and ideal for simulations including large numbers of neurons. Other neuron models like the Hodgkin-Huxley model [1] although they capture in a better way the biological mechanisms, are too complex to be used in computational simulations of larger networks. For example the Hodgkin-Huxley model describes both the subthreshold and the spiking behavior of membrane potential but is using four coupled differential equations. In 197 Lapicque [19] introduced the I&F model which is a passive circuit consisting of a resistor and a capacitor in parallel, which represent the leakage and capacitance of the membrane. In this simple model the capacitor is charged until a certain voltage threshold is reached. At this point a spike occurs (the capacitor discharges) and the voltage is reset to a specific value (V reset ). There are two basic versions of the Integrate-and-Fire model which are described below Nonleaky Integrate-and-Fire Neuron The nonleaky (or perfect) I&F model includes only a single capacitance C which is charged until a fixed and stationary voltage threshold V thr is reached. This model doesn t take into account the membrane resistance and as a result the leaking current which makes it unphysiological. However it is very simple to be described mathematically. Assuming an input current I(t) the differential equation governing the voltage is :

21 1 Chapter 2 - Integrate-and-Fire Neuron Models dv (t) C = I(t) (2.4) dt When V th is reached at time t i a spike δ(t t i ) is triggered and voltage is reset to V reset. For t ref seconds following the spike generation any input is shunted to ground making another spike during the absolute refractory period impossible [17] Leaky Integrate-and-Fire Neuron In the more general leaky model, the summed contributions to the membrane potential decay with a characteristic time constant τ m which is called the membrane time constant. Again when the membrane voltage V m reaches a fixed threshold V thr an action potential is initiated. After the spiking the voltage is reset to a resting value V rest and the neuron is inactivated for a brief time corresponding to the absolute refractory period. The model is described by the following differential equation: C m dv m (t) dt = I leak (t) + I noise (t) + I in (t) (2.) where I leak (t) the current due to the passive leak of the membrane, I noise (t) the current due to noise ( for non noisy neurons) and I in (t) the input current (injected through an electrode I inj (t) and/or through synaptic input I syn (t)). So there are two components for I in (t): The leaking current is given by the equation : I in (t) = I inj (t) + I syn (t). (2.6) I leak (t) = 1 R m [V m (t) V rest ] = C m τ m [V m (t) V rest ] (2.7) where τ m = R m C m the passive membrane time constant depending solely on membrane s capacitance C m and leak resistance R m. For our simulations we used a membrane time constant of τ m = 2 ms Synaptic Input Although the study of neuron s response to injected current pulses and noise is interesting from an experimental perspective, it is not realistic. In a real cell the main source of input current is synaptic input. Each neuron is synaptically connected to multiple other neurons through its dendrites. When an external stimulus is presented to an organism (for example a visual stimulus) some cells activate and the generated spike trains propagate through the axons of the activated neurons, acting as inputs to the cells connected on them. Assuming a presynaptic spike at time t spike, the postsynaptic current I syn (t) applied on the neuron at time t can be given by the following exponential equation, describing an AMPA synapse where the synaptic conductance g(t) is given by I syn (t) = g(t) ( V rev syn V m(t) ) (2.8) g(t) = g e t t spike τsyn. (2.9) In the previous equations, V rev syn is the synapse s reversal potential, g the maximum synaptic conductance and τ syn the synapse s time constant.

22 Chapter 3 Multiplication in the Nervous System 3.1 Introduction In the literature experimental evidence can be found that supports the existence of multiplicative mechanisms in the nervous system. Studies have shown that the optomotor control in the fly is controlled by neural circuits performing multiplication [12], [14]. More recent experiments have found a multiplicative like response in auditory neurons of the barn owl s midbrain [23]. The exact way multiplication is implemented in neurons is unclear. However there is a lot of interest about its details, driven by the experimental observations, which imply its existence. Koch and Poggio [18] have discussed different biophysical properties, present in single cells, capable of producing multiplicative interactions. Also in the literature some other neuronal models implementing multiplicative operations can be found (for example [27]). In this chapter we will initially try to show why multiplication is important, and how it could play central role in decision making and perceptive tasks. Following we present biological evidence of multiplicative operation in the neural system and in the end we describe some of the models that can be found in literature. 3.2 Importance of Multiplication The simplest neuron models operate under a regime of thresholding: if the sum of all inputs, excitatory and inhibitory (inhibitory synapses have a negative weight while excitatory a positive one) exceeds a certain threshold then the neuron is active, otherwise there is no spike generation. This binary threshold function is the only nonlinearity present in the model. In artificial neural networks sigmoid functions are used to give a smoother input-output relationship. The threshold function may be the dominant nonlinearity present in neurons but it is not the only one. As we will see on the next section literature is full of experimental evidence that supports the presence of multiplicative operations in the nervous system. Given that multiplication is the simplest possible nonlinearity, neuronal networks implementing multiplicative interactions can process information [18]. Below we will try to show how powerful this simple operation is and we will highlight its connection with the logical AND operation. We will also see how important multiplication is for decision making tasks Function Approximation The Weierstrass approximation theorem states that every continuous function defined on an interval [a, b] can be uniformly approximated as closely as desired by a polynomial function. More 11

23 12 Chapter 3 - Multiplication in the Nervous System formally the theorem has the following statement: Theorem Suppose f is a continuous complex-valued function defined on the real interval [a, b]. For every ǫ >, there exists a polynomial function p over C such that for all x in [a, b], we have f(x) p(x) < ǫ, or equivalently, the supremum norm f p < ǫ. If f is real-valued, the polynomial function can be taken over R. The only nonlinear operation present in the construction of a polynomial is multiplication. As a result if neural networks are capable of doing multiplicative-like operators then they could approximate under weak conditions all smooth input-output transductions [18]. A polynomial can be expressed as the sum of a set of monominals. A monominal of order k can be modeled with a multiplicative neural unit which has k inputs. P(x) = a 1 + b 1 x 1 + b 2 x 2 + c 1 x c 2x 1 x (3.1) Relationship Between Operators In order to understand the importance of multiplication we should first understand that multiplication is in fact a close relative of another far more fundamental operation, the logical AND ( ) operation. In Boolean algebra x 1... x i... x n is true only if x i is true for all i. If there exists some x i which is false then the whole expression is false. This behavior is similar to the multiplication with zero in classical algebra: x =, x R. More strictly, the behavior of the operator is similar to the minimum function. On the other hand, the second more common Boolean operation OR ( ) can be parallelized with addition or more strictly with the maximum function. On the following table we illustrate these relations, in a truth-like table form. Relationship Between Operators x y x y x y min(x, y) x y x + y max(x, y) Multiplication and Decision Making The parallelism between multiplication and the logical AND operation could explain the importance of multiplicative neural mechanisms from a decision making perspective. Logical AND ( ) operations are fundamental in such tasks and generally in the organization of perception. We will try to demonstrate this importance with a simplified example. Imagine a bird whose diet includes an edible red flower. In order to discriminate this flower from another similar one our bird detects its characteristic odour. So it eats only these red flowers which also have the desired smell. This is nothing more than an AND based perceptive task. Assume that there are two regions in the bird s neural system, one responsible for recognising the red color and the other for recognising the desired odour. The outputs of these neural regions are combined in a third region which decides if a specific flower is edible or not. If the two outputs were just added, then a very strong output of the color detector would produce a stronger output signal than a modest output of both detectors. That could lead the bird to eat a poisonous flower. If on the other hand the outputs of the two detectors are multiplied then the performance would be better. The absence of one feature (color or odour) would suppress the output and prevent the

24 Section Biological Evidence of Multiplication 13 bird from classifying the flower as edible. If on the other hand both features are present but weak then the multiplicative operation would lead to a supra-linear enhancement of the output signal. Through this intuitive example we showed that perceptive tasks which include operations can modeled better using multiplication than simple addition. However it is not known to what extent multiplicative like mechanisms are present in the neural system. In the next section we do a literature research, presenting evidence of such multiplicative behaviors. However for binary signals, when imposing a threshold, the difference between the AND operation and addition is minor. 3.3 Biological Evidence of Multiplication Multiplicative operations are thought to be important in sensory processing. Despite the research on this topic is limited, there is significant experimental evidence that reinforces the ideas for multiplicative biophysical mechanisms. The most interesting clue of multiplicative properties of neurons can be found in the auditory system. There is also evidence that multiplication is carried out in the nervous system for motion perception tasks [18]. In the following sections we will present these clues, trying to underline the importance of multiplication Barn Owl s Auditory System Barn owls are able to use their very accurate directional hearing to strike prey in complete darkness. This impressive capability is based on a very complex auditory system, barn owls have, which among other specializations includes asymmetric external ears. As a consequence of this asymmetry, the owl s auditory system computes both interaural time (ITD) and level (ILD) differences in order to create a two dimensional map of auditory space [22]. Interaural level differences (ILDs) vary with elevation, allowing barn owls to use ILDs, in order to localize sounds in the vertical plane. Similarly interaural time differences (ITDs) are used for localization in the horizontal plane. Neuronal sensitivity to these binaural cues first appears in the owls brainstem, with separate nuclei responsible for processing ILDs and ITDs. Both ITDs and ILDs information are merged in space-specific neurons that respond maximally to sounds coming from a particular direction in space. The parallel pathways that process this information merge in a region known as the external nucleus of the inferior colliculus (ICx), eventually leading to the construction of a neural map of auditory space (see Figure 3.1). The research of Pena and Konishi [22], suggests that the space-specific neurons in the barn owl ICx tune at the location of an auditory stimulus by multiplying postsynaptic potentials tuned to ITD and ILD. So the subthreshold responses of these neurons to ITD-ILD pairs have a multiplicative rather than an additive behavior. Owls were anesthetized and postsynaptic potentials generated by ICx neurons in response to different combinations of ITDs and ILDs, were recorded with the help of intracellular electrode recordings. Acoustic stimuli were digitally synthesized with a personal computer and delivered to both ears by calibrated earphone assemblies giving rise to the various ITD-ILD pairs [23]. The researchers discovered that a model based on the product of the ITD and ILD inputs could account for more of the observed responses. An additive model was also tested but it was not efficient and could not reconstruct the original data matrix as well as the multiplicative model. In Figure 3.2 we can see the success of the multiplicative model in reconstructing the measures membrane potential for different ITD-ILD pairs.

25 14 Chapter 3 - Multiplication in the Nervous System Figure 3.1 Space-specific neuron in barn owl s auditory system, that respond maximally to sounds coming from a particular direction in space. (A) A drawing of an ICx neuron and its axon projecting to the optic tectum (OT ). (B) The same neuron labeled with neurobiotin. (C) Postsynaptic potentials in response to different ITD-ILD pairs. Dotted lines indicate the mean resting potential. (D) Spiking responses of the same neuron to different ITD-ILD pairs. The large peak is the excitatory center and the flat area around it is the inhibitory surround [compare (C) and (D)]. Negative (-)ITD and negative (-)ILD mean, respectively, sound in ipsilateral ear leading and louder. (Figure and caption taken from [22].) The Lobula Giant Movement Detector, LGMD, of Locusts Gabbiani et al. [11] mention that there is evidence for the existence of a multiplicative operation in the processing of looming stimuli. They experimented with a neuron in the locust visual system (the LGMD neuron) that responds well to objects looming on a collision course towards the animal. Multiplication could be used for the computation of an angular threshold that could prevent collision with looming objects. The firing rate of the LGMD neuron was monitored by recording the action potentials of its post-synaptic target neuron (DCMD). The insect was presented with black squares or disks on a bright background, which simulated approaching. The monitored firing rate a firing rate initially increased, until a peak and finally it decayed as the approach ended.

26 Section Biological Evidence of Multiplication 1 If we denote by l the looming object half-size, by t the time to collision and by v its approach velocity, then the angular size is given by [1] : ( ) l θ(t) = 2arctan vt (3.2) In a beautiful analysis the researchers suggest that the angular threshold might be the imagebased retinal variable used to trigger escape responses in the face of an impending collision. Indeed, a leg flexion (presumably in preparation for an escape jump) has been shown to follow the peak LGMD firing rate with a fixed delay [11]. The researchers tried to figure out how the the angular threshold is calculated by the insect s nervous system. They tried different models which were based on the size of the forthcoming object and the velocity, that could describe the recorded responses of the LGMD. One input was excitatory and the other one inhibitory. By using selective activation and inactivation of pre and postsynaptic inhibition, they found out that postsynaptic inhibition played a very important role, suggesting that multiplication is implemented within the neuron itself [1]. Experimental and theoretical results are consistent with multiplication being implemented by subtraction of two logarithmic terms followed by exponentiation via active membrane conductances, according toa 1 = exp(ln(a) ln(b)). In Figure 3.3 we can see some of their results. b Figure 3.2 Multiplicative combination of ILD and ITD inputs. (A) Raw data matrix. (B) Reconstruction of the matrix from the computed left and right singular vectors and the first singular value. Addition of V [DC offset (blue area)] that minimizes the second singular value almost restores the original matrix. (C) ITD curve. (D) ILD curve. (E) Computed left singular vector. (F) Computed right singular vector. (Figure and caption taken from [22].)

27 16 Chapter 3 - Multiplication in the Nervous System Figure 3.3 Transformation between membrane potential (V m ) and firing rate at the spike initiation zone. (a) Approaching stimulus (top), recordings from the DCMD (middle, extracellular) and from the LGMD (bottom, intracellular) close to its spike initiation zone (b, inset). Orange trace is membrane potential after median filtering (V m ). Inset, bracketed portion of V m and (V m ) expanded 3 times. (b) Top panel presents median filtered membrane potential (orange line is same trace as in a; repetitions). Bottom traces were recorded after TTX application to the axon (inset). (c) Mean traces in control and TTX (from b) were fitted with a third-order polynomial (black) and used to compute the mean temporal difference (32 ms) in membrane potential over the response rising phase. (d) Fit of mean instantaneous firing rate, g, as a function of mean, median filtered membrane potential (mean ± s.d.; solid and dotted black lines) with linear, third-power and exponential models. (Figure and caption taken from [11].) Other Evidence Multiplicative evidence can also be found in the optomotor control of the fly [12], [14]. The fly s optomotor response to transient stimuli was studied under open loop conditions. The stimuli used were moving edges and stripes. A comparison of the fly s responses to these stimuli led the researchers to the result that progressive moving patterns elicit stronger responses (from front to back) than regressive moving ones (from back to front). The existence of such deviations in fly s response to different moving edges suggests the evidence of nonlinearities in insect s perceiving system.

28 Section Existing Models 17 Research done by Reichardt [24] has also suggested that the optomotor response of insects to moving stimuli is mediated by a correlation like operation which can be seen abstractly seen as a form of multiplication. Similar experimental observations exist for pigeons as well [28]. Finally Andersen [1] reviews some research papers, which indicate that sensory signals from many modalities, converge in the posterior parietal cortex in order to code the spatial locations of goals for movement. These signals are combined using a specific gain mechanism. 3.4 Existing Models In the literature there are some papers which propose models for multiplicative neural operations. Most of these models are single cell s specific biophysical mechanisms which could give rise to a multiplicative-like operation. However we should note that the research in this field is limited, despite the importance of understanding how multiplicative-like operations are neurally implemented. In our proposal we won t concern with single cell models, but with small feed forward networks of Integrate-and-Fire neurons. On the following paragraphs we will briefly explain some of the multiplicative models found in literature Multiplication via Silent Inhibition Silent inhibition in some special cases can give rise to a multiplicative behavior. We have seen in the previous chapter that synaptic current is given by: I syn (t) = g syn (t) ( V rev syn V m(t) ). (3.3) If we suppose that the synaptic input changes slowly [18] then we can assume that the synaptic conductance g syn (t) changes slowly with time. As a result there will be a stationary current and g syn will be the constant synaptic input. If R syn the synaptic resistance then using Ohm s law, V = R I we take the following equation for the membrane voltage: V m = g synr syn Vsyn rev. (3.4) 1 + R syn V rev If the synaptic reversal is close to the resting potential of the cell (shunting inhibition) then the action of this synapse to V m remains invisible. From the previous equation we can take a multiplicative relation if we assume that the product of the synaptic resistance and synaptic conductance is small, g syn R syn 1 : syn V m g syn R syn V rev syn. (3.) If we also have an excitatory synaptic input with an associated conductance change g e and a reversal potential Vexc rev then using Taylor expansion we take [18]: V m V rev exc R syn ( ge g 2 e g e g syn R syn +... ) (3.6) which includes quadratic contributions from the excitatory synaptic terms and higher order terms from combinations of the excitatory and inhibitory inputs Spike Coincidence Detector Srinivasan and Bernard [27] used an input spike coincidence detector in order to model multiplication like responses. The main aim of the authors was not to model exact multiplication

29 18 Chapter 3 - Multiplication in the Nervous System but to describe a scheme by which a neuron can produce a response which is proportional to the product of the input signals that it receives from two other neurons. They investigated a neuronal model in which the neuron produces a spike only if it receives two spikes, from the two external neurons, that are coincident in time or nearly so. In Figure 3.4 we can see how such a neuron operates. Figure 3.4 Neuron C receives input from two neurons A,B. Cell C fires a spike only if two input action potentials arrive within a ms. Only in this case the voltage membrane reaches the threshold. As a result the output firing rate of neuron C is proportional to the firing rates of A,B. (Figure taken from [27].) In order to model coincidence detection, the proposed neuron spikes when its membrane voltage V m is above a certain threshold V thr. The presence of only one presynaptic spike cannot cause enough EPSP to discharge the cell, but if two spikes arrive within ms then the voltage threshold is reached and an output spike is generated. If V max the maximum membrane potential the neuron can reach from a single input spike, then there is an exponential decaying relation between membrane voltage and time, V (t) = V max e t τ (3.7) where τ the neuron s time constant. The authors make the assumption that V max < V thr < 2V max so a single spike cannot initiate a postsynaptic action potential. If we have an input spike at time t there should be another spike in an interval of ms before (or after) t ([t, t + ]) in order to have the postsynaptic action potential. Given two spikes at t and t + then the neuron will fire an action potential and as a result can easily be determined by the equation: V max e τ + Vmax = V thr. (3.8) The authors assumed statistical independence of the two input firing rates (a natural assumption in most cases, for example when the stimuli causing activation of the two presynaptic cells

30 Section Existing Models 19 are independent) and showed that the output firing rate is proportional to the product of the two input firing frequencies [27]: f out = 2 f A f B (3.9)

31 2 Chapter 3 - Multiplication in the Nervous System

32 Chapter 4 Multiplication with Networks of I&F Neurons 4.1 Introduction In the previous chapter we presented evidence of multiplicative behavior in neural cells. We also argued for the importance of this simple nonlinear operation. Despite its simplicity it is unclear how biological neural networks implement multiplication. Also the research done in this field is limited and the models found in bibliography (we presented some of them in the previous chapter), are complex single cell biophysical mechanisms. We try to approach multiplication using very simple networks of Integrate-and-Fire neurons and a combination of excitatory and inhibitory synapses. In this chapter we are going to present the underlying theory and the proposed models. We also analyze in depth the main idea behind this dissertation which is the usage of the minimum function for implementing a neural multiplicative operator. 4.2 Aim of the Thesis The aim of this thesis is to find feed-forward networks of Integrate-and-Fire neurons which do multiplication of the input firing rates. The problem can be defined as follows: Problem Given two firing rates ρ 1, ρ 2 [in Hz], find a network of Integrate-and-Fire neurons whose output spike train has a firing rate ρ out where ρ out = ρ 1 ρ 2 (4.1) In the next sections we will see that the exact multiplication is not possible so we will try to approximate it. Before presenting the proposed networks we will give the definitions for firing rates and rate coding. 4.3 Firing Rates and Rate Coding The way information is transmitted in the brain, is a central problem in neuroscience. Neurons respond to a certain stimulus by the generation of action potentials which are called spike trains. Spike trains are stochastic and repeated presentation of the same stimulus don t cause identical firing patterns. It is believed that information is encoded in the spatiotemporal pattern of these 21

33 22 Chapter 4 - Multiplication with Networks of I&F Neurons trains of action potentials. There is a debate between those who support that information is embedded in temporal codes and researchers in favor of rate coding. In rate coding hypothesis, one of the oldest ideas about neural coding [3], information is embedded in the mean firing rates of a population of neurons. On the other hand, temporal coding relies on precise timing of action potentials and inter-spike intervals Firing Rates Suppose that we record the output of an integrate-and-fire neuron, for a specific time interval of duration T. In total n spikes are observed which occur at times t i, i = 1,..., n. Then the neural response r(t) can be represented as a sum of Dirac functions: n r(t) = δ(t t i ) (4.2) i=1 The specific timing of each action potential is useful only if we use temporal coding. In this thesis we study the multiplication of firing rates, so the times t i are not useful. Due to their stochastic nature, neural responses can be characterized by firing rates instead of specific spike trains []. Figure 4.1 Firing rates approximated by different procedures. (A) A spike train from a neuron in the inferotemporal cortex of a monkey recorded while that animal watched a video on a monitor under free viewing conditions. (B) Discrete time firing rate obtained by binning time and counting spikes for t = 1 ms. (C) Approximate firing rate determined by sliding a rectangular window function along the spike train with t = 1 ms. (D) Approximate firing rate computed using a Gaussian window function with σ t = 1 ms. (E) Approximate firing rate using the window function w(τ) = [ α 2 τ exp( ατ) ] +, where 1/α = 1 ms. (Figure and caption taken from [].) If there is low variability in the spiking activity then the firing rate can be accurately approached by the spike count rate which is nothing more than the frequency if the n action potentials during a time T.

34 Section Excitation vs. Inhibition 23 ρ = n T = 1 T T r(t)dt (4.3) Of course if there is variability in the frequency of spikes this approximation is not sufficient and a time-dependent firing rate should be used. The firing rate at time t in this case, can be defined as the time of spikes between t and t + t, where t a small time interval []. So mathematically we can express ρ(t) as: ρ(t) = 1 t t+ t t r(t) dt (4.4) where the trial averaged neural response r(t) is the sum of spike occurrences over the interval [t, t + t] for K trials, divided by the number of trials K Rate Coding According to the rate coding hypothesis, information is encoded in the mean firing rate of a population of neurons and not the exact timing of the action potentials. It has been observed that different stimuli cause spike trains of different frequencies on the same neurons. Generally as the intensity of a stimulus increases so does the frequency or rate of action potentials. These experimental observations have lead to the formation of rate coding hypothesis. Theunissen [29] defines rate coding as a coding scheme in which there is significant correlation between the relevant stimulus parameter and the mean number of spikes in the elicited response within the encoding window and no additional correlation between the stimulus parameter and any higher-order moments (or higher-order principal components) of the elicited spike pattern. Unlike rate coding [32], temporal coding relies on precise timing of action potentials or interspike intervals. Combined with traditional rate coding models, temporal coding can provide additional information with the same rate. There is strong evidence that temporal codes are used for cortical information transmission, especially after the discovery of spike timing dependent plasticity [26]. Theunissen [29] defines temporal coding as a coding scheme in which there is significant additional correlation between the relevant stimulus parameter and any moments of the elicited spike pattern having higher order than the mean. 4.4 Excitation vs. Inhibition We have seen in Chapter 2 that the main difference between an excitatory and an inhibitory synapse is the reversal synaptic potential. We are going to analyze on this section how the type of the synapse affects the response of the cell for an input spike train. The synaptic current I syn (t) is given by the following equation and depends on the synaptic conductance g(t) and the difference between the reversal potential Vsyn rev and the membrane voltage V m : I syn (t) = g(t) ( V rev syn V m(t) ) (4.) where g(t) changes with time as we have already seen. Let a time moment t where g(t ) = g t = ps. For this time moment we plot the synaptic current for different values of the reversal potential and the membrane voltage. For an inhibitory synapse the reversal synaptic potential is close to the resting potential of the neuron V rest, which in most cases is about 7 mv. On the other hand the reversal potential for an excitatory synapse is close to mv. In Figure 4.2 we can see that as Vsyn rev increases, so does I syn. For values of the reversal potential close to the ones observed on inhibitory synapses we

35 24 Chapter 4 - Multiplication with Networks of I&F Neurons 4 x Synaptic Current Membrane Voltage Reversal Potential 2 Figure 4.2 Plot of the synaptic current I syn (t) for different values of the reversal synaptic potential V rev syn and the membrane voltage V m. The synaptic conductance was assumed constant to ps. have negative synaptic current, while a reversal potential of mv can produce synaptic current of about 3 pa. The positive synaptic current for an excitatory synapse has the following effect : an action potential in the presynaptic cell increases the probability of an action potential occurring in the postsynaptic cell. On the other hand the minor synaptic currents caused by inhibitory synapses, result in inconsiderable changes in the membrane voltage V m and as a result to low firing possibilities. Of course the existence of neural noise can cause the firing of a spike even if there is no input current. In the case of both excitatory and inhibitory inputs on a neuron, the negative inhibitory current acts subtractively to the positive current of excitation. In Figure 4.3 we can see how the firing sequence of a neuron depends on the kind of the synapse, for the same spike train input. Excitatory I&F Synaptic Input Inhibitory Cell Response Figure 4.3 Output spike train of an I&F neuron after stimulation with the sequence of action potential presented on the left. We can see how the output depends on the kind of the synapse Subtractive Effects of Inhibitory Synapses A synapse is called inhibitory when its reversal potential is less than the threshold for action potential generation. If it is close to the resting potential of the cell then this is called shunting inhibition and has a divisive effect on subthreshold EPSP amplitudes [7]. However this divisive effect is not observed in firing rates. Holt and Koch [16] have shown that there is a subtractive effect on the firing rate. This happens because the current that passes

36 Section 4. - Rectification 2 through the shunting conductance is independent of the firing rate. The voltage at the shunting site cannot take a larger value than the spiking threshold and as a result the inhibitory synaptic current is limited for different firing rates. Under these circumstances a linear subtractive operation is implemented. We simulated an Integrate-and-Fire neuron which had an excitatory and an inhibitory synapse. The reversal potential of the inhibitory one was close to the resting potential of the neuron making it shunting. On the following diagram we can see that there is indeed a subtractive effect on firing rates. The firing rate of the spike train used as input to the excitatory synapse is 8 Hz while the inhibitory firing rates change from to 8 Hz. The firing rates of the output spike train are plotted, among with the actual difference of the two input firing rates ρ exc ρ inh Output firing rate [in Hz] Mean firing rate of the inhibitory input [in Hz] Figure 4.4 A noisy I&F neuron with two synaptic inputs (one excitatory and one inhibitory) was stimulated and the output firing rates were recorded. The firing rate of the excitatory input was kept constant to 8 Hz while we were increasing the firing rate of the inhibitory input from to 8 Hz with a step of Hz. With black circles we see the recorded output firing rates, while the red triangles are the difference between the excitatory and inhibitory firing rates ρ exc ρ inh. We can see that there is a subtractive effect as expected. 4. Rectification Let an Integrate and Fire neuron (I&F) which is stimulated with excitatory presynaptic action potential whose average firing rate is ρ exc in and with inhibitory ones having a firing rate ρinh. The output spike train depends on both the excitatory and inhibitory inputs, and its firing rate is ρ out. If ρ inh in = Hz then output firing rate will be close to the firing rate of the spike train arriving at the excitatory synapse. On the other hand the absence of excitatory input, or the presence only of inhibitory presynaptic spikes, will deter the neuron from spiking.

37 26 Chapter 4 - Multiplication with Networks of I&F Neurons If we have both excitatory and inhibitory synapses then as we have seen the inhibition has a subtractive effect. on firing rates. Since the firing rate of a neuron cannot take a negative value the output will be a rectified copy of the input: ρ out = max (, ρ exc in ) [ ρinh = ρ exc in ] ρinh (4.6) + where [ ] + stands for rectification. Since in our model we care only about firing rates (and not membrane voltage dynamics), we should note that rectification will be the only present nonlinearity in the approximation of multiplication. A noisy integrate and fire neuron which has two synaptic inputs, one excitatory and one inhibitory, was simulated for different values of ρ exc in and ρ inh in. The output firing rate ρ out in all cases was close to the rectified difference of the two inputs ρ exc in ρinh as suggested by theory. The expected output firing rates and the recorded ones are presented on the following figure for both simple rectification and power-law Nonlinearity. 1 Output firing rates [in Hz] Inhibitory firing rates [in Hz] (a) Recorded Firing Rates 6 8 Excitatory firing rates [in Hz] Output firing rates [in Hz] Output firing rates [in Hz] Inhibitory firing rates [in Hz] Excitatory firing rates [in Hz] (b) Expected Firing Rates - Simple Inhibitory firing rates [in Hz] Excitatory firing rates [in Hz] (c) Expected Firing Rates - Power Law 1 Figure 4. An I&F neuron with two synaptic inputs was simulated for different input firing rates in order to examine if there is rectifying behavior. Both the excitatory and inhibitory inputs ρ exc in, ρinh were gradually increased from to 9 Hz with a step of Hz. (a) Recorded output firing rates. (b) Expected input-output relation according to the equation ρ out = [ ρ exc in ] ρinh, (c) Expected input-output relation + according to the equation ρ out = [ ρ exc in ] 1.4 ρinh +.

38 Section Approximating Multiplication 27 We can see that the neuron responds as expected but there are some errors. In Figure 4.6 we plot the error surface which is nothing more than the difference between the observed output firing rates and the expected ones. The maximum error is 18 Hz and it was recorded for large firing rates of the excitatory input. This is natural considering that for large excitatory input firing rates, inter spike intervals are smaller than the synaptic time constant, giving synaptic current capable of generating postsynaptic spikes. This in combination with the noisy background current can explain this error of 2 Hz. For smaller input firing rates the observed error is not more than Hz in most cases Error [in Hz] 1 1 Error [in Hz] Inhibitory firing rates [in Hz] Excitatory firing rates [in Hz] (a) Error - Simple Inhibitory firing rates [in Hz] 4 2 (b) Error - Power Law Excitatory firing rates [in Hz] 1 Figure 4.6 Error surfaces for a rectifying I&F neuron for different values of input firing rates. If ρ exp the expected output firing rate and ρ obs the observed one for two given values of input firing rates ρ exc in, ρinh, then the error is defined as ρ obs ρ exp. For both cases the error is similar Power-law Nonlinearities The approximation of multiplication using networks that do the minimum function can be improved if the input-output relation is slightly nonlinear. Generally a non-linear input-output relation of a rectifying neuron will have the form ρ out = ( [ ρ exc in ρ inh in + )n (4.7) where n is close to 1. It can be found [3] that for n = 1.4 the error between the approximation and the exact multiplication is minimum (see also Figure 4.7). Supra-linear relations of this kind have been observed in cat s visual cortex [2]. Anderson et al. compared the orientation tuning of spikes and membrane potential responses in single cells. They showed that noise can smooth the relation between membrane potential and spike rate, since even subthreshold responses are capable of generating spikes. Miller and Troyer [2] extended Anderson s work proving analytically that a power-law nonlinearity is the only input-output function that converts contrast-invariant voltage tuning into contrast-invariant spiking tuning. But the most important observation they made is that addition of Gaussian noise to the cell causes a relationship between membrane voltage and firing rate that is well approximated by an expansive power law. 4.6 Approximating Multiplication Ideally we want to find a network of Integrate and Fire neurons whose output is the product of two input firing rates ρ 1, ρ 2. But the exact multiplication is not possible since the only nonlinear ]

39 28 Chapter 4 - Multiplication with Networks of I&F Neurons operator we have is the rectification. So we will try to approach multiplication using the available functionalities. Somebody would ask how we define the abstract term approximation of multiplication. What we actually want to approach is the output firing rates landscape. Imagine that we have a first population of N neurons each responding with a specific firing rate f i, 1 i N to a stimulus s 1 and a second population of M neurons which respond with firing rates g j, 1 j M to a second stimulus s 2. Consider a set of N M neurons arranged in a matrix form, where the (i, j) element is selective to the i th neuron of the first population and the j th neuron of the second population. The output firing rates of these N M can be visualized as a three dimensional plot where the x and y axis correspond to the f, g firing rates. The z axis corresponds to the output firing rate of the set of N M neurons. If the network approximates multiplication, then the three dimensional plots should resemble to the landscape obtained if we multiplied every pair of f i, g j and creating the corresponding plot. Since exact multiplication is not possible, in order to see how well a certain network approaches the desired output we normalize the output firing rates and check which network minimizes the error surface. The only tools we have in order to construct networks performing a multiplication-like operation are excitation, subtractive inhibition and rectification. Multiplication is a close relative of another more fundamental operation, the logical AND ( ). Actually is the binary equivalent of. Logical AND (like other logical operators) is fundamental in perceptive tasks and that s one more reason of the importance of multiplication. Actually logical AND is nothing more than the minimum of the two binary digits. So we could abstractly approach the multiplication of two firing rates with the minimum of these two. Hypothesis The multiplication of two firing rates ρ 1, ρ 2 can be approached using the minimum function: ρ 1 ρ 2 min (ρ 1, ρ 2 ) (4.8) Actually multiplication is accurately approached using the minimum function [3]. Using rectifying neurons its very simple to create simple networks which theoretically have as output a spike train whose firing rate is the minimum of the two inputs ρ 1, ρ 2. For example min (ρ 1, ρ 2 ) = [ ρ 1 [ρ 1 ρ 2 ] + ] + Let ρ 1 (x) = 4 1+e x and ρ 2 (y) = sin(y)+1, where x, y take values from predefined intervals. On the following figure we can see the resemblance between the actual multiplication and the landscape obtained using the minimum function. If there exists a power law nonlinearity then the error is minimal. 4.7 Proposed Networks Since it is very easy to find networks of Integrate and Fire neurons that do Boolean operations like the minimum function, and given that multiplication can be approached accurately using the minimum we can restate our initial problem: Problem Given two firing rates ρ 1, ρ 2 [in Hz], find a network of Integrate-and-Fire neurons whose output spike train has a firing rate ρ out where ρ out = min (ρ 1, ρ 2 ) (4.9)

40 Section Proposed Networks 29 (a) Exact (b) Linear (c) Non-Linear Figure 4.7 Multiplication of the firing rates ρ 1 (x) = 4 1+e x and ρ 2 (y) = sin(y) + 1. (a) Exact multiplication, (b) Approximation using the minimum function, (c) Approximation if there is a supra-linear input-output relation. In the following sections we are going to present two networks that find the minimum of the two input firing rates and were used for the simulations. We should note that these networks are not unique and somebody could find many other networks that implement the same function. However their simplicity and the fact that they could easily be implemented computationally made us select them. Only feed forward connections are used despite computation could be also implemented using feedback connections [3] Network 1 The first proposed network can be seen in Figure 4.8. With arrows excitatory synapses are represented while circles stand for inhibitory synapses. The minimum function is easily taken using the rectification function and a combination of excitatory/inhibitory synapses. In this network a lateral connection is used. If f, g the input firing rates then the firing rate of the output will be the minimum of these two. The following rectifying function is implemented: min(f, g) = [ ] f [f g] + (4.1) Network 2 The second network (Figure 4.9) finds the double of the minimum, using four integrate and fire neurons in a two layers feed forward neuron. The rectifying function follows: 2min(f, g) = [ ] [f + g] + [f g] + [g f] + (4.11) +

41 3 Chapter 4 - Multiplication with Networks of I&F Neurons if f > g then f-(f-g) = g else f- = f f g min(f,g) if f > g then f-g else Figure 4.8 A simple feed-forward network of I&F neurons which implements the minimum function. Excitatory synapses are represented as arrows while for inhibitory synapses circles are used. if f > g then f-g else f if f > g then f+g-(f-g)- = 2g else f+g--(g-f) = 2f f+g g if f > g then else g-f 2min(f,g) Figure 4.9 A second feed-forward network of I&F neurons which implements the double of the minimum function. Excitatory synapses are represented as arrows while for inhibitory synapses circles are used.

42 Chapter Simulation Results.1 Introduction The networks presented in the previous chapter will be used for our simulations. In this chapter we are going to present the experimental results and discuss about the performance of the two networks. We will see that in most cases our networks manage to approach multiplication. Before presenting the results we will show how we adjusted the parameters of the integrateand-fire neurons in order to have the desired input-output relation. Another very important observation we made and we will analyze in this chapter is the importance of spike timing. We will see that the output of the networks does not depend only on the input firing rates but also on the exact timing of the spikes. This may be a clue that temporal coding is also present in seemingly just rate coding functionalities. Maybe the spatiotemporal pattern of the input spike trains plays a minor role among with the crucial role of their firing rate. All computational simulations presented here were done using Simulink. Simulink is an environment for multidomain simulation and Model-Based Design for dynamic and embedded systems. It offers tight integration with the rest of the MATLAB environment and its usage is very simple. We developed a library for the needs of our dissertation which can be used for simulations of networks of Integrate-and-Fire neurons. In the Appendix we present in detail this library..2 Neuron s Behavior Theoretically when an integrate-and-fire neuron has only one excitatory input, then the firing rate of the output spike train should be equal to the input one. But what happens in reality? We used Poisson spike generators to create spike trains with firing rates from to 12 Hz with a step of Hz. These spike trains were used as input at a noisy integrate-and-fire neuron with the following parameters: V thr = mv, V rest 7 mv, V reset = 7 mv, τ m = 2 ms, Vrev exc = mv, τ syn = 1 ms and g = ps. In order to have statistically correct results each experiment was repeated 1 times and the mean output firing rate was calculated. We plotted (Figure.1 a) the input-output firing rate relations. Surprisingly we observe that while for the first 4 Hz there is a linear relation for input firing rates greater than 4 Hz ρ in ρ out. The best fit is obtained with a cubic curve. We can see that for the linear relation (red curve) significant errors are observed (Figure.1 b). 31

43 32 Chapter - Simulation Results Data Cubic Fit Output firing rate [in Hz] Quadratic Feet 2 Linear Fit Input firing rate [in Hz] (a) 1 linear quadratic cubic Error [in Hz] Input firing rate [in Hz] (b) Figure.1 (a) The input output relation for different input firing rates and polynomials (up to cubic) that fit this relation. (b) Errors between the observed relation and the three polynomials. While theoretically there exists a linear relation, we observe that a cubic curve approaches better the recorded input-output firing rates.

44 Section.3 - Adjusting the Parameters 33.3 Adjusting the Parameters Before simulating the proposed networks, we adjusted the parameters of the integrate and fire units. We remind that given an excitatory synaptic input with firing rate ρ exc and an inhibitory one with rate ρ inh then the firing rate of the output spike train ρ out should be: ρ out = max (, ρ exc ρ inh ) = [ρ exc ρ inh ] + Given that parameters like the resting potential of the neuron or the threshold voltage are constant and cannot be modified, we will adjust the two parameters of the inhibitory synapse : its reversal potential Vrev inh and the synaptic time constant τ syn. In order to find the best pair ( ) τ syn, Vrev inh we used an error minimization criterion. For two predetermined input firing rates ρ exc, ρ inh, the absolute error between the expected output firing rate ρ expected out and the observed one ρ recorded out is: error = ρ expected out ρ recorded out (.1) In order to take a more statistically accurate result we repeat the experiment with the same pair of parameters ( ) τ syn, Vrev inh P times and take the average error: error = P i=1 ρexpected out P ρ recorded out = P i=1 [ρ exc ρ inh ] + ρ recorded P out. (.2) We varied the synaptic time constant τ syn from to 2 ms and the reversal potential of the inhibitory synapse from 1 to 6 mv. Both these value ranges are realistic and such parameter values have been observed in biological neurons. For every pair of parameters ( ) τ syn, Vrev inh we presented the integrate and fire neuron with many different combinations of ρ exc, ρ inh and averaged the error. Figure.2 is a plot of the mean error for different values of τ syn and Vrev inh. We can see that for τ syn = 1 ms and Vrev inh = 9 mv the error is minimal. These were the synaptic parameters that were used in the simulations of the proposed networks..4 Multiplication of Firing Rates Using the parameters that give the smallest error, we simulated the two networks proposed in the previous chapter. In this Section we present the experimental results for different input firing rates. All these experiments were conducted using the first proposed network (Figure.3-a). In the next section we will see that the second network has not so good performance. The comparison of the two architectures will be presented on a following section. Before presenting the results, we describe first the experimental procedure..4.1 Experimental Procedure The input firing rates f, g take values from two predefined vectors f values, g values. Let N the number of elements of vector f values and M the elements of g values. Poisson spike generators are used to produce spike trains with the desired firing rates. These vectors can be seen as a population response to a certain input. The simulation is run for every combination f values (i), g values (j), 1 i N, 1 j M giving an array N M of output firing rates. This can be seen as the output of a set of N M neurons where the (i, j)th element is selective to the i th neuron of population f and the j th neuron of population g.

45 34 Chapter - Simulation Results 7 6 Error [in Hz] V r ev [in mv] Synaptic time constant [in ms] Figure.2 Mean Error (in Hz) for different pairs of synaptic time constant τ syn and reversal potential of the inhibitory synapse Vrev inh. For every pair of parameters an integrate-and-fire neuron was simulated for various inputs ρ exc, ρ inh and the average error was taken. We can see that for τ syn = 1 ms and Vrev inh = 9 mv the error surface is minimized and the neuron fits better the desired rectifying behavior..4.2 Experiment 1 In this experiment two identical Gaussian like population responses are multiplied (Figure.4). There are 1 neurons in each population so 1 1 output firing rates were recorded. In order to understand better the results we make some three dimensional plots. The x axis can be considered as the number of neuron in population f while the y axis as the neuron in population g. So for example the (3, ) point in the x y plane corresponds to the neuron in the set of N M neurons that is selective to the third neuron of population f and the fifth one from population g. The corresponding z axis value is the firing rate that was recorded. Three plots are created each time. The lower left subplot (c) corresponds to the real multiplication of the two firing rate vectors: ρ out (i, j) = f values (i) g values (j), 1 i N and 1 j M. (.3) Since Poisson spike generators are used, there is a deviation between the mean firing rates as defined in the two vectors, and the spikes that are actually generated. If fvalues real (i), greal values (j) the real input firing rates then on the upper right (b) plot we draw the expected result: ρ out (i, j) = min ( f real values(i), g real values(j) ), 1 i N and 1 j M. (.4) Finally on the lower right figure (d) we plot the firing rate that was recorded at the output of the network. In order to evaluate the performance of the network this actual result (d) should be compared to the expected one (b).

46 Section.4 - Multiplication of Firing Rates 3 (a) Network 1 (b) Network 2 Figure.3 The two proposed networks that implement the minimum function. 7 Expected Result Firing Rate [in Hz] Neuron (a) Firing Rates 1 g f (b) Expected Output 1 1 Actual Result x g f (c) Exact Multiplication g f (d) Recorded Output Figure.4 Two identical Gaussian-like firing rates (a) are used to stimulate the network of Figure.3- a. The exact multiplication (c), the expected one according to the recorded Poisson generators output firing rates (b) and the recorded firing rates at the output of the network (d) are plotted.

47 36 Chapter - Simulation Results.4.3 Experiment 2 A Gaussian like population response f (red curve on Figure.-a ) and a two peak sinusoidal like population response g (blue curve on Figure.-a ) was used as inputs. On Figure. we can see the expected and recorded outputs. Once again the network performs a multiplicative like operation. 9 Expected Result 8 Firing Rate [in Hz] Firing Rate [in Hz] Neuron (a) Firing Rates Neuron (g) Neuron (f) (b) Expected Output Multiplication Network Output Firing Rate [in Hz] Firing Rate [in Hz] Neuron (g) Neuron (f) (c) Exact Multiplication Neuron (g) Neuron (f) (d) Recorded Output Figure. The population responses f (red trace), g (blue trace) plotted in (a) are used to stimulate the network of Figure.3-a. The exact multiplication (c), the expected one according to the recorded Poisson generators output firing rates (b) and the recorded firing rates at the output of the network (d) are plotted..4.4 Experiment 3 Now we will examine if under some circumstances, the network doesn t have the desired behavior. Actually it s a case were the minimum function doesn t approach multiplication. Imagine that one of the two input population responses (for example f) is constant to some value c, which is smaller than any value in the other population response (g). Then while a multiplicative operation would be a proportional projection of the g population response on the x axis corresponding to the neurons of population f, the minimum gives always the value c. So theoretically if g i the firing rates of population g, where g j > c, j, min(g j, c) = c while g j c g j. So theoretically we take a horizontal plane of height c instead of the desired projection.

48 Section. - Comparison of the Two Networks 37 This is demonstrated with an example. All g firing rates are Hz while the population response g is a Gaussian with a minimum firing rate of 1 Hz (Figure.6-a) We can see how different is the observed output than the desired result (c). Even in this case network implements multiplication and the recorded output firing rates are similar to the expected ones (compare subplots b and d). 8 Expected Result Firing Rate [in Hz] 4 3 Firing Rate [in Hz] Neuron (a) Firing Rates Neuron (g) Neuron (f) (b) Expected Output Multiplication Network Output Firing Rate [in Hz] Firing Rate [in Hz] Neuron (g) Neuron (f) (c) Exact Multiplication Neuron (g) Neuron (f) (d) Recorded Output Figure.6 The population responses f (red trace), g (blue trace) plotted in (a) are used to stimulate the network of Figure.3-a. The exact multiplication (c), the expected one according to the recorded Poisson generators output firing rates (b) and the recorded firing rates at the output of the network (d) are plotted. While the network finds the minimum between the two input firing rates, in such an extreme case the minimum function is not able to approach the actual multiplication.. Comparison of the Two Networks The first network as we have seen gives accurate results according to the ones expected. On the other hand the second network didn t manage to approach multiplication so well. We will now present some examples trying to explain the different behavior of the two networks despite theoretically they should both have similar results. In Figure.7 we see the network outputs of the two networks when both stimulated with the same Gaussian-like firing rates of Figure.4 a. On the left column we can see the output of

49 38 Chapter - Simulation Results Network 1 and the expected output (a and c respectively) while on the right column the respective plots for Network 2 are presented. Actual Result Network Output Firing Rate [in Hz] g f (a) Network 1 - Recorded Output Neuron (g) Neuron (f) (b) Network 2 - Recorded Output Expected Result Expected Result Firing Rate [in Hz] g f (c) Network 1 - Expected Output Neuron (g) Neuron (f) (d) Network 2 - Expected Output Figure.7 The population responses f (red trace), g (blue trace) plotted in.4 (a) are used to stimulate the networks of Figure.3. We can see the recorded and expected outputs for both network architectures. It is obvious that the first network has a better performance. In order to verify the better performance we conducted the same experiment but this time the input firing rates were these plotted on Figure.-a. Again (Figure.8) we discover that Network 1 has a much better performance. The reason for this difference in the experimental results between the two architectures can be justified from the fact that the second network doesn t return the minimum but twice the minimum. As we observe in both experiments for small expected firing rates (< 7 Hz) the second network has the desired performance. But for larger expected output firing rates this is not the case. This can be explained from the single neuron s behavior. Remember from the second Section of this chapter (Figure.1) that for large input firing rates there is no linear relation between input and output. Suppose that we have as inputs a firing rate f = 8 Hz and a firing rate g = 7 Hz. Theoretically we would expect an output firing rate of ρ out = 2 min(8, 7) = 14 Hz. This means that a total firing rate of 14 Hz is the input of the output neuron of the second Network (ρ e inxc ρinh ). Due to the nonlinear relation between input and output, such a large input firing rate will not be

50 Section. - Comparison of the Two Networks 39 Network Output Network Output Firing Rate [in Hz] Firing Rate [in Hz] Neuron (g) Neuron (f) (a) Network 1 - Recorded Output Neuron (g) Neuron (f) (b) Network 2 - Recorded Output Expected Result Expected Result Firing Rate [in Hz] Firing Rate [in Hz] Neuron (g) Neuron (f) (c) Network 1 - Expected Output Neuron (g) Neuron (f) (d) Network 2 - Expected Output Figure.8 The population responses f (red trace), g (blue trace) plotted in. (a) are used to stimulate the networks of Figure.3. We can see the recorded and expected outputs for both network architectures. It is obvious that the first network has a better performance.

51 4 Chapter - Simulation Results able to produce a spike train with firing rate greater than 1 Hz. This combined with the neural noise, and the losses of the previous layers among with the refractory period of the integrate-andfire neurons are the reasons for this output..6 Spike Timing is Important We will now examine how spike timing affects the behavior and the performance of our networks. Our models are based in the rectifying hypothesis. The output of an integrate-and-fire neuron (with excitatory and inhibitory synaptic inputs) has a firing rate which is max (, ρ exc ρ inh ). We have shown in the previous chapter that this hypothesis is correct and our neurons have the desired behavior. But this correctness depends solely on the stochasticity of the Poisson input spike trains. Imagine a very simple scenario, where an integrate and fire neuron is stimulated with an excitatory synaptic input of 1 Hz and an inhibitory input of Hz. Theoretically no spikes will be generated and the postsynaptic firing rate will be Hz. If we simulate the neuron for 1 s and all 1 excitatory input spikes appear in the first 1 ms while all inhibitory ones in the last 9 ms then about 1 spikes will be generated. This happens because in the first 1 ms of the simulation there is no inhibitory current to reduce the excitatory one. As a result the excitatory presynaptic spikes are able to cause a postsynaptic spike firing. In order to verify the importance of spike timing we cannot use Poisson spike train as inputs due to their stochasticity. We decided to eliminate any stochasticity and randomness in order to check the effect spike timing has. In order to do so, we used spike generators with constant firing rates which fired at exact times. Given a firing rate r and the time of the first spike t start then we can determine the inter-spike interval χ using: χ = 1 r ms. (.) The first spike is fired at t start and then a new spike is generated every χ ms. Expected Result Network Output Firing Rate [in Hz] Firing Rate [in Hz] Neuron (g) (a) Exact Neuron (f) Neuron (g) (b) Observed Neuron (f) Figure.9 Importance of spike timing. An integrate-and-fire neuron was simulated for different input and output firing rates. The spike generators generate action potentials with predetermined inter-spike intervals. In Case 1 g spikes occur at least 1 ms before the f ones (t f start = tg start + 1). (a) Expected output. (b) Observed output. We see that in this case the recorded output firing rates approach the desired ones.

52 Section.6 - Spike Timing is Important 41 For our simulations we used the first network while the experimentation methodology is identical to the one used in the previous Sections. We managed to demonstrate with a very simple experiment that spike timing is crucial. Case 1 The second input g is activated 1 ms before the first one (t f start = t g start + 1). So for small firing rates g spikes occur at least 1 ms before the f spikes. In this case the result approached the desired one as we can see in Figure.9. Case 2 The second input g is activated 2 ms before f (t f start = t g start + 2)). So for small firing rates g spikes occur at least 2 ms before the f spikes. In this case the output is does not approach the desired one, the network does not perform multiplication and there is substantial difference with the result of Case 1, as we can see in Figure.1. Expected Result Network Output 7 6 Firing Rate [in Hz] Firing Rate [in Hz] Neuron (g) (a) Exact Neuron (f) Neuron (g) (b) Observed Neuron (f) Figure.1 Importance of spike timing. An integrate-and-fire neuron was simulated for different input and output firing rates. The spike generators generate action potentials with predetermined inter-spike intervals. In Case 2 g spikes occur at least 2 ms before the f ones (t f start = tg start + 2). (a) Expected output. (b) Observed output. The importance of spike timing is obvious compared to Figure.9. Explanation Why is spike timing so important, and there is such difference between Case 1 and Case 2? Remember that the synaptic time constant is 1 ms. In the second case, at the lower left neuron of the network (Figure 4.8), excitatory input spikes occur at least 2 ms after the inhibitory ones. The synaptic current is decaying exponentially and τ syn determines how fast it goes back to zero after a presynaptic spike. So the synaptic current due to the inhibitory synapse has a very small value 1 ms after the presynaptic inhibitory event, which doesn t manage to restrain the effect of the excitatory inputs. As a result most of the time, presynaptic spikes of the excitatory synapse manage to raise the membrane voltage V m above threshold and an action potential is generated. So neurons in this case have different behavior from the expected one and as a result the whole network cannot perform multiplication as theory predicted. Of course in real neurons, spike trains are stochastic and such an extreme case where there are determined inter-spike intervals are not realistic.

53 42 Chapter - Simulation Results

54 Chapter 6 Discussion 6.1 Introduction This chapter summarizes the project, including its achievements and limitations, and puts it in a wider context for review. 6.2 Achievements and Limitations The project was successful in achieving its outlined aims and objectives. The proposed networks managed to approach multiplication. This was shown through a series of experiments. However the performance of the two networks was not the same (as expected) and also in some extreme cases the result is not proportional to a multiplicative operator. We also showed that inhibition has a subtractive effect in rate codes under certain circumstances. This makes the proposed networks more realistic. Compared to the other models for multiplication of neural signals found in the literature, our two networks are very simple and straight forward to understand. Finally one other achievement of this thesis was the development of the Biological Neural Networks Library for Simulink. It is a tool that can be used for the simulation of any neural network while one can extend it by adding models of other cells. 6.3 Future Work If time permitted, there are several issues that could be improved and more tests done. First of all we could study in detail the importance of noise. There is enough evidence that noise plays a crucial part in neural information processing and neural signals transmission. Actually in the cortex of the brain, noise is not just noise at all, but contains information reflecting the activity in other parts of the brain [8]. We could also simulate other networks which implement the minimum function. The two proposed are not unique but are the simplest ones. It would be interesting to see how a scaling in the number of neurons affects performance despite the fact that theoretically the result should be the same. We could also adjust the parameters in larger network in order to take better results. If time permitted we could implement some other neuronal models for the BNNS Library like the Hodgkin - Huxley model and use it instead of the Integrate-and-Fire neuron. A comparison of the results would be interesting especially if the performance was not much better given the high complexity of the latter models. 43

55 44 Chapter 6 - Discussion 6.4 Final Remarks The author s experience of undertaking this project was overall quite positive. Despite the limited research in this field, a multiplicative network of I&F neurons was successfully developed that managed to address all the aims and objectives outlined for the project. Solutions were found to all the difficulties occur ed during the project.

56 Appendix A Simulating Biological Neural Networks using SIMULINK A.1 Introduction In order to simulate the proposed networks in this dissertation we created a Biological Neural Network Library for the SIMULINK package. This library can be used for simulating any kind of neural network and it is very easy to be extended by adding models for other neurons. It offer the advantage that no coding is needed and through a graphical interface one can create any network and simulate it. In the following paragraphs we will describe the BNNSL and give some examples. A.2 SIMULINK Simulink, developed by The MathWorks, is a commercial tool for modeling, simulating and analyzing multidomain dynamic systems. Its primary interface is a graphical block diagramming tool and a customizable set of block libraries. It offers tight integration with the rest of the MATLAB environment and can either drive MATLAB or be scripted from it. A.2.1 Advantages of Simulink Easy to use Graphical User Interface. No coding is needed, somebody can create any network by dragging and dropping items from the library. By double clicking on any model one can change its parameters while the simulation starts by pressing a button. Convenient for rapid development & Efficient. SIMULINK models can be written in MATLAB scripting language but also in more effective programming languages like C or ADA. This improves the efficiency of the models. At the same time the built-in real time simulation algorithms are effective. Provides inbuilt visualiation (scopes). By connecting the output of any model (for example the spikes of an I&F neuron) to a scope you can see this output during the simulation. Convenient when simulating for long periods of time and we want to see if the behavior of our model is the desired one. Easily change parameters through dialog boxes. No programming needed. Anyone can use SIMULINK and BNNSL without knowing how to program. Everything can be modified through dialog boxes. 4

57 46 Chapter A - Simulating Biological Neural Networks using SIMULINK Vectorization for simulation of large numbers of neurons. Vectors of neurons can be created in order to simulate large populations. Actually there is no limit to the maximum number of neurons since MATLAB is the MATrix LAnguage and it is very simple to work with vectors and matrices. Easy extraction of desired variables for processing later. Any variable can be stored at a file or at the Workspace through the corresponding Output items. They are treated as MATLAB variables and can be easily processed. A simple MATLAB script can automate the simulation for different values of specific parameters. The close relation between SIMULINK and MATLAB makes the automation of experimental procedure very easy. Through simple MATLAB scripts somebody can simulate the model for different variable values and then analyze the results. A MATLAB script could even be used to build a model. A.2.2 S-functions In order to add blocks in a SIMULINK library somebody should create a special type of functions called S-functions. S-functions (system-functions) provide a powerful mechanism for extending the capabilities of the Simulink environment. It is nothing more than a computer language description of a Simulink block written in MATLAB, C, C++, Ada, or Fortran. S-functions follow a general form and can accommodate continuous, discrete, and hybrid systems. By following a set of simple rules, one can implement an algorithm in an S-function. Then one can create simple blocks that correspond to this S-function and use the block within models. Below we describe the general form of an S-function. Structure of S-functions An M-file S-function consists of a MATLAB function of the following form [ sys, x, str, ts ]= f ( t, x, u, flag, p1, p2,... ) where f is the name of the S-function. During simulation of a model, Simulink repeatedly invokes f, using the flag argument to indicate the task (or tasks) to be performed for a particular invocation. Each time the S-function performs the task and returns the results in an output vector. Simulink passes the following arguments to an S-function: t Current time x State vector u Input vector flag Integer value that indicates the task to be performed by the S-function The following table describes the values that flag can assume and lists the corresponding S- function method for each value. Table taken from [4].

58 Section A.2 - SIMULINK 47 Flag S-Function Routine Description mdlinitializesizes Defines basic S-Function block characteristics, including sample times, initial conditions of continuous and discrete states, and the sizes array. 1 mdlderivatives Calculates the derivatives of the continuous state variables. 2 mdlupdate Updates discrete states, sample times, and major time step requirements. 3 mdloutputs Calculates the outputs of the S-function. 4 mdlgettimeofnextvarhit Calculates the time of the next hit in absolute time. This routine is used only when you specify a variable discretetime sample time in mdlinitializesizes. 9 mdlterminate Performs any necessary end-of-simulation tasks. An S-function Example The following code corresponds to the S-function that implements a Poisson spike generator. %neuronpoissonspgen : S function that implements a simple % Poisson Spike Generator given an estimate of the firing rate % % The parameters of the Poisson process are : % % dt : Spike duration % r est : Estimate of the firing rate % % A spike occurs at a moment ( time step ) t i f % % r est dt > x rand % where % x rand a number chosen uniformly in the range [,1] switch flag, % I n i t i a l i z a t ion case, [ sys, x, str, ts ]= m dlinitializesizes ( dt ) ; % Outputs case 3, sys=mdloutputs( t, x, u, dt, r e s t ) ; % Unused flags case {1,2,4,9} sys = [ ] ; % Unexpected flags otherwise error ( [ Unhandled flag =,num2str( flag ) ] ) ; end % end neuronpoissonspgen %=============================================================================

59 48 Chapter A - Simulating Biological Neural Networks using SIMULINK % mdlinitializesizes % Return the sizes, i n i t i a l conditions, and sample times for the S function. %============================================================================= function [ sys, x, str, ts ]= m d l I n itializesizes ( dt ) % % call simsizes for a sizes structure, f i l l i t in and convert i t to a % sizes array. % % Note that in this example, the values are hard coded. This is not a % recommended practice as the characteristics of the block are typically % defined by the S function parameters. % s i z e s = simsizes ; s i z e s. NumContStates = ; % continuous states s i z e s. NumDiscStates = ; % discrete states s i z e s. NumOutputs = 1; % 1 output : spike occurence s i z e s. NumInputs = ; % inputs, estimate of fir ing rate is given % as parameter s i z e s. DirFeedthrough = ; % no direct feedthrough s i z e s. NumSampleTimes = 1; % at least one sample time is needed sys = simsizes ( s i z e s ) ; % i n i t i a l i z e the i n i t i a l conditions x = [ ] ; % str is always an empty matrix str = [ ] ; % i n i t i a l i z e the array of sample times ts = [ dt ] ; % end mdlinitializesizes %============================================================================= % mdloutputs % Return the block outputs. %============================================================================= function sys=mdloutputs( t, x, u, dt, r e s t ) x rand = rand; % r est is given in Hz so the expected spikes in the interval dt is % r est dt /1 if ( r e s t dt/1 > x rand ) sys = 1; else sys = ; end % end mdloutputs

60 Section A.3 - The Biological Neural Networks SIMULINK Library (BNNSL) 49 A.3 The Biological Neural Networks SIMULINK Library (BNNSL) The Biological Neural Networks SIMULINK Library (BNNSL) is a specific library we created for the needs of this dissertation. However it can easily be extended and used for the simulation of any architecture. The creation of new blocks is a very simple procdesure, and one has only to write the S-function for his neuronal model. Then he can create a corresponding block and add it to the library. Any new block can be used with the existing ones for any simulation. When calling the BNNS Library the window of Figure A.1 appears. There are three main categories of blocks which are described in the following sections. Figure A.1 The main window of the BNNS Library. A.3.1 Current Sources This category includes blocks used to inject current into neurons (Figure A.2), like Pulse generators, noisy current source or constant current source. Figure A.2 The input current sources of the BNNS Library. A.3.2 Output Devices Blocks used to visualize and/or save the desired variables (Figure A.3) like Scopes, Save to File, Save to Workspace. A.3.3 Neuronal Models Blocks that implement some basic neuronal models (A.4) like Poisson Spike Generator, I&F Neurons with or without Synaptic Input, etc.

61 Chapter A - Simulating Biological Neural Networks using SIMULINK Figure A.3 The output devices of the BNNS Library. Figure A.4 The neuronal models of the BNNS Library. A.3.4 BNNSL in Action The simplicity of a SIMULINK library lies on the fact that one can create a model by dragging objects in an empty sheet and connecting them with lines. Any parameter of a specific block can be modified by changing the values on a dialog box. In Figure A. we can see the modification of the parameters of a simple model in action. (a) The Model (b) The Dialog Box Figure A. (a) A simple model created with BNNSL (b) Changing the parameters of a neuron with a dialog box after double clicking on the neuron block.

62 Section A.3 - The Biological Neural Networks SIMULINK Library (BNNSL) 1 Another intereting feature is the visualization capabilities of SIMULINK. We illustrate it with a very simple example (Figure A.6). The action potentials produced by a Poisson spike generator (yellow) with firing rate ρ = Hz are the inputs at the excitatory synapse of an I&F neuron. The membrane voltage and the output spikes (purple) can be seen on the scopes, in real time during the simulation. (a) The Model (b) Input and Output Spikes (c) Membrane Voltage Figure A.6 (a) A simple model created with BNNSL. (b)plot of Input and Output Spikes. (c) Plot of Voltage Membrane.

Bi 360: Midterm Review

Bi 360: Midterm Review Bi 360: Midterm Review Basic Neurobiology 1) Many axons are surrounded by a fatty insulating sheath called myelin, which is interrupted at regular intervals at the Nodes of Ranvier, where the action potential

More information

Neurophysiology. 2.1 Equilibrium Potential

Neurophysiology. 2.1 Equilibrium Potential 2 Neurophysiology 2.1 Equilibrium Potential An understanding of the concepts of electrical and chemical forces that act on ions, electrochemical equilibrium, and equilibrium potential is a powerful tool

More information

CHAPTER 5 SIGNALLING IN NEURONS

CHAPTER 5 SIGNALLING IN NEURONS 5.1. SYNAPTIC TRANSMISSION CHAPTER 5 SIGNALLING IN NEURONS One of the main functions of neurons is to communicate with other neurons. An individual neuron may receive information from many different sources.

More information

REVIEW SHEET EXERCISE 3 Neurophysiology of Nerve Impulses Name Lab Time/Date. The Resting Membrane Potential

REVIEW SHEET EXERCISE 3 Neurophysiology of Nerve Impulses Name Lab Time/Date. The Resting Membrane Potential REVIEW SHEET EXERCISE 3 Neurophysiology of Nerve Impulses Name Lab Time/Date ACTIVITY 1 The Resting Membrane Potential 1. Explain why increasing extracellular K + reduces the net diffusion of K + out of

More information

CHAPTER 6 PRINCIPLES OF NEURAL CIRCUITS.

CHAPTER 6 PRINCIPLES OF NEURAL CIRCUITS. CHAPTER 6 PRINCIPLES OF NEURAL CIRCUITS. 6.1. CONNECTIONS AMONG NEURONS Neurons are interconnected with one another to form circuits, much as electronic components are wired together to form a functional

More information

PART I: Neurons and the Nerve Impulse

PART I: Neurons and the Nerve Impulse PART I: Neurons and the Nerve Impulse Identify each of the labeled structures of the neuron below. A. B. C. D. E. F. G. Identify each of the labeled structures of the neuron below. A. dendrites B. nucleus

More information

Passive Conduction - Cable Theory

Passive Conduction - Cable Theory Passive Conduction - Cable Theory October 7, 2013 Biological Structure Theoretical models describing propagation of synaptic potentials have evolved significantly over the past century. Synaptic potentials

More information

Simulation of an Action Potential using the Hodgkin-Huxley Model in Python. Nathan Law 250560559. Medical Biophysics 3970

Simulation of an Action Potential using the Hodgkin-Huxley Model in Python. Nathan Law 250560559. Medical Biophysics 3970 Simulation of an Action Potential using the Hodgkin-Huxley Model in Python Nathan Law 250560559 Medical Biophysics 3970 Instructor: Dr. Ian MacDonald TA: Nathaniel Hayward Project Supervisor: Dr. Andrea

More information

Biological Neurons and Neural Networks, Artificial Neurons

Biological Neurons and Neural Networks, Artificial Neurons Biological Neurons and Neural Networks, Artificial Neurons Neural Computation : Lecture 2 John A. Bullinaria, 2015 1. Organization of the Nervous System and Brain 2. Brains versus Computers: Some Numbers

More information

BIOPHYSICS OF NERVE CELLS & NETWORKS

BIOPHYSICS OF NERVE CELLS & NETWORKS UNIVERSITY OF LONDON MSci EXAMINATION May 2007 for Internal Students of Imperial College of Science, Technology and Medicine This paper is also taken for the relevant Examination for the Associateship

More information

The Action Potential Graphics are used with permission of: adam.com (http://www.adam.com/) Benjamin Cummings Publishing Co (http://www.awl.

The Action Potential Graphics are used with permission of: adam.com (http://www.adam.com/) Benjamin Cummings Publishing Co (http://www.awl. The Action Potential Graphics are used with permission of: adam.com (http://www.adam.com/) Benjamin Cummings Publishing Co (http://www.awl.com/bc) ** If this is not printed in color, it is suggested you

More information

Resting membrane potential ~ -70mV - Membrane is polarized

Resting membrane potential ~ -70mV - Membrane is polarized Resting membrane potential ~ -70mV - Membrane is polarized (ie) Electrical charge on the outside of the membrane is positive while the electrical charge on the inside of the membrane is negative Changes

More information

Name: Teacher: Olsen Hour:

Name: Teacher: Olsen Hour: Name: Teacher: Olsen Hour: The Nervous System: Part 1 Textbook p216-225 41 In all exercises, quizzes and tests in this class, always answer in your own words. That is the only way that you can show that

More information

12. Nervous System: Nervous Tissue

12. Nervous System: Nervous Tissue 12. Nervous System: Nervous Tissue I. Introduction to the Nervous System General functions of the nervous system The nervous system has three basic functions: 1. Gather sensory input from the environment

More information

Activity 5: The Action Potential: Measuring Its Absolute and Relative Refractory Periods. 250 20 Yes. 125 20 Yes. 60 20 No. 60 25 No.

Activity 5: The Action Potential: Measuring Its Absolute and Relative Refractory Periods. 250 20 Yes. 125 20 Yes. 60 20 No. 60 25 No. 3: Neurophysiology of Nerve Impulses (Part 2) Activity 5: The Action Potential: Measuring Its Absolute and Relative Refractory Periods Interval between stimuli Stimulus voltage (mv) Second action potential?

More information

Ion Channels. Graphics are used with permission of: Pearson Education Inc., publishing as Benjamin Cummings (http://www.aw-bc.com)

Ion Channels. Graphics are used with permission of: Pearson Education Inc., publishing as Benjamin Cummings (http://www.aw-bc.com) Ion Channels Graphics are used with permission of: Pearson Education Inc., publishing as Benjamin Cummings (http://www.aw-bc.com) ** There are a number of ion channels introducted in this topic which you

More information

Nerves and Nerve Impulse

Nerves and Nerve Impulse Nerves and Nerve Impulse Terms Absolute refractory period: Period following stimulation during which no additional action potential can be evoked. Acetylcholine: Chemical transmitter substance released

More information

Action Potentials I Generation. Reading: BCP Chapter 4

Action Potentials I Generation. Reading: BCP Chapter 4 Action Potentials I Generation Reading: BCP Chapter 4 Action Potentials Action potentials (AP s) aka Spikes (because of how they look in an electrical recording of Vm over time). Discharges (descriptive

More information

Biology Slide 1 of 38

Biology Slide 1 of 38 Biology 1 of 38 2 of 38 35-2 The Nervous System What are the functions of the nervous system? 3 of 38 35-2 The Nervous System 1. Nervous system: a. controls and coordinates functions throughout the body

More information

AP Biology I. Nervous System Notes

AP Biology I. Nervous System Notes AP Biology I. Nervous System Notes 1. General information: passage of information occurs in two ways: Nerves - process and send information fast (eg. stepping on a tack) Hormones - process and send information

More information

Chapter 7: The Nervous System

Chapter 7: The Nervous System Chapter 7: The Nervous System Objectives Discuss the general organization of the nervous system Describe the structure & function of a nerve Draw and label the pathways involved in a withdraw reflex Define

More information

EXCITABILITY & ACTION POTENTIALS page 1

EXCITABILITY & ACTION POTENTIALS page 1 page 1 INTRODUCTION A. Excitable Tissue: able to generate Action Potentials (APs) (e.g. neurons, muscle cells) B. Neurons (nerve cells) a. components 1) soma (cell body): metabolic center (vital, always

More information

The Neuron and the Synapse. The Neuron. Parts of the Neuron. Functions of the neuron:

The Neuron and the Synapse. The Neuron. Parts of the Neuron. Functions of the neuron: The Neuron and the Synapse The Neuron Functions of the neuron: Transmit information from one point in the body to another. Process the information in various ways (that is, compute). The neuron has a specialized

More information

Artificial Neural Networks and Support Vector Machines. CS 486/686: Introduction to Artificial Intelligence

Artificial Neural Networks and Support Vector Machines. CS 486/686: Introduction to Artificial Intelligence Artificial Neural Networks and Support Vector Machines CS 486/686: Introduction to Artificial Intelligence 1 Outline What is a Neural Network? - Perceptron learners - Multi-layer networks What is a Support

More information

Slide 1. Slide 2. Slide 3. Cable Properties. Passive flow of current. Voltage Decreases With Distance

Slide 1. Slide 2. Slide 3. Cable Properties. Passive flow of current. Voltage Decreases With Distance Slide 1 Properties of the nerve, axon, cell body and dendrite affect the distance and speed of membrane potential Passive conduction properties = cable properties Signal becomes reduced over distance depending

More information

THE HUMAN BRAIN. observations and foundations

THE HUMAN BRAIN. observations and foundations THE HUMAN BRAIN observations and foundations brains versus computers a typical brain contains something like 100 billion miniscule cells called neurons estimates go from about 50 billion to as many as

More information

ANIMATED NEUROSCIENCE

ANIMATED NEUROSCIENCE ANIMATED NEUROSCIENCE and the Action of Nicotine, Cocaine, and Marijuana in the Brain Te a c h e r s G u i d e Films for the Humanities & Sciences Background Information This program, made entirely of

More information

Standards Alignment Minnesota Science Standards Alignment Matrix www.brainu.org/resources/mnstds

Standards Alignment Minnesota Science Standards Alignment Matrix www.brainu.org/resources/mnstds Lesson Summary: Neurons transfer information by releasing neurotransmitters across the synapse or space between neurons. Students model the chemical communication between pre-synaptic and post-synaptic

More information

Andrew Rosen - Chapter 3: The Brain and Nervous System Intro:

Andrew Rosen - Chapter 3: The Brain and Nervous System Intro: Intro: Brain is made up of numerous, complex parts Frontal lobes by forehead are the brain s executive center Parietal lobes wave sensory information together (maps feeling on body) Temporal lobes interpret

More information

Model Neurons I: Neuroelectronics

Model Neurons I: Neuroelectronics Chapter 5 Model Neurons I: Neuroelectronics 5.1 Introduction A great deal is known about the biophysical mechanisms responsible for generating neuronal activity, and these provide a basis for constructing

More information

Nerve Cell Communication

Nerve Cell Communication Nerve Cell Communication Core Concept: Nerve cells communicate using electrical and chemical signals. Class time required: Approximately 2 forty minute class periods Teacher Provides: For each student

More information

Appendix 4 Simulation software for neuronal network models

Appendix 4 Simulation software for neuronal network models Appendix 4 Simulation software for neuronal network models D.1 Introduction This Appendix describes the Matlab software that has been made available with Cerebral Cortex: Principles of Operation (Rolls

More information

Auditory neuroanatomy: the Spanish heritage. Santiago Ramón y Cajal, 1852 1934

Auditory neuroanatomy: the Spanish heritage. Santiago Ramón y Cajal, 1852 1934 Auditory neuroanatomy: the Spanish heritage Santiago Ramón y Cajal, 1852 1934 Rafael Lorente de Nó, 1902 1990 3 The nervous system is made up of cells. Estimates of the number of cells vary from

More information

Introduction to Machine Learning and Data Mining. Prof. Dr. Igor Trajkovski trajkovski@nyus.edu.mk

Introduction to Machine Learning and Data Mining. Prof. Dr. Igor Trajkovski trajkovski@nyus.edu.mk Introduction to Machine Learning and Data Mining Prof. Dr. Igor Trakovski trakovski@nyus.edu.mk Neural Networks 2 Neural Networks Analogy to biological neural systems, the most robust learning systems

More information

The Membrane Equation

The Membrane Equation The Membrane Equation Professor David Heeger September 5, 2000 RC Circuits Figure 1A shows an RC (resistor, capacitor) equivalent circuit model for a patch of passive neural membrane. The capacitor represents

More information

Origin of Electrical Membrane Potential

Origin of Electrical Membrane Potential Origin of Electrical Membrane Potential parti This book is about the physiological characteristics of nerve and muscle cells. As we shall see, the ability of these cells to generate and conduct electricity

More information

Introduction to Psychology, 7th Edition, Rod Plotnik Module 3: Brain s Building Blocks. Module 3. Brain s Building Blocks

Introduction to Psychology, 7th Edition, Rod Plotnik Module 3: Brain s Building Blocks. Module 3. Brain s Building Blocks Module 3 Brain s Building Blocks Structure of the Brain Genes chains of chemicals that are arranged like rungs on a twisting ladder there are about 100,000 genes that contain chemical instructions that

More information

Parts of the Nerve Cell and Their Functions

Parts of the Nerve Cell and Their Functions Parts of the Nerve Cell and Their Functions Silvia Helena Cardoso, PhD [ 1. Cell body] [2. Neuronal membrane] [3. Dendrites] [4. Axon] [5. Nerve ending] 1. Cell body The cell body (soma) is the factory

More information

Biology/ANNB 261 Exam 1 Name Fall, 2006

Biology/ANNB 261 Exam 1 Name Fall, 2006 Biology/ANNB 261 Exam 1 Name Fall, 2006 * = correct answer. 1. The Greek philosopher Aristotle hypothesized that the brain was a) A radiator for cooling the blood.* b) The seat of the soul. c) The organ

More information

How To Understand The Distributed Potential Of A Dendritic Tree

How To Understand The Distributed Potential Of A Dendritic Tree Systems Biology II: Neural Systems (580.422) Lecture 8, Linear cable theory Eric Young 5-3164 eyoung@jhu.edu Reading: D. Johnston and S.M. Wu Foundations of Cellular Neurophysiology (MIT Press, 1995).

More information

Biology/ANNB 261 Exam 1 Spring, 2006

Biology/ANNB 261 Exam 1 Spring, 2006 Biology/ANNB 261 Exam 1 Spring, 2006 Name * = correct answer Multiple Choice: 1. Axons and dendrites are two types of a) Neurites * b) Organelles c) Synapses d) Receptors e) Golgi cell components 2. The

More information

Slide 1: Introduction Introduce the purpose of your presentation. Indicate that you will explain how the brain basically works and how and where

Slide 1: Introduction Introduce the purpose of your presentation. Indicate that you will explain how the brain basically works and how and where Slide 1: Introduction Introduce the purpose of your presentation. Indicate that you will explain how the brain basically works and how and where drugs such as heroin and cocaine work in the brain. Tell

More information

Masters research projects. 1. Adapting Granger causality for use on EEG data.

Masters research projects. 1. Adapting Granger causality for use on EEG data. Masters research projects 1. Adapting Granger causality for use on EEG data. Background. Granger causality is a concept introduced in the field of economy to determine which variables influence, or cause,

More information

Neural Network Design in Cloud Computing

Neural Network Design in Cloud Computing International Journal of Computer Trends and Technology- volume4issue2-2013 ABSTRACT: Neural Network Design in Cloud Computing B.Rajkumar #1,T.Gopikiran #2,S.Satyanarayana *3 #1,#2Department of Computer

More information

Hearing and Deafness 1. Anatomy & physiology

Hearing and Deafness 1. Anatomy & physiology Hearing and Deafness 1. Anatomy & physiology Chris Darwin Web site for lectures, lecture notes and filtering lab: http://www.lifesci.susx.ac.uk/home/chris_darwin/ safari 1 Outer, middle & inner ear Capture;

More information

Lab 1: Simulation of Resting Membrane Potential and Action Potential

Lab 1: Simulation of Resting Membrane Potential and Action Potential Lab 1: Simulation of Resting Membrane Potential and Action Potential Overview The aim of the present laboratory exercise is to simulate how changes in the ion concentration or ionic conductance can change

More information

Computational Neuroscience. Models of Synaptic Transmission and Plasticity. Prof. Dr. Michele GIUGLIANO 2036FBDBMW

Computational Neuroscience. Models of Synaptic Transmission and Plasticity. Prof. Dr. Michele GIUGLIANO 2036FBDBMW Computational Neuroscience 2036FBDBMW Master of Science in Computer Science (Scientific Computing) Master of Science in Biomedical Sciences (Neurosciences) Master of Science in Physics Prof. Dr. Michele

More information

Nerves and Conduction of Nerve Impulses

Nerves and Conduction of Nerve Impulses A. Introduction 1. Innovation in Cnidaria - Nerve net a. We need to talk more about nerves b. Cnidaria have simple nerve net - 2 way conduction c. Basis for more complex system in Vertebrates B. Vertebrate

More information

CHAPTER XV PDL 101 HUMAN ANATOMY & PHYSIOLOGY. Ms. K. GOWRI. M.Pharm., Lecturer.

CHAPTER XV PDL 101 HUMAN ANATOMY & PHYSIOLOGY. Ms. K. GOWRI. M.Pharm., Lecturer. CHAPTER XV PDL 101 HUMAN ANATOMY & PHYSIOLOGY Ms. K. GOWRI. M.Pharm., Lecturer. Types of Muscle Tissue Classified by location, appearance, and by the type of nervous system control or innervation. Skeletal

More information

CHAPTER I From Biological to Artificial Neuron Model

CHAPTER I From Biological to Artificial Neuron Model Ugur HALICI ARTIFICIAL NEURAL NETWORKS CHAPTER CHAPTER I From Biological to Artificial Neuron Model Martin Gardner in his book titled 'The Annotated Snark" has the following note for the last illustration

More information

Lecture - 4 Diode Rectifier Circuits

Lecture - 4 Diode Rectifier Circuits Basic Electronics (Module 1 Semiconductor Diodes) Dr. Chitralekha Mahanta Department of Electronics and Communication Engineering Indian Institute of Technology, Guwahati Lecture - 4 Diode Rectifier Circuits

More information

Integration and Coordination of the Human Body. Nervous System

Integration and Coordination of the Human Body. Nervous System I. General Info Integration and Coordination of the Human Body A. Both the and system are responsible for maintaining 1. Homeostasis is the process by which organisms keep internal conditions despite changes

More information

Anatomy Review Graphics are used with permission of: adam.com (http://www.adam.com/) Benjamin Cummings Publishing Co (http://www.awl.com/bc).

Anatomy Review Graphics are used with permission of: adam.com (http://www.adam.com/) Benjamin Cummings Publishing Co (http://www.awl.com/bc). Page 1. Introduction The structure of neurons reflects their function. One part of the cell receives incoming signals. Another part generates outgoing signals. Anatomy Review Graphics are used with permission

More information

Controlling the brain

Controlling the brain out this leads to a very exciting research field with many challenges which need to be solved; both in the medical as well as in the electrical domain. Controlling the brain In this article an introduction

More information

FUNCTIONS OF THE NERVOUS SYSTEM 1. Sensory input. Sensory receptors detects external and internal stimuli.

FUNCTIONS OF THE NERVOUS SYSTEM 1. Sensory input. Sensory receptors detects external and internal stimuli. FUNCTIONS OF THE NERVOUS SYSTEM 1. Sensory input. Sensory receptors detects external and internal stimuli. 2. Integration. The brain and spinal cord process sensory input and produce responses. 3. Homeostasis.

More information

W03 Analysis of DC Circuits. Yrd. Doç. Dr. Aytaç Gören

W03 Analysis of DC Circuits. Yrd. Doç. Dr. Aytaç Gören W03 Analysis of DC Circuits Yrd. Doç. Dr. Aytaç Gören ELK 2018 - Contents W01 Basic Concepts in Electronics W02 AC to DC Conversion W03 Analysis of DC Circuits (self and condenser) W04 Transistors and

More information

Agent Simulation of Hull s Drive Theory

Agent Simulation of Hull s Drive Theory Agent Simulation of Hull s Drive Theory Nick Schmansky Department of Cognitive and Neural Systems Boston University March 7, 4 Abstract A computer simulation was conducted of an agent attempting to survive

More information

A wave lab inside a coaxial cable

A wave lab inside a coaxial cable INSTITUTE OF PHYSICS PUBLISHING Eur. J. Phys. 25 (2004) 581 591 EUROPEAN JOURNAL OF PHYSICS PII: S0143-0807(04)76273-X A wave lab inside a coaxial cable JoãoMSerra,MiguelCBrito,JMaiaAlves and A M Vallera

More information

Lab #6: Neurophysiology Simulation

Lab #6: Neurophysiology Simulation Lab #6: Neurophysiology Simulation Background Neurons (Fig 6.1) are cells in the nervous system that are used conduct signals at high speed from one part of the body to another. This enables rapid, precise

More information

The mhr model is described by 30 ordinary differential equations (ODEs): one. ion concentrations and 23 equations describing channel gating.

The mhr model is described by 30 ordinary differential equations (ODEs): one. ion concentrations and 23 equations describing channel gating. On-line Supplement: Computer Modeling Chris Clausen, PhD and Ira S. Cohen, MD, PhD Computer models of canine ventricular action potentials The mhr model is described by 30 ordinary differential equations

More information

Nodus 3.1. Manual. with Nodus 3.2 Appendix. Neuron and network simulation software for Macintosh computers

Nodus 3.1. Manual. with Nodus 3.2 Appendix. Neuron and network simulation software for Macintosh computers Nodus 3.1 Manual with Nodus 3.2 Appendix Neuron and network simulation software for Macintosh computers Copyright Erik De Schutter, 1995 Copyright This manual and the Nodus software described in it are

More information

The Visual Cortex 0 http://www.tutis.ca/neuromd/index.htm 20 February 2013

The Visual Cortex 0 http://www.tutis.ca/neuromd/index.htm 20 February 2013 T he Visual Cortex 0 Chapter contents Contents Chapter 2... 0 T he Visual Cortex... 0 Chapter Contents... 1 Introduction... 2 Optic Chiasm... 2 Where do the eye's ganglion cells project to?... 3 To where

More information

Chapter 11: Functional Organization of Nervous Tissue

Chapter 11: Functional Organization of Nervous Tissue Chapter 11: Functional Organization of Nervous Tissue Multiple Choice 1. The nervous system A) monitors internal and external stimuli. B) transmits information in the form of action potentials. C) interprets

More information

Simulating Spiking Neurons by Hodgkin Huxley Model

Simulating Spiking Neurons by Hodgkin Huxley Model Simulating Spiking Neurons by Hodgkin Huxley Model Terje Kristensen 1 and Donald MacNearney 2 1 Department of Computing, Bergen University College, Bergen, Norway, tkr@hib.no 2 Electrical Systems Integration,

More information

Problem Sets: Questions and Answers

Problem Sets: Questions and Answers BI 360: Neurobiology Fall 2014 Problem Sets: Questions and Answers These problems are provided to aid in your understanding of basic neurobiological concepts and to guide your focus for in-depth study.

More information

Questions on The Nervous System and Gas Exchange

Questions on The Nervous System and Gas Exchange Name: Questions on The Nervous System and Gas Exchange Directions: The following questions are taken from previous IB Final Papers on Topics 6.4 (Gas Exchange) and 6.5 (Nerves, hormones and homeostasis).

More information

The Time Constant of an RC Circuit

The Time Constant of an RC Circuit The Time Constant of an RC Circuit 1 Objectives 1. To determine the time constant of an RC Circuit, and 2. To determine the capacitance of an unknown capacitor. 2 Introduction What the heck is a capacitor?

More information

DIODE CIRCUITS LABORATORY. Fig. 8.1a Fig 8.1b

DIODE CIRCUITS LABORATORY. Fig. 8.1a Fig 8.1b DIODE CIRCUITS LABORATORY A solid state diode consists of a junction of either dissimilar semiconductors (pn junction diode) or a metal and a semiconductor (Schottky barrier diode). Regardless of the type,

More information

Anatomy Review. Graphics are used with permission of: Pearson Education Inc., publishing as Benjamin Cummings (http://www.aw-bc.

Anatomy Review. Graphics are used with permission of: Pearson Education Inc., publishing as Benjamin Cummings (http://www.aw-bc. Anatomy Review Graphics are used with permission of: Pearson Education Inc., publishing as Benjamin Cummings (http://www.aw-bc.com) Page 1. Introduction The structure of neurons reflects their function.

More information

Fundamentals of Microelectronics

Fundamentals of Microelectronics Fundamentals of Microelectronics CH1 Why Microelectronics? CH2 Basic Physics of Semiconductors CH3 Diode Circuits CH4 Physics of Bipolar Transistors CH5 Bipolar Amplifiers CH6 Physics of MOS Transistors

More information

Lecture One: Brain Basics

Lecture One: Brain Basics Lecture One: Brain Basics Brain Fractured Femur Bone Spinal Cord 1 How does pain get from here to here 2 How does the brain work? Every cell in your body is wired to send a signal to your brain The brain

More information

Neural Networks in Data Mining

Neural Networks in Data Mining IOSR Journal of Engineering (IOSRJEN) ISSN (e): 2250-3021, ISSN (p): 2278-8719 Vol. 04, Issue 03 (March. 2014), V6 PP 01-06 www.iosrjen.org Neural Networks in Data Mining Ripundeep Singh Gill, Ashima Department

More information

Brain Basics: A Brain in Sync

Brain Basics: A Brain in Sync Brain Basics: A Brain in Sync By: Dr. Robert Melillo The idea of a functional relationship between the left and right sides of the brain is hardly new. In 1949, Canadian neuropsychologist Donald O. Hebb,

More information

Models of Cortical Maps II

Models of Cortical Maps II CN510: Principles and Methods of Cognitive and Neural Modeling Models of Cortical Maps II Lecture 19 Instructor: Anatoli Gorchetchnikov dy dt The Network of Grossberg (1976) Ay B y f (

More information

Laboratory Guide. Anatomy and Physiology

Laboratory Guide. Anatomy and Physiology Laboratory Guide Anatomy and Physiology TBME04, Fall 2010 Name: Passed: Last updated 2010-08-13 Department of Biomedical Engineering Linköpings Universitet Introduction This laboratory session is intended

More information

The Action Potential, Synaptic Transmission, and Maintenance of Nerve Function

The Action Potential, Synaptic Transmission, and Maintenance of Nerve Function C H A P T E R 3 The Action Potential, Synaptic Transmission, and Maintenance of Nerve Function Cynthia J. Forehand, Ph.D. CHAPTER OUTLINE PASSIVE MEMBRANE PROPERTIES, THE ACTION POTENTIAL, AND ELECTRICAL

More information

2 Neurons. 4 The Brain: Cortex

2 Neurons. 4 The Brain: Cortex 1 Neuroscience 2 Neurons output integration axon cell body, membrane potential Frontal planning control auditory episodes soma motor Temporal Parietal action language objects space vision Occipital inputs

More information

The Action Potential

The Action Potential OpenStax-CNX module: m46526 1 The Action Potential OpenStax College This work is produced by OpenStax-CNX and licensed under the Creative Commons Attribution License 4.0 By the end of this section, you

More information

Biological Membranes. Impermeable lipid bilayer membrane. Protein Channels and Pores

Biological Membranes. Impermeable lipid bilayer membrane. Protein Channels and Pores Biological Membranes Impermeable lipid bilayer membrane Protein Channels and Pores 1 Biological Membranes Are Barriers for Ions and Large Polar Molecules The Cell. A Molecular Approach. G.M. Cooper, R.E.

More information

Processing the Image or Can you Believe what you see? Light and Color for Nonscientists PHYS 1230

Processing the Image or Can you Believe what you see? Light and Color for Nonscientists PHYS 1230 Processing the Image or Can you Believe what you see? Light and Color for Nonscientists PHYS 1230 Optical Illusions http://www.michaelbach.de/ot/mot_mib/index.html Vision We construct images unconsciously

More information

A model of memory, learning and recognition

A model of memory, learning and recognition A model of memory, learning and recognition Bruce Hoeneisen Universidad San Francisco de Quito 6 May 2002 Abstract We propose a simple model of recognition, short-term memory, longterm memory and learning.

More information

PCM Encoding and Decoding:

PCM Encoding and Decoding: PCM Encoding and Decoding: Aim: Introduction to PCM encoding and decoding. Introduction: PCM Encoding: The input to the PCM ENCODER module is an analog message. This must be constrained to a defined bandwidth

More information

Step Response of RC Circuits

Step Response of RC Circuits Step Response of RC Circuits 1. OBJECTIVES...2 2. REFERENCE...2 3. CIRCUITS...2 4. COMPONENTS AND SPECIFICATIONS...3 QUANTITY...3 DESCRIPTION...3 COMMENTS...3 5. DISCUSSION...3 5.1 SOURCE RESISTANCE...3

More information

Bayesian probability theory

Bayesian probability theory Bayesian probability theory Bruno A. Olshausen arch 1, 2004 Abstract Bayesian probability theory provides a mathematical framework for peforming inference, or reasoning, using probability. The foundations

More information

Nervous System: Spinal Cord and Spinal Nerves (Chapter 13) Lecture Materials for Amy Warenda Czura, Ph.D. Suffolk County Community College

Nervous System: Spinal Cord and Spinal Nerves (Chapter 13) Lecture Materials for Amy Warenda Czura, Ph.D. Suffolk County Community College Nervous System: Spinal Cord and Spinal Nerves (Chapter 13) Lecture Materials for Amy Warenda Czura, Ph.D. Suffolk County Community College Primary Sources for figures and content: Eastern Campus Marieb,

More information

Solving Simultaneous Equations and Matrices

Solving Simultaneous Equations and Matrices Solving Simultaneous Equations and Matrices The following represents a systematic investigation for the steps used to solve two simultaneous linear equations in two unknowns. The motivation for considering

More information

Human Physiology Study Questions-2

Human Physiology Study Questions-2 Human Physiology Study Questions-2 Action potentials: Handout-8, Chapter 8 1. Explain the positive feedback component of an action potential that is, how the opening of one voltage-gated sodium (or calcium)

More information

Electrophysiological Recording Techniques

Electrophysiological Recording Techniques Electrophysiological Recording Techniques Wen-Jun Gao, PH.D. Drexel University College of Medicine Goal of Physiological Recording To detect the communication signals between neurons in real time (μs to

More information

ANN Based Fault Classifier and Fault Locator for Double Circuit Transmission Line

ANN Based Fault Classifier and Fault Locator for Double Circuit Transmission Line International Journal of Computer Sciences and Engineering Open Access Research Paper Volume-4, Special Issue-2, April 2016 E-ISSN: 2347-2693 ANN Based Fault Classifier and Fault Locator for Double Circuit

More information

7 Network Models. 7.1 Introduction

7 Network Models. 7.1 Introduction 7 Network Models 7.1 Introduction Extensive synaptic connectivity is a hallmark of neural circuitry. For example, a typical neuron in the mammalian neocortex receives thousands of synaptic inputs. Network

More information

The Lipid Bilayer Is a Two-Dimensional Fluid

The Lipid Bilayer Is a Two-Dimensional Fluid The Lipid Bilayer Is a Two-Dimensional Fluid The aqueous environment inside and outside a cell prevents membrane lipids from escaping from bilayer, but nothing stops these molecules from moving about and

More information

Precision Diode Rectifiers

Precision Diode Rectifiers by Kenneth A. Kuhn March 21, 2013 Precision half-wave rectifiers An operational amplifier can be used to linearize a non-linear function such as the transfer function of a semiconductor diode. The classic

More information

QUANTAL ANALYSIS AT THE NEUROMUSCULAR JUNCTION

QUANTAL ANALYSIS AT THE NEUROMUSCULAR JUNCTION Hons Neuroscience Professor R.R. Ribchester QUANTAL ANALYSIS AT THE NEUROMUSCULAR JUNCTION Our present understanding of the fundamental physiological mechanism of transmitter release at synapses is mainly

More information

S-Parameters and Related Quantities Sam Wetterlin 10/20/09

S-Parameters and Related Quantities Sam Wetterlin 10/20/09 S-Parameters and Related Quantities Sam Wetterlin 10/20/09 Basic Concept of S-Parameters S-Parameters are a type of network parameter, based on the concept of scattering. The more familiar network parameters

More information

Analecta Vol. 8, No. 2 ISSN 2064-7964

Analecta Vol. 8, No. 2 ISSN 2064-7964 EXPERIMENTAL APPLICATIONS OF ARTIFICIAL NEURAL NETWORKS IN ENGINEERING PROCESSING SYSTEM S. Dadvandipour Institute of Information Engineering, University of Miskolc, Egyetemváros, 3515, Miskolc, Hungary,

More information

Brain-in-a-bag: creating an artificial brain

Brain-in-a-bag: creating an artificial brain Activity 2 Brain-in-a-bag: creating an artificial brain Age group successfully used with: Abilities assumed: Time: Size of group: 8 adult answering general questions, 20-30 minutes as lecture format, 1

More information

Cellular Calcium Dynamics. Jussi Koivumäki, Glenn Lines & Joakim Sundnes

Cellular Calcium Dynamics. Jussi Koivumäki, Glenn Lines & Joakim Sundnes Cellular Calcium Dynamics Jussi Koivumäki, Glenn Lines & Joakim Sundnes Cellular calcium dynamics A real cardiomyocyte is obviously not an empty cylinder, where Ca 2+ just diffuses freely......instead

More information

Brain & Mind. Bicester Community College Science Department

Brain & Mind. Bicester Community College Science Department B6 Brain & Mind B6 Key Questions How do animals respond to changes in their environment? How is information passed through the nervous system? What can we learn through conditioning? How do humans develop

More information

MULTIPLE CHOICE. Choose the one alternative that best completes the statement or answers the question.

MULTIPLE CHOICE. Choose the one alternative that best completes the statement or answers the question. Chapter 2 The Neural Impulse Name Period Date MULTIPLE CHOICE. Choose the one alternative that best completes the statement or answers the question. 1) The cell body is enclosed by the. A) cell membrane

More information

Yrd. Doç. Dr. Aytaç Gören

Yrd. Doç. Dr. Aytaç Gören H2 - AC to DC Yrd. Doç. Dr. Aytaç Gören ELK 2018 - Contents W01 Basic Concepts in Electronics W02 AC to DC Conversion W03 Analysis of DC Circuits W04 Transistors and Applications (H-Bridge) W05 Op Amps

More information