On a phase diagram for random neural networks with embedded spike timing dependent plasticity

Size: px
Start display at page:

Download "On a phase diagram for random neural networks with embedded spike timing dependent plasticity"

Transcription

1 BioSystems 89 (2007) On a phase diagram for random neural networks with embedded spike timing dependent plasticity Tatyana S. Turova a,, Alessandro E.P. Villa b,c,1 a Mathematical Center, Lund University, Sweden b Laboratoire de Neurobiophysique, University Joseph Fourier, Inserm, U318 Grenoble, France c Neuroheuristic Research Group, Institute of Computer Science and Organization INFORGE, University of Lausanne, Switzerland Received 28 November 2005; accepted 26 May 2006 Abstract This paper presents an original mathematical framework based on graph theory which is a first attempt to investigate the dynamics of a model of neural networks with embedded spike timing dependent plasticity. The neurons correspond to integrate-and-fire units located at the vertices of a finite subset of 2D lattice. There are two types of vertices, corresponding to the inhibitory and the excitatory neurons. The edges are directed and labelled by the discrete values of the synaptic strength. We assume that there is an initial firing pattern corresponding to a subset of units that generate a spike. The number of activated externally vertices is a small fraction of the entire network. The model presented here describes how such pattern propagates throughout the network as a random walk on graph. Several results are compared with computational simulations and new data are presented for identifying critical parameters of the model Elsevier Ireland Ltd. All rights reserved. Keywords: Random network; Spike timing dependent synaptic plasticity; Spiking neural network; Graph theory 1. Introduction The dynamics of connectivity patterns within large networks of integrate-and-fire neuromimes with embedded spike timing dependent plasticity (STDP) rules represents one of the most intriguing questions under investigation in recent years. STDP is a change in the synaptic strength based on the ordering of preand post-synaptic spikes (Bell et al., 1997). This mechanism has been proposed to explain the reinforce- Corresponding author. Tel.: ; fax: Tel.: ; fax: addresses: [email protected] (T.S. Turova), [email protected], [email protected] (A.E.P. Villa). URL: ment of synapses repeatedly activated shortly before the occurrence of a post-synaptic spike (potentiation) and the weakening of synapses strength whenever the pre-synaptic cell is repeatedly activated shortly after the occurrence of a post-synaptic spike (depression). There is evidence that the synaptic weights may follow a discrete distribution, in particular with respect to the glutamatergic NMDA-mediated receptors excitatory synapses (Montgomery and Madison, 2004). This mechanism might also drive selective synaptic pruning that lead to shape the newborn densely interconnected neural networks with removal of a large proportion of the initial connections (Huttenlocher, 1979; Huttenlocher et al., 1982; Bourgeois and Rakic, 1993). What is the role of the firing patterns presented at the begin of such process? What kind of topology can be obtained when the pruning process has stabilized? These are just few among many questions that can be raised in this topic (Song and /$ see front matter 2006 Elsevier Ireland Ltd. All rights reserved. doi: /j.biosystems

2 T.S. Turova / BioSystems 89 (2007) Abbott, 2001). Experimental investigations in animal models are limited to specific aspects of these questions with the techniques available nowadays but cannot address the global issues. The complexity of this problem has led the investigations to focus mainly on computational results of large scale neural networks modeling (Eriksson et al., 2003; Izhikevich et al., 2004; Iglesias et al., 2005a). This paper presents an original mathematical framework based on graph theory which is a first attempt to investigate this problem from a theoretical viewpoint. In order to keep this problem mathematically tractable it has been necessary to oversimplify a number of assumptions but the richness of the main questions was preserved. We present several results that can be compared with results obtained from computational simulations. In particular, we show that the phase transitions (along different parameters) take place in this type of network and we make first steps in the description of the phase diagram. Our results should be useful for further simulation analyses. 2. Network topology 2.1. Spatial distribution The network is a 2D lattice where each unit is assumed to be at the center of a relative map with coordinates x = 0, y = 0. Two types of units, excitatory and inhibitory, are laid down according to the following spatial distribution. The excitatory units are located at every point (vertex) of the graph Λ + = [ N, N] 2. The inhibitory units are more sparse because they are placed in the nodes of the enlarged, by factor 2, and shifted lattice Λ = (1/2; 1/2) + 2Λ +, thus Λ ={(1/2; 1/2) + (x, y) :x = 2k and y = 2n}. For large values of N the proportion of inhibitory units relative to the number of excitatory units tends to be 1:4. Then, the entire network is represented by the graph Λ = Λ + Λ Connectivity For any pair of units belonging to the network Λ, there is a synapse (u, v) between u Λ, the pre-synaptic neuron, and v Λ, the post-synaptic neuron, characterized by its synaptic weight. The set of the network connections at time t is denoted L(t). In other words let us define a (random) graph G(t) = (Λ,L(t)) with a set of vertices Λ and a set of directed edges L(t)at time t 0. For the initial conditions, at t = 0, let G(0) be a random graph on Λ such that any edge (u, v) is presented with a probability { q, if u v d, p 0 (u, v) = c q, otherwise, independent of other edges. Here c, q, and d are parameters of the model. We assume that c is much smaller than q, i.e., the units establish a dense net of short-range connections and a sparse net of long-range connections. Then L(0) is a random set of edges chosen with the above probabilities Synaptic weights At time t a synapse between u, the pre-synaptic neuron, and v, the post-synaptic neuron, is characterized by its synaptic weight which we denote w t (u, v). We set w t (u, v) 0, and denote for all t 0, if (u, v) / L(0), W(t) ={w t (u, v) :(u, v) L(0),t 0} the set of synaptic weights at time t that depend on the types of the pre- and post-synaptic units. The strength of the excitatory connections can take only four discrete states: zero (no connection), weak (α 0 ), medium (α 1 )or strong (α 2 ). Then, the set of excitatory connections at time t is described by w t (u, v) {0,α 0,α 1,α 2 }, u Λ +,v Λ, where 0 <α 0 <α 1 <α 2 1. The strength of the inhibitory connections can only take two values: w t (u, v) {0, γ}, u Λ,v Λ, where 0 <γ 1. For the initial conditions, at t = 0, it is assumed that any excitatory connection in W(0) = {w 0 (u, v) :(u, v) L(0)} has a medium strength, i.e.: { α1, if u Λ +, w 0 (u, v) = γ, if u Λ. Notice that the post-synaptic potentials may assume values in the range [0,M], corresponding to the synaptic strengths w t (u, v) multiplied by some positive constant M.

3 282 T.S. Turova / BioSystems 89 (2007) Neuromimetic modeling The units of the graph correspond to neuromimes (oversimplified neuronal models) whose membrane potential X v (t) varies in the subthreshold range [0, 1]. A unit v fires a spike at time t if X v (t) = 1. Thus, the state of all network units at time t is defined by X(t) ={X v (t) [0, 1] : v Λ,t 0} with the initial state X v (0) = X v for all v, where X v are independent random variables uniformly distributed on [0, 1]. Immediately after firing, the state of a unit v is reset to a random state X v (t+), which is an independent copy of a random variable X v. Notice that the chance that X v (t+) = 1 is zero, thus there is always some positive period τ>0 between two consecutive spikes of one unit. Hence, we may consider that a refractory period here is at least τ>0. 4. Network dynamics The dynamics of the network is described by the Markov process (X(t),G(t),W(t)), t Decay of the excitatory synaptic weights To make our model mathematically tractable it is assumed that in the absence of any input the states of the neurons are kept constant. However, the weights of the excitatory connections may vary in time between states weak (α 0 ), medium (α 1 ) and strong (α 2 ). In the absence of any activity of the pre- and post-synaptic units it is assumed that the excitatory synapses tend to disappear, i.e., it is assumed that w t (u, v) is a random process with decreasing trajectories: from state α i it jumps to state α i 1 with intensity μ i, for i = 2, 1. This means that (in the absence of any activity) the synaptic weight remains at state α i for a random time, which is distributed exponentially with mean 1/μ i. In presence of activity the synaptic strengths can decrease or increase following a spike timing dependent plasticity (STDP) rule described later and vary between states α 0 and α 2. State zero is absorbing because the process w t (u, v) jumps from state α 0 to 0 with intensity μ 0 but once a synapse reached level 0 it cannot recover any other state and the connection is lost. Hence we define L(t) ={(u, v) L(0) : w t (u, v) 0} L(0), with the probabilities of edges as in the initial graph G(0) Evoked activity Let us consider a time sequence T ={T 1,T 2,...}. At each time t T a set of units A 0 Λ receives a suprathreshold input such that those units generate a spike. We assume, that T n+1 T n τ. Note that τ should not be too large compare to 1/μ 0 + 1/μ 1 + 1/μ 2, since otherwise during the period without any activity the system can get with a high probability to the state where all the synaptic connections are zeros. Thus, let us set A 0 ={v Λ : X v (t) = 1}. This evoked activity is propagated to the target units along a path that follows the edges of the (random) graph G(t) = (Λ,L(t)). Let us examine some simple cases Firing of 1 inhibitory unit In this case A 0 contains only one vertex u Λ, i.e., only one inhibitory neuron is at state 1. For all v Λ such that there is an edge (u, v) in the graph G(t) and w(u, v) 0 the states of the units are updated and we set X v (t+) = max{0,x v (t) + w t (u, v)}. Thus, the inhibitory impulse may affect only the units that are receiving a direct input from the active unit u in graph G(t), and cannot propagate further in the network Firing of 1 excitatory unit In this case A 0 contains only one vertex u Λ +, i.e., only one excitatory neuron is at state 1. For all v Λ such that there is an edge (u, v) in the graph G(t) and w(u, v) 0 the states of the units are updated and we set X v (t+) = min{1,x v (t) + w t (u, v)}. Let us denote A 1 the set of units which fire due to the input from unit u: A 1 ={v Λ : X v (t) + w t (u, v) 1}\A 0. (1) The algorithm stops if set A 1 =. Otherwise, the algorithm continues recursively assuming the existence of sets A 0,A 1,...,A n for some n 1. Let us denote A n+1 the set formed by all the units which fire due to the impulses generated by the units with indices in n i=0 A i: A n+1 = v Λ : X v(t) + w t (v,v) 1 \ v n i=0 A i n i=0 A i. (2)

4 T.S. Turova / BioSystems 89 (2007) The algorithm continues until minimal k such that A k+1 =. With this k it is possible to define the set A(t) = k i=0 A i. (3) This means that at time t+ all the units have a state defined by X v, if v A(t), { } X v (t+) = max 0,X v (t) + w t (v,v), if v/ A(t). v A(t+) Notice that as stated here the propagation of the impulses occurs instantaneously throughout the network and all units with indices in A(t) fire simultaneously at time t. However, the iterations may also be viewed as discrete steps at some micro-scale of time, where the time steps correspond to the indices 0, 1,...,k. 5. Results Our aim is to study the dynamics of the connectivity, i.e., the qualitative and quantitative features of the modifications of the graph G(t) = (Λ,L(t)) due to the STDP rule and to specific input patterns of activity. Firstly we derive a graph induced by the strong excitatory connections (i.e., synaptic weight α 2 ) after the first application of the STDP rule, and describe the architecture of this graph. Then we provide a (partial) description of the phase diagram on the space of parameters (c, q, α 1,γ), which will describe the size of the graph induced on G(0) by the STDP rule. At this stage of the study we do not present computational results which are currently under investigation Choice of the initial random graph G(0): parameters q and c Firing of a set of units In general, if at time t only the units of some set A 0 have been activated, it is possible to proceed recursively as in the previous case, by replacing Eq. (1) with the following equation: A 1 = v Λ : X v(t) + w t (v,v) 1 \ A 0. (4) v A Spike timing dependent plasticity rule Spike timing dependent plasticity (STDP) is a change in the synaptic strength based on the ordering of pre- and postsynaptic spikes. At this stage we present a very rough STDP rule that will be improved in future extensions of this work. Following the previous notation we can state that if v A i and v A j for some i<jit means that a post-synaptic unit v generates a spike immediately after the occurrence of a pre-synaptic spike of unit v.in this case it is assumed that the strength of the synapse is increased to the strong level, α 2. Conversely, if there are no post-synaptic spikes generated after a pre-synaptic spike the strength of the synapse is decreased to the weak level, α 0. In the current model the inhibitory connections are not modified by the STDP rule. Hence, the set of the synaptic weights is dynamically defined by the following equation: α 2, if v A i,v A j for i<jand w t (v, v ) > 0, w t+ (v, v ) = α 0, only if v A(t+) and w t (v, v ) > 0, w t (v, v ), otherwise. Let us remind that G(t) G(0) for all t, following the construction rule presented above (see Section 2.2). The graph G(0) is sparse if, with a high probability, any connected component has a size bounded uniformly in N. Then, even after the changes due to STDP the graph will be formed by bounded connected parts of the network. In order to study a dynamical phenomenon that propagates through a large part of the network, the parameters q and c should not be too small. Let us recall the basic facts from the percolation theory and the theory of random graphs which one should take into account here. If c = 0, then there is a constant 0 <q 0 = q 0 (d) < 1 such that if q>q 0 then, with a positive probability (independent of N), there is a connected path of short-range connections in G(0) from the origin (or any other unit) to the boundary of Λ.Ifq<q 0 then, with a probability one, any connected path of short-range connections in G(0) is uniformly bounded in N. The exact value of q 0 (d) is known only for d = 1: q 0 (1) = 1/2. Let q = 0. According to the theory of random graphs (Bollobás, 1985),ifc = c 1 /(2N) 2 with c 1 > 1 then G(0) has, with a high probability, a connected component which spans a positive fraction of all the vertices in Λ + (even when N ). If c = c 1 /(2N) 2 with c 1 < 1 then with a high probability, the largest connected component in Λ + is at most of order log N when N. Hence, if q<q 0 and c<1/(2n) 2 then, no matter what are the (5)

5 284 T.S. Turova / BioSystems 89 (2007) other parameters, the induced network after the STDP rule will consist of only a very small fraction of units (exponentially small compared to the initial size). In the following results we shall consider parameters q and c outside of this area Propagation of a firing pattern A firing pattern is defined by a fixed set A 0 firing spikes at time T. The strength of the inhibitory connections is not modified by any STDP rule here, and for sake of simplicity we assume that at time T all the synaptic weights still have a medium strength, i.e., for any (u, v) L(0), w T (u, v) = w 0 (u, v) {α 1, γ}, as in the initial state. In this case Eq. (2) becomes A n+1 ={v Λ : X v (t) + α 1 {v n i=0 A i Λ + : (v,v) L(0)} γ {v n i=0 A i Λ : (v,v) L(0)} 1}\ n i=0 A i, where we simply have to calculate the number of excitatory and the inhibitory impulses for each unit, since the strengths of the connections are equal. (Note that here and elsewhere for any set A we denote A the number of elements in A.) Let us define now a random set A of units activated by A 0 according to Eq. (3): A = A(T ) = k i=0 A i, (6) where k is the minimal value for which defined above A k+1 =. Denote A + = A Λ + and A + i = A i Λ +, 0 i k, the subsets of the excitatory units in A. The sets A + i for all 0 i<kare not empty because the firing pattern can be propagated only through the excitatory connections. The set A + k can be empty, because A k may consist of only the inhibitory neurons. Notice that in the present scenario of propagation of firing from one single source, the (directed) cycles of edges cannot be found in the final structure. Cycles can be constructed, for example, in the course of an alternating activation of (at least) two different subsets of units Graph of strongly interconnected units Let us define G + the graph on the set A +, after application of the STDP rule (Eq. (5)) between its vertices, induced only by the strong connections, i.e., by the connections (u, v) for which w T + = α 2. Consider the set A + as an abstract set of vertices, not taking into account their coordinates in Λ. The sets A + 0,...,A+ k may be viewed as k + 1 parallel layers. Each unit may be viewed as the target of converging inputs, named here the in-degree, from the previous layer and the source of diverging outputs, named here the outdegree, projecting to the next layer. By construction the graph G + has the following basic properties: there is at least one path from the initial set A + 0 and at least one edge from the set A + i 1 to any vertex of any set A + i, i 1; the edges may go from the vertices of a set A + i to the vertices of a set A + j if and only if i<j; there are no edges between vertices of the same set A + i for any i. Let us take into account the metric properties of the original space Λ and distinguish the two following subgraphs of G + : G + S is the subgraph that contains all the short-range edges in A +, including their endpoints, and V S is its set of vertices. G + L is the subgraph that contains only long-range edges in A +, including their endpoints, and V L is its set of vertices. Notice that V S and V L may have common vertices. The parameters of the model determine very different structures and relations between subgraphs G + S and G+ L Effect of connection strengths α 1 and γ We refer here to the propagation of a firing pattern as defined in Section 5.2. We assume that A 0 is small Strong excitatory connections: α 1 1 and γ = 0 In the marginal case, if α 1 = 1 and γ = 0, the graph G + is the subgraph of G(0): it consists of A 0 and all the units of Λ + (and the edges between them) which are connected to A 0 in G(0). If γ = 0 the probability that a unit will fire due to a firing of a single neighbour is qα 1, if a short-range neighbour fires, and it is cα 1 if a long-range neighbour fires. This analysis can be extended to consider the probability of firing due to the simultaneous firing of several units, which leads to two different phases: if parameters (α 1,γ) are in the vicinity of point (1, 0) the structure of G + is basically the same as the subgraph of G(0) connected to A 0 ; if 0 α 1 α 0 1 (for some 0 <α0 1 < 1) then, no matter what are the values of other parameters, the graph

6 T.S. Turova / BioSystems 89 (2007) G + will never span a positive fraction of the entire network Weak excitatory connections: α 1 0 Let us assume that the synaptic weight α 1 is weak. In this case a unit must receive enough inputs to reach the threshold of discharge in order to generate a spike, i.e., a unit must have a sufficiently high in-degree. Denote ν in the minimal in-degree which yields, with a high probability, the firing of a unit. Obviously, ν in is increasing when α 1 is decreasing. By the assumptions on the probabilities p 0 (u, v), we have c q, where c is the probability of a long- and q is the probability of a short-range connection (see Section 2.2). This implies that the main contribution into ν in will be from the short-range connections. Furthermore, for small values of α 1 the subgraph G + L will not be generated at all. The graph G + can be further described as follows: the only vertices with in-degree zero can be found in A 0 ; there is a high probability that the in-degree of any other vertex is at least ν in. Hence, for weak synaptic weights α 1, the first firing pattern propagates through a forest-like (converging) graph with the maximal length of the connected path of order log A 0, which is also the order of value k Critical excitatory connections Based on our preliminary analysis above we can derive that for any fixed values c 1 > 1, q>q 0 and γ = 0 there is a critical value 0 <α cr 1 (0) < 1, such that if α 1 is above this value then the graph G + spans a positive fraction of all the units in the network. We conjecture, that for all 0 <γ<1 there is a critical value α cr 1 (0) α cr 1 (γ) 1, above which G+ spans a positive fraction of all the units in the network. It is plausible to think that α cr 1 (γ) is an increasing function of γ. It is an interesting open question whether the strict inequality α cr 1 (γ) < 1 holds for all 0 <γ<1. In other words, whether indeed for any γ there is a parameter which yields a positive fraction of the connected units in the network Effect of the initial firing pattern A 0 The interesting result here is that the geometry of the set A 0 affects directly only the subgraph G + S. Let the number of the excitatory vertices A + 0 =m N2 be fixed. Let us consider two marginal examples of different configurations of set A + 0 in Λ. The distance between any two vertices in A + 0 is greater than d. Thus, A + 0 most likely does not generate a subgraph G + S at all. The set A + 0 covers a circle around some vertex. Thus, for large values of m the vertices in A + 0 at distance greater than d from the boundary of the set A + 0 do not contribute to the formation of G + S. In both cases the subgraph G + L remains the same because it depends only on the size m of A Discussion This paper has presented an original mathematical framework to study the dynamics of a model of neural networks with embedded spike timing dependent plasticity. The presynaptic signals that lead to a spike enhance the strength of an excitatory synapse by STDP, thereby reinforcing a specific path of the connectivity graph. Recent study suggest that some inhibitory synapses may be modified in a spike timing dependent manner, somewhat similar to what is observed for the excitatory (Harkany et al., 2004). In the case of inhibitory STDP, the increase in synaptic strength would lead to an effect opposite to the storage or reinforcement of the activity patterns that caused it. We do not rule out that the inhibitory STDP may play an important role in shaping preferred paths through the graph, but at this stage of our model it is premature to introduce this additional parameter. We have emphasized the importance of the initial parameters, in particular of the initial topology that describes patterns of short- and long-range excitatory connections. This is in agreement with well known data of the cerebral cortex anatomy (Szentágothai, 1975; Braitenberg and Schüz, 1988). The cortico-cortical connections are characterized by short-range dense columnar and areal projections and long-range sparse interareal and interhemispheric connections (Greilich, 1984). In the current framework, with relatively weak excitatory connections, our results show that the degree of the diverging projections (out-degree) is smaller than the degree of converging inputs (in-degree) for both graphs of short- (G + S ) and long-range (G+ L ) connections. Indeed we should consider that the smaller the medium synaptic weight, α 1, the larger should be the in-degree and smaller the out-degree of any vertex in A + \ V 0. This means that a unit may successfully contribute only to the firings of a small number of other units, but it requires many more inputs to become activated. This is in good agreement with the computational results obtained with similar models (Iglesias et al., 2005b).

7 286 T.S. Turova / BioSystems 89 (2007) The initial firing pattern A 0 plays also an important role. To get more excitation in the network it is appropriate to have A + 0 composed of assemblies of units grouped in patches of some diameter d placed next to each other at certain distances e. This parameter e should be chosen such that the excitation may propagate from one group to the other throughout the entire network. We suggest that the diameter d of A + 0 plays a key role for the study of G(t) knowing the structure of graph G +. In conclusion, the study of neural networks dynamics with the help of graph theory may offer new perspectives for the identification of critical parameters of the models. Future simulations should consider these results to guide the build-up of more accurate models and focus the exploration of the parameter space on targeted areas. Acknowledgments The authors thank the referees for the helpful remarks. The research of T.T. was supported by the Swedish Natural Science Research Council and the Mathematical Science Research Institute, Berkeley, USA. References Bell, C.C., Han, V.Z., Sugawara, Y., Grant, K., Synaptic plasticity in a cerebellum-like structure depends on temporal order. Nature 387, Bollobás, B., Random Graphs. Academic Press, London. Bourgeois, J., Rakic, P., Changes of synaptic density in the primary visual cortex of the macaque monkey from fetal to adult stage. J. Neurosci. 13, Braitenberg, V., Schüz, A., Cortex: Statistics and Geometry of Neuronal Connectivity, 2nd ed. Springer Verlag, Berlin. Eriksson, J., Torres, O., Mitchell, A., Tucker, G., Lindsay, K., Rosenberg, J., Moreno, J.M., Villa, A.E.P., Spiking neural networks for reconfigurable poetic tissue. Lecture Notes Comput. Sci. 2606, Greilich, H., Quantitative analyse der cortico-corticalen fernverbindungen bei der maus. PhD Thesis, University of Tübingen. Harkany, T., Holmgren, C., Hartig, W., Qureshi, T., Chaudhry, F.A., Storm-Mathisen, J., Dobszay, M.B., Berghuis, P., Schulte, G., Sousa, K.M., Fremeau, R.T.J., Edwards, R.H., Mackie, K., Ernfors, P., Zilberter, Y., Endocannabinoid-independent retrograde signaling at inhibitory synapses in layer 2/3 of neocortex: involvement of vesicular glutamate transporter 3. J. Neurosci. 24, Huttenlocher, P.R., Synaptic density in human frontal cortex developmental changes and effects of aging. Brain Res. 163, Huttenlocher, P.R., de Courten, C., Garey, L.J., Van der Loos, H., Synaptogenesis in human visual cortex evidence for synapse elimination during normal development. Neurosci. Lett. 33, Iglesias, J., Eriksson, J., Grize, F., Tomassini, M., Villa, A.E.P., Dynamics of pruning in simulated large-scale spiking neural networks. BioSystems 79, Iglesias, J., Eriksson, J., Pardo, B., Tomassini, M., Villa, A.E.P., Emergence of oriented cell assemblies associated with spiketiming-dependent plasticity. Lecture Notes Comput. Sci. 3704, Izhikevich, E.M., Gally, J.A., Edelman, G.M., Spike-timing dynamics of neuronal groups. Cerebral Cortex 14, Montgomery, J., Madison, D., Discrete synaptic states define a major mechanism of synapse plasticity. Trends Neurosci. 27, Song, S., Abbott, L.F., Cortical development and remapping through spike timing-dependent plasticity. Neuron 32, Szentágothai, J., The module-concept in cerebral cortex architecture. Brain Res. 95,

Appendix 4 Simulation software for neuronal network models

Appendix 4 Simulation software for neuronal network models Appendix 4 Simulation software for neuronal network models D.1 Introduction This Appendix describes the Matlab software that has been made available with Cerebral Cortex: Principles of Operation (Rolls

More information

The Heat Equation. Lectures INF2320 p. 1/88

The Heat Equation. Lectures INF2320 p. 1/88 The Heat Equation Lectures INF232 p. 1/88 Lectures INF232 p. 2/88 The Heat Equation We study the heat equation: u t = u xx for x (,1), t >, (1) u(,t) = u(1,t) = for t >, (2) u(x,) = f(x) for x (,1), (3)

More information

INDISTINGUISHABILITY OF ABSOLUTELY CONTINUOUS AND SINGULAR DISTRIBUTIONS

INDISTINGUISHABILITY OF ABSOLUTELY CONTINUOUS AND SINGULAR DISTRIBUTIONS INDISTINGUISHABILITY OF ABSOLUTELY CONTINUOUS AND SINGULAR DISTRIBUTIONS STEVEN P. LALLEY AND ANDREW NOBEL Abstract. It is shown that there are no consistent decision rules for the hypothesis testing problem

More information

CHAPTER 6 PRINCIPLES OF NEURAL CIRCUITS.

CHAPTER 6 PRINCIPLES OF NEURAL CIRCUITS. CHAPTER 6 PRINCIPLES OF NEURAL CIRCUITS. 6.1. CONNECTIONS AMONG NEURONS Neurons are interconnected with one another to form circuits, much as electronic components are wired together to form a functional

More information

An Empirical Study of Two MIS Algorithms

An Empirical Study of Two MIS Algorithms An Empirical Study of Two MIS Algorithms Email: Tushar Bisht and Kishore Kothapalli International Institute of Information Technology, Hyderabad Hyderabad, Andhra Pradesh, India 32. [email protected],

More information

Self Organizing Maps: Fundamentals

Self Organizing Maps: Fundamentals Self Organizing Maps: Fundamentals Introduction to Neural Networks : Lecture 16 John A. Bullinaria, 2004 1. What is a Self Organizing Map? 2. Topographic Maps 3. Setting up a Self Organizing Map 4. Kohonen

More information

Graphs without proper subgraphs of minimum degree 3 and short cycles

Graphs without proper subgraphs of minimum degree 3 and short cycles Graphs without proper subgraphs of minimum degree 3 and short cycles Lothar Narins, Alexey Pokrovskiy, Tibor Szabó Department of Mathematics, Freie Universität, Berlin, Germany. August 22, 2014 Abstract

More information

Greedy Routing on Hidden Metric Spaces as a Foundation of Scalable Routing Architectures

Greedy Routing on Hidden Metric Spaces as a Foundation of Scalable Routing Architectures Greedy Routing on Hidden Metric Spaces as a Foundation of Scalable Routing Architectures Dmitri Krioukov, kc claffy, and Kevin Fall CAIDA/UCSD, and Intel Research, Berkeley Problem High-level Routing is

More information

COMBINATORIAL PROPERTIES OF THE HIGMAN-SIMS GRAPH. 1. Introduction

COMBINATORIAL PROPERTIES OF THE HIGMAN-SIMS GRAPH. 1. Introduction COMBINATORIAL PROPERTIES OF THE HIGMAN-SIMS GRAPH ZACHARY ABEL 1. Introduction In this survey we discuss properties of the Higman-Sims graph, which has 100 vertices, 1100 edges, and is 22 regular. In fact

More information

Temporal Difference Learning in the Tetris Game

Temporal Difference Learning in the Tetris Game Temporal Difference Learning in the Tetris Game Hans Pirnay, Slava Arabagi February 6, 2009 1 Introduction Learning to play the game Tetris has been a common challenge on a few past machine learning competitions.

More information

The Goldberg Rao Algorithm for the Maximum Flow Problem

The Goldberg Rao Algorithm for the Maximum Flow Problem The Goldberg Rao Algorithm for the Maximum Flow Problem COS 528 class notes October 18, 2006 Scribe: Dávid Papp Main idea: use of the blocking flow paradigm to achieve essentially O(min{m 2/3, n 1/2 }

More information

Maximum Likelihood Estimation

Maximum Likelihood Estimation Math 541: Statistical Theory II Lecturer: Songfeng Zheng Maximum Likelihood Estimation 1 Maximum Likelihood Estimation Maximum likelihood is a relatively simple method of constructing an estimator for

More information

Protein Protein Interaction Networks

Protein Protein Interaction Networks Functional Pattern Mining from Genome Scale Protein Protein Interaction Networks Young-Rae Cho, Ph.D. Assistant Professor Department of Computer Science Baylor University it My Definition of Bioinformatics

More information

Social Media Mining. Graph Essentials

Social Media Mining. Graph Essentials Graph Essentials Graph Basics Measures Graph and Essentials Metrics 2 2 Nodes and Edges A network is a graph nodes, actors, or vertices (plural of vertex) Connections, edges or ties Edge Node Measures

More information

CS 598CSC: Combinatorial Optimization Lecture date: 2/4/2010

CS 598CSC: Combinatorial Optimization Lecture date: 2/4/2010 CS 598CSC: Combinatorial Optimization Lecture date: /4/010 Instructor: Chandra Chekuri Scribe: David Morrison Gomory-Hu Trees (The work in this section closely follows [3]) Let G = (V, E) be an undirected

More information

Resting membrane potential ~ -70mV - Membrane is polarized

Resting membrane potential ~ -70mV - Membrane is polarized Resting membrane potential ~ -70mV - Membrane is polarized (ie) Electrical charge on the outside of the membrane is positive while the electrical charge on the inside of the membrane is negative Changes

More information

Separation Properties for Locally Convex Cones

Separation Properties for Locally Convex Cones Journal of Convex Analysis Volume 9 (2002), No. 1, 301 307 Separation Properties for Locally Convex Cones Walter Roth Department of Mathematics, Universiti Brunei Darussalam, Gadong BE1410, Brunei Darussalam

More information

6.3 Conditional Probability and Independence

6.3 Conditional Probability and Independence 222 CHAPTER 6. PROBABILITY 6.3 Conditional Probability and Independence Conditional Probability Two cubical dice each have a triangle painted on one side, a circle painted on two sides and a square painted

More information

Practical Graph Mining with R. 5. Link Analysis

Practical Graph Mining with R. 5. Link Analysis Practical Graph Mining with R 5. Link Analysis Outline Link Analysis Concepts Metrics for Analyzing Networks PageRank HITS Link Prediction 2 Link Analysis Concepts Link A relationship between two entities

More information

arxiv:1112.0829v1 [math.pr] 5 Dec 2011

arxiv:1112.0829v1 [math.pr] 5 Dec 2011 How Not to Win a Million Dollars: A Counterexample to a Conjecture of L. Breiman Thomas P. Hayes arxiv:1112.0829v1 [math.pr] 5 Dec 2011 Abstract Consider a gambling game in which we are allowed to repeatedly

More information

Stochastic modeling of a serial killer

Stochastic modeling of a serial killer Stochastic modeling of a serial killer M.V. Simkin and V.P. Roychowdhury Department of Electrical Engineering, University of California, Los Angeles, CA 995-594 We analyze the time pattern of the activity

More information

Competitive Analysis of On line Randomized Call Control in Cellular Networks

Competitive Analysis of On line Randomized Call Control in Cellular Networks Competitive Analysis of On line Randomized Call Control in Cellular Networks Ioannis Caragiannis Christos Kaklamanis Evi Papaioannou Abstract In this paper we address an important communication issue arising

More information

Introduction to Machine Learning and Data Mining. Prof. Dr. Igor Trajkovski [email protected]

Introduction to Machine Learning and Data Mining. Prof. Dr. Igor Trajkovski trajkovski@nyus.edu.mk Introduction to Machine Learning and Data Mining Prof. Dr. Igor Trakovski [email protected] Neural Networks 2 Neural Networks Analogy to biological neural systems, the most robust learning systems

More information

1 Solving LPs: The Simplex Algorithm of George Dantzig

1 Solving LPs: The Simplex Algorithm of George Dantzig Solving LPs: The Simplex Algorithm of George Dantzig. Simplex Pivoting: Dictionary Format We illustrate a general solution procedure, called the simplex algorithm, by implementing it on a very simple example.

More information

Stationary random graphs on Z with prescribed iid degrees and finite mean connections

Stationary random graphs on Z with prescribed iid degrees and finite mean connections Stationary random graphs on Z with prescribed iid degrees and finite mean connections Maria Deijfen Johan Jonasson February 2006 Abstract Let F be a probability distribution with support on the non-negative

More information

Section 1.3 P 1 = 1 2. = 1 4 2 8. P n = 1 P 3 = Continuing in this fashion, it should seem reasonable that, for any n = 1, 2, 3,..., = 1 2 4.

Section 1.3 P 1 = 1 2. = 1 4 2 8. P n = 1 P 3 = Continuing in this fashion, it should seem reasonable that, for any n = 1, 2, 3,..., = 1 2 4. Difference Equations to Differential Equations Section. The Sum of a Sequence This section considers the problem of adding together the terms of a sequence. Of course, this is a problem only if more than

More information

Tools for parsimonious edge-colouring of graphs with maximum degree three. J.L. Fouquet and J.M. Vanherpe. Rapport n o RR-2010-10

Tools for parsimonious edge-colouring of graphs with maximum degree three. J.L. Fouquet and J.M. Vanherpe. Rapport n o RR-2010-10 Tools for parsimonious edge-colouring of graphs with maximum degree three J.L. Fouquet and J.M. Vanherpe LIFO, Université d Orléans Rapport n o RR-2010-10 Tools for parsimonious edge-colouring of graphs

More information

Standards Alignment Minnesota Science Standards Alignment Matrix www.brainu.org/resources/mnstds

Standards Alignment Minnesota Science Standards Alignment Matrix www.brainu.org/resources/mnstds Lesson Summary: Neurons transfer information by releasing neurotransmitters across the synapse or space between neurons. Students model the chemical communication between pre-synaptic and post-synaptic

More information

Supplement to Call Centers with Delay Information: Models and Insights

Supplement to Call Centers with Delay Information: Models and Insights Supplement to Call Centers with Delay Information: Models and Insights Oualid Jouini 1 Zeynep Akşin 2 Yves Dallery 1 1 Laboratoire Genie Industriel, Ecole Centrale Paris, Grande Voie des Vignes, 92290

More information

Exponential time algorithms for graph coloring

Exponential time algorithms for graph coloring Exponential time algorithms for graph coloring Uriel Feige Lecture notes, March 14, 2011 1 Introduction Let [n] denote the set {1,..., k}. A k-labeling of vertices of a graph G(V, E) is a function V [k].

More information

CHAPTER 5 SIGNALLING IN NEURONS

CHAPTER 5 SIGNALLING IN NEURONS 5.1. SYNAPTIC TRANSMISSION CHAPTER 5 SIGNALLING IN NEURONS One of the main functions of neurons is to communicate with other neurons. An individual neuron may receive information from many different sources.

More information

Every tree contains a large induced subgraph with all degrees odd

Every tree contains a large induced subgraph with all degrees odd Every tree contains a large induced subgraph with all degrees odd A.J. Radcliffe Carnegie Mellon University, Pittsburgh, PA A.D. Scott Department of Pure Mathematics and Mathematical Statistics University

More information

SCORE SETS IN ORIENTED GRAPHS

SCORE SETS IN ORIENTED GRAPHS Applicable Analysis and Discrete Mathematics, 2 (2008), 107 113. Available electronically at http://pefmath.etf.bg.ac.yu SCORE SETS IN ORIENTED GRAPHS S. Pirzada, T. A. Naikoo The score of a vertex v in

More information

P. Jeyanthi and N. Angel Benseera

P. Jeyanthi and N. Angel Benseera Opuscula Math. 34, no. 1 (014), 115 1 http://dx.doi.org/10.7494/opmath.014.34.1.115 Opuscula Mathematica A TOTALLY MAGIC CORDIAL LABELING OF ONE-POINT UNION OF n COPIES OF A GRAPH P. Jeyanthi and N. Angel

More information

Time series Forecasting using Holt-Winters Exponential Smoothing

Time series Forecasting using Holt-Winters Exponential Smoothing Time series Forecasting using Holt-Winters Exponential Smoothing Prajakta S. Kalekar(04329008) Kanwal Rekhi School of Information Technology Under the guidance of Prof. Bernard December 6, 2004 Abstract

More information

Chapter 3 RANDOM VARIATE GENERATION

Chapter 3 RANDOM VARIATE GENERATION Chapter 3 RANDOM VARIATE GENERATION In order to do a Monte Carlo simulation either by hand or by computer, techniques must be developed for generating values of random variables having known distributions.

More information

Masters research projects. 1. Adapting Granger causality for use on EEG data.

Masters research projects. 1. Adapting Granger causality for use on EEG data. Masters research projects 1. Adapting Granger causality for use on EEG data. Background. Granger causality is a concept introduced in the field of economy to determine which variables influence, or cause,

More information

Cluster detection algorithm in neural networks

Cluster detection algorithm in neural networks Cluster detection algorithm in neural networks David Meunier and Hélène Paugam-Moisy Institute for Cognitive Science, UMR CNRS 5015 67, boulevard Pinel F-69675 BRON - France E-mail: {dmeunier,hpaugam}@isc.cnrs.fr

More information

Reading 13 : Finite State Automata and Regular Expressions

Reading 13 : Finite State Automata and Regular Expressions CS/Math 24: Introduction to Discrete Mathematics Fall 25 Reading 3 : Finite State Automata and Regular Expressions Instructors: Beck Hasti, Gautam Prakriya In this reading we study a mathematical model

More information

Fuzzy Differential Systems and the New Concept of Stability

Fuzzy Differential Systems and the New Concept of Stability Nonlinear Dynamics and Systems Theory, 1(2) (2001) 111 119 Fuzzy Differential Systems and the New Concept of Stability V. Lakshmikantham 1 and S. Leela 2 1 Department of Mathematical Sciences, Florida

More information

Big Ideas in Mathematics

Big Ideas in Mathematics Big Ideas in Mathematics which are important to all mathematics learning. (Adapted from the NCTM Curriculum Focal Points, 2006) The Mathematics Big Ideas are organized using the PA Mathematics Standards

More information

Finding and counting given length cycles

Finding and counting given length cycles Finding and counting given length cycles Noga Alon Raphael Yuster Uri Zwick Abstract We present an assortment of methods for finding and counting simple cycles of a given length in directed and undirected

More information

International Journal of Information Technology, Modeling and Computing (IJITMC) Vol.1, No.3,August 2013

International Journal of Information Technology, Modeling and Computing (IJITMC) Vol.1, No.3,August 2013 FACTORING CRYPTOSYSTEM MODULI WHEN THE CO-FACTORS DIFFERENCE IS BOUNDED Omar Akchiche 1 and Omar Khadir 2 1,2 Laboratory of Mathematics, Cryptography and Mechanics, Fstm, University of Hassan II Mohammedia-Casablanca,

More information

Network (Tree) Topology Inference Based on Prüfer Sequence

Network (Tree) Topology Inference Based on Prüfer Sequence Network (Tree) Topology Inference Based on Prüfer Sequence C. Vanniarajan and Kamala Krithivasan Department of Computer Science and Engineering Indian Institute of Technology Madras Chennai 600036 [email protected],

More information

Class notes Program Analysis course given by Prof. Mooly Sagiv Computer Science Department, Tel Aviv University second lecture 8/3/2007

Class notes Program Analysis course given by Prof. Mooly Sagiv Computer Science Department, Tel Aviv University second lecture 8/3/2007 Constant Propagation Class notes Program Analysis course given by Prof. Mooly Sagiv Computer Science Department, Tel Aviv University second lecture 8/3/2007 Osnat Minz and Mati Shomrat Introduction This

More information

No: 10 04. Bilkent University. Monotonic Extension. Farhad Husseinov. Discussion Papers. Department of Economics

No: 10 04. Bilkent University. Monotonic Extension. Farhad Husseinov. Discussion Papers. Department of Economics No: 10 04 Bilkent University Monotonic Extension Farhad Husseinov Discussion Papers Department of Economics The Discussion Papers of the Department of Economics are intended to make the initial results

More information

Markov random fields and Gibbs measures

Markov random fields and Gibbs measures Chapter Markov random fields and Gibbs measures 1. Conditional independence Suppose X i is a random element of (X i, B i ), for i = 1, 2, 3, with all X i defined on the same probability space (.F, P).

More information

CSC2420 Fall 2012: Algorithm Design, Analysis and Theory

CSC2420 Fall 2012: Algorithm Design, Analysis and Theory CSC2420 Fall 2012: Algorithm Design, Analysis and Theory Allan Borodin November 15, 2012; Lecture 10 1 / 27 Randomized online bipartite matching and the adwords problem. We briefly return to online algorithms

More information

2.3 Convex Constrained Optimization Problems

2.3 Convex Constrained Optimization Problems 42 CHAPTER 2. FUNDAMENTAL CONCEPTS IN CONVEX OPTIMIZATION Theorem 15 Let f : R n R and h : R R. Consider g(x) = h(f(x)) for all x R n. The function g is convex if either of the following two conditions

More information

Statistical Forecasting of High-Way Traffic Jam at a Bottleneck

Statistical Forecasting of High-Way Traffic Jam at a Bottleneck Metodološki zvezki, Vol. 9, No. 1, 2012, 81-93 Statistical Forecasting of High-Way Traffic Jam at a Bottleneck Igor Grabec and Franc Švegl 1 Abstract Maintenance works on high-ways usually require installation

More information

Open Access Research on Application of Neural Network in Computer Network Security Evaluation. Shujuan Jin *

Open Access Research on Application of Neural Network in Computer Network Security Evaluation. Shujuan Jin * Send Orders for Reprints to [email protected] 766 The Open Electrical & Electronic Engineering Journal, 2014, 8, 766-771 Open Access Research on Application of Neural Network in Computer Network

More information

GOAL-BASED INTELLIGENT AGENTS

GOAL-BASED INTELLIGENT AGENTS International Journal of Information Technology, Vol. 9 No. 1 GOAL-BASED INTELLIGENT AGENTS Zhiqi Shen, Robert Gay and Xuehong Tao ICIS, School of EEE, Nanyang Technological University, Singapore 639798

More information

Oscillations of the Sending Window in Compound TCP

Oscillations of the Sending Window in Compound TCP Oscillations of the Sending Window in Compound TCP Alberto Blanc 1, Denis Collange 1, and Konstantin Avrachenkov 2 1 Orange Labs, 905 rue Albert Einstein, 06921 Sophia Antipolis, France 2 I.N.R.I.A. 2004

More information

On the k-path cover problem for cacti

On the k-path cover problem for cacti On the k-path cover problem for cacti Zemin Jin and Xueliang Li Center for Combinatorics and LPMC Nankai University Tianjin 300071, P.R. China [email protected], [email protected] Abstract In this paper we

More information

CHAPTER 2 Estimating Probabilities

CHAPTER 2 Estimating Probabilities CHAPTER 2 Estimating Probabilities Machine Learning Copyright c 2016. Tom M. Mitchell. All rights reserved. *DRAFT OF January 24, 2016* *PLEASE DO NOT DISTRIBUTE WITHOUT AUTHOR S PERMISSION* This is a

More information

Applied mathematics and mathematical statistics

Applied mathematics and mathematical statistics Applied mathematics and mathematical statistics The graduate school is organised within the Department of Mathematical Sciences.. Deputy head of department: Aila Särkkä Director of Graduate Studies: Marija

More information

Bayesian probability theory

Bayesian probability theory Bayesian probability theory Bruno A. Olshausen arch 1, 2004 Abstract Bayesian probability theory provides a mathematical framework for peforming inference, or reasoning, using probability. The foundations

More information

Handout #Ch7 San Skulrattanakulchai Gustavus Adolphus College Dec 6, 2010. Chapter 7: Digraphs

Handout #Ch7 San Skulrattanakulchai Gustavus Adolphus College Dec 6, 2010. Chapter 7: Digraphs MCS-236: Graph Theory Handout #Ch7 San Skulrattanakulchai Gustavus Adolphus College Dec 6, 2010 Chapter 7: Digraphs Strong Digraphs Definitions. A digraph is an ordered pair (V, E), where V is the set

More information

Phase Change Memory for Neuromorphic Systems and Applications

Phase Change Memory for Neuromorphic Systems and Applications Phase Change Memory for Neuromorphic Systems and Applications M. Suri 1, O. Bichler 2, D. Querlioz 3, V. Sousa 1, L. Perniola 1, D. Vuillaume 4, C. Gamrat 2, and B. DeSalvo 1 ([email protected], [email protected])

More information

2 Neurons. 4 The Brain: Cortex

2 Neurons. 4 The Brain: Cortex 1 Neuroscience 2 Neurons output integration axon cell body, membrane potential Frontal planning control auditory episodes soma motor Temporal Parietal action language objects space vision Occipital inputs

More information

Lecture 4: BK inequality 27th August and 6th September, 2007

Lecture 4: BK inequality 27th August and 6th September, 2007 CSL866: Percolation and Random Graphs IIT Delhi Amitabha Bagchi Scribe: Arindam Pal Lecture 4: BK inequality 27th August and 6th September, 2007 4. Preliminaries The FKG inequality allows us to lower bound

More information

Random graphs with a given degree sequence

Random graphs with a given degree sequence Sourav Chatterjee (NYU) Persi Diaconis (Stanford) Allan Sly (Microsoft) Let G be an undirected simple graph on n vertices. Let d 1,..., d n be the degrees of the vertices of G arranged in descending order.

More information

1 Approximating Set Cover

1 Approximating Set Cover CS 05: Algorithms (Grad) Feb 2-24, 2005 Approximating Set Cover. Definition An Instance (X, F ) of the set-covering problem consists of a finite set X and a family F of subset of X, such that every elemennt

More information

Practical Guide to the Simplex Method of Linear Programming

Practical Guide to the Simplex Method of Linear Programming Practical Guide to the Simplex Method of Linear Programming Marcel Oliver Revised: April, 0 The basic steps of the simplex algorithm Step : Write the linear programming problem in standard form Linear

More information

Neural Networks and Back Propagation Algorithm

Neural Networks and Back Propagation Algorithm Neural Networks and Back Propagation Algorithm Mirza Cilimkovic Institute of Technology Blanchardstown Blanchardstown Road North Dublin 15 Ireland [email protected] Abstract Neural Networks (NN) are important

More information

An approach of detecting structure emergence of regional complex network of entrepreneurs: simulation experiment of college student start-ups

An approach of detecting structure emergence of regional complex network of entrepreneurs: simulation experiment of college student start-ups An approach of detecting structure emergence of regional complex network of entrepreneurs: simulation experiment of college student start-ups Abstract Yan Shen 1, Bao Wu 2* 3 1 Hangzhou Normal University,

More information

SEQUENCES OF MAXIMAL DEGREE VERTICES IN GRAPHS. Nickolay Khadzhiivanov, Nedyalko Nenov

SEQUENCES OF MAXIMAL DEGREE VERTICES IN GRAPHS. Nickolay Khadzhiivanov, Nedyalko Nenov Serdica Math. J. 30 (2004), 95 102 SEQUENCES OF MAXIMAL DEGREE VERTICES IN GRAPHS Nickolay Khadzhiivanov, Nedyalko Nenov Communicated by V. Drensky Abstract. Let Γ(M) where M V (G) be the set of all vertices

More information

OPTIMAL DESIGN OF DISTRIBUTED SENSOR NETWORKS FOR FIELD RECONSTRUCTION

OPTIMAL DESIGN OF DISTRIBUTED SENSOR NETWORKS FOR FIELD RECONSTRUCTION OPTIMAL DESIGN OF DISTRIBUTED SENSOR NETWORKS FOR FIELD RECONSTRUCTION Sérgio Pequito, Stephen Kruzick, Soummya Kar, José M. F. Moura, A. Pedro Aguiar Department of Electrical and Computer Engineering

More information

ECON20310 LECTURE SYNOPSIS REAL BUSINESS CYCLE

ECON20310 LECTURE SYNOPSIS REAL BUSINESS CYCLE ECON20310 LECTURE SYNOPSIS REAL BUSINESS CYCLE YUAN TIAN This synopsis is designed merely for keep a record of the materials covered in lectures. Please refer to your own lecture notes for all proofs.

More information

An On-Line Algorithm for Checkpoint Placement

An On-Line Algorithm for Checkpoint Placement An On-Line Algorithm for Checkpoint Placement Avi Ziv IBM Israel, Science and Technology Center MATAM - Advanced Technology Center Haifa 3905, Israel [email protected] Jehoshua Bruck California Institute

More information

Models of Cortical Maps II

Models of Cortical Maps II CN510: Principles and Methods of Cognitive and Neural Modeling Models of Cortical Maps II Lecture 19 Instructor: Anatoli Gorchetchnikov dy dt The Network of Grossberg (1976) Ay B y f (

More information

SPANNING CACTI FOR STRUCTURALLY CONTROLLABLE NETWORKS NGO THI TU ANH NATIONAL UNIVERSITY OF SINGAPORE

SPANNING CACTI FOR STRUCTURALLY CONTROLLABLE NETWORKS NGO THI TU ANH NATIONAL UNIVERSITY OF SINGAPORE SPANNING CACTI FOR STRUCTURALLY CONTROLLABLE NETWORKS NGO THI TU ANH NATIONAL UNIVERSITY OF SINGAPORE 2012 SPANNING CACTI FOR STRUCTURALLY CONTROLLABLE NETWORKS NGO THI TU ANH (M.Sc., SFU, Russia) A THESIS

More information

8.1 Min Degree Spanning Tree

8.1 Min Degree Spanning Tree CS880: Approximations Algorithms Scribe: Siddharth Barman Lecturer: Shuchi Chawla Topic: Min Degree Spanning Tree Date: 02/15/07 In this lecture we give a local search based algorithm for the Min Degree

More information

Gambling Systems and Multiplication-Invariant Measures

Gambling Systems and Multiplication-Invariant Measures Gambling Systems and Multiplication-Invariant Measures by Jeffrey S. Rosenthal* and Peter O. Schwartz** (May 28, 997.. Introduction. This short paper describes a surprising connection between two previously

More information

Current Standard: Mathematical Concepts and Applications Shape, Space, and Measurement- Primary

Current Standard: Mathematical Concepts and Applications Shape, Space, and Measurement- Primary Shape, Space, and Measurement- Primary A student shall apply concepts of shape, space, and measurement to solve problems involving two- and three-dimensional shapes by demonstrating an understanding of:

More information

NEUROEVOLUTION OF AUTO-TEACHING ARCHITECTURES

NEUROEVOLUTION OF AUTO-TEACHING ARCHITECTURES NEUROEVOLUTION OF AUTO-TEACHING ARCHITECTURES EDWARD ROBINSON & JOHN A. BULLINARIA School of Computer Science, University of Birmingham Edgbaston, Birmingham, B15 2TT, UK [email protected] This

More information

Physiological Basis of the BOLD Signal. Kerstin Preuschoff Social and Neural systems Lab University of Zurich

Physiological Basis of the BOLD Signal. Kerstin Preuschoff Social and Neural systems Lab University of Zurich Physiological Basis of the BOLD Signal Kerstin Preuschoff Social and Neural systems Lab University of Zurich Source: Arthurs & Boniface, 2002 From Stimulus to Bold Overview Physics of BOLD signal - Magnetic

More information

5 Directed acyclic graphs

5 Directed acyclic graphs 5 Directed acyclic graphs (5.1) Introduction In many statistical studies we have prior knowledge about a temporal or causal ordering of the variables. In this chapter we will use directed graphs to incorporate

More information

Information Theory and Coding Prof. S. N. Merchant Department of Electrical Engineering Indian Institute of Technology, Bombay

Information Theory and Coding Prof. S. N. Merchant Department of Electrical Engineering Indian Institute of Technology, Bombay Information Theory and Coding Prof. S. N. Merchant Department of Electrical Engineering Indian Institute of Technology, Bombay Lecture - 17 Shannon-Fano-Elias Coding and Introduction to Arithmetic Coding

More information

Modern Optimization Methods for Big Data Problems MATH11146 The University of Edinburgh

Modern Optimization Methods for Big Data Problems MATH11146 The University of Edinburgh Modern Optimization Methods for Big Data Problems MATH11146 The University of Edinburgh Peter Richtárik Week 3 Randomized Coordinate Descent With Arbitrary Sampling January 27, 2016 1 / 30 The Problem

More information

Metric Spaces. Chapter 7. 7.1. Metrics

Metric Spaces. Chapter 7. 7.1. Metrics Chapter 7 Metric Spaces A metric space is a set X that has a notion of the distance d(x, y) between every pair of points x, y X. The purpose of this chapter is to introduce metric spaces and give some

More information

Largest Fixed-Aspect, Axis-Aligned Rectangle

Largest Fixed-Aspect, Axis-Aligned Rectangle Largest Fixed-Aspect, Axis-Aligned Rectangle David Eberly Geometric Tools, LLC http://www.geometrictools.com/ Copyright c 1998-2016. All Rights Reserved. Created: February 21, 2004 Last Modified: February

More information

Network Algorithms for Homeland Security

Network Algorithms for Homeland Security Network Algorithms for Homeland Security Mark Goldberg and Malik Magdon-Ismail Rensselaer Polytechnic Institute September 27, 2004. Collaborators J. Baumes, M. Krishmamoorthy, N. Preston, W. Wallace. Partially

More information

IE 680 Special Topics in Production Systems: Networks, Routing and Logistics*

IE 680 Special Topics in Production Systems: Networks, Routing and Logistics* IE 680 Special Topics in Production Systems: Networks, Routing and Logistics* Rakesh Nagi Department of Industrial Engineering University at Buffalo (SUNY) *Lecture notes from Network Flows by Ahuja, Magnanti

More information

1 if 1 x 0 1 if 0 x 1

1 if 1 x 0 1 if 0 x 1 Chapter 3 Continuity In this chapter we begin by defining the fundamental notion of continuity for real valued functions of a single real variable. When trying to decide whether a given function is or

More information

THE BANACH CONTRACTION PRINCIPLE. Contents

THE BANACH CONTRACTION PRINCIPLE. Contents THE BANACH CONTRACTION PRINCIPLE ALEX PONIECKI Abstract. This paper will study contractions of metric spaces. To do this, we will mainly use tools from topology. We will give some examples of contractions,

More information

Big Data Analytics of Multi-Relationship Online Social Network Based on Multi-Subnet Composited Complex Network

Big Data Analytics of Multi-Relationship Online Social Network Based on Multi-Subnet Composited Complex Network , pp.273-284 http://dx.doi.org/10.14257/ijdta.2015.8.5.24 Big Data Analytics of Multi-Relationship Online Social Network Based on Multi-Subnet Composited Complex Network Gengxin Sun 1, Sheng Bin 2 and

More information

Review of Fundamental Mathematics

Review of Fundamental Mathematics Review of Fundamental Mathematics As explained in the Preface and in Chapter 1 of your textbook, managerial economics applies microeconomic theory to business decision making. The decision-making tools

More information

Types of Degrees in Bipolar Fuzzy Graphs

Types of Degrees in Bipolar Fuzzy Graphs pplied Mathematical Sciences, Vol. 7, 2013, no. 98, 4857-4866 HIKRI Ltd, www.m-hikari.com http://dx.doi.org/10.12988/ams.2013.37389 Types of Degrees in Bipolar Fuzzy Graphs Basheer hamed Mohideen Department

More information

THE HUMAN BRAIN. observations and foundations

THE HUMAN BRAIN. observations and foundations THE HUMAN BRAIN observations and foundations brains versus computers a typical brain contains something like 100 billion miniscule cells called neurons estimates go from about 50 billion to as many as

More information

arxiv:1203.1525v1 [math.co] 7 Mar 2012

arxiv:1203.1525v1 [math.co] 7 Mar 2012 Constructing subset partition graphs with strong adjacency and end-point count properties Nicolai Hähnle [email protected] arxiv:1203.1525v1 [math.co] 7 Mar 2012 March 8, 2012 Abstract Kim defined

More information

The positive minimum degree game on sparse graphs

The positive minimum degree game on sparse graphs The positive minimum degree game on sparse graphs József Balogh Department of Mathematical Sciences University of Illinois, USA [email protected] András Pluhár Department of Computer Science University

More information

Algebra 1 2008. Academic Content Standards Grade Eight and Grade Nine Ohio. Grade Eight. Number, Number Sense and Operations Standard

Algebra 1 2008. Academic Content Standards Grade Eight and Grade Nine Ohio. Grade Eight. Number, Number Sense and Operations Standard Academic Content Standards Grade Eight and Grade Nine Ohio Algebra 1 2008 Grade Eight STANDARDS Number, Number Sense and Operations Standard Number and Number Systems 1. Use scientific notation to express

More information

A REMARK ON ALMOST MOORE DIGRAPHS OF DEGREE THREE. 1. Introduction and Preliminaries

A REMARK ON ALMOST MOORE DIGRAPHS OF DEGREE THREE. 1. Introduction and Preliminaries Acta Math. Univ. Comenianae Vol. LXVI, 2(1997), pp. 285 291 285 A REMARK ON ALMOST MOORE DIGRAPHS OF DEGREE THREE E. T. BASKORO, M. MILLER and J. ŠIRÁŇ Abstract. It is well known that Moore digraphs do

More information