Mathematical Neuroscience Seminar
Department of Mathematical Sciences
Indiana University Purdue University Indianapolis
The seminar is organized by Leonid Rubchinsky. Please, send him an e-mail, if you want to be on our mailing list.
February 19, Friday, 11:00am, SL 012 (math department seminar)
Sachin Talathi (Qualcomm Inc.)
Perspectives in computational modeling of the brain
In the first half of my presentation, I will present a biophysical perspective on computational modeling of the brain. I will present findings from my academic work on using biophysically realistic mathematical models for functional units of the nervous system to explore the dynamical principles underlying certain aspects of brain function and brain pathology. In particular, I will present results from my work on developing quantitative models to study the influence of circadian rhythms on epilepsy, a neurological disease affecting around 50 million people worldwide. I will then present findings from some of my work on the topic of neural synchronization; in particular focusing on the phenomenon of synchronization in a network of mutually coupled inhibitory interneurons. These works highlight the importance of mathematical models to analyze low-level brain phenomenon.
The second half of my presentation will focus on brain inspired computational models for artificial intelligent systems. I will draw the connection between visual neuroscience and the field of computer vision and machine learning, with a particular attention to visual scene understanding under the umbrella of the emerging field of deep learning. I will present results from some of my recent work at Qualcomm Inc. on designing computationally efficient deep learning models for real time scene understanding on mobile platforms. I will conclude with a discussion on my ideas for future research at the intersection of computational neuroscience and deep learning.
February 12, Friday, 10:30am, SL 108 (math department seminar)
Calvin Zhang (New York University)
Two applied math stories in neuroscience and locomotion
The first story is about the role of stochasticity in synaptic vesicle release. The release of neurotransmitter vesicles in synapses is an unreliable process, especially in the central nervous system. Here we show that the probabilistic nature of neurotransmitter release directly influences the functional role of a synapse, and that a small probability of release per docked vesicle helps reduce the error in the reconstruction of desired signals from the time series of vesicle release events. Thus, noise is not only a source of disturbance, but it also can be beneficial for neuronal information processing.
The second story is about the distinct metachronal limb coordination in crustacean swimming. We show that the interlimb coordination in crustacean swimming is biomechanically optimal, and that the structure of the underlying neural circuit provides a robust neural mechanism for generating this coordination. Thus, we provide a concrete example of how an optimal behavior arises from the anatomical structure of a neural circuit.
November 13, Friday, 1:00pm, LD 265
Ehren Newman (IU Bloomington)
Rhythms of neural information processing and their cholinergic conductor
The hippocampus and entorhinal cortex are widely recognized as brain areas required for memory encoding. Yet, the mechanisms and dynamics of mnemonic encoding are poorly understood. An emerging view is that rhythmic neural activity supports neural information processing in the entorhinal-hippocampal circuit. I will review recent findings highlighting the importance of low (theta band) and high (gamma band) frequency rhythms in particular. A key observation within this data is that rhythmic activity occurs in phasic bursts. Why and how these bursts of rhythmic activity occur remain unknown. I will present evidence that acetylcholine is a key modulator of these neural rhythmic dynamics. Included in this evidence is an analysis of theta-gamma phase amplitude coupling, speed modulation of theta frequency, phasic coding by hippocampal place cells, and spatial coding by entorhinal grid cells. I hope to convince you that acetylcholine is a key player but moreover hope to open the door to new discussions regarding future projects and potential collaborations.
October 9, Friday, 1:00pm, LD 265
Wondimu Teka (IUPUI)
Modeling the primate motor control system for simulation studies of Huntington's disease
The motor cortex plays a major role in controlling voluntary movements by sending signals to the muscles through the spinal cord. We have developed a model of a neuronal controller which governs a two-joint arm actuated by six muscles (three flexors and three extensors). This model describes the dynamics of voluntary arm movement controlled by a spinal cord circuit with afferent feedbacks. A thalamo-cerebellar-cortical controller in the model provides the neuronal motor program necessary to perform reaching tasks based on a bell-shaped velocity profile and a straight line trajectory. To do so the controller first calculates the joint torques which are necessary to move the arm along the specified trajectory and then generates these torques with agonist and antagonist muscles via specific activation of the low-level spinal circuit. The model demonstrates that patterns of activity in the motor cortex may strongly correlate with both geometric and dynamical parameters of the movement. The correlation is caused by the afferent feedback signals and movement geometry. We conclude that the dynamical paradigm of cortical representation of movement implies coding of both dynamical movement parameters and its geometrical characteristics.
October 2, Friday, 1:00pm, LD 265
William Barnett (IUPUI)
Chemoreception and neuroplasticity in respiratory circuits
September 18, Friday, 1:00pm, LD 265
Gregory Dumont (IUPUI)
Synchronization in a Network of Excitatory Leaky Integrate-and-Fire Neurons
In a neural network, when a cell emits an action potential, it activates the downstream synapses, which implies the release of neurotransmitters. The flow of positively charged ions into the postsynaptic cell raises (depolarizes) the membrane potential. As a result, the firing of a neuron may bring the postsynaptic cell to the threshold, the postsynaptic neuron may fire and bring another cell to the threshold, and so on. There is then the possibility of an avalanche in which many neurons fire in unison. We study under which circumstances such an avalanche occurs.
August 28, Friday, 1:00pm, LD 265
William Barnett (IUPUI)
Cellular mechanisms governing dynamics of Central Pattern Generators
The dynamics of individual neurons are crucial for producing functional activity in neuronal circuits. Dynamical systems theory provides a framework to describe the activity of neuronal systems. We describe a family of mechanisms that control transient and oscillatory neuronal activity. These mechanisms are organized around the cornerstone bifurcation, which satisfies the criteria for both the saddle-node bifurcation on invariant circle (SNIC) and the blue sky catastrophe. The first mechanism describes control over the burst duration and interburst interval of an endogenously bursting neuron. The second mechanism provides control of the duration of an evoked burst in an endogenously silent neuron. The third mechanism determines the delay to spiking after inhibition in an endogenously spiking neuron.We describe how these mechanisms could explain basic dynamics of a central pattern generator (CPG) that controls six-legged locomotion [Barnett and Cymbalyuk, PLOS ONE 2014]. The network generated travelling waves of activity that propagated from posterior to anterior segments. We applied the mechanism that controls the duration of an evoked burst in a silent neuron to control gait in this CPG. By controlling the duration of evoked bursts in interneurons that controlled retraction, we demonstrated control over the period and duration of retraction in the network. These mechanisms were used to control smooth transitions between tripod and metachronal-wave gaits. The described mechanisms are generic and could be applied to a wide range of problems in motor control.
April 17, Friday, 3:30pm, LD229 (math department colloquium, refreshments will be served in LD 259 at 3:00 p.m.)
Robert Rosenbaum (Notre Dame University)
Stochastic dynamics of neural synapses in health and disease
Neurons communicate through chemical synapses: A "spike" in one neuron causes the release of neurotransmitter molecules that induce a current across another neuron's membrane. Synapses are temporarily weakened by periods of fast spiking due to a depletion of neurotransmitter resources. This effect, known as short term synaptic depression, modulates the transfer of signals between neurons. Most computational studies of dynamic synapses use a deterministic mean-field model of synaptic depression despite the fact that neurotransmitter release and recovery are fundamentally stochastic processes. We use stochastic calculus and linear response techniques to show that synaptic stochasticity fundamentally alters the transmission of information between neurons. We then combine our theoretical results with experimentally recorded data to show that the filtering properties of stochastic synapses contributes to the therapeutic efficacy of deep brain stimulation as a treatment for Parkinson's disease.
January 16, Friday, 1:00pm, LD 265
Wondimu Teka (IUPUI)
Models with Fractional Order Differentiation Produce Adapting Spiking Activities
The electrical activities of neurons can follow multiple timescale dynamics that arise from intrinsic membrane conductance or synaptic inputs. These multiple timescale dynamics display non-Markovian process that can result in power-law behavior in which the membrane voltage cannot be characterized with a single time constant. Such processes with power law dynamics are scale free, and can be modeled with a fractional order derivative. A fractional order derivative is a non-local process in which the value of the variable is determined by integrating a temporal weighted voltage trace, also called the memory trace. We developed and analyzed a fractional order model in which the order of the fractional derivative can vary from 0 to 1, with 1 representing the normal (classical) derivative. The spiking activity of the fractional order model deviates from the spiking activity of the Markovian model and reflects the temporal accumulated intrinsic membrane dynamics that affect the response of the neuron to external stimulation. When the order of the fractional derivative decreases from 1, the weights of the memory trace increase. By varying only the fractional order, the model can produce upward and downward spike adaptations found experimentally in neocortical pyramidal cells and tectal neurons in vitro. The model also produces spikes with longer first-spike latency and high inter-spike variability with power-law distribution. The fractional model generates reliable spike patterns in response to noisy input.
October 31, Friday, 1:00pm, LD 265
Shivakeshavan Ratnadurai Giridharan (IUPUI)
Synchronization of all-to-all coupled fast spiking inhibitory neurons via spike time dependent plasticity
We investigate the emergence of in-phase synchronization in a heterogeneous network of coupled inhibitory interneurons in the presence of spike timing dependent plasticity (STDP). Using a simple network of two mutually coupled interneurons (2-MCI), we first study the effects of the STDP on in-phase synchronization. We demonstrate that, with STDP, the 2-MCI network can evolve to a single state of stable 1:1 in-phase synchronization or exhibit multiple regimes of higher order synchronization states. We show that the emergence of synchronization induces a structural asymmetry in the 2-MCI network such that the synapses onto the high frequency firing neurons are de-potentiated, while those onto the low frequency firing neurons are potentiated resulting in the directed flow of information from low frequency firing neurons to high frequency firing neurons. Finally, we demonstrate that the principal findings from our analysis of the 2-MCI network contribute to the emergence of robust synchronization in the Wang-Buzsaki network (Wang and Buzsaki, 1996) of all-to-all coupled inhibitory interneurons (100-MCI) for a significantly larger range of heterogeneity in the intrinsic firing rate of the neurons in the network.
September 26, Friday, 1:00pm, LD 265
Abolhassan Behrouzvaziri (IUPUI)
Mathematical Modeling of Orexinergic Neurotransmission in Temperature Responses to Methamphetamine and Stress
Derivatives of amphetamines are widely abused all over the world. After long-term use cognitive, neurophysiological, and neuroanatomical deficits have been reported. Neurophysiological deficits are enhanced by hyperthermia, which itself is major mortality factor in drug abusers. Temperature responses to injections of methamphetamine are multiphasic and include both hypothermic and hyperthermic phases, which are highly dependent on ambient temperature and previous exposure to the drug. Also, amphetamine derivatives differentially affect various neuromediator systems, such as dopaminergic, noradrenergic and serotonergic. Mathematical modelling had been recently used to explain the complexity of temperature responses to Methamphetamine. We have extended the developed model to include stress of manipulations and effects of SB. Our analysis confirmed that stress is synergistic with Meth in action on excitatory node. Orexin receptors mediate effect of Meth on both excitatory and inhibitory, but not on high-dose node. We suggest to define an Exc HD (high dose) component which is inhibitated by the Inh population. This component is activated only by injecting high dose of Meth and results in increasing the temperature response and also delay in the second response. The dose dependence activation of HD should be discussed since it is 10 and 5 mg/kg of Meth for 0 and 10 mg/kg of SB which shows the effect of disinhibitation of HD by Inh component. Exaggeration of early responses to high doses of Meth by SB has two components - low dose of drug increases sensitivity of HD node to Meth (most likely through decrease of tonic inhibition), while higher dose of SB suppresses inhibitory component of the response. Modeling approach to data assimilation appeared efficient in separating individual components of complex response with statistical analysis unachievable by traditional data processing methods.
September 19, Friday, 3:30pm, LD229 (math department colloquium, refreshments will be served in LD 259 at 3:00 p.m.)
Horacio Rotstein (NJIT)
Frequency preference response to oscillatory inputs in neuronal models: a geometric approach to subthreshold resonance
The electric properties of isolated neurons are typically studied by measuring their voltage response to current inputs with various temporal properties. Many neuron types exhibit preferred frequency responses to subthreshold oscillatory input currents reflected in a voltage amplitude peak (resonance) and a zero phase-shift (phase-resonance). This phenomena may occur in the absence of intrinsic oscillations in response to constant input currents. The dynamics principles that govern the generation of resonance and the effect of the biophysical parameters on the resonant properties are not well understood. We propose a framework to analyze the role of different ionic currents and their interactions in shaping the properties of the impedance amplitude and phase profiles (graphs of these quantities as a function of the input frequency) in linearized and quadratic biophysical models. We adapt the classical phase-plane analysis approach to account for the dynamic effects of oscillatory inputs and develop a tool, the envelope-plane diagrams, that capture the role that conductances and time scales play in amplifying the voltage response at the resonant frequency band as compared to smaller and larger frequencies. We further explain why an increase in the time scale separation causes an amplification of the voltage response in addition to shifting the resonant and phase-resonant frequencies. The method we develop provides a framework for the investigation of the preferred frequency responses in three-dimensional and nonlinear neuronal models as well as simple models of coupled neurons.
September 12, Friday, 3:30pm, LD229 (math department colloquium, refreshments will be served in LD 259 at 3:00 p.m.)
Yixin Guo (Drexel University)
Title: Mathematical Modeling of Deep Brain Stimulation
Deep Brain Stimulation (DBS) is a neurosurgical intervention that sends electrical signals to the brain to effectively alleviate the symptoms of neurological disorders such as Parkinson's disease. I will introduce some interesting mathematical models of DBS and further explore a closed-loop DBS paradigm that may potentially overcome the drawbacks of constant high frequency DBS .We then evaluate the outcome of closed-loop DBS applied to a parkinsonian network by examining both quantitative measures of neurons in the basal ganglia and the relay error of thalamocortical (TC) neurons. Our computational results show that the closed-loop DBS significantly diminish TC relay error by breaking the bursting pattern and desynchronizing the synchronized clusters in the basal ganglia. The design of the closed-loop DBS suggests that it is superior to open-loop stimulation in that not only the stimulation signal is guided by changes in neuronal activities specific to disorders being treated, but also closed-loop DBS shows much lower energy consumption compared with the conventional high frequency DBS. To support the computational results and feasibility of the closed-loop DBS, we further review some previous work that validates the evaluation measure of TC relay error and some recent experimental studies that validate the on-demand type of DBS.
May 2, Friday, noon, SL148
Sungwoo Ahn (IUPUI)
Fine temporal structure of intermittent neural synchronization
January 17, Friday, 3:30pm, LD229 (math department colloquium, refreshments will be served in LD 259 at 3:00 p.m.)
Duane Nykamp (University of Minnesota)
Capturing effective neuronal dynamics in random networks with complex topologies
We introduce a random network model in which one can prescribe the frequency of second order edge motifs. We derive effective equations for the activity of spiking neuron models coupled via such networks. A key consequence of the motif-induced edge correlations is that one cannot derive closed equations for the average activity of the nodes (the average firing rate of neurons) but instead must develop the equations in terms of the average activity of the edges (the synaptic drives). As a result, the network topology increases the dimension of the effective dynamics and allows for a larger repertoire of behavior. We demonstrate this behavior through simulations of spiking neuronal networks.
December 6, Friday, 3:30pm, LD229 (math department colloquium, refreshments will be served in LD 259 at 3:00 p.m.)
Kresimir Josic (University of Houston)
The patterns and impact of correlations in neuronal networks
Connectivity in biological networks is not completely random. Certain subgraph - motifs - occur more frequently than expected by chance. Such patterns of connectivity may in turn affect network dynamics. In this talk I will examine the impact of network structure on the level of correlated, or synchronized, spiking activity in a neuronal network. I will first describe how the patterns of correlation are related to the prevalence of certain network motifs. I will next describe the impact of such correlations on performance in a visual search task where the goal is to detect whether a predefined target object is present among multiple objects. Correlations between sensory measurements can have a large effect on performance in such categorical, global perceptual judgments.
September 27, Friday, 1pm, LD229
John Beggs (Indiana University, Bloomington)
The impact of network structure on criticality in cortical circuits
Cortical circuits have been hypothesized to operate near a critical point for optimality. Previous evidence supporting this came from bulk signals that did not show individual neuron activity. Using a 512 electrode array, we recorded hundreds of spiking neurons and found two main things: (1) Avalanche shapes can be collapsed onto a universal scaling function, a key indicator of criticality; (2) The network structure of effective connections strongly influences the critical exponents of the system. Work done in collaboration with Karin Dahmen and Alan Litke.
September 13, Friday, 1pm, LD229
Andrey Dovzhenok (University of Cincinnati)
Mathematical model of glucose-compensated Neurospora circadian clock
Circadian (daily) rhythms are vital for an organism's functions and anticipation of changes in its environment. Autonomous circadian oscillations have been shown to arise from interlocked transcription-translation negative and positive feedback loops. Recently, csp-1 gene was identified as one of the core components of the circadian clock in a fungus Neurospora crassa playing a major role in maintaining robust period length in a wide range of glucose concentrations in the growth media. In this talk, we will study a mathematical model of the Neurospora circadian clock that incorporates glucose-dependent negative feedback of csp-1 on the core oscillator. Our model reproduces glucose compensation of circadian period length, predicts loss of glucose compensation in short period mutant and provides valuable insights for further experimental studies of the Neurospora circadian clock.
February 22, Friday, 1pm, HS4055
Sebastien Helie (Purdue University)
Simulating cognitive deficits in Parkinson's
disease using a computational cognitive neuroscience model
Parkinson's disease is caused by the accelerated death of dopamine producing neurons. Numerous studies documenting cognitive deficits of Parkinson's disease patients have revealed impairments in a variety of tasks related to memory, learning, visuospatial skills, and attention. While there have been several studies documenting cognitive deficits of Parkinson's disease patients, very few computational models have been proposed. In this presentation, we show how the COVIS model of category learning can be used to simulate dopamine depletion and show that the model suffers from cognitive symptoms similar to those of human participants affected by Parkinson's disease. This suggests that COVIS may not only be an adequate model of the simulated tasks and phenomena but also more generally of the role of dopamine in these tasks and phenomena.
February 1, Friday, 3:30pm, LD229 (math department colloquium, refreshments will be served in LD 259 at 3:00 p.m.)
Alla Borisyuk (University of Utah)
Periodically driven noisy neuronal models: a spectral approach
Neurons are often driven by periodic or periodically modulated inputs. The response of the neurons is often periodic and phase-locked to the stimulus. However, this is not always the case. It is well-known that even simple deterministic systems can exhibit quasiperiodic behavior, with no clear relationship between the phases of the stimulus and response. Moreover, the biological situation is further complicated by presence of noise in the periodic inputs and intrinsic cellular properties, e.g. firing rate adaptation. In an effort to develop a mathematical theory applicable to the above biologically-motivated project on phase-locking and related phenomena, we have developed a spectral approach to stochastic circle maps. A stochastic circle map is defined as a Markov chain on the circle. This abstract class of objects includes a wide range of models for firing times of periodically forced noisy neuronal models. We analyze path-wise dynamic properties of the Markov chain, such as stochastic periodicity (or phase locking) and stochastic quasiperiodicity, and show how these properties are read off of the geometry of the spectrum of the transition operator.