HOW DOES BIOLOGICAL COMPLEXITY SUPPORT EFFICIENT COMPUTATIONS?
My objective is to understand how evolutionary pressure has resulted in neuronal complexity across biological scales of organization that result in efficient coding of sensory stimuli. For this purpose, I use a combination of experimental, computational, and theoretical techniques.
My recent theoretical work has centered in developing a unified framework to study history dependent activity in the nervous system, from molecules to behavior. We have shown that fractional order differential equations are the natural mathematical tools to describe history dependence. We have applied them to study the diffusion of synaptic receptors in the membrane and the generation of complex spiking patterns in whole neurons. We have recently extended our efforts to understand how history dependence affects information rates in neuronal networks. We have gone a step further by building circuits that model spiking neurons with history dependent components, in our case capacitors that retain memory (mem-capacitors). This work falls in the context of the re-invigorated field of neuromorphic engineering, which is starting to use mem-ristors to implement history dependent learning rules.
My experimental work also looks at history dependence but from the point of intrinsic excitability in cells. In this case, we combine electrophysiology, multi-photon imaging, and biophysical modeling to understand the computational function of changes in intrinsic excitability in cerebellar Purkinje cells. We have shown that intrinsic excitability changes after induction of synaptic plasticity. We have shown that intracellular calcium dynamics also changes after induction of synaptic plasticity. We suspect that the combination of synaptic and intrinsic plasticity in Purkinje cells implements a mechanism of pattern separation. More recently, we have started to study cerebellar intrinsic excitability in an animal model of Autism.