#neuraldynamics — Public Fediverse posts
Live and recent posts from across the Fediverse tagged #neuraldynamics, aggregated by home.social.
-
RE: https://mathstodon.xyz/@DurstewitzLab/116549716016889895
🧠 New preprint by Brändle et al./ @DurstewitzLab: Continuous-Time Piecewise-Linear #RecurrentNeuralNetworks introduces continuous-time #PLRNNs for #DynamicalSystems reconstruction.
The model combines interpretability and analytical tractability of pw-linear #RNN with cont.-time dynamics, allowing semi-analytic analysis of equilibria and limit cycles while handling irregularly sampled data better than standard Neural #ODEs.
-
RE: https://mathstodon.xyz/@DurstewitzLab/116549716016889895
🧠 New preprint by Brändle et al./ @DurstewitzLab: Continuous-Time Piecewise-Linear #RecurrentNeuralNetworks introduces continuous-time #PLRNNs for #DynamicalSystems reconstruction.
The model combines interpretability and analytical tractability of pw-linear #RNN with cont.-time dynamics, allowing semi-analytic analysis of equilibria and limit cycles while handling irregularly sampled data better than standard Neural #ODEs.
-
RE: https://mathstodon.xyz/@DurstewitzLab/116549716016889895
🧠 New preprint by Brändle et al./ @DurstewitzLab: Continuous-Time Piecewise-Linear #RecurrentNeuralNetworks introduces continuous-time #PLRNNs for #DynamicalSystems reconstruction.
The model combines interpretability and analytical tractability of pw-linear #RNN with cont.-time dynamics, allowing semi-analytic analysis of equilibria and limit cycles while handling irregularly sampled data better than standard Neural #ODEs.
-
RE: https://mathstodon.xyz/@DurstewitzLab/116549716016889895
🧠 New preprint by Brändle et al./ @DurstewitzLab: Continuous-Time Piecewise-Linear #RecurrentNeuralNetworks introduces continuous-time #PLRNNs for #DynamicalSystems reconstruction.
The model combines interpretability and analytical tractability of pw-linear #RNN with cont.-time dynamics, allowing semi-analytic analysis of equilibria and limit cycles while handling irregularly sampled data better than standard Neural #ODEs.
-
RE: https://mathstodon.xyz/@DurstewitzLab/116549716016889895
🧠 New preprint by Brändle et al./ @DurstewitzLab: Continuous-Time Piecewise-Linear #RecurrentNeuralNetworks introduces continuous-time #PLRNNs for #DynamicalSystems reconstruction.
The model combines interpretability and analytical tractability of pw-linear #RNN with cont.-time dynamics, allowing semi-analytic analysis of equilibria and limit cycles while handling irregularly sampled data better than standard Neural #ODEs.
-
🧠 New preprint by Lu et al: Recordings from the human #hippocampus and anterior cingulate #cortex during three distinct tasks reveal that #NeuralPopulation activity is not fully task-specific. About half of the low-dimensional #NeuralSubspace structure was shared across tasks, suggesting a stable population geometry that may support flexible #cognition across different #behaviors.
-
🧠 New preprint by Lu et al: Recordings from the human #hippocampus and anterior cingulate #cortex during three distinct tasks reveal that #NeuralPopulation activity is not fully task-specific. About half of the low-dimensional #NeuralSubspace structure was shared across tasks, suggesting a stable population geometry that may support flexible #cognition across different #behaviors.
-
🧠 New preprint by Lu et al: Recordings from the human #hippocampus and anterior cingulate #cortex during three distinct tasks reveal that #NeuralPopulation activity is not fully task-specific. About half of the low-dimensional #NeuralSubspace structure was shared across tasks, suggesting a stable population geometry that may support flexible #cognition across different #behaviors.
-
🧠 New preprint by Lu et al: Recordings from the human #hippocampus and anterior cingulate #cortex during three distinct tasks reveal that #NeuralPopulation activity is not fully task-specific. About half of the low-dimensional #NeuralSubspace structure was shared across tasks, suggesting a stable population geometry that may support flexible #cognition across different #behaviors.
-
🧠 New preprint by Lu et al: Recordings from the human #hippocampus and anterior cingulate #cortex during three distinct tasks reveal that #NeuralPopulation activity is not fully task-specific. About half of the low-dimensional #NeuralSubspace structure was shared across tasks, suggesting a stable population geometry that may support flexible #cognition across different #behaviors.
-
RE: https://mastodon.social/@appassionato/116493374179009767
Indeed, an excellent recommendation: Tristram D. Wyatt’s “#AnimalBehaviour: A Very Short Introduction” is a useful reminder for #NaturalisticNeuroscience: #Behavior is not just output, but evolved action in ecological and social context. Tinbergen’s questions, costs, signals, conflict, cooperation. This is exactly the conceptual bridge we need between eg #NeuralDynamics and real-world behavior.
🌍 https://global.oup.com/academic/product/animal-behaviour-9780198712152
-
RE: https://mastodon.social/@appassionato/116493374179009767
Indeed, an excellent recommendation: Tristram D. Wyatt’s “#AnimalBehaviour: A Very Short Introduction” is a useful reminder for #NaturalisticNeuroscience: #Behavior is not just output, but evolved action in ecological and social context. Tinbergen’s questions, costs, signals, conflict, cooperation. This is exactly the conceptual bridge we need between eg #NeuralDynamics and real-world behavior.
🌍 https://global.oup.com/academic/product/animal-behaviour-9780198712152
-
RE: https://mastodon.social/@appassionato/116493374179009767
Indeed, an excellent recommendation: Tristram D. Wyatt’s “#AnimalBehaviour: A Very Short Introduction” is a useful reminder for #NaturalisticNeuroscience: #Behavior is not just output, but evolved action in ecological and social context. Tinbergen’s questions, costs, signals, conflict, cooperation. This is exactly the conceptual bridge we need between eg #NeuralDynamics and real-world behavior.
🌍 https://global.oup.com/academic/product/animal-behaviour-9780198712152
-
RE: https://mastodon.social/@appassionato/116493374179009767
Indeed, an excellent recommendation: Tristram D. Wyatt’s “#AnimalBehaviour: A Very Short Introduction” is a useful reminder for #NaturalisticNeuroscience: #Behavior is not just output, but evolved action in ecological and social context. Tinbergen’s questions, costs, signals, conflict, cooperation. This is exactly the conceptual bridge we need between eg #NeuralDynamics and real-world behavior.
🌍 https://global.oup.com/academic/product/animal-behaviour-9780198712152
-
🧠 New preprint by Garcia-Garcia et al.: The authors show that cerebellar #GranuleCells do not simply expand #cortical activity into a high-dimensional code. Instead, they preserve low-dimensional cortical #manifold geometry while reorienting it across contexts. This rotation separates similar tasks, reduces interference, and supports flexible dual-task learning.
📄 https://www.biorxiv.org/content/10.64898/2026.03.03.709240v1.full.pdf
-
🧠 New paper by Pezon, Schmutz & Gerstner: Linking #NeuralManifolds to circuit structure in recurrent networks.
The study connects two common views of neural activity: low-dimensional #PopulationDynamics (“neural manifolds”) and single-neuron selectivity. Using recurrent network models, the authors show how circuit connectivity constrains both the geometry of neural #manifolds and the tuning of individual neurons.
-
🧠 New paper by Pezon, Schmutz & Gerstner: Linking #NeuralManifolds to circuit structure in recurrent networks.
The study connects two common views of neural activity: low-dimensional #PopulationDynamics (“neural manifolds”) and single-neuron selectivity. Using recurrent network models, the authors show how circuit connectivity constrains both the geometry of neural #manifolds and the tuning of individual neurons.
-
🧠 New paper by Pezon, Schmutz & Gerstner: Linking #NeuralManifolds to circuit structure in recurrent networks.
The study connects two common views of neural activity: low-dimensional #PopulationDynamics (“neural manifolds”) and single-neuron selectivity. Using recurrent network models, the authors show how circuit connectivity constrains both the geometry of neural #manifolds and the tuning of individual neurons.
-
🧠 New paper by Pezon, Schmutz & Gerstner: Linking #NeuralManifolds to circuit structure in recurrent networks.
The study connects two common views of neural activity: low-dimensional #PopulationDynamics (“neural manifolds”) and single-neuron selectivity. Using recurrent network models, the authors show how circuit connectivity constrains both the geometry of neural #manifolds and the tuning of individual neurons.
-
🧠 New paper by Pezon, Schmutz & Gerstner: Linking #NeuralManifolds to circuit structure in recurrent networks.
The study connects two common views of neural activity: low-dimensional #PopulationDynamics (“neural manifolds”) and single-neuron selectivity. Using recurrent network models, the authors show how circuit connectivity constrains both the geometry of neural #manifolds and the tuning of individual neurons.
-
🧠 New preprint by Guardamagna et al.: Using large-scale recordings in #rat pups, the authors show that toroidal #manifolds in #MEC emerge by P10, before eye and ear opening, upright gait, and active exploration. Ring-like manifolds appear even earlier, by P9. External spatial experience seems to align these preconfigured internal maps only later, as pups begin to navigate.
-
🧠 New preprint by Guardamagna et al.: Using large-scale recordings in #rat pups, the authors show that toroidal #manifolds in #MEC emerge by P10, before eye and ear opening, upright gait, and active exploration. Ring-like manifolds appear even earlier, by P9. External spatial experience seems to align these preconfigured internal maps only later, as pups begin to navigate.
-
🧠 New preprint by Guardamagna et al.: Using large-scale recordings in #rat pups, the authors show that toroidal #manifolds in #MEC emerge by P10, before eye and ear opening, upright gait, and active exploration. Ring-like manifolds appear even earlier, by P9. External spatial experience seems to align these preconfigured internal maps only later, as pups begin to navigate.
-
🧠 New preprint by Guardamagna et al.: Using large-scale recordings in #rat pups, the authors show that toroidal #manifolds in #MEC emerge by P10, before eye and ear opening, upright gait, and active exploration. Ring-like manifolds appear even earlier, by P9. External spatial experience seems to align these preconfigured internal maps only later, as pups begin to navigate.
-
🧠 New preprint by Guardamagna et al.: Using large-scale recordings in #rat pups, the authors show that toroidal #manifolds in #MEC emerge by P10, before eye and ear opening, upright gait, and active exploration. Ring-like manifolds appear even earlier, by P9. External spatial experience seems to align these preconfigured internal maps only later, as pups begin to navigate.
-
Cool work on conserved #MotorCortex dynamics across species. #Behavior differs mainly through different trajectories on shared #NeuralManifolds. #NeuralDynamics #CompNeuro #Neuroscience 🧪
RE: https://bsky.app/profile/did:plc:tfffyrbltg3reliv5wq35on3/post/3mgpw73yhac2q -
🧠 New work by Codol et al. who show that #MotorCortex dynamics are remarkably conserved across #mice, #monkeys, and #humans. Despite very different #behaviors, #NeuralPopulation activity follows similar dynamical rules on low-dimensional #manifolds. Species differences arise mainly from the geometry of trajectories within this shared #DynamicalSystem.
-
🧠 New preprint by Chericoni et al: #NeuralPopulation activity in the #hippocampus encodes spatial information for different agents (self, prey, gaze) in distinct but related low-dimensional subspaces. The study shows that these representations can be linearly transformed between each other, suggesting a shared geometric code supporting flexible spatial #cognition and multi-agent navigation.
-
🧠 New preprint by Chericoni et al: #NeuralPopulation activity in the #hippocampus encodes spatial information for different agents (self, prey, gaze) in distinct but related low-dimensional subspaces. The study shows that these representations can be linearly transformed between each other, suggesting a shared geometric code supporting flexible spatial #cognition and multi-agent navigation.
-
🧠 New preprint by Chericoni et al: #NeuralPopulation activity in the #hippocampus encodes spatial information for different agents (self, prey, gaze) in distinct but related low-dimensional subspaces. The study shows that these representations can be linearly transformed between each other, suggesting a shared geometric code supporting flexible spatial #cognition and multi-agent navigation.
-
🧠 New preprint by Chericoni et al: #NeuralPopulation activity in the #hippocampus encodes spatial information for different agents (self, prey, gaze) in distinct but related low-dimensional subspaces. The study shows that these representations can be linearly transformed between each other, suggesting a shared geometric code supporting flexible spatial #cognition and multi-agent navigation.
-
🧠 New preprint by Chericoni et al: #NeuralPopulation activity in the #hippocampus encodes spatial information for different agents (self, prey, gaze) in distinct but related low-dimensional subspaces. The study shows that these representations can be linearly transformed between each other, suggesting a shared geometric code supporting flexible spatial #cognition and multi-agent navigation.
-
Spike-timing-dependent #plasticity (#STDP) is a core rule in #ComputationalNeuroscience that adjusts #synaptic strength based on precise pre- vs. postsynaptic #spike timing, enabling #TemporalCoding and #learning in #SNN. In this post, I summarize its mathematical formulation, functional consequences for learning and #memory along with a simple #Python example:
🌍 https://www.fabriziomusacchio.com/blog/2026-02-12-stdp/
#CompNeuro #Neuroscience #SNN #NeuralDynamics #NeuralPlasticity
-
Spike-timing-dependent #plasticity (#STDP) is a core rule in #ComputationalNeuroscience that adjusts #synaptic strength based on precise pre- vs. postsynaptic #spike timing, enabling #TemporalCoding and #learning in #SNN. In this post, I summarize its mathematical formulation, functional consequences for learning and #memory along with a simple #Python example:
🌍 https://www.fabriziomusacchio.com/blog/2026-02-12-stdp/
#CompNeuro #Neuroscience #SNN #NeuralDynamics #NeuralPlasticity
-
Spike-timing-dependent #plasticity (#STDP) is a core rule in #ComputationalNeuroscience that adjusts #synaptic strength based on precise pre- vs. postsynaptic #spike timing, enabling #TemporalCoding and #learning in #SNN. In this post, I summarize its mathematical formulation, functional consequences for learning and #memory along with a simple #Python example:
🌍 https://www.fabriziomusacchio.com/blog/2026-02-12-stdp/
#CompNeuro #Neuroscience #SNN #NeuralDynamics #NeuralPlasticity
-
Spike-timing-dependent #plasticity (#STDP) is a core rule in #ComputationalNeuroscience that adjusts #synaptic strength based on precise pre- vs. postsynaptic #spike timing, enabling #TemporalCoding and #learning in #SNN. In this post, I summarize its mathematical formulation, functional consequences for learning and #memory along with a simple #Python example:
🌍 https://www.fabriziomusacchio.com/blog/2026-02-12-stdp/
#CompNeuro #Neuroscience #SNN #NeuralDynamics #NeuralPlasticity
-
Spike-timing-dependent #plasticity (#STDP) is a core rule in #ComputationalNeuroscience that adjusts #synaptic strength based on precise pre- vs. postsynaptic #spike timing, enabling #TemporalCoding and #learning in #SNN. In this post, I summarize its mathematical formulation, functional consequences for learning and #memory along with a simple #Python example:
🌍 https://www.fabriziomusacchio.com/blog/2026-02-12-stdp/
#CompNeuro #Neuroscience #SNN #NeuralDynamics #NeuralPlasticity
-
Finally published with David Angulo Garcia:
A theory for self-sustained balanced states in absence of strong external currents
@PLOSDynamical balance can be obtained via nonlinear mechanisms without the need of strong external drive: short term depression act as a new balancing mechanism. The complete theory is here reported.
#neuraldynamics #compneuroscience
#neuroscience #compneuro #brain #neuralnetwork
#neurodon @hnp_geneva -
#NeuralDynamics is a central subfield of #ComputationalNeuroscience studying timedependent #NeuralActivity and its governing #mathematics. It examines how #NeuralStates evolve, how stable or unstable patterns arise, and how #learning reshapes them. Neural dynamics forms the backbone for how #neurons & #NeuralNetworks generate complex activity over time. This post gives a brief overview of the field & its historical milestones:
🌍https://www.fabriziomusacchio.com/blog/2026-02-04-neural_dynamics/
-
🧠 New preprint by Shervani-Tabar, Brincat & @ekmiller on emergent #TravelingWaves in #RNN.
By aligning RNN dynamics to an empirically measured #NeuralManifold, they show that task-relevant TW can emerge through #learning, w/o hard-coding wave dynamics or connectivity. The cool thing here is that the waves are not imposed or engineered, but emerge naturally from learning under #BiologicallyPlausible constraints:
-
🧠 New preprint by Shervani-Tabar, Brincat & @ekmiller on emergent #TravelingWaves in #RNN.
By aligning RNN dynamics to an empirically measured #NeuralManifold, they show that task-relevant TW can emerge through #learning, w/o hard-coding wave dynamics or connectivity. The cool thing here is that the waves are not imposed or engineered, but emerge naturally from learning under #BiologicallyPlausible constraints:
-
🧠 New preprint by Shervani-Tabar, Brincat & @ekmiller on emergent #TravelingWaves in #RNN.
By aligning RNN dynamics to an empirically measured #NeuralManifold, they show that task-relevant TW can emerge through #learning, w/o hard-coding wave dynamics or connectivity. The cool thing here is that the waves are not imposed or engineered, but emerge naturally from learning under #BiologicallyPlausible constraints:
-
🧠 New preprint by Shervani-Tabar, Brincat & @ekmiller on emergent #TravelingWaves in #RNN.
By aligning RNN dynamics to an empirically measured #NeuralManifold, they show that task-relevant TW can emerge through #learning, w/o hard-coding wave dynamics or connectivity. The cool thing here is that the waves are not imposed or engineered, but emerge naturally from learning under #BiologicallyPlausible constraints:
-
🧠 New preprint by Shervani-Tabar, Brincat & @ekmiller on emergent #TravelingWaves in #RNN.
By aligning RNN dynamics to an empirically measured #NeuralManifold, they show that task-relevant TW can emerge through #learning, w/o hard-coding wave dynamics or connectivity. The cool thing here is that the waves are not imposed or engineered, but emerge naturally from learning under #BiologicallyPlausible constraints:
-
🧠 New preprint by Behrad et al. introducing #fastDSA, a much faster way to compare neural systems at the level of their dynamics, not just geometry or task performance.
What’s cool here: similarity is defined by shared #VectorFields, i.e. by the computational mechanism itself. This provides the first tool for mechanistic comparison of neural computations (to my knowledge).
🌍 https://arxiv.org/abs/2511.22828
💻 https://github.com/CMC-lab/fastDSA#Neuroscience #CompNeuro #NeuralDynamics #Manifolds #DynamicalSystems
-
🧠 New preprint by Lee et al.: Fast dendritic excitations primarily mediate #backpropagation in #CA1 pyramidal #neurons during #behavior
Using kHz #VoltageImaging across the full #dendritic tree, they show that fast dendritic spikes are usually driven by somatic #bAPs, not independently initiated. #bAP propagation into apical dendrites is contin. modulated by pre-spike dendritic voltage & can trigger slower plateau potentials linked to complex spikes.
-
🧠 New preprint by Lee et al.: Fast dendritic excitations primarily mediate #backpropagation in #CA1 pyramidal #neurons during #behavior
Using kHz #VoltageImaging across the full #dendritic tree, they show that fast dendritic spikes are usually driven by somatic #bAPs, not independently initiated. #bAP propagation into apical dendrites is contin. modulated by pre-spike dendritic voltage & can trigger slower plateau potentials linked to complex spikes.
-
🧠 New paper by Deistler et al: #JAXLEY: differentiable #simulation for large-scale training of detailed #biophysical #models of #NeuralDynamics.
They present a #differentiable #GPU accelerated #simulator that trains #morphologically detailed biophysical #neuron models with #GradientDescent. JAXLEY fits intracellular #voltage and #calcium data, scales to 1000s of compartments, trains biophys. #RNNs on #WorkingMemory tasks & even solves #MNIST.
-
🧠 New paper by Deistler et al: #JAXLEY: differentiable #simulation for large-scale training of detailed #biophysical #models of #NeuralDynamics.
They present a #differentiable #GPU accelerated #simulator that trains #morphologically detailed biophysical #neuron models with #GradientDescent. JAXLEY fits intracellular #voltage and #calcium data, scales to 1000s of compartments, trains biophys. #RNNs on #WorkingMemory tasks & even solves #MNIST.
-
🧠 New paper by Deistler et al: #JAXLEY: differentiable #simulation for large-scale training of detailed #biophysical #models of #NeuralDynamics.
They present a #differentiable #GPU accelerated #simulator that trains #morphologically detailed biophysical #neuron models with #GradientDescent. JAXLEY fits intracellular #voltage and #calcium data, scales to 1000s of compartments, trains biophys. #RNNs on #WorkingMemory tasks & even solves #MNIST.