home.social

#neuraldynamics — Public Fediverse posts

Live and recent posts from across the Fediverse tagged #neuraldynamics, aggregated by home.social.

  1. RE: mathstodon.xyz/@DurstewitzLab/

    🧠 New preprint by Brändle et al./ @DurstewitzLab: Continuous-Time Piecewise-Linear #RecurrentNeuralNetworks introduces continuous-time #PLRNNs for #DynamicalSystems reconstruction.

    The model combines interpretability and analytical tractability of pw-linear #RNN with cont.-time dynamics, allowing semi-analytic analysis of equilibria and limit cycles while handling irregularly sampled data better than standard Neural #ODEs.

    #NeuralDynamics #Neuroscience #NeuralODE

  2. RE: mathstodon.xyz/@DurstewitzLab/

    🧠 New preprint by Brändle et al./ @DurstewitzLab: Continuous-Time Piecewise-Linear #RecurrentNeuralNetworks introduces continuous-time #PLRNNs for #DynamicalSystems reconstruction.

    The model combines interpretability and analytical tractability of pw-linear #RNN with cont.-time dynamics, allowing semi-analytic analysis of equilibria and limit cycles while handling irregularly sampled data better than standard Neural #ODEs.

    #NeuralDynamics #Neuroscience #NeuralODE

  3. RE: mathstodon.xyz/@DurstewitzLab/

    🧠 New preprint by Brändle et al./ @DurstewitzLab: Continuous-Time Piecewise-Linear #RecurrentNeuralNetworks introduces continuous-time #PLRNNs for #DynamicalSystems reconstruction.

    The model combines interpretability and analytical tractability of pw-linear #RNN with cont.-time dynamics, allowing semi-analytic analysis of equilibria and limit cycles while handling irregularly sampled data better than standard Neural #ODEs.

    #NeuralDynamics #Neuroscience #NeuralODE

  4. RE: mathstodon.xyz/@DurstewitzLab/

    🧠 New preprint by Brändle et al./ @DurstewitzLab: Continuous-Time Piecewise-Linear #RecurrentNeuralNetworks introduces continuous-time #PLRNNs for #DynamicalSystems reconstruction.

    The model combines interpretability and analytical tractability of pw-linear #RNN with cont.-time dynamics, allowing semi-analytic analysis of equilibria and limit cycles while handling irregularly sampled data better than standard Neural #ODEs.

    #NeuralDynamics #Neuroscience #NeuralODE

  5. RE: mathstodon.xyz/@DurstewitzLab/

    🧠 New preprint by Brändle et al./ @DurstewitzLab: Continuous-Time Piecewise-Linear #RecurrentNeuralNetworks introduces continuous-time #PLRNNs for #DynamicalSystems reconstruction.

    The model combines interpretability and analytical tractability of pw-linear #RNN with cont.-time dynamics, allowing semi-analytic analysis of equilibria and limit cycles while handling irregularly sampled data better than standard Neural #ODEs.

    #NeuralDynamics #Neuroscience #NeuralODE

  6. 🧠 New preprint by Lu et al: Recordings from the human #hippocampus and anterior cingulate #cortex during three distinct tasks reveal that #NeuralPopulation activity is not fully task-specific. About half of the low-dimensional #NeuralSubspace structure was shared across tasks, suggesting a stable population geometry that may support flexible #cognition across different #behaviors.

    📝 doi.org/10.64898/2026.04.24.72

    #Neuroscience #NeuralDynamics #cogsci #Behavior

  7. 🧠 New preprint by Lu et al: Recordings from the human #hippocampus and anterior cingulate #cortex during three distinct tasks reveal that #NeuralPopulation activity is not fully task-specific. About half of the low-dimensional #NeuralSubspace structure was shared across tasks, suggesting a stable population geometry that may support flexible #cognition across different #behaviors.

    📝 doi.org/10.64898/2026.04.24.72

    #Neuroscience #NeuralDynamics #cogsci #Behavior

  8. 🧠 New preprint by Lu et al: Recordings from the human #hippocampus and anterior cingulate #cortex during three distinct tasks reveal that #NeuralPopulation activity is not fully task-specific. About half of the low-dimensional #NeuralSubspace structure was shared across tasks, suggesting a stable population geometry that may support flexible #cognition across different #behaviors.

    📝 doi.org/10.64898/2026.04.24.72

    #Neuroscience #NeuralDynamics #cogsci #Behavior

  9. 🧠 New preprint by Lu et al: Recordings from the human #hippocampus and anterior cingulate #cortex during three distinct tasks reveal that #NeuralPopulation activity is not fully task-specific. About half of the low-dimensional #NeuralSubspace structure was shared across tasks, suggesting a stable population geometry that may support flexible #cognition across different #behaviors.

    📝 doi.org/10.64898/2026.04.24.72

    #Neuroscience #NeuralDynamics #cogsci #Behavior

  10. 🧠 New preprint by Lu et al: Recordings from the human #hippocampus and anterior cingulate #cortex during three distinct tasks reveal that #NeuralPopulation activity is not fully task-specific. About half of the low-dimensional #NeuralSubspace structure was shared across tasks, suggesting a stable population geometry that may support flexible #cognition across different #behaviors.

    📝 doi.org/10.64898/2026.04.24.72

    #Neuroscience #NeuralDynamics #cogsci #Behavior

  11. RE: mastodon.social/@appassionato/

    Indeed, an excellent recommendation: Tristram D. Wyatt’s “#AnimalBehaviour: A Very Short Introduction” is a useful reminder for #NaturalisticNeuroscience: #Behavior is not just output, but evolved action in ecological and social context. Tinbergen’s questions, costs, signals, conflict, cooperation. This is exactly the conceptual bridge we need between eg #NeuralDynamics and real-world behavior.

    🌍 global.oup.com/academic/produc

    #Neuroscience #CompNeuro

  12. RE: mastodon.social/@appassionato/

    Indeed, an excellent recommendation: Tristram D. Wyatt’s “#AnimalBehaviour: A Very Short Introduction” is a useful reminder for #NaturalisticNeuroscience: #Behavior is not just output, but evolved action in ecological and social context. Tinbergen’s questions, costs, signals, conflict, cooperation. This is exactly the conceptual bridge we need between eg #NeuralDynamics and real-world behavior.

    🌍 global.oup.com/academic/produc

    #Neuroscience #CompNeuro

  13. RE: mastodon.social/@appassionato/

    Indeed, an excellent recommendation: Tristram D. Wyatt’s “#AnimalBehaviour: A Very Short Introduction” is a useful reminder for #NaturalisticNeuroscience: #Behavior is not just output, but evolved action in ecological and social context. Tinbergen’s questions, costs, signals, conflict, cooperation. This is exactly the conceptual bridge we need between eg #NeuralDynamics and real-world behavior.

    🌍 global.oup.com/academic/produc

    #Neuroscience #CompNeuro

  14. RE: mastodon.social/@appassionato/

    Indeed, an excellent recommendation: Tristram D. Wyatt’s “#AnimalBehaviour: A Very Short Introduction” is a useful reminder for #NaturalisticNeuroscience: #Behavior is not just output, but evolved action in ecological and social context. Tinbergen’s questions, costs, signals, conflict, cooperation. This is exactly the conceptual bridge we need between eg #NeuralDynamics and real-world behavior.

    🌍 global.oup.com/academic/produc

    #Neuroscience #CompNeuro

  15. 🧠 New preprint by Garcia-Garcia et al.: The authors show that cerebellar #GranuleCells do not simply expand #cortical activity into a high-dimensional code. Instead, they preserve low-dimensional cortical #manifold geometry while reorienting it across contexts. This rotation separates similar tasks, reduces interference, and supports flexible dual-task learning.

    📄 biorxiv.org/content/10.64898/2

    #Neuroscience #NeuralDynamics #CompNeuro

  16. 🧠 New paper by Pezon, Schmutz & Gerstner: Linking #NeuralManifolds to circuit structure in recurrent networks.

    The study connects two common views of neural activity: low-dimensional #PopulationDynamics (“neural manifolds”) and single-neuron selectivity. Using recurrent network models, the authors show how circuit connectivity constrains both the geometry of neural #manifolds and the tuning of individual neurons.

    📄 doi.org/10.1016/j.neuron.2025.

    #Neuroscience #NeuralDynamics #CompNeuro #RNN

  17. 🧠 New paper by Pezon, Schmutz & Gerstner: Linking #NeuralManifolds to circuit structure in recurrent networks.

    The study connects two common views of neural activity: low-dimensional #PopulationDynamics (“neural manifolds”) and single-neuron selectivity. Using recurrent network models, the authors show how circuit connectivity constrains both the geometry of neural #manifolds and the tuning of individual neurons.

    📄 doi.org/10.1016/j.neuron.2025.

    #Neuroscience #NeuralDynamics #CompNeuro #RNN

  18. 🧠 New paper by Pezon, Schmutz & Gerstner: Linking #NeuralManifolds to circuit structure in recurrent networks.

    The study connects two common views of neural activity: low-dimensional #PopulationDynamics (“neural manifolds”) and single-neuron selectivity. Using recurrent network models, the authors show how circuit connectivity constrains both the geometry of neural #manifolds and the tuning of individual neurons.

    📄 doi.org/10.1016/j.neuron.2025.

    #Neuroscience #NeuralDynamics #CompNeuro #RNN

  19. 🧠 New paper by Pezon, Schmutz & Gerstner: Linking #NeuralManifolds to circuit structure in recurrent networks.

    The study connects two common views of neural activity: low-dimensional #PopulationDynamics (“neural manifolds”) and single-neuron selectivity. Using recurrent network models, the authors show how circuit connectivity constrains both the geometry of neural #manifolds and the tuning of individual neurons.

    📄 doi.org/10.1016/j.neuron.2025.

    #Neuroscience #NeuralDynamics #CompNeuro #RNN

  20. 🧠 New paper by Pezon, Schmutz & Gerstner: Linking #NeuralManifolds to circuit structure in recurrent networks.

    The study connects two common views of neural activity: low-dimensional #PopulationDynamics (“neural manifolds”) and single-neuron selectivity. Using recurrent network models, the authors show how circuit connectivity constrains both the geometry of neural #manifolds and the tuning of individual neurons.

    📄 doi.org/10.1016/j.neuron.2025.

    #Neuroscience #NeuralDynamics #CompNeuro #RNN

  21. 🧠 New preprint by Guardamagna et al.: Using large-scale recordings in #rat pups, the authors show that toroidal #manifolds in #MEC emerge by P10, before eye and ear opening, upright gait, and active exploration. Ring-like manifolds appear even earlier, by P9. External spatial experience seems to align these preconfigured internal maps only later, as pups begin to navigate.

    📄 doi.org/10.64898/2026.03.10.71

    #Neuroscience #GridCells #NeuralDynamics

  22. 🧠 New preprint by Guardamagna et al.: Using large-scale recordings in #rat pups, the authors show that toroidal #manifolds in #MEC emerge by P10, before eye and ear opening, upright gait, and active exploration. Ring-like manifolds appear even earlier, by P9. External spatial experience seems to align these preconfigured internal maps only later, as pups begin to navigate.

    📄 doi.org/10.64898/2026.03.10.71

    #Neuroscience #GridCells #NeuralDynamics

  23. 🧠 New preprint by Guardamagna et al.: Using large-scale recordings in #rat pups, the authors show that toroidal #manifolds in #MEC emerge by P10, before eye and ear opening, upright gait, and active exploration. Ring-like manifolds appear even earlier, by P9. External spatial experience seems to align these preconfigured internal maps only later, as pups begin to navigate.

    📄 doi.org/10.64898/2026.03.10.71

    #Neuroscience #GridCells #NeuralDynamics

  24. 🧠 New preprint by Guardamagna et al.: Using large-scale recordings in #rat pups, the authors show that toroidal #manifolds in #MEC emerge by P10, before eye and ear opening, upright gait, and active exploration. Ring-like manifolds appear even earlier, by P9. External spatial experience seems to align these preconfigured internal maps only later, as pups begin to navigate.

    📄 doi.org/10.64898/2026.03.10.71

    #Neuroscience #GridCells #NeuralDynamics

  25. 🧠 New preprint by Guardamagna et al.: Using large-scale recordings in #rat pups, the authors show that toroidal #manifolds in #MEC emerge by P10, before eye and ear opening, upright gait, and active exploration. Ring-like manifolds appear even earlier, by P9. External spatial experience seems to align these preconfigured internal maps only later, as pups begin to navigate.

    📄 doi.org/10.64898/2026.03.10.71

    #Neuroscience #GridCells #NeuralDynamics

  26. 🧠 New work by Codol et al. who show that #MotorCortex dynamics are remarkably conserved across #mice, #monkeys, and #humans. Despite very different #behaviors, #NeuralPopulation activity follows similar dynamical rules on low-dimensional #manifolds. Species differences arise mainly from the geometry of trajectories within this shared #DynamicalSystem.

    📄 doi.org/10.64898/2026.03.06.70

    #Neuroscience #CompNeuro #NeuralDynamics

  27. 🧠 New preprint by Chericoni et al: #NeuralPopulation activity in the #hippocampus encodes spatial information for different agents (self, prey, gaze) in distinct but related low-dimensional subspaces. The study shows that these representations can be linearly transformed between each other, suggesting a shared geometric code supporting flexible spatial #cognition and multi-agent navigation.

    📄 arxiv.org/abs/2603.04747

    #Neuroscience #CompNeuro #NeuralDynamics

  28. 🧠 New preprint by Chericoni et al: #NeuralPopulation activity in the #hippocampus encodes spatial information for different agents (self, prey, gaze) in distinct but related low-dimensional subspaces. The study shows that these representations can be linearly transformed between each other, suggesting a shared geometric code supporting flexible spatial #cognition and multi-agent navigation.

    📄 arxiv.org/abs/2603.04747

    #Neuroscience #CompNeuro #NeuralDynamics

  29. 🧠 New preprint by Chericoni et al: #NeuralPopulation activity in the #hippocampus encodes spatial information for different agents (self, prey, gaze) in distinct but related low-dimensional subspaces. The study shows that these representations can be linearly transformed between each other, suggesting a shared geometric code supporting flexible spatial #cognition and multi-agent navigation.

    📄 arxiv.org/abs/2603.04747

    #Neuroscience #CompNeuro #NeuralDynamics

  30. 🧠 New preprint by Chericoni et al: #NeuralPopulation activity in the #hippocampus encodes spatial information for different agents (self, prey, gaze) in distinct but related low-dimensional subspaces. The study shows that these representations can be linearly transformed between each other, suggesting a shared geometric code supporting flexible spatial #cognition and multi-agent navigation.

    📄 arxiv.org/abs/2603.04747

    #Neuroscience #CompNeuro #NeuralDynamics

  31. 🧠 New preprint by Chericoni et al: #NeuralPopulation activity in the #hippocampus encodes spatial information for different agents (self, prey, gaze) in distinct but related low-dimensional subspaces. The study shows that these representations can be linearly transformed between each other, suggesting a shared geometric code supporting flexible spatial #cognition and multi-agent navigation.

    📄 arxiv.org/abs/2603.04747

    #Neuroscience #CompNeuro #NeuralDynamics

  32. Spike-timing-dependent #plasticity (#STDP) is a core rule in #ComputationalNeuroscience that adjusts #synaptic strength based on precise pre- vs. postsynaptic #spike timing, enabling #TemporalCoding and #learning in #SNN. In this post, I summarize its mathematical formulation, functional consequences for learning and #memory along with a simple #Python example:

    🌍 fabriziomusacchio.com/blog/202

    #CompNeuro #Neuroscience #SNN #NeuralDynamics #NeuralPlasticity

  33. Spike-timing-dependent #plasticity (#STDP) is a core rule in #ComputationalNeuroscience that adjusts #synaptic strength based on precise pre- vs. postsynaptic #spike timing, enabling #TemporalCoding and #learning in #SNN. In this post, I summarize its mathematical formulation, functional consequences for learning and #memory along with a simple #Python example:

    🌍 fabriziomusacchio.com/blog/202

    #CompNeuro #Neuroscience #SNN #NeuralDynamics #NeuralPlasticity

  34. Spike-timing-dependent #plasticity (#STDP) is a core rule in #ComputationalNeuroscience that adjusts #synaptic strength based on precise pre- vs. postsynaptic #spike timing, enabling #TemporalCoding and #learning in #SNN. In this post, I summarize its mathematical formulation, functional consequences for learning and #memory along with a simple #Python example:

    🌍 fabriziomusacchio.com/blog/202

    #CompNeuro #Neuroscience #SNN #NeuralDynamics #NeuralPlasticity

  35. Spike-timing-dependent #plasticity (#STDP) is a core rule in #ComputationalNeuroscience that adjusts #synaptic strength based on precise pre- vs. postsynaptic #spike timing, enabling #TemporalCoding and #learning in #SNN. In this post, I summarize its mathematical formulation, functional consequences for learning and #memory along with a simple #Python example:

    🌍 fabriziomusacchio.com/blog/202

    #CompNeuro #Neuroscience #SNN #NeuralDynamics #NeuralPlasticity

  36. Spike-timing-dependent #plasticity (#STDP) is a core rule in #ComputationalNeuroscience that adjusts #synaptic strength based on precise pre- vs. postsynaptic #spike timing, enabling #TemporalCoding and #learning in #SNN. In this post, I summarize its mathematical formulation, functional consequences for learning and #memory along with a simple #Python example:

    🌍 fabriziomusacchio.com/blog/202

    #CompNeuro #Neuroscience #SNN #NeuralDynamics #NeuralPlasticity

  37. Finally published with David Angulo Garcia:

    A theory for self-sustained balanced states in absence of strong external currents
    @PLOS

    Dynamical balance can be obtained via nonlinear mechanisms without the need of strong external drive: short term depression act as a new balancing mechanism. The complete theory is here reported.

    #neuraldynamics #compneuroscience

    #neuroscience #compneuro #brain #neuralnetwork
    #neurodon @hnp_geneva

  38. #NeuralDynamics is a central subfield of #ComputationalNeuroscience studying timedependent #NeuralActivity and its governing #mathematics. It examines how #NeuralStates evolve, how stable or unstable patterns arise, and how #learning reshapes them. Neural dynamics forms the backbone for how #neurons & #NeuralNetworks generate complex activity over time. This post gives a brief overview of the field & its historical milestones:

    🌍fabriziomusacchio.com/blog/202

    #CompNeuro #Neuroscience #DynamicalSystems

  39. 🧠 New preprint by Shervani-Tabar, Brincat & @ekmiller on emergent #TravelingWaves in #RNN.

    By aligning RNN dynamics to an empirically measured #NeuralManifold, they show that task-relevant TW can emerge through #learning, w/o hard-coding wave dynamics or connectivity. The cool thing here is that the waves are not imposed or engineered, but emerge naturally from learning under #BiologicallyPlausible constraints:

    🌍 doi.org/10.64898/2026.01.08.69

    #Neuroscience #CompNeuro #NeuralDynamics #WorkingMemory

  40. 🧠 New preprint by Shervani-Tabar, Brincat & @ekmiller on emergent #TravelingWaves in #RNN.

    By aligning RNN dynamics to an empirically measured #NeuralManifold, they show that task-relevant TW can emerge through #learning, w/o hard-coding wave dynamics or connectivity. The cool thing here is that the waves are not imposed or engineered, but emerge naturally from learning under #BiologicallyPlausible constraints:

    🌍 doi.org/10.64898/2026.01.08.69

    #Neuroscience #CompNeuro #NeuralDynamics #WorkingMemory

  41. 🧠 New preprint by Shervani-Tabar, Brincat & @ekmiller on emergent #TravelingWaves in #RNN.

    By aligning RNN dynamics to an empirically measured #NeuralManifold, they show that task-relevant TW can emerge through #learning, w/o hard-coding wave dynamics or connectivity. The cool thing here is that the waves are not imposed or engineered, but emerge naturally from learning under #BiologicallyPlausible constraints:

    🌍 doi.org/10.64898/2026.01.08.69

    #Neuroscience #CompNeuro #NeuralDynamics #WorkingMemory

  42. 🧠 New preprint by Shervani-Tabar, Brincat & @ekmiller on emergent #TravelingWaves in #RNN.

    By aligning RNN dynamics to an empirically measured #NeuralManifold, they show that task-relevant TW can emerge through #learning, w/o hard-coding wave dynamics or connectivity. The cool thing here is that the waves are not imposed or engineered, but emerge naturally from learning under #BiologicallyPlausible constraints:

    🌍 doi.org/10.64898/2026.01.08.69

    #Neuroscience #CompNeuro #NeuralDynamics #WorkingMemory

  43. 🧠 New preprint by Shervani-Tabar, Brincat & @ekmiller on emergent #TravelingWaves in #RNN.

    By aligning RNN dynamics to an empirically measured #NeuralManifold, they show that task-relevant TW can emerge through #learning, w/o hard-coding wave dynamics or connectivity. The cool thing here is that the waves are not imposed or engineered, but emerge naturally from learning under #BiologicallyPlausible constraints:

    🌍 doi.org/10.64898/2026.01.08.69

    #Neuroscience #CompNeuro #NeuralDynamics #WorkingMemory

  44. 🧠 New preprint by Behrad et al. introducing #fastDSA, a much faster way to compare neural systems at the level of their dynamics, not just geometry or task performance.

    What’s cool here: similarity is defined by shared #VectorFields, i.e. by the computational mechanism itself. This provides the first tool for mechanistic comparison of neural computations (to my knowledge).

    🌍 arxiv.org/abs/2511.22828
    💻 github.com/CMC-lab/fastDSA

    #Neuroscience #CompNeuro #NeuralDynamics #Manifolds #DynamicalSystems

  45. 🧠 New preprint by Lee et al.: Fast dendritic excitations primarily mediate #backpropagation in #CA1 pyramidal #neurons during #behavior

    Using kHz #VoltageImaging across the full #dendritic tree, they show that fast dendritic spikes are usually driven by somatic #bAPs, not independently initiated. #bAP propagation into apical dendrites is contin. modulated by pre-spike dendritic voltage & can trigger slower plateau potentials linked to complex spikes.

    🌍doi.org/10.64898/2026.01.03.69

    #NeuralDynamics

  46. 🧠 New preprint by Lee et al.: Fast dendritic excitations primarily mediate #backpropagation in #CA1 pyramidal #neurons during #behavior

    Using kHz #VoltageImaging across the full #dendritic tree, they show that fast dendritic spikes are usually driven by somatic #bAPs, not independently initiated. #bAP propagation into apical dendrites is contin. modulated by pre-spike dendritic voltage & can trigger slower plateau potentials linked to complex spikes.

    🌍doi.org/10.64898/2026.01.03.69

    #NeuralDynamics

  47. 🧠 New paper by Deistler et al: #JAXLEY: differentiable #simulation for large-scale training of detailed #biophysical #models of #NeuralDynamics.

    They present a #differentiable #GPU accelerated #simulator that trains #morphologically detailed biophysical #neuron models with #GradientDescent. JAXLEY fits intracellular #voltage and #calcium data, scales to 1000s of compartments, trains biophys. #RNNs on #WorkingMemory tasks & even solves #MNIST.

    🌍 doi.org/10.1038/s41592-025-028

    #Neuroscience #CompNeuro

  48. 🧠 New paper by Deistler et al: #JAXLEY: differentiable #simulation for large-scale training of detailed #biophysical #models of #NeuralDynamics.

    They present a #differentiable #GPU accelerated #simulator that trains #morphologically detailed biophysical #neuron models with #GradientDescent. JAXLEY fits intracellular #voltage and #calcium data, scales to 1000s of compartments, trains biophys. #RNNs on #WorkingMemory tasks & even solves #MNIST.

    🌍 doi.org/10.1038/s41592-025-028

    #Neuroscience #CompNeuro

  49. 🧠 New paper by Deistler et al: #JAXLEY: differentiable #simulation for large-scale training of detailed #biophysical #models of #NeuralDynamics.

    They present a #differentiable #GPU accelerated #simulator that trains #morphologically detailed biophysical #neuron models with #GradientDescent. JAXLEY fits intracellular #voltage and #calcium data, scales to 1000s of compartments, trains biophys. #RNNs on #WorkingMemory tasks & even solves #MNIST.

    🌍 doi.org/10.1038/s41592-025-028

    #Neuroscience #CompNeuro