home.social

#rnn — Public Fediverse posts

Live and recent posts from across the Fediverse tagged #rnn, aggregated by home.social.

  1. RE: mathstodon.xyz/@DurstewitzLab/

    🧠 New preprint by Brändle et al./ @DurstewitzLab: Continuous-Time Piecewise-Linear #RecurrentNeuralNetworks introduces continuous-time #PLRNNs for #DynamicalSystems reconstruction.

    The model combines interpretability and analytical tractability of pw-linear #RNN with cont.-time dynamics, allowing semi-analytic analysis of equilibria and limit cycles while handling irregularly sampled data better than standard Neural #ODEs.

    #NeuralDynamics #Neuroscience #NeuralODE

  2. RE: mathstodon.xyz/@DurstewitzLab/

    🧠 New preprint by Brändle et al./ @DurstewitzLab: Continuous-Time Piecewise-Linear #RecurrentNeuralNetworks introduces continuous-time #PLRNNs for #DynamicalSystems reconstruction.

    The model combines interpretability and analytical tractability of pw-linear #RNN with cont.-time dynamics, allowing semi-analytic analysis of equilibria and limit cycles while handling irregularly sampled data better than standard Neural #ODEs.

    #NeuralDynamics #Neuroscience #NeuralODE

  3. RE: mathstodon.xyz/@DurstewitzLab/

    🧠 New preprint by Brändle et al./ @DurstewitzLab: Continuous-Time Piecewise-Linear #RecurrentNeuralNetworks introduces continuous-time #PLRNNs for #DynamicalSystems reconstruction.

    The model combines interpretability and analytical tractability of pw-linear #RNN with cont.-time dynamics, allowing semi-analytic analysis of equilibria and limit cycles while handling irregularly sampled data better than standard Neural #ODEs.

    #NeuralDynamics #Neuroscience #NeuralODE

  4. RE: mathstodon.xyz/@DurstewitzLab/

    🧠 New preprint by Brändle et al./ @DurstewitzLab: Continuous-Time Piecewise-Linear #RecurrentNeuralNetworks introduces continuous-time #PLRNNs for #DynamicalSystems reconstruction.

    The model combines interpretability and analytical tractability of pw-linear #RNN with cont.-time dynamics, allowing semi-analytic analysis of equilibria and limit cycles while handling irregularly sampled data better than standard Neural #ODEs.

    #NeuralDynamics #Neuroscience #NeuralODE

  5. RE: mathstodon.xyz/@DurstewitzLab/

    🧠 New preprint by Brändle et al./ @DurstewitzLab: Continuous-Time Piecewise-Linear #RecurrentNeuralNetworks introduces continuous-time #PLRNNs for #DynamicalSystems reconstruction.

    The model combines interpretability and analytical tractability of pw-linear #RNN with cont.-time dynamics, allowing semi-analytic analysis of equilibria and limit cycles while handling irregularly sampled data better than standard Neural #ODEs.

    #NeuralDynamics #Neuroscience #NeuralODE

  6. Implementing a Recurrent Neural Network (RNN) from scratch involves building a neural network capable of processing sequential data by retaining information across time steps. Unlike feedforward networks, RNNs have[..]

    #python #rnn #neural #network #ai

    ml-nn.eu/a1/61.html

  7. Implementing a Recurrent Neural Network (RNN) from scratch involves building a neural network capable of processing sequential data by retaining information across time steps. Unlike feedforward networks, RNNs have[..]

    #python #rnn #neural #network #ai

    ml-nn.eu/a1/61.html

  8. Implementing a Recurrent Neural Network (RNN) from scratch involves building a neural network capable of processing sequential data by retaining information across time steps. Unlike feedforward networks, RNNs have[..]

    #python #rnn #neural #network #ai

    ml-nn.eu/a1/61.html

  9. Implementing a Recurrent Neural Network (RNN) from scratch involves building a neural network capable of processing sequential data by retaining information across time steps. Unlike feedforward networks, RNNs have[..]

    #python #rnn #neural #network #ai

    ml-nn.eu/a1/61.html

  10. Implementing a Recurrent Neural Network (RNN) from scratch involves building a neural network capable of processing sequential data by retaining information across time steps. Unlike feedforward networks, RNNs have[..]

    #python #rnn #neural #network #ai

    ml-nn.eu/a1/61.html

  11. AI ARCHITECTURES FOR LESS ENERGY CONSUMPTION(b)

    (being continued from 1/05/24) We did the math on AI’s energy footprint. Here’s the story you haven’t heard. The emissions from individual AI text, image, and video queries seem small—until you add up what the industry isn’t tracking and consider where it’s heading next. AI’s integration into our lives is the most significant shift in online life in more than a decade. Hundreds of millions of people now regularly turn to chatbots for help with homework, research, coding, […]

    spacezilotes.wordpress.com/202

  12. AI ARCHITECTURES FOR LESS ENERGY CONSUMPTION(b)

    (being continued from 1/05/24) We did the math on AI’s energy footprint. Here’s the story you haven’t heard. The emissions from individual AI text, image, and video queries seem small—until you add up what the industry isn’t tracking and consider where it’s heading next. AI’s integration into our lives is the most significant shift in online life in more than a decade. Hundreds of millions of people now regularly turn to chatbots for help with homework, research, coding, […]

    spacezilotes.wordpress.com/202

  13. AI ARCHITECTURES FOR LESS ENERGY CONSUMPTION(b)

    (being continued from 1/05/24) We did the math on AI’s energy footprint. Here’s the story you haven’t heard. The emissions from individual AI text, image, and video queries seem small—until you add up what the industry isn’t tracking and consider where it’s heading next. AI’s integration into our lives is the most significant shift in online life in more than a decade. Hundreds of millions of people now regularly turn to chatbots for help with homework, research, coding, […]

    spacezilotes.wordpress.com/202

  14. AI ARCHITECTURES FOR LESS ENERGY CONSUMPTION(b)

    (being continued from 1/05/24) We did the math on AI’s energy footprint. Here’s the story you haven’t heard. The emissions from individual AI text, image, and video queries seem small—until you add up what the industry isn’t tracking and consider where it’s heading next. AI’s integration into our lives is the most significant shift in online life in more than a decade. Hundreds of millions of people now regularly turn to chatbots for help with homework, research, coding, […]

    spacezilotes.wordpress.com/202

  15. AI ARCHITECTURES FOR LESS ENERGY CONSUMPTION(b)

    (being continued from 1/05/24) We did the math on AI’s energy footprint. Here’s the story you haven’t heard. The emissions from individual AI text, image, and video queries seem small—until you add up what the industry isn’t tracking and consider where it’s heading next. AI’s integration into our lives is the most significant shift in online life in more than a decade. Hundreds of millions of people now regularly turn to chatbots for help with homework, research, coding, […]

    spacezilotes.wordpress.com/202

  16. 🧠 New paper by Pezon, Schmutz & Gerstner: Linking #NeuralManifolds to circuit structure in recurrent networks.

    The study connects two common views of neural activity: low-dimensional #PopulationDynamics (“neural manifolds”) and single-neuron selectivity. Using recurrent network models, the authors show how circuit connectivity constrains both the geometry of neural #manifolds and the tuning of individual neurons.

    📄 doi.org/10.1016/j.neuron.2025.

    #Neuroscience #NeuralDynamics #CompNeuro #RNN

  17. 🧠 New paper by Pezon, Schmutz & Gerstner: Linking #NeuralManifolds to circuit structure in recurrent networks.

    The study connects two common views of neural activity: low-dimensional #PopulationDynamics (“neural manifolds”) and single-neuron selectivity. Using recurrent network models, the authors show how circuit connectivity constrains both the geometry of neural #manifolds and the tuning of individual neurons.

    📄 doi.org/10.1016/j.neuron.2025.

    #Neuroscience #NeuralDynamics #CompNeuro #RNN

  18. 🧠 New paper by Pezon, Schmutz & Gerstner: Linking #NeuralManifolds to circuit structure in recurrent networks.

    The study connects two common views of neural activity: low-dimensional #PopulationDynamics (“neural manifolds”) and single-neuron selectivity. Using recurrent network models, the authors show how circuit connectivity constrains both the geometry of neural #manifolds and the tuning of individual neurons.

    📄 doi.org/10.1016/j.neuron.2025.

    #Neuroscience #NeuralDynamics #CompNeuro #RNN

  19. 🧠 New paper by Pezon, Schmutz & Gerstner: Linking #NeuralManifolds to circuit structure in recurrent networks.

    The study connects two common views of neural activity: low-dimensional #PopulationDynamics (“neural manifolds”) and single-neuron selectivity. Using recurrent network models, the authors show how circuit connectivity constrains both the geometry of neural #manifolds and the tuning of individual neurons.

    📄 doi.org/10.1016/j.neuron.2025.

    #Neuroscience #NeuralDynamics #CompNeuro #RNN

  20. 🧠 New paper by Pezon, Schmutz & Gerstner: Linking #NeuralManifolds to circuit structure in recurrent networks.

    The study connects two common views of neural activity: low-dimensional #PopulationDynamics (“neural manifolds”) and single-neuron selectivity. Using recurrent network models, the authors show how circuit connectivity constrains both the geometry of neural #manifolds and the tuning of individual neurons.

    📄 doi.org/10.1016/j.neuron.2025.

    #Neuroscience #NeuralDynamics #CompNeuro #RNN

  21. Aviation weather for Bornholm airport in Rønne area (Denmark) is “EKRN 121050Z AUTO 07016KT 6000 -RA OVC008/// 01/00 Q0984” : See what it means on bigorre.org/aero/meteo/ekrn/en #bornholmairport #airport #ronne #denmark #ekrn #rnn #metar #aviation #aviationweather #avgeek vl

  22. Aviation weather for Bornholm airport in Rønne area (Denmark) is “EKRN 121050Z AUTO 07016KT 6000 -RA OVC008/// 01/00 Q0984” : See what it means on bigorre.org/aero/meteo/ekrn/en #bornholmairport #airport #ronne #denmark #ekrn #rnn #metar #aviation #aviationweather #avgeek vl

  23. Aviation weather for Bornholm airport in Rønne area (Denmark) is “EKRN 121050Z AUTO 07016KT 6000 -RA OVC008/// 01/00 Q0984” : See what it means on bigorre.org/aero/meteo/ekrn/en #bornholmairport #airport #ronne #denmark #ekrn #rnn #metar #aviation #aviationweather #avgeek vl

  24. Aviation weather for Bornholm airport in Rønne area (Denmark) is “EKRN 121050Z AUTO 07016KT 6000 -RA OVC008/// 01/00 Q0984” : See what it means on bigorre.org/aero/meteo/ekrn/en #bornholmairport #airport #ronne #denmark #ekrn #rnn #metar #aviation #aviationweather #avgeek vl

  25. Aviation weather for Bornholm airport in Rønne area (Denmark) is “EKRN 121050Z AUTO 07016KT 6000 -RA OVC008/// 01/00 Q0984” : See what it means on bigorre.org/aero/meteo/ekrn/en #bornholmairport #airport #ronne #denmark #ekrn #rnn #metar #aviation #aviationweather #avgeek vl

  26. Математические основы рекуррентных нейросетей (детские вопросы и ответы, о которых не принято говорить)

    Подробно разбираем математику рекуррентных нейросетей на базе самой простой нейросети от одного из основателей Open AI, а попутно задаёмся разными вопросами, которых нет в книжках, но которые обязательно задали бы дети. Узнаем сложно ли продифференцировать вектор по матрице, что не так с обратным распространением ошибки и как нейросети пробудили у автора его детские воспоминания.

    habr.com/ru/articles/993824/

    #rnn #нейросеть #искусственный_интеллект #градиентный_спуск #обратное_распространение_ошибки

  27. Математические основы рекуррентных нейросетей (детские вопросы и ответы, о которых не принято говорить)

    Подробно разбираем математику рекуррентных нейросетей на базе самой простой нейросети от одного из основателей Open AI, а попутно задаёмся разными вопросами, которых нет в книжках, но которые обязательно задали бы дети. Узнаем сложно ли продифференцировать вектор по матрице, что не так с обратным распространением ошибки и как нейросети пробудили у автора его детские воспоминания.

    habr.com/ru/articles/993824/

    #rnn #нейросеть #искусственный_интеллект #градиентный_спуск #обратное_распространение_ошибки

  28. Математические основы рекуррентных нейросетей (детские вопросы и ответы, о которых не принято говорить)

    Подробно разбираем математику рекуррентных нейросетей на базе самой простой нейросети от одного из основателей Open AI, а попутно задаёмся разными вопросами, которых нет в книжках, но которые обязательно задали бы дети. Узнаем сложно ли продифференцировать вектор по матрице, что не так с обратным распространением ошибки и как нейросети пробудили у автора его детские воспоминания.

    habr.com/ru/articles/993824/

    #rnn #нейросеть #искусственный_интеллект #градиентный_спуск #обратное_распространение_ошибки

  29. Математические основы рекуррентных нейросетей (детские вопросы и ответы, о которых не принято говорить)

    Подробно разбираем математику рекуррентных нейросетей на базе самой простой нейросети от одного из основателей Open AI, а попутно задаёмся разными вопросами, которых нет в книжках, но которые обязательно задали бы дети. Узнаем сложно ли продифференцировать вектор по матрице, что не так с обратным распространением ошибки и как нейросети пробудили у автора его детские воспоминания.

    habr.com/ru/articles/993824/

    #rnn #нейросеть #искусственный_интеллект #градиентный_спуск #обратное_распространение_ошибки

  30. Aviation weather for Bornholm airport in Rønne area (Denmark) is “EKRN 261050Z AUTO 07019KT 4800 -DZ BR OVC004/// 01/00 Q1001” : See what it means on bigorre.org/aero/meteo/ekrn/en #bornholmairport #airport #ronne #denmark #ekrn #rnn #metar #aviation #aviationweather #avgeek vl

  31. Aviation weather for Bornholm airport in Rønne area (Denmark) is “EKRN 261050Z AUTO 07019KT 4800 -DZ BR OVC004/// 01/00 Q1001” : See what it means on bigorre.org/aero/meteo/ekrn/en #bornholmairport #airport #ronne #denmark #ekrn #rnn #metar #aviation #aviationweather #avgeek vl

  32. Aviation weather for Bornholm airport in Rønne area (Denmark) is “EKRN 261050Z AUTO 07019KT 4800 -DZ BR OVC004/// 01/00 Q1001” : See what it means on bigorre.org/aero/meteo/ekrn/en #bornholmairport #airport #ronne #denmark #ekrn #rnn #metar #aviation #aviationweather #avgeek vl

  33. Aviation weather for Bornholm airport in Rønne area (Denmark) is “EKRN 261050Z AUTO 07019KT 4800 -DZ BR OVC004/// 01/00 Q1001” : See what it means on bigorre.org/aero/meteo/ekrn/en #bornholmairport #airport #ronne #denmark #ekrn #rnn #metar #aviation #aviationweather #avgeek vl

  34. Aviation weather for Bornholm airport in Rønne area (Denmark) is “EKRN 261050Z AUTO 07019KT 4800 -DZ BR OVC004/// 01/00 Q1001” : See what it means on bigorre.org/aero/meteo/ekrn/en #bornholmairport #airport #ronne #denmark #ekrn #rnn #metar #aviation #aviationweather #avgeek vl

  35. Ich würde mir eine Source of Truth für den #RNN Wünschen. Die Info das der Betrieb heute eingestellt ist, steht aktuell nur auf der Webseite vom #KRN.

  36. 🧠 New preprint by Shervani-Tabar, Brincat & @ekmiller on emergent #TravelingWaves in #RNN.

    By aligning RNN dynamics to an empirically measured #NeuralManifold, they show that task-relevant TW can emerge through #learning, w/o hard-coding wave dynamics or connectivity. The cool thing here is that the waves are not imposed or engineered, but emerge naturally from learning under #BiologicallyPlausible constraints:

    🌍 doi.org/10.64898/2026.01.08.69

    #Neuroscience #CompNeuro #NeuralDynamics #WorkingMemory

  37. 🧠 New preprint by Shervani-Tabar, Brincat & @ekmiller on emergent #TravelingWaves in #RNN.

    By aligning RNN dynamics to an empirically measured #NeuralManifold, they show that task-relevant TW can emerge through #learning, w/o hard-coding wave dynamics or connectivity. The cool thing here is that the waves are not imposed or engineered, but emerge naturally from learning under #BiologicallyPlausible constraints:

    🌍 doi.org/10.64898/2026.01.08.69

    #Neuroscience #CompNeuro #NeuralDynamics #WorkingMemory

  38. 🧠 New preprint by Shervani-Tabar, Brincat & @ekmiller on emergent #TravelingWaves in #RNN.

    By aligning RNN dynamics to an empirically measured #NeuralManifold, they show that task-relevant TW can emerge through #learning, w/o hard-coding wave dynamics or connectivity. The cool thing here is that the waves are not imposed or engineered, but emerge naturally from learning under #BiologicallyPlausible constraints:

    🌍 doi.org/10.64898/2026.01.08.69

    #Neuroscience #CompNeuro #NeuralDynamics #WorkingMemory

  39. 🧠 New preprint by Shervani-Tabar, Brincat & @ekmiller on emergent #TravelingWaves in #RNN.

    By aligning RNN dynamics to an empirically measured #NeuralManifold, they show that task-relevant TW can emerge through #learning, w/o hard-coding wave dynamics or connectivity. The cool thing here is that the waves are not imposed or engineered, but emerge naturally from learning under #BiologicallyPlausible constraints:

    🌍 doi.org/10.64898/2026.01.08.69

    #Neuroscience #CompNeuro #NeuralDynamics #WorkingMemory

  40. 🧠 New preprint by Shervani-Tabar, Brincat & @ekmiller on emergent #TravelingWaves in #RNN.

    By aligning RNN dynamics to an empirically measured #NeuralManifold, they show that task-relevant TW can emerge through #learning, w/o hard-coding wave dynamics or connectivity. The cool thing here is that the waves are not imposed or engineered, but emerge naturally from learning under #BiologicallyPlausible constraints:

    🌍 doi.org/10.64898/2026.01.08.69

    #Neuroscience #CompNeuro #NeuralDynamics #WorkingMemory

  41. "Compared to Gaussian networks, finite heavy-tailed RNNs exhibit a broader gain regime near the edge of chaos, namely, a slow transition to chaos. However, this robustness comes with a tradeoff: heavier tails reduce the Lyapunov dimension of the attractor, indicating lower effective dimensionality. Our results reveal a biologically aligned tradeoff between the robustness of dynamics near the edge of chaos and the richness of high-dimensional neural activity."

    "Slow Transition to Low-Dimensional Chaos in Heavy-Tailed Recurrent Neural Networks", Xie et al. 2025
    openreview.net/forum?id=J0SbYY

    Code: github.com/AllenInstitute/Heav

    Why would anyone model biological neural circuits with a Gaussian distribution of synaptic weights is beyond me, but it's great to know what the brain gets from having a different distribution.

    #neuroscience #RNN

  42. "Compared to Gaussian networks, finite heavy-tailed RNNs exhibit a broader gain regime near the edge of chaos, namely, a slow transition to chaos. However, this robustness comes with a tradeoff: heavier tails reduce the Lyapunov dimension of the attractor, indicating lower effective dimensionality. Our results reveal a biologically aligned tradeoff between the robustness of dynamics near the edge of chaos and the richness of high-dimensional neural activity."

    "Slow Transition to Low-Dimensional Chaos in Heavy-Tailed Recurrent Neural Networks", Xie et al. 2025
    openreview.net/forum?id=J0SbYY

    Code: github.com/AllenInstitute/Heav

    Why would anyone model biological neural circuits with a Gaussian distribution of synaptic weights is beyond me, but it's great to know what the brain gets from having a different distribution.

    #neuroscience #RNN

  43. "Compared to Gaussian networks, finite heavy-tailed RNNs exhibit a broader gain regime near the edge of chaos, namely, a slow transition to chaos. However, this robustness comes with a tradeoff: heavier tails reduce the Lyapunov dimension of the attractor, indicating lower effective dimensionality. Our results reveal a biologically aligned tradeoff between the robustness of dynamics near the edge of chaos and the richness of high-dimensional neural activity."

    "Slow Transition to Low-Dimensional Chaos in Heavy-Tailed Recurrent Neural Networks", Xie et al. 2025
    openreview.net/forum?id=J0SbYY

    Code: github.com/AllenInstitute/Heav

    Why would anyone model biological neural circuits with a Gaussian distribution of synaptic weights is beyond me, but it's great to know what the brain gets from having a different distribution.

    #neuroscience #RNN