#rnns — Public Fediverse posts
Live and recent posts from across the Fediverse tagged #rnns, aggregated by home.social.
-
🧠 New paper by Deistler et al: #JAXLEY: differentiable #simulation for large-scale training of detailed #biophysical #models of #NeuralDynamics.
They present a #differentiable #GPU accelerated #simulator that trains #morphologically detailed biophysical #neuron models with #GradientDescent. JAXLEY fits intracellular #voltage and #calcium data, scales to 1000s of compartments, trains biophys. #RNNs on #WorkingMemory tasks & even solves #MNIST.
-
🧠 New paper by Deistler et al: #JAXLEY: differentiable #simulation for large-scale training of detailed #biophysical #models of #NeuralDynamics.
They present a #differentiable #GPU accelerated #simulator that trains #morphologically detailed biophysical #neuron models with #GradientDescent. JAXLEY fits intracellular #voltage and #calcium data, scales to 1000s of compartments, trains biophys. #RNNs on #WorkingMemory tasks & even solves #MNIST.
-
🧠 New paper by Deistler et al: #JAXLEY: differentiable #simulation for large-scale training of detailed #biophysical #models of #NeuralDynamics.
They present a #differentiable #GPU accelerated #simulator that trains #morphologically detailed biophysical #neuron models with #GradientDescent. JAXLEY fits intracellular #voltage and #calcium data, scales to 1000s of compartments, trains biophys. #RNNs on #WorkingMemory tasks & even solves #MNIST.
-
🧠 New paper by Deistler et al: #JAXLEY: differentiable #simulation for large-scale training of detailed #biophysical #models of #NeuralDynamics.
They present a #differentiable #GPU accelerated #simulator that trains #morphologically detailed biophysical #neuron models with #GradientDescent. JAXLEY fits intracellular #voltage and #calcium data, scales to 1000s of compartments, trains biophys. #RNNs on #WorkingMemory tasks & even solves #MNIST.
-
🧠 New paper by Deistler et al: #JAXLEY: differentiable #simulation for large-scale training of detailed #biophysical #models of #NeuralDynamics.
They present a #differentiable #GPU accelerated #simulator that trains #morphologically detailed biophysical #neuron models with #GradientDescent. JAXLEY fits intracellular #voltage and #calcium data, scales to 1000s of compartments, trains biophys. #RNNs on #WorkingMemory tasks & even solves #MNIST.
-
🧠 New preprint by Codol et al. (2025): Brain-like #NeuralDynamics for #behavioral control develop through #ReinforcementLearning. They show that only #RL, not #SupervisedLearning, yields neural activity geometries & dynamics matching monkey #MotorCortex recordings. RL-trained #RNNs operate at the edge of #chaos, reproduce adaptive reorganization under #visuomotor rotation, and require realistic limb #biomechanics to achieve brain-like control.
-
#ITByte: The #MachineLearning models having sequential data as input or output are called #SequenceModels.
It includes text streams, video clips, audio clips, time-series data, etc. Recurrent Neural Networks (#RNNs) and Long Short-Term Memory(#LSTM) are popular algorithms used in sequence models.
https://knowledgezone.co.in/trends/explorer?topic=Sequence-Model
-
#ITByte: The #MachineLearning models having sequential data as input or output are called #SequenceModels.
It includes text streams, video clips, audio clips, time-series data, etc. Recurrent Neural Networks (#RNNs) and Long Short-Term Memory(#LSTM) are popular algorithms used in sequence models.
https://knowledgezone.co.in/trends/explorer?topic=Sequence-Model
-
1997 with the advent of Long Short-Term Memory recurrent #neuralnetworks marks the subsequent step in our brief history of )large) #languagemodels from last week's #ise2023 lecture. Introduced by Sepp Hochreiter and Jürgen Schmidhuber #LSTM #RNNs enabled efficient processing of sequences of data.
Slides: https://drive.google.com/file/d/1atNvMYNkeKDwXP3olHXzloa09S5pzjXb/view?usp=drive_link
#nlp #llm #llms #ai #artificialintelligence #lecture @fizise