#metalearning — Public Fediverse posts
Live and recent posts from across the Fediverse tagged #metalearning, aggregated by home.social.
-
Bayesian Meta-Learning Is All You Need
— Why is the deterministic view of meta-learning not sufficient?
— What is the variational inference?
— How can we design neural-based Bayesian meta-learning algorithms?
https://jameskle.com/writes/bayesian-meta-learning-is-all-you-need
-
Very interesting work by Confavreux and colleagues on #MetaLearning families of #plasticity rules in #RecurrentSpikingNetworks using simulation-based inference 👌
🌍 https://openreview.net/forum?id=FLFasCFJNo
#RSN #CompNeuro #Neuroscience #NeurIPS2023 #SynapticPlasticity #SpikingNeuronalNetwork #SNN
-
Very interesting work by Confavreux and colleagues on #MetaLearning families of #plasticity rules in #RecurrentSpikingNetworks using simulation-based inference 👌
🌍 https://openreview.net/forum?id=FLFasCFJNo
#RSN #CompNeuro #Neuroscience #NeurIPS2023 #SynapticPlasticity #SpikingNeuronalNetwork #SNN
-
Very interesting work by Confavreux and colleagues on #MetaLearning families of #plasticity rules in #RecurrentSpikingNetworks using simulation-based inference 👌
🌍 https://openreview.net/forum?id=FLFasCFJNo
#RSN #CompNeuro #Neuroscience #NeurIPS2023 #SynapticPlasticity #SpikingNeuronalNetwork #SNN
-
Very interesting work by Confavreux and colleagues on #MetaLearning families of #plasticity rules in #RecurrentSpikingNetworks using simulation-based inference 👌
🌍 https://openreview.net/forum?id=FLFasCFJNo
#RSN #CompNeuro #Neuroscience #NeurIPS2023 #SynapticPlasticity #SpikingNeuronalNetwork #SNN
-
Very interesting work by Confavreux and colleagues on #MetaLearning families of #plasticity rules in #RecurrentSpikingNetworks using simulation-based inference 👌
🌍 https://openreview.net/forum?id=FLFasCFJNo
#RSN #CompNeuro #Neuroscience #NeurIPS2023 #SynapticPlasticity #SpikingNeuronalNetwork #SNN
-
Rethinking the #hippocampal #cognitivemap as a #metalearning computational module – New publication by Luca Ambrogioni &
H. Freyja Ólafsdóttir (2023) -
Approfondissez vos connaissances avec ces 5 ressources peu connues que j'utilise au quotidien :
- https://fmhy.pages.dev
- https://github.com/JushBJJ/Mr.-Ranedeer-AI-Tutor
- https://github.com/sindresorhus/awesome
- https://learn-anything.xyz
- https://singlelogin.re -
Me:
What would be the best hashtags to use for that post?ChatGPT:
Based on the content of the post, the following hashtags may be appropriate:#AGI #AI #ArtificialIntelligence #MachineLearning #DeepLearning #DataScience #AlphaZero #AlphaGo #MetaLearning #SelfCompetition #MultiModality #Robotics #PhysicalInteractions #Research
-
Through scaling #DeepNeuralNetworks we have found in two different domains, #ReinforcementLearning and #LanguageModels, that these models learn to learn (#MetaLearning).
They spontaneously learn internal models with memory and learning capability which are able to exhibit #InContextLearning much faster and much more effectively than any of our standard #backpropagation based deep neural networks can.
These rather alien #LearningModels embedded inside the deep learning models are emulated by #neuron layers, but aren't necessarily deep learning models themselves.
I believe it is possible to extract these internal models which have learned to learn, out of the scaled up #DeepLearning #substrate they run on, and run them natively and directly on #hardware.
This allows those much more efficient learning models to be used either as #LearningAgents themselves, or as a further substrate for further meta-learning.
I have an #embodiment #research on-going but with a related goal and focus specifically in extracting (or distilling) the models out of the meta-models here:
https://github.com/keskival/embodied-emulated-personasIt is of course an open research problem how to do this, but I have a lot of ideas!
If you're inspired by this, or if you think the same, let's chat!