#stdp — Public Fediverse posts
Live and recent posts from across the Fediverse tagged #stdp, aggregated by home.social.
-
When synaptic #plasticity depends on more than spike timing alone, the #ClopathRule offers a biologically plausible model incorporating postsynaptic voltage dynamics. This voltage based #STDP model captures #synaptic change features such as frequency dependence and homeostatic stabilization making it useful for simulating #learning and #memory in #NeuralNetworks. Here's a brief introduction to that rule and its applications in #CompNeuro:
🌍 https://www.fabriziomusacchio.com/blog/2026-04-14-clopath_rule/
-
When synaptic #plasticity depends on more than spike timing alone, the #ClopathRule offers a biologically plausible model incorporating postsynaptic voltage dynamics. This voltage based #STDP model captures #synaptic change features such as frequency dependence and homeostatic stabilization making it useful for simulating #learning and #memory in #NeuralNetworks. Here's a brief introduction to that rule and its applications in #CompNeuro:
🌍 https://www.fabriziomusacchio.com/blog/2026-04-14-clopath_rule/
-
When synaptic #plasticity depends on more than spike timing alone, the #ClopathRule offers a biologically plausible model incorporating postsynaptic voltage dynamics. This voltage based #STDP model captures #synaptic change features such as frequency dependence and homeostatic stabilization making it useful for simulating #learning and #memory in #NeuralNetworks. Here's a brief introduction to that rule and its applications in #CompNeuro:
🌍 https://www.fabriziomusacchio.com/blog/2026-04-14-clopath_rule/
-
When synaptic #plasticity depends on more than spike timing alone, the #ClopathRule offers a biologically plausible model incorporating postsynaptic voltage dynamics. This voltage based #STDP model captures #synaptic change features such as frequency dependence and homeostatic stabilization making it useful for simulating #learning and #memory in #NeuralNetworks. Here's a brief introduction to that rule and its applications in #CompNeuro:
🌍 https://www.fabriziomusacchio.com/blog/2026-04-14-clopath_rule/
-
Just came across an elegant new #SNN framework called #nervos by Maskeen and Lashkare, which implements a two layer SNN w/ local #STDP #learning to classify, e.g., #MNIST digits. Here is an example, where I apply it to a 6-class subset of MNIST. The model reaches around 85% accuracy & the learned synapses show digit-like patterns. Quite impressive in my view, given the simplicity of the architecture & the local learning rule:
🌍https://www.fabriziomusacchio.com/blog/2026-02-16-nervos_stdp_snn_simulation_on_mnist/
-
Just came across an elegant new #SNN framework called #nervos by Maskeen and Lashkare, which implements a two layer SNN w/ local #STDP #learning to classify, e.g., #MNIST digits. Here is an example, where I apply it to a 6-class subset of MNIST. The model reaches around 85% accuracy & the learned synapses show digit-like patterns. Quite impressive in my view, given the simplicity of the architecture & the local learning rule:
🌍https://www.fabriziomusacchio.com/blog/2026-02-16-nervos_stdp_snn_simulation_on_mnist/
-
Just came across an elegant new #SNN framework called #nervos by Maskeen and Lashkare, which implements a two layer SNN w/ local #STDP #learning to classify, e.g., #MNIST digits. Here is an example, where I apply it to a 6-class subset of MNIST. The model reaches around 85% accuracy & the learned synapses show digit-like patterns. Quite impressive in my view, given the simplicity of the architecture & the local learning rule:
🌍https://www.fabriziomusacchio.com/blog/2026-02-16-nervos_stdp_snn_simulation_on_mnist/
-
Just came across an elegant new #SNN framework called #nervos by Maskeen and Lashkare, which implements a two layer SNN w/ local #STDP #learning to classify, e.g., #MNIST digits. Here is an example, where I apply it to a 6-class subset of MNIST. The model reaches around 85% accuracy & the learned synapses show digit-like patterns. Quite impressive in my view, given the simplicity of the architecture & the local learning rule:
🌍https://www.fabriziomusacchio.com/blog/2026-02-16-nervos_stdp_snn_simulation_on_mnist/
-
Just came across an elegant new #SNN framework called #nervos by Maskeen and Lashkare, which implements a two layer SNN w/ local #STDP #learning to classify, e.g., #MNIST digits. Here is an example, where I apply it to a 6-class subset of MNIST. The model reaches around 85% accuracy & the learned synapses show digit-like patterns. Quite impressive in my view, given the simplicity of the architecture & the local learning rule:
🌍https://www.fabriziomusacchio.com/blog/2026-02-16-nervos_stdp_snn_simulation_on_mnist/
-
Как работают мозгоподобные чипы: от нейроморфных архитектур до реальных приложений
В этой статье я расскажу о нейроморфных чипах — аппаратных решениях, вдохновлённых биологическим мозгом. Без сухой теории и без ссылок на чужие публикации: только мои наблюдения, эксперименты на FPGA и готовые примеры на Python и C++. Почти детективный сюжет про транзисторы, которые ведут себя как нейроны, и про то, как они помогают роботам и «умным» датчикам работать миллисекунды.
https://habr.com/ru/articles/930248/
#нейроморфный_чип #SNN #intel_loihi #fpga #stdp #edge_computing #DVS #spinnaker