home.social

#stdp — Public Fediverse posts

Live and recent posts from across the Fediverse tagged #stdp, aggregated by home.social.

  1. When synaptic #plasticity depends on more than spike timing alone, the #ClopathRule offers a biologically plausible model incorporating postsynaptic voltage dynamics. This voltage based #STDP model captures #synaptic change features such as frequency dependence and homeostatic stabilization making it useful for simulating #learning and #memory in #NeuralNetworks. Here's a brief introduction to that rule and its applications in #CompNeuro:

    🌍 fabriziomusacchio.com/blog/202

    #Neuroscience

  2. When synaptic #plasticity depends on more than spike timing alone, the #ClopathRule offers a biologically plausible model incorporating postsynaptic voltage dynamics. This voltage based #STDP model captures #synaptic change features such as frequency dependence and homeostatic stabilization making it useful for simulating #learning and #memory in #NeuralNetworks. Here's a brief introduction to that rule and its applications in #CompNeuro:

    🌍 fabriziomusacchio.com/blog/202

    #Neuroscience

  3. When synaptic #plasticity depends on more than spike timing alone, the #ClopathRule offers a biologically plausible model incorporating postsynaptic voltage dynamics. This voltage based #STDP model captures #synaptic change features such as frequency dependence and homeostatic stabilization making it useful for simulating #learning and #memory in #NeuralNetworks. Here's a brief introduction to that rule and its applications in #CompNeuro:

    🌍 fabriziomusacchio.com/blog/202

    #Neuroscience

  4. When synaptic #plasticity depends on more than spike timing alone, the #ClopathRule offers a biologically plausible model incorporating postsynaptic voltage dynamics. This voltage based #STDP model captures #synaptic change features such as frequency dependence and homeostatic stabilization making it useful for simulating #learning and #memory in #NeuralNetworks. Here's a brief introduction to that rule and its applications in #CompNeuro:

    🌍 fabriziomusacchio.com/blog/202

    #Neuroscience

  5. Just came across an elegant new #SNN framework called #nervos by Maskeen and Lashkare, which implements a two layer SNN w/ local #STDP #learning to classify, e.g., #MNIST digits. Here is an example, where I apply it to a 6-class subset of MNIST. The model reaches around 85% accuracy & the learned synapses show digit-like patterns. Quite impressive in my view, given the simplicity of the architecture & the local learning rule:

    🌍fabriziomusacchio.com/blog/202

    #CompNeuro #Neuroscience #NeuralPlasticity

  6. Just came across an elegant new #SNN framework called #nervos by Maskeen and Lashkare, which implements a two layer SNN w/ local #STDP #learning to classify, e.g., #MNIST digits. Here is an example, where I apply it to a 6-class subset of MNIST. The model reaches around 85% accuracy & the learned synapses show digit-like patterns. Quite impressive in my view, given the simplicity of the architecture & the local learning rule:

    🌍fabriziomusacchio.com/blog/202

    #CompNeuro #Neuroscience #NeuralPlasticity

  7. Just came across an elegant new #SNN framework called #nervos by Maskeen and Lashkare, which implements a two layer SNN w/ local #STDP #learning to classify, e.g., #MNIST digits. Here is an example, where I apply it to a 6-class subset of MNIST. The model reaches around 85% accuracy & the learned synapses show digit-like patterns. Quite impressive in my view, given the simplicity of the architecture & the local learning rule:

    🌍fabriziomusacchio.com/blog/202

    #CompNeuro #Neuroscience #NeuralPlasticity

  8. Just came across an elegant new #SNN framework called #nervos by Maskeen and Lashkare, which implements a two layer SNN w/ local #STDP #learning to classify, e.g., #MNIST digits. Here is an example, where I apply it to a 6-class subset of MNIST. The model reaches around 85% accuracy & the learned synapses show digit-like patterns. Quite impressive in my view, given the simplicity of the architecture & the local learning rule:

    🌍fabriziomusacchio.com/blog/202

    #CompNeuro #Neuroscience #NeuralPlasticity

  9. Just came across an elegant new #SNN framework called #nervos by Maskeen and Lashkare, which implements a two layer SNN w/ local #STDP #learning to classify, e.g., #MNIST digits. Here is an example, where I apply it to a 6-class subset of MNIST. The model reaches around 85% accuracy & the learned synapses show digit-like patterns. Quite impressive in my view, given the simplicity of the architecture & the local learning rule:

    🌍fabriziomusacchio.com/blog/202

    #CompNeuro #Neuroscience #NeuralPlasticity

  10. Как работают мозгоподобные чипы: от нейроморфных архитектур до реальных приложений

    В этой статье я расскажу о нейроморфных чипах — аппаратных решениях, вдохновлённых биологическим мозгом. Без сухой теории и без ссылок на чужие публикации: только мои наблюдения, эксперименты на FPGA и готовые примеры на Python и C++. Почти детективный сюжет про транзисторы, которые ведут себя как нейроны, и про то, как они помогают роботам и «умным» датчикам работать миллисекунды.

    habr.com/ru/articles/930248/

    #нейроморфный_чип #SNN #intel_loihi #fpga #stdp #edge_computing #DVS #spinnaker