#mnist — Public Fediverse posts
Live and recent posts from across the Fediverse tagged #mnist, aggregated by home.social.
-
Taking this one step further, I also looked at #ConditionalGANs: Extending #GANs by conditioning both generator and discriminator on labels, so you can explicitly control what is generated. In the #MNIST case, this means generating specific digits and even smoothly interpolating between them:
🌍 https://www.fabriziomusacchio.com/blog/2023-07-30-cgan/
#MachineLearning #GenerativeAI #CGAN #GAN
(The attached GIF shows the interpolation between the digits 6 and 1)
-
Taking this one step further, I also looked at #ConditionalGANs: Extending #GANs by conditioning both generator and discriminator on labels, so you can explicitly control what is generated. In the #MNIST case, this means generating specific digits and even smoothly interpolating between them:
🌍 https://www.fabriziomusacchio.com/blog/2023-07-30-cgan/
#MachineLearning #GenerativeAI #CGAN #GAN
(The attached GIF shows the interpolation between the digits 6 and 1)
-
Taking this one step further, I also looked at #ConditionalGANs: Extending #GANs by conditioning both generator and discriminator on labels, so you can explicitly control what is generated. In the #MNIST case, this means generating specific digits and even smoothly interpolating between them:
🌍 https://www.fabriziomusacchio.com/blog/2023-07-30-cgan/
#MachineLearning #GenerativeAI #CGAN #GAN
(The attached GIF shows the interpolation between the digits 6 and 1)
-
Taking this one step further, I also looked at #ConditionalGANs: Extending #GANs by conditioning both generator and discriminator on labels, so you can explicitly control what is generated. In the #MNIST case, this means generating specific digits and even smoothly interpolating between them:
🌍 https://www.fabriziomusacchio.com/blog/2023-07-30-cgan/
#MachineLearning #GenerativeAI #CGAN #GAN
(The attached GIF shows the interpolation between the digits 6 and 1)
-
Taking this one step further, I also looked at #ConditionalGANs: Extending #GANs by conditioning both generator and discriminator on labels, so you can explicitly control what is generated. In the #MNIST case, this means generating specific digits and even smoothly interpolating between them:
🌍 https://www.fabriziomusacchio.com/blog/2023-07-30-cgan/
#MachineLearning #GenerativeAI #CGAN #GAN
(The attached GIF shows the interpolation between the digits 6 and 1)
-
Just came across an elegant new #SNN framework called #nervos by Maskeen and Lashkare, which implements a two layer SNN w/ local #STDP #learning to classify, e.g., #MNIST digits. Here is an example, where I apply it to a 6-class subset of MNIST. The model reaches around 85% accuracy & the learned synapses show digit-like patterns. Quite impressive in my view, given the simplicity of the architecture & the local learning rule:
🌍https://www.fabriziomusacchio.com/blog/2026-02-16-nervos_stdp_snn_simulation_on_mnist/
-
Just came across an elegant new #SNN framework called #nervos by Maskeen and Lashkare, which implements a two layer SNN w/ local #STDP #learning to classify, e.g., #MNIST digits. Here is an example, where I apply it to a 6-class subset of MNIST. The model reaches around 85% accuracy & the learned synapses show digit-like patterns. Quite impressive in my view, given the simplicity of the architecture & the local learning rule:
🌍https://www.fabriziomusacchio.com/blog/2026-02-16-nervos_stdp_snn_simulation_on_mnist/
-
Just came across an elegant new #SNN framework called #nervos by Maskeen and Lashkare, which implements a two layer SNN w/ local #STDP #learning to classify, e.g., #MNIST digits. Here is an example, where I apply it to a 6-class subset of MNIST. The model reaches around 85% accuracy & the learned synapses show digit-like patterns. Quite impressive in my view, given the simplicity of the architecture & the local learning rule:
🌍https://www.fabriziomusacchio.com/blog/2026-02-16-nervos_stdp_snn_simulation_on_mnist/
-
Just came across an elegant new #SNN framework called #nervos by Maskeen and Lashkare, which implements a two layer SNN w/ local #STDP #learning to classify, e.g., #MNIST digits. Here is an example, where I apply it to a 6-class subset of MNIST. The model reaches around 85% accuracy & the learned synapses show digit-like patterns. Quite impressive in my view, given the simplicity of the architecture & the local learning rule:
🌍https://www.fabriziomusacchio.com/blog/2026-02-16-nervos_stdp_snn_simulation_on_mnist/
-
Just came across an elegant new #SNN framework called #nervos by Maskeen and Lashkare, which implements a two layer SNN w/ local #STDP #learning to classify, e.g., #MNIST digits. Here is an example, where I apply it to a 6-class subset of MNIST. The model reaches around 85% accuracy & the learned synapses show digit-like patterns. Quite impressive in my view, given the simplicity of the architecture & the local learning rule:
🌍https://www.fabriziomusacchio.com/blog/2026-02-16-nervos_stdp_snn_simulation_on_mnist/
-
🧠 New paper by Deistler et al: #JAXLEY: differentiable #simulation for large-scale training of detailed #biophysical #models of #NeuralDynamics.
They present a #differentiable #GPU accelerated #simulator that trains #morphologically detailed biophysical #neuron models with #GradientDescent. JAXLEY fits intracellular #voltage and #calcium data, scales to 1000s of compartments, trains biophys. #RNNs on #WorkingMemory tasks & even solves #MNIST.
-
🧠 New paper by Deistler et al: #JAXLEY: differentiable #simulation for large-scale training of detailed #biophysical #models of #NeuralDynamics.
They present a #differentiable #GPU accelerated #simulator that trains #morphologically detailed biophysical #neuron models with #GradientDescent. JAXLEY fits intracellular #voltage and #calcium data, scales to 1000s of compartments, trains biophys. #RNNs on #WorkingMemory tasks & even solves #MNIST.
-
🧠 New paper by Deistler et al: #JAXLEY: differentiable #simulation for large-scale training of detailed #biophysical #models of #NeuralDynamics.
They present a #differentiable #GPU accelerated #simulator that trains #morphologically detailed biophysical #neuron models with #GradientDescent. JAXLEY fits intracellular #voltage and #calcium data, scales to 1000s of compartments, trains biophys. #RNNs on #WorkingMemory tasks & even solves #MNIST.
-
🧠 New paper by Deistler et al: #JAXLEY: differentiable #simulation for large-scale training of detailed #biophysical #models of #NeuralDynamics.
They present a #differentiable #GPU accelerated #simulator that trains #morphologically detailed biophysical #neuron models with #GradientDescent. JAXLEY fits intracellular #voltage and #calcium data, scales to 1000s of compartments, trains biophys. #RNNs on #WorkingMemory tasks & even solves #MNIST.
-
🧠 New paper by Deistler et al: #JAXLEY: differentiable #simulation for large-scale training of detailed #biophysical #models of #NeuralDynamics.
They present a #differentiable #GPU accelerated #simulator that trains #morphologically detailed biophysical #neuron models with #GradientDescent. JAXLEY fits intracellular #voltage and #calcium data, scales to 1000s of compartments, trains biophys. #RNNs on #WorkingMemory tasks & even solves #MNIST.
-
Going Digital: Teaching a TI-84 Handwriting Recognition https://hackaday.com/2024/12/24/going-digital-teaching-a-ti-84-handwriting-recognition/ #convolutionalneuralnetwork #ArtificialIntelligence #graphicscalculator #texasinstruments #MachineLearning #handheldshacks #neuralnetwork #handwriting #TI-84PlusCE #calculator #mnist #ti-84 #News
-
Going Digital: Teaching a TI-84 Handwriting Recognition https://hackaday.com/2024/12/24/going-digital-teaching-a-ti-84-handwriting-recognition/ #convolutionalneuralnetwork #ArtificialIntelligence #graphicscalculator #texasinstruments #MachineLearning #handheldshacks #neuralnetwork #handwriting #TI-84PlusCE #calculator #mnist #ti-84 #News
-
Going Digital: Teaching a TI-84 Handwriting Recognition https://hackaday.com/2024/12/24/going-digital-teaching-a-ti-84-handwriting-recognition/ #convolutionalneuralnetwork #ArtificialIntelligence #graphicscalculator #texasinstruments #MachineLearning #handheldshacks #neuralnetwork #handwriting #TI-84PlusCE #calculator #mnist #ti-84 #News
-
Going Digital: Teaching a TI-84 Handwriting Recognition - You wouldn’t typically associate graphing calculators with artificial intelligence... - https://hackaday.com/2024/12/24/going-digital-teaching-a-ti-84-handwriting-recognition/ #convolutionalneuralnetwork #artificialintelligence #graphicscalculator #texasinstruments #machinelearning #handheldshacks #neuralnetwork #handwriting #ti-84plusce #calculator #mnist #ti-84 #news
-
Going Digital: Teaching a TI-84 Handwriting Recognition - You wouldn’t typically associate graphing calculators with artificial intelligence... - https://hackaday.com/2024/12/24/going-digital-teaching-a-ti-84-handwriting-recognition/ #convolutionalneuralnetwork #artificialintelligence #graphicscalculator #texasinstruments #machinelearning #handheldshacks #neuralnetwork #handwriting #ti-84plusce #calculator #mnist #ti-84 #news
-
Going Digital: Teaching a TI-84 Handwriting Recognition - You wouldn’t typically associate graphing calculators with artificial intelligence... - https://hackaday.com/2024/12/24/going-digital-teaching-a-ti-84-handwriting-recognition/ #convolutionalneuralnetwork #artificialintelligence #graphicscalculator #texasinstruments #machinelearning #handheldshacks #neuralnetwork #handwriting #ti-84plusce #calculator #mnist #ti-84 #news
-
Going Digital: Teaching a TI-84 Handwriting Recognition - You wouldn’t typically associate graphing calculators with artificial intelligence... - https://hackaday.com/2024/12/24/going-digital-teaching-a-ti-84-handwriting-recognition/ #convolutionalneuralnetwork #artificialintelligence #graphicscalculator #texasinstruments #machinelearning #handheldshacks #neuralnetwork #handwriting #ti-84plusce #calculator #mnist #ti-84 #news
-
Going Digital: Teaching a TI-84 Handwriting Recognition - You wouldn’t typically associate graphing calculators with artificial intelligence... - https://hackaday.com/2024/12/24/going-digital-teaching-a-ti-84-handwriting-recognition/ #convolutionalneuralnetwork #artificialintelligence #graphicscalculator #texasinstruments #machinelearning #handheldshacks #neuralnetwork #handwriting #ti-84plusce #calculator #mnist #ti-84 #news
-
The #Wasserstein #metric (#EMD) can be used, to train #GenerativeAdversarialNetworks (#GANs) more effectively. This tutorial compares a default GAN with a #WassersteinGAN (#WGAN) trained on the #MNIST dataset.
-
Nothing like the #Kaggle #fashion #MNIST variant to make me feel like a real Elle Woods over here doing t-SNE on purses and saliency maps on ankle boots 😅
https://github.com/janeadams/fashion_model_analysis/
#MachineLearning #WomeninSTEM #AI #ML #tsne #pca #WiDS #Python