home.social

#mixture — Public Fediverse posts

Live and recent posts from across the Fediverse tagged #mixture, aggregated by home.social.

  1. 📡 New post:

    The twenty five brain architecture inside Mickai, and why it is structurally not a Mixture of Experts. SENTINEL, CORTEX, HIPPOCAMPUS, AMYGDALA, and twenty one specialists, each in its own process, with its own queue, signing its own audit.

    Readers familiar with the Mixture of Experts literature sometimes assume Mickai

    mickai.co.uk/articles/the-twen

    #mickai #multi-brain #architecture #mixture-of-experts #moe

  2. [Перевод] Вышел DeepSeek V4. Почему это очень плохо для США? DeepSeek V4 Pro — это 1,6 триллиона параметров, mixture of experts (MoE), ...

    #DeepSeek #V4 #mixture #of #experts #open #source #LLM #frontier #модели #SWE-bench

    Origin | Interest | Match
  3. Why NVLink Is Nvidia’s Secret Sauce Driving a 10x Performance Boost in MoEs We’ve seen a significant metamorphosis occur in AI in the past year, thanks to the emergence of large, capable Mixtur...

    #Features #AI #for #Science #DeepSeek-R1 #extreme #co-design #HPC #Ian #Buck #mixture

    Origin | Interest | Match
  4. <li><a href="#mixture-of-models-vs-mixture-of-experts">Mixture-of-Models vs Mixture-of-Experts</a></li>
    <li><a href="#the-mom-design-philosophy">The MoM Design Philosophy</a></li>
    <li><a href="#live-demo-on-amd-gpus">Live Demo on AMD GPUs</a></li>
    <li><a href="#signal-based-routing">Signal-Based Routing</a></li>
    <li><a href="#deploy-your-own">Deploy Your Own</a></li>
    </ul>

    <hr />

    <h2 id="mixture-of-models-vs-mixture-of-experts">Mixture-of-Models vs Mixture-of-Experts</h2>

  5. 'Gaussian Mixture Models with Rare Events', by Xuetong Li, Jing Zhou, Hansheng Wang.

    jmlr.org/papers/v25/23-1245.ht

    #mixture #gaussian #empirical