home.social

#mamba — Public Fediverse posts

Live and recent posts from across the Fediverse tagged #mamba, aggregated by home.social.

  1. kpopnsfw.com/247302/260410-win 260410 Winter on the album/title track: “What is Kwangya Style? Is Kwangya style like…Black Mamba, Savage, Next Level, Girls, smth like that? If that what it means, then it’s not going to be Kwangya style. Is Supernova considered to be Kwangya style? Would Whiplash be considered Kwangya style?” #aespa #albumtitle #considered #girls #Kwangya #Level #LikeBlack #Mamba #Means #savage #smth #style #supernova #track #Whiplash #winter

  2. My day job is all about #Python (which I love). Here are some personal rules, specific to working with Python projects:

    * Do **not** install or modify global tools, especially Python itself or any packages. This means a given system might not even **have** a global Python
    * Always use virtual environments (`uv` agrees with me, and doesn't need this but). I always set the global environment variable `PIP_REQUIRE_VIRTUALENV`.
    * The two rules above mean my virtual environment contains (not via a link, it's really there) Python itself (and of course, of the right version)
    * Virtual environments always live **inside** a project directory. Never global.
    * Activate virtual environments only **inside** the project directory (`direnv` #direnv makes this easy)
    * Don't install (let alone use) #Anaconda, #Miniconda, or #Mamba, because those violate all the rules above (but see the next rule)
    * Anaconda-based packages implies a `pixi` #Pixi project (it's the same people, but a better answer, and you still get what you want -- the correct packages)
    * No Anaconda-based packages implies a `uv` #UV project
    * Always use `pyproject.toml` #pyprojecttoml over any other config file (e.g., `requirements.txt` #requirementstxt), except where things just don't work, such as needing `pyrefly.toml`
    * `uv`, `pixi`, and `direnv` must exist outside of any project, so install them at the user level, or else globally if and only if that is appropriate and compelling enough to override rule one

    That was a wall of text, but in practice doing it this way is trivial. It's probably **less** work than you have been doing. This post is just about managing your Python versions, environments, and projects. Not about, e.g., using `pre-commit` #precommit, or doing type checking, etc. But if you follow these rules, your work will be easier, faster, more adaptable, and encounter fewer obstacles.

    #HowTo

  3. My day job is all about #Python (which I love). Here are some personal rules, specific to working with Python projects:

    * Do **not** install or modify global tools, especially Python itself or any packages. This means a given system might not even **have** a global Python
    * Always use virtual environments (`uv` agrees with me, and doesn't need this but). I always set the global environment variable `PIP_REQUIRE_VIRTUALENV`.
    * The two rules above mean my virtual environment contains (not via a link, it's really there) Python itself (and of course, of the right version)
    * Virtual environments always live **inside** a project directory. Never global.
    * Activate virtual environments only **inside** the project directory (`direnv` #direnv makes this easy)
    * Don't install (let alone use) #Anaconda, #Miniconda, or #Mamba, because those violate all the rules above (but see the next rule)
    * Anaconda-based packages implies a `pixi` #Pixi project (it's the same people, but a better answer, and you still get what you want -- the correct packages)
    * No Anaconda-based packages implies a `uv` #UV project
    * Always use `pyproject.toml` #pyprojecttoml over any other config file (e.g., `requirements.txt` #requirementstxt), except where things just don't work, such as needing `pyrefly.toml`
    * `uv`, `pixi`, and `direnv` must exist outside of any project, so install them at the user level, or else globally if and only if that is appropriate and compelling enough to override rule one

    That was a wall of text, but in practice doing it this way is trivial. It's probably **less** work than you have been doing. This post is just about managing your Python versions, environments, and projects. Not about, e.g., using `pre-commit` #precommit, or doing type checking, etc. But if you follow these rules, your work will be easier, faster, more adaptable, and encounter fewer obstacles.

    #HowTo

  4. My day job is all about (which I love). Here are some personal rules, specific to working with Python projects:

    * Do **not** install or modify global tools, especially Python itself or any packages. This means a given system might not even **have** a global Python
    * Always use virtual environments (`uv` agrees with me, and doesn't need this but). I always set the global environment variable `PIP_REQUIRE_VIRTUALENV`.
    * The two rules above mean my virtual environment contains (not via a link, it's really there) Python itself (and of course, of the right version)
    * Virtual environments always live **inside** a project directory. Never global.
    * Activate virtual environments only **inside** the project directory (`direnv` makes this easy)
    * Don't install (let alone use) , , or , because those violate all the rules above (but see the next rule)
    * Anaconda-based packages implies a `pixi` project (it's the same people, but a better answer, and you still get what you want -- the correct packages)
    * No Anaconda-based packages implies a `uv` project
    * Always use `pyproject.toml` over any other config file (e.g., `requirements.txt` ), except where things just don't work, such as needing `pyrefly.toml`
    * `uv`, `pixi`, and `direnv` must exist outside of any project, so install them at the user level, or else globally if and only if that is appropriate and compelling enough to override rule one

    That was a wall of text, but in practice doing it this way is trivial. It's probably **less** work than you have been doing. This post is just about managing your Python versions, environments, and projects. Not about, e.g., using `pre-commit` , or doing type checking, etc. But if you follow these rules, your work will be easier, faster, more adaptable, and encounter fewer obstacles.

  5. My day job is all about #Python (which I love). Here are some personal rules, specific to working with Python projects:

    * Do **not** install or modify global tools, especially Python itself or any packages. This means a given system might not even **have** a global Python
    * Always use virtual environments (`uv` agrees with me, and doesn't need this but). I always set the global environment variable `PIP_REQUIRE_VIRTUALENV`.
    * The two rules above mean my virtual environment contains (not via a link, it's really there) Python itself (and of course, of the right version)
    * Virtual environments always live **inside** a project directory. Never global.
    * Activate virtual environments only **inside** the project directory (`direnv` #direnv makes this easy)
    * Don't install (let alone use) #Anaconda, #Miniconda, or #Mamba, because those violate all the rules above (but see the next rule)
    * Anaconda-based packages implies a `pixi` #Pixi project (it's the same people, but a better answer, and you still get what you want -- the correct packages)
    * No Anaconda-based packages implies a `uv` #UV project
    * Always use `pyproject.toml` #pyprojecttoml over any other config file (e.g., `requirements.txt` #requirementstxt), except where things just don't work, such as needing `pyrefly.toml`
    * `uv`, `pixi`, and `direnv` must exist outside of any project, so install them at the user level, or else globally if and only if that is appropriate and compelling enough to override rule one

    That was a wall of text, but in practice doing it this way is trivial. It's probably **less** work than you have been doing. This post is just about managing your Python versions, environments, and projects. Not about, e.g., using `pre-commit` #precommit, or doing type checking, etc. But if you follow these rules, your work will be easier, faster, more adaptable, and encounter fewer obstacles.

    #HowTo

  6. My day job is all about #Python (which I love). Here are some personal rules, specific to working with Python projects:

    * Do **not** install or modify global tools, especially Python itself or any packages. This means a given system might not even **have** a global Python
    * Always use virtual environments (`uv` agrees with me, and doesn't need this but). I always set the global environment variable `PIP_REQUIRE_VIRTUALENV`.
    * The two rules above mean my virtual environment contains (not via a link, it's really there) Python itself (and of course, of the right version)
    * Virtual environments always live **inside** a project directory. Never global.
    * Activate virtual environments only **inside** the project directory (`direnv` #direnv makes this easy)
    * Don't install (let alone use) #Anaconda, #Miniconda, or #Mamba, because those violate all the rules above (but see the next rule)
    * Anaconda-based packages implies a `pixi` #Pixi project (it's the same people, but a better answer, and you still get what you want -- the correct packages)
    * No Anaconda-based packages implies a `uv` #UV project
    * Always use `pyproject.toml` #pyprojecttoml over any other config file (e.g., `requirements.txt #requirementstxt), except where things just don't work, such as needing `pyrefly.toml`
    * `uv`, `pixi`, and `direnv` must exist outside of any project, so install them at the user level, or else globally if and only if that is appropriate and compelling enough to override rule one

    That was a wall of text, but in practice doing it this way is trivial. It's probably **less** work than you have been doing. This post is just about managing your Python versions, environments, and projects. Not about, e.g., using `pre-commit` #precommit, or doing type checking, etc. But if you follow these rules, your work will be easier, faster, more adaptable, and encounter fewer obstacles.

    #HotTo

  7. Титаны и MIROS: Google учит ИИ помнить как человек — от сюрприза к бесконечной памяти

    2026 год в разгаре: мы пережили новогодние обновления фреймворков, свежие релизы ИИ-моделей и, возможно, первые эксперименты с AGI в labs. Но после праздничного кода и кофе пора нырнуть в фундаментальное — как сделать ИИ, который не просто генерирует текст, а эволюционирует в реальном времени. Около месяца назад Google Research анонсировали архитектуру HOPE с вложенным обучением для непрерывной памяти. А теперь — свежий пост о более ранних, но потенциально революционных работах: Titans и фреймворке MIROS . Это гибрид рекуррентных сетей и трансформеров, где память обновляется на лету через "сюрприз".

    habr.com/ru/articles/988100/

    #titans #титаны #долговременная_память #longterm_memory #hybrid_architecture #оптимизация #google_research #mamba #stateful #ии

  8. It's the first Thursday tomorrow, which means it is Michigan Python night at 7pm ET! Antonio Cavallo will be giving a talk comparing uv and micromamba!

    He'll compare these two modern tools—their strengths, use cases, and when to use each. Whether you're managing pure Python environments or multi-language stacks, you'll gain practical insights for choosing the right tool. Looking forward to seeing everyone, all are welcome!

    meetup.com/michigan-python/eve

  9. So with a local #SSM State Space Model (#Mamba, #RWKV) I can snapshot contexts.

    I've now got a general purpose pipe/filter that can feed and read from any saved context, for stable static or dynamic (by re-saving the checkpoint) sessions.

    Any of which can become an always ready CLI filter I can pipe stuff through.

    *Could* make a local SSM model useful providing instantly ready small named natural language filters that work in the shell just like grep, awk, etc.

    Useful?

    Don't know yet.

  10. Turns out my llama_cpp code needed minimal changes to *just work* swapping Falcon-Mamba 7B in for RWKV7 7B.

    The Mamba is both faster, and based on a brief test with the same primer text - smarter.

    I still haven't figured out if either can actually be useful in any way, but State-Space Models in general are pretty awesome.

    huggingface.co/tiiuae/Falcon3-

    #ssm #mamba #falcon

  11. Thing I've been experimenting with the past few days -- "diegetic role based prompting" for a local State Space Model ( #RWKV7 currently).

    Tiny llama.cpp Python runner for the model and "composer" GUI for stepping and half-stepping through input only or input and generated role specified output, with saving and restoring of KV checkpoints.

    Planning to write runners for #XLSTM 7B & #Falcon #MAMBA 7B to compare.

    Started 'cause no actual #SSM saving, resuming examples.

    github.com/stevenaleach/ssmpro

  12. Finally! After I don't even know how much head-banging, I've got a State Space Model (RWKV7) running using llama.cpp with GPU offloading and memory states successfully saved and restored. And I *think* the same code can be minimally modified for Mamba (Mistral, Falcon).

    Yay for infinite context length!

    And for massive time and energy savings.

    llama.cpp: github.com/ggml-org/llama.cpp

    Llama CPP Python: pypi.org/project/llama-cpp-pyt

    RWKV7 live demo: huggingface.co/spaces/BlinkDL/

    #SSM #llama #rwkv #mamba

  13. Часть 4: Mamba — State Space Models vs трансформеры

    Mamba — революция в обработке длинных последовательностей! Mamba — State Space Models vs трансформеры, что лучше?!

    habr.com/ru/articles/925416/

    #mamba #transformer #nlp #ssm

  14. After recent license changes at #Anaconda, Inc., #Miniforge has become the recommended, fully open solution for institutional #Python environments. Now bundling both #conda and #mamba, it replaces #Miniconda/ #Mambaforge, ensuring high performance and broad package access via #CondaForge. Here I summarized the key changes and migration steps:

    🌍 fabriziomusacchio.com/blog/202

    #Python #DataScience #OpenSource

  15. After recent license changes at #Anaconda, Inc., #Miniforge has become the recommended, fully open solution for institutional #Python environments. Now bundling both #conda and #mamba, it replaces #Miniconda/ #Mambaforge, ensuring high performance and broad package access via #CondaForge. Here I summarized the key changes and migration steps:

    🌍 fabriziomusacchio.com/blog/202

    #Python #DataScience #OpenSource

  16. After recent license changes at #Anaconda, Inc., #Miniforge has become the recommended, fully open solution for institutional #Python environments. Now bundling both #conda and #mamba, it replaces #Miniconda/ #Mambaforge, ensuring high performance and broad package access via #CondaForge. Here I summarized the key changes and migration steps:

    🌍 fabriziomusacchio.com/blog/202

    #Python #DataScience #OpenSource

  17. After recent license changes at #Anaconda, Inc., #Miniforge has become the recommended, fully open solution for institutional #Python environments. Now bundling both #conda and #mamba, it replaces #Miniconda/ #Mambaforge, ensuring high performance and broad package access via #CondaForge. Here I summarized the key changes and migration steps:

    🌍 fabriziomusacchio.com/blog/202

    #Python #DataScience #OpenSource

  18. After recent license changes at #Anaconda, Inc., #Miniforge has become the recommended, fully open solution for institutional #Python environments. Now bundling both #conda and #mamba, it replaces #Miniconda/ #Mambaforge, ensuring high performance and broad package access via #CondaForge. Here I summarized the key changes and migration steps:

    🌍 fabriziomusacchio.com/blog/202

    #Python #DataScience #OpenSource

  19. #Conda is still driving me crazy.🙈
    The confusion is actually perfect...
    Conda is actually a package manager.
    But then there is also #Anaconda and #Miniconda as package managers.

    Both Miniconda and Anaconda install conda.
    Then there is #mamba A reimplementation of Conda in C++. There is also #Micromamba...but no #Minimamba...

    #bioinformatics

    🥴 🐍

  20. #Conda is still driving me crazy.🙈
    The confusion is actually perfect...
    Conda is actually a package manager.
    But then there is also #Anaconda and #Miniconda as package managers.

    Both Miniconda and Anaconda install conda.
    Then there is #mamba A reimplementation of Conda in C++. There is also #Micromamba...but no #Minimamba...

    #bioinformatics

    🥴 🐍

  21. #Conda is still driving me crazy.🙈
    The confusion is actually perfect...
    Conda is actually a package manager.
    But then there is also #Anaconda and #Miniconda as package managers.

    Both Miniconda and Anaconda install conda.
    Then there is #mamba A reimplementation of Conda in C++. There is also #Micromamba...but no #Minimamba...

    #bioinformatics

    🥴 🐍

  22. #Conda is still driving me crazy.🙈
    The confusion is actually perfect...
    Conda is actually a package manager.
    But then there is also #Anaconda and #Miniconda as package managers.

    Both Miniconda and Anaconda install conda.
    Then there is #mamba A reimplementation of Conda in C++. There is also #Micromamba...but no #Minimamba...

    #bioinformatics

    🥴 🐍

  23. #Conda is still driving me crazy.🙈
    The confusion is actually perfect...
    Conda is actually a package manager.
    But then there is also #Anaconda and #Miniconda as package managers.

    Both Miniconda and Anaconda install conda.
    Then there is #mamba A reimplementation of Conda in C++. There is also #Micromamba...but no #Minimamba...

    #bioinformatics

    🥴 🐍

  24. Oh yeeeeeeeeeeeah buddy! It’s time!

  25. & 1.2: general (and compatible) software package managers for any kind of software and all operating systems prefix.dev/blog/mamba_release_

  26. Just missing this dude right now. Hopefully LeBron can keep the Lakers increasing their win totals until AD is back in action 🙏💜💛

    #lakers #lakersnation #nba #kobe #kobebryant #lebron #lebronjames #losangeles #la #basketball #ballislife #hoops #mambainstagr.am/p/CnBDq3ELkco/