home.social

#pixi — Public Fediverse posts

Live and recent posts from across the Fediverse tagged #pixi, aggregated by home.social.

  1. Yesterday was just a fantastic day for me. I planned to post on it. Just overjoyed. Strongly tempered today by the acquisition of Astral, and therefore the uncertain future of #uv and #ty (and transitively, #pixi).

    I’m gonna focus on yesterday.

  2. Yesterday was just a fantastic day for me. I planned to post on it. Just overjoyed. Strongly tempered today by the acquisition of Astral, and therefore the uncertain future of #uv and #ty (and transitively, #pixi).

    I’m gonna focus on yesterday.

  3. Yesterday was just a fantastic day for me. I planned to post on it. Just overjoyed. Strongly tempered today by the acquisition of Astral, and therefore the uncertain future of and (and transitively, ).

    I’m gonna focus on yesterday.

  4. Yesterday was just a fantastic day for me. I planned to post on it. Just overjoyed. Strongly tempered today by the acquisition of Astral, and therefore the uncertain future of #uv and #ty (and transitively, #pixi).

    I’m gonna focus on yesterday.

  5. My day job is all about #Python (which I love). Here are some personal rules, specific to working with Python projects:

    * Do **not** install or modify global tools, especially Python itself or any packages. This means a given system might not even **have** a global Python
    * Always use virtual environments (`uv` agrees with me, and doesn't need this but). I always set the global environment variable `PIP_REQUIRE_VIRTUALENV`.
    * The two rules above mean my virtual environment contains (not via a link, it's really there) Python itself (and of course, of the right version)
    * Virtual environments always live **inside** a project directory. Never global.
    * Activate virtual environments only **inside** the project directory (`direnv` #direnv makes this easy)
    * Don't install (let alone use) #Anaconda, #Miniconda, or #Mamba, because those violate all the rules above (but see the next rule)
    * Anaconda-based packages implies a `pixi` #Pixi project (it's the same people, but a better answer, and you still get what you want -- the correct packages)
    * No Anaconda-based packages implies a `uv` #UV project
    * Always use `pyproject.toml` #pyprojecttoml over any other config file (e.g., `requirements.txt` #requirementstxt), except where things just don't work, such as needing `pyrefly.toml`
    * `uv`, `pixi`, and `direnv` must exist outside of any project, so install them at the user level, or else globally if and only if that is appropriate and compelling enough to override rule one

    That was a wall of text, but in practice doing it this way is trivial. It's probably **less** work than you have been doing. This post is just about managing your Python versions, environments, and projects. Not about, e.g., using `pre-commit` #precommit, or doing type checking, etc. But if you follow these rules, your work will be easier, faster, more adaptable, and encounter fewer obstacles.

    #HowTo

  6. My day job is all about #Python (which I love). Here are some personal rules, specific to working with Python projects:

    * Do **not** install or modify global tools, especially Python itself or any packages. This means a given system might not even **have** a global Python
    * Always use virtual environments (`uv` agrees with me, and doesn't need this but). I always set the global environment variable `PIP_REQUIRE_VIRTUALENV`.
    * The two rules above mean my virtual environment contains (not via a link, it's really there) Python itself (and of course, of the right version)
    * Virtual environments always live **inside** a project directory. Never global.
    * Activate virtual environments only **inside** the project directory (`direnv` #direnv makes this easy)
    * Don't install (let alone use) #Anaconda, #Miniconda, or #Mamba, because those violate all the rules above (but see the next rule)
    * Anaconda-based packages implies a `pixi` #Pixi project (it's the same people, but a better answer, and you still get what you want -- the correct packages)
    * No Anaconda-based packages implies a `uv` #UV project
    * Always use `pyproject.toml` #pyprojecttoml over any other config file (e.g., `requirements.txt` #requirementstxt), except where things just don't work, such as needing `pyrefly.toml`
    * `uv`, `pixi`, and `direnv` must exist outside of any project, so install them at the user level, or else globally if and only if that is appropriate and compelling enough to override rule one

    That was a wall of text, but in practice doing it this way is trivial. It's probably **less** work than you have been doing. This post is just about managing your Python versions, environments, and projects. Not about, e.g., using `pre-commit` #precommit, or doing type checking, etc. But if you follow these rules, your work will be easier, faster, more adaptable, and encounter fewer obstacles.

    #HowTo

  7. My day job is all about (which I love). Here are some personal rules, specific to working with Python projects:

    * Do **not** install or modify global tools, especially Python itself or any packages. This means a given system might not even **have** a global Python
    * Always use virtual environments (`uv` agrees with me, and doesn't need this but). I always set the global environment variable `PIP_REQUIRE_VIRTUALENV`.
    * The two rules above mean my virtual environment contains (not via a link, it's really there) Python itself (and of course, of the right version)
    * Virtual environments always live **inside** a project directory. Never global.
    * Activate virtual environments only **inside** the project directory (`direnv` makes this easy)
    * Don't install (let alone use) , , or , because those violate all the rules above (but see the next rule)
    * Anaconda-based packages implies a `pixi` project (it's the same people, but a better answer, and you still get what you want -- the correct packages)
    * No Anaconda-based packages implies a `uv` project
    * Always use `pyproject.toml` over any other config file (e.g., `requirements.txt` ), except where things just don't work, such as needing `pyrefly.toml`
    * `uv`, `pixi`, and `direnv` must exist outside of any project, so install them at the user level, or else globally if and only if that is appropriate and compelling enough to override rule one

    That was a wall of text, but in practice doing it this way is trivial. It's probably **less** work than you have been doing. This post is just about managing your Python versions, environments, and projects. Not about, e.g., using `pre-commit` , or doing type checking, etc. But if you follow these rules, your work will be easier, faster, more adaptable, and encounter fewer obstacles.

  8. My day job is all about #Python (which I love). Here are some personal rules, specific to working with Python projects:

    * Do **not** install or modify global tools, especially Python itself or any packages. This means a given system might not even **have** a global Python
    * Always use virtual environments (`uv` agrees with me, and doesn't need this but). I always set the global environment variable `PIP_REQUIRE_VIRTUALENV`.
    * The two rules above mean my virtual environment contains (not via a link, it's really there) Python itself (and of course, of the right version)
    * Virtual environments always live **inside** a project directory. Never global.
    * Activate virtual environments only **inside** the project directory (`direnv` #direnv makes this easy)
    * Don't install (let alone use) #Anaconda, #Miniconda, or #Mamba, because those violate all the rules above (but see the next rule)
    * Anaconda-based packages implies a `pixi` #Pixi project (it's the same people, but a better answer, and you still get what you want -- the correct packages)
    * No Anaconda-based packages implies a `uv` #UV project
    * Always use `pyproject.toml` #pyprojecttoml over any other config file (e.g., `requirements.txt` #requirementstxt), except where things just don't work, such as needing `pyrefly.toml`
    * `uv`, `pixi`, and `direnv` must exist outside of any project, so install them at the user level, or else globally if and only if that is appropriate and compelling enough to override rule one

    That was a wall of text, but in practice doing it this way is trivial. It's probably **less** work than you have been doing. This post is just about managing your Python versions, environments, and projects. Not about, e.g., using `pre-commit` #precommit, or doing type checking, etc. But if you follow these rules, your work will be easier, faster, more adaptable, and encounter fewer obstacles.

    #HowTo

  9. My day job is all about #Python (which I love). Here are some personal rules, specific to working with Python projects:

    * Do **not** install or modify global tools, especially Python itself or any packages. This means a given system might not even **have** a global Python
    * Always use virtual environments (`uv` agrees with me, and doesn't need this but). I always set the global environment variable `PIP_REQUIRE_VIRTUALENV`.
    * The two rules above mean my virtual environment contains (not via a link, it's really there) Python itself (and of course, of the right version)
    * Virtual environments always live **inside** a project directory. Never global.
    * Activate virtual environments only **inside** the project directory (`direnv` #direnv makes this easy)
    * Don't install (let alone use) #Anaconda, #Miniconda, or #Mamba, because those violate all the rules above (but see the next rule)
    * Anaconda-based packages implies a `pixi` #Pixi project (it's the same people, but a better answer, and you still get what you want -- the correct packages)
    * No Anaconda-based packages implies a `uv` #UV project
    * Always use `pyproject.toml` #pyprojecttoml over any other config file (e.g., `requirements.txt #requirementstxt), except where things just don't work, such as needing `pyrefly.toml`
    * `uv`, `pixi`, and `direnv` must exist outside of any project, so install them at the user level, or else globally if and only if that is appropriate and compelling enough to override rule one

    That was a wall of text, but in practice doing it this way is trivial. It's probably **less** work than you have been doing. This post is just about managing your Python versions, environments, and projects. Not about, e.g., using `pre-commit` #precommit, or doing type checking, etc. But if you follow these rules, your work will be easier, faster, more adaptable, and encounter fewer obstacles.

    #HotTo

  10. @bfdi
    Och nö. Warum kann man als Privatperson eure tollen Pixi-Bücher nicht mehr bestellen? Dann ist wohl klar, warum ich seit drei Monaten vergebens auf die Zustellung warte. 😥

    #Pixi #BfDI #kostenlos

  11. @bfdi
    Och nö. Warum kann man als Privatperson eure tollen Pixi-Bücher nicht mehr bestellen? Dann ist wohl klar, warum ich seit drei Monaten vergebens auf die Zustellung warte. 😥

    #Pixi #BfDI #kostenlos

  12. @bfdi
    Och nö. Warum kann man als Privatperson eure tollen Pixi-Bücher nicht mehr bestellen? Dann ist wohl klar, warum ich seit drei Monaten vergebens auf die Zustellung warte. 😥

    #Pixi #BfDI #kostenlos

  13. My first article for Towards Data Science!

    How sharded repodata (CEP-16) makes conda and Pixi 10x faster with 90% less bandwidth on conda-forge.

    Thanks to Bas Zalmstra, @dholth, and the teams at @prefix, Anaconda, and Quansight.
    towardsdatascience.com/why-pac

    #PackageManagement #OpenSource #Python #Conda #Pixi #DataScience

  14. My first article for Towards Data Science!

    How sharded repodata (CEP-16) makes conda and Pixi 10x faster with 90% less bandwidth on conda-forge.

    Thanks to Bas Zalmstra, @dholth, and the teams at @prefix, Anaconda, and Quansight.
    towardsdatascience.com/why-pac

  15. My first article for Towards Data Science!

    How sharded repodata (CEP-16) makes conda and Pixi 10x faster with 90% less bandwidth on conda-forge.

    Thanks to Bas Zalmstra, @dholth, and the teams at @prefix, Anaconda, and Quansight.
    towardsdatascience.com/why-pac

    #PackageManagement #OpenSource #Python #Conda #Pixi #DataScience

  16. My first article for Towards Data Science!

    How sharded repodata (CEP-16) makes conda and Pixi 10x faster with 90% less bandwidth on conda-forge.

    Thanks to Bas Zalmstra, @dholth, and the teams at @prefix, Anaconda, and Quansight.
    towardsdatascience.com/why-pac

    #PackageManagement #OpenSource #Python #Conda #Pixi #DataScience

  17. My first article for Towards Data Science!

    How sharded repodata (CEP-16) makes conda and Pixi 10x faster with 90% less bandwidth on conda-forge.

    Thanks to Bas Zalmstra, @dholth, and the teams at @prefix, Anaconda, and Quansight.
    towardsdatascience.com/why-pac

    #PackageManagement #OpenSource #Python #Conda #Pixi #DataScience

  18. The more I learn about #Direnv, the happier I become. It **already** works with #Pixi. It **already** helps you with secrets (ignore `.envrc` in your `.gitignore` or equivalent). I grabbed a `layout_uv` from the direnv wiki (made some small modifications), and it works basically everywhere I want it.

    If you're not using using `direnv` yet, you are doing yourself a disservice.

  19. The more I learn about #Direnv, the happier I become. It **already** works with #Pixi. It **already** helps you with secrets (ignore `.envrc` in your `.gitignore` or equivalent). I grabbed a `layout_uv` from the direnv wiki (made some small modifications), and it works basically everywhere I want it.

    If you're not using using `direnv` yet, you are doing yourself a disservice.

  20. The more I learn about , the happier I become. It **already** works with . It **already** helps you with secrets (ignore `.envrc` in your `.gitignore` or equivalent). I grabbed a `layout_uv` from the direnv wiki (made some small modifications), and it works basically everywhere I want it.

    If you're not using using `direnv` yet, you are doing yourself a disservice.

  21. The more I learn about #Direnv, the happier I become. It **already** works with #Pixi. It **already** helps you with secrets (ignore `.envrc` in your `.gitignore` or equivalent). I grabbed a `layout_uv` from the direnv wiki (made some small modifications), and it works basically everywhere I want it.

    If you're not using using `direnv` yet, you are doing yourself a disservice.

  22. Here’s your regular #CommandLine #PSA: #Starship helps you every time you hit return, in every shell. #Atuin makes history 10x more useful. #Direnv is becoming my friend but I need to understand better how to use it, and it needs additions to work in more situations.(e.g., #uv layout, #pixi layout, better path handling in #GitBashForWindows)

  23. Here’s your regular #CommandLine #PSA: #Starship helps you every time you hit return, in every shell. #Atuin makes history 10x more useful. #Direnv is becoming my friend but I need to understand better how to use it, and it needs additions to work in more situations.(e.g., #uv layout, #pixi layout, better path handling in #GitBashForWindows)

  24. Here’s your regular : helps you every time you hit return, in every shell. makes history 10x more useful. is becoming my friend but I need to understand better how to use it, and it needs additions to work in more situations.(e.g., layout, layout, better path handling in )

  25. Here’s your regular #CommandLine #PSA: #Starship helps you every time you hit return, in every shell. #Atuin makes history 10x more useful. #Direnv is becoming my friend but I need to understand better how to use it, and it needs additions to work in more situations.(e.g., #uv layout, #pixi layout, better path handling in #GitBashForWindows)

  26. Here’s your regular #CommandLine #PSA: #Starship helps you every time you hit return, in every shell. #Atuin makes history 10x more useful. #Direnv is becoming my friend but I need to understand better how to use it, and it needs additions to work in more situations.(e.g., #uv layout, #pixi layout, better path handling in #GitBashForWindows)

  27. RE: mastodon.social/@digiresacadem

    I attended the dry-run of this course* recently and it was pretty mind-blowing - I'd seen before how #pixi can be used for python packaging but this course introduced me to a few new example use cases:

    - Reproducible data analysis in #R
    - A new application in #C
    -Image processing with #Python
    - #Jupyter Notebook for exploratory analysis

  28. RE: mastodon.social/@digiresacadem

    I attended the dry-run of this course* recently and it was pretty mind-blowing - I'd seen before how #pixi can be used for python packaging but this course introduced me to a few new example use cases:

    - Reproducible data analysis in #R
    - A new application in #C
    -Image processing with #Python
    - #Jupyter Notebook for exploratory analysis

  29. RE: mastodon.social/@digiresacadem

    I attended the dry-run of this course* recently and it was pretty mind-blowing - I'd seen before how #pixi can be used for python packaging but this course introduced me to a few new example use cases:

    - Reproducible data analysis in #R
    - A new application in #C
    -Image processing with #Python
    - #Jupyter Notebook for exploratory analysis

  30. RE: mastodon.social/@digiresacadem

    I attended the dry-run of this course* recently and it was pretty mind-blowing - I'd seen before how #pixi can be used for python packaging but this course introduced me to a few new example use cases:

    - Reproducible data analysis in #R
    - A new application in #C
    -Image processing with #Python
    - #Jupyter Notebook for exploratory analysis

  31. RE: mastodon.social/@digiresacadem

    I attended the dry-run of this course* recently and it was pretty mind-blowing - I'd seen before how #pixi can be used for python packaging but this course introduced me to a few new example use cases:

    - Reproducible data analysis in #R
    - A new application in #C
    -Image processing with #Python
    - #Jupyter Notebook for exploratory analysis

  32. Demnächst auf der Rückseite vom Pixibuch:
    Hallo, euer #Pixi zeigt euch heute, worauf ihr achten müsst, wenn ihr einem #Text eine #Überschrift gebt:
    Haltet euch immer an die Wahrheit. Das bedeutet, dass ihr di:en #Leserin mit der Überschrift #locken dürft, aber nur im Rahmen der #Tatsachen!
    #PixiBuch #literaturfuerkinder #literaturfueralle #wahrheit #echteWahrheit #nichtsalsdiewahrheit #diewirklichechtewahrheit #journalismus #journalismusAlteSchule #teaser #bildzeitung #blödzeitung

  33. I am really enjoying the Pixi package manager, pixi.sh , made by @prefix. We have been using conda at my work for managing the dependencies of our python application. It involves scientific data analysis so there are lots of dependencies, and it has been a challenge to keep things up to date. Pixi has nice support for cleanly defining the direct dependencies in the pixi.toml file, and then it automatically generates a lock file. There is a command to upgrade all the dependencies too. It's amazing! I'm just starting to use it, but it is helpful so far.

    #conda
    #packageManagement
    #pixi
    #dependencyManagement

  34. I am really enjoying the Pixi package manager, pixi.sh , made by @prefix. We have been using conda at my work for managing the dependencies of our python application. It involves scientific data analysis so there are lots of dependencies, and it has been a challenge to keep things up to date. Pixi has nice support for cleanly defining the direct dependencies in the pixi.toml file, and then it automatically generates a lock file. There is a command to upgrade all the dependencies too. It's amazing! I'm just starting to use it, but it is helpful so far.

    #conda
    #packageManagement
    #pixi
    #dependencyManagement

  35. I am really enjoying the Pixi package manager, pixi.sh , made by @prefix. We have been using conda at my work for managing the dependencies of our python application. It involves scientific data analysis so there are lots of dependencies, and it has been a challenge to keep things up to date. Pixi has nice support for cleanly defining the direct dependencies in the pixi.toml file, and then it automatically generates a lock file. There is a command to upgrade all the dependencies too. It's amazing! I'm just starting to use it, but it is helpful so far.

    #conda
    #packageManagement
    #pixi
    #dependencyManagement

  36. I am really enjoying the Pixi package manager, pixi.sh , made by @prefix. We have been using conda at my work for managing the dependencies of our python application. It involves scientific data analysis so there are lots of dependencies, and it has been a challenge to keep things up to date. Pixi has nice support for cleanly defining the direct dependencies in the pixi.toml file, and then it automatically generates a lock file. There is a command to upgrade all the dependencies too. It's amazing! I'm just starting to use it, but it is helpful so far.

    #conda
    #packageManagement
    #pixi
    #dependencyManagement

  37. I am really enjoying the Pixi package manager, pixi.sh , made by @prefix. We have been using conda at my work for managing the dependencies of our python application. It involves scientific data analysis so there are lots of dependencies, and it has been a challenge to keep things up to date. Pixi has nice support for cleanly defining the direct dependencies in the pixi.toml file, and then it automatically generates a lock file. There is a command to upgrade all the dependencies too. It's amazing! I'm just starting to use it, but it is helpful so far.

    #conda
    #packageManagement
    #pixi
    #dependencyManagement