#pixi — Public Fediverse posts
Live and recent posts from across the Fediverse tagged #pixi, aggregated by home.social.
-
My day job is all about #Python (which I love). Here are some personal rules, specific to working with Python projects:
* Do **not** install or modify global tools, especially Python itself or any packages. This means a given system might not even **have** a global Python
* Always use virtual environments (`uv` agrees with me, and doesn't need this but). I always set the global environment variable `PIP_REQUIRE_VIRTUALENV`.
* The two rules above mean my virtual environment contains (not via a link, it's really there) Python itself (and of course, of the right version)
* Virtual environments always live **inside** a project directory. Never global.
* Activate virtual environments only **inside** the project directory (`direnv` #direnv makes this easy)
* Don't install (let alone use) #Anaconda, #Miniconda, or #Mamba, because those violate all the rules above (but see the next rule)
* Anaconda-based packages implies a `pixi` #Pixi project (it's the same people, but a better answer, and you still get what you want -- the correct packages)
* No Anaconda-based packages implies a `uv` #UV project
* Always use `pyproject.toml` #pyprojecttoml over any other config file (e.g., `requirements.txt` #requirementstxt), except where things just don't work, such as needing `pyrefly.toml`
* `uv`, `pixi`, and `direnv` must exist outside of any project, so install them at the user level, or else globally if and only if that is appropriate and compelling enough to override rule oneThat was a wall of text, but in practice doing it this way is trivial. It's probably **less** work than you have been doing. This post is just about managing your Python versions, environments, and projects. Not about, e.g., using `pre-commit` #precommit, or doing type checking, etc. But if you follow these rules, your work will be easier, faster, more adaptable, and encounter fewer obstacles.
-
My day job is all about #Python (which I love). Here are some personal rules, specific to working with Python projects:
* Do **not** install or modify global tools, especially Python itself or any packages. This means a given system might not even **have** a global Python
* Always use virtual environments (`uv` agrees with me, and doesn't need this but). I always set the global environment variable `PIP_REQUIRE_VIRTUALENV`.
* The two rules above mean my virtual environment contains (not via a link, it's really there) Python itself (and of course, of the right version)
* Virtual environments always live **inside** a project directory. Never global.
* Activate virtual environments only **inside** the project directory (`direnv` #direnv makes this easy)
* Don't install (let alone use) #Anaconda, #Miniconda, or #Mamba, because those violate all the rules above (but see the next rule)
* Anaconda-based packages implies a `pixi` #Pixi project (it's the same people, but a better answer, and you still get what you want -- the correct packages)
* No Anaconda-based packages implies a `uv` #UV project
* Always use `pyproject.toml` #pyprojecttoml over any other config file (e.g., `requirements.txt` #requirementstxt), except where things just don't work, such as needing `pyrefly.toml`
* `uv`, `pixi`, and `direnv` must exist outside of any project, so install them at the user level, or else globally if and only if that is appropriate and compelling enough to override rule oneThat was a wall of text, but in practice doing it this way is trivial. It's probably **less** work than you have been doing. This post is just about managing your Python versions, environments, and projects. Not about, e.g., using `pre-commit` #precommit, or doing type checking, etc. But if you follow these rules, your work will be easier, faster, more adaptable, and encounter fewer obstacles.
-
My day job is all about #Python (which I love). Here are some personal rules, specific to working with Python projects:
* Do **not** install or modify global tools, especially Python itself or any packages. This means a given system might not even **have** a global Python
* Always use virtual environments (`uv` agrees with me, and doesn't need this but). I always set the global environment variable `PIP_REQUIRE_VIRTUALENV`.
* The two rules above mean my virtual environment contains (not via a link, it's really there) Python itself (and of course, of the right version)
* Virtual environments always live **inside** a project directory. Never global.
* Activate virtual environments only **inside** the project directory (`direnv` #direnv makes this easy)
* Don't install (let alone use) #Anaconda, #Miniconda, or #Mamba, because those violate all the rules above (but see the next rule)
* Anaconda-based packages implies a `pixi` #Pixi project (it's the same people, but a better answer, and you still get what you want -- the correct packages)
* No Anaconda-based packages implies a `uv` #UV project
* Always use `pyproject.toml` #pyprojecttoml over any other config file (e.g., `requirements.txt` #requirementstxt), except where things just don't work, such as needing `pyrefly.toml`
* `uv`, `pixi`, and `direnv` must exist outside of any project, so install them at the user level, or else globally if and only if that is appropriate and compelling enough to override rule oneThat was a wall of text, but in practice doing it this way is trivial. It's probably **less** work than you have been doing. This post is just about managing your Python versions, environments, and projects. Not about, e.g., using `pre-commit` #precommit, or doing type checking, etc. But if you follow these rules, your work will be easier, faster, more adaptable, and encounter fewer obstacles.
-
My day job is all about #Python (which I love). Here are some personal rules, specific to working with Python projects:
* Do **not** install or modify global tools, especially Python itself or any packages. This means a given system might not even **have** a global Python
* Always use virtual environments (`uv` agrees with me, and doesn't need this but). I always set the global environment variable `PIP_REQUIRE_VIRTUALENV`.
* The two rules above mean my virtual environment contains (not via a link, it's really there) Python itself (and of course, of the right version)
* Virtual environments always live **inside** a project directory. Never global.
* Activate virtual environments only **inside** the project directory (`direnv` #direnv makes this easy)
* Don't install (let alone use) #Anaconda, #Miniconda, or #Mamba, because those violate all the rules above (but see the next rule)
* Anaconda-based packages implies a `pixi` #Pixi project (it's the same people, but a better answer, and you still get what you want -- the correct packages)
* No Anaconda-based packages implies a `uv` #UV project
* Always use `pyproject.toml` #pyprojecttoml over any other config file (e.g., `requirements.txt` #requirementstxt), except where things just don't work, such as needing `pyrefly.toml`
* `uv`, `pixi`, and `direnv` must exist outside of any project, so install them at the user level, or else globally if and only if that is appropriate and compelling enough to override rule oneThat was a wall of text, but in practice doing it this way is trivial. It's probably **less** work than you have been doing. This post is just about managing your Python versions, environments, and projects. Not about, e.g., using `pre-commit` #precommit, or doing type checking, etc. But if you follow these rules, your work will be easier, faster, more adaptable, and encounter fewer obstacles.
-
My day job is all about #Python (which I love). Here are some personal rules, specific to working with Python projects:
* Do **not** install or modify global tools, especially Python itself or any packages. This means a given system might not even **have** a global Python
* Always use virtual environments (`uv` agrees with me, and doesn't need this but). I always set the global environment variable `PIP_REQUIRE_VIRTUALENV`.
* The two rules above mean my virtual environment contains (not via a link, it's really there) Python itself (and of course, of the right version)
* Virtual environments always live **inside** a project directory. Never global.
* Activate virtual environments only **inside** the project directory (`direnv` #direnv makes this easy)
* Don't install (let alone use) #Anaconda, #Miniconda, or #Mamba, because those violate all the rules above (but see the next rule)
* Anaconda-based packages implies a `pixi` #Pixi project (it's the same people, but a better answer, and you still get what you want -- the correct packages)
* No Anaconda-based packages implies a `uv` #UV project
* Always use `pyproject.toml` #pyprojecttoml over any other config file (e.g., `requirements.txt #requirementstxt), except where things just don't work, such as needing `pyrefly.toml`
* `uv`, `pixi`, and `direnv` must exist outside of any project, so install them at the user level, or else globally if and only if that is appropriate and compelling enough to override rule oneThat was a wall of text, but in practice doing it this way is trivial. It's probably **less** work than you have been doing. This post is just about managing your Python versions, environments, and projects. Not about, e.g., using `pre-commit` #precommit, or doing type checking, etc. But if you follow these rules, your work will be easier, faster, more adaptable, and encounter fewer obstacles.
-
@bfdi
Och nö. Warum kann man als Privatperson eure tollen Pixi-Bücher nicht mehr bestellen? Dann ist wohl klar, warum ich seit drei Monaten vergebens auf die Zustellung warte. 😥 -
@bfdi
Och nö. Warum kann man als Privatperson eure tollen Pixi-Bücher nicht mehr bestellen? Dann ist wohl klar, warum ich seit drei Monaten vergebens auf die Zustellung warte. 😥 -
@bfdi
Och nö. Warum kann man als Privatperson eure tollen Pixi-Bücher nicht mehr bestellen? Dann ist wohl klar, warum ich seit drei Monaten vergebens auf die Zustellung warte. 😥 -
My first article for Towards Data Science!
How sharded repodata (CEP-16) makes conda and Pixi 10x faster with 90% less bandwidth on conda-forge.
Thanks to Bas Zalmstra, @dholth, and the teams at @prefix, Anaconda, and Quansight.
https://towardsdatascience.com/why-package-installs-are-slow-and-how-to-fix-it/#PackageManagement #OpenSource #Python #Conda #Pixi #DataScience
-
My first article for Towards Data Science!
How sharded repodata (CEP-16) makes conda and Pixi 10x faster with 90% less bandwidth on conda-forge.
Thanks to Bas Zalmstra, @dholth, and the teams at @prefix, Anaconda, and Quansight.
https://towardsdatascience.com/why-package-installs-are-slow-and-how-to-fix-it/#PackageManagement #OpenSource #Python #Conda #Pixi #DataScience
-
My first article for Towards Data Science!
How sharded repodata (CEP-16) makes conda and Pixi 10x faster with 90% less bandwidth on conda-forge.
Thanks to Bas Zalmstra, @dholth, and the teams at @prefix, Anaconda, and Quansight.
https://towardsdatascience.com/why-package-installs-are-slow-and-how-to-fix-it/#PackageManagement #OpenSource #Python #Conda #Pixi #DataScience
-
My first article for Towards Data Science!
How sharded repodata (CEP-16) makes conda and Pixi 10x faster with 90% less bandwidth on conda-forge.
Thanks to Bas Zalmstra, @dholth, and the teams at @prefix, Anaconda, and Quansight.
https://towardsdatascience.com/why-package-installs-are-slow-and-how-to-fix-it/#PackageManagement #OpenSource #Python #Conda #Pixi #DataScience
-
My first article for Towards Data Science!
How sharded repodata (CEP-16) makes conda and Pixi 10x faster with 90% less bandwidth on conda-forge.
Thanks to Bas Zalmstra, @dholth, and the teams at @prefix, Anaconda, and Quansight.
https://towardsdatascience.com/why-package-installs-are-slow-and-how-to-fix-it/#PackageManagement #OpenSource #Python #Conda #Pixi #DataScience
-
The more I learn about #Direnv, the happier I become. It **already** works with #Pixi. It **already** helps you with secrets (ignore `.envrc` in your `.gitignore` or equivalent). I grabbed a `layout_uv` from the direnv wiki (made some small modifications), and it works basically everywhere I want it.
If you're not using using `direnv` yet, you are doing yourself a disservice.
-
The more I learn about #Direnv, the happier I become. It **already** works with #Pixi. It **already** helps you with secrets (ignore `.envrc` in your `.gitignore` or equivalent). I grabbed a `layout_uv` from the direnv wiki (made some small modifications), and it works basically everywhere I want it.
If you're not using using `direnv` yet, you are doing yourself a disservice.
-
The more I learn about #Direnv, the happier I become. It **already** works with #Pixi. It **already** helps you with secrets (ignore `.envrc` in your `.gitignore` or equivalent). I grabbed a `layout_uv` from the direnv wiki (made some small modifications), and it works basically everywhere I want it.
If you're not using using `direnv` yet, you are doing yourself a disservice.
-
The more I learn about #Direnv, the happier I become. It **already** works with #Pixi. It **already** helps you with secrets (ignore `.envrc` in your `.gitignore` or equivalent). I grabbed a `layout_uv` from the direnv wiki (made some small modifications), and it works basically everywhere I want it.
If you're not using using `direnv` yet, you are doing yourself a disservice.
-
Here’s your regular #CommandLine #PSA: #Starship helps you every time you hit return, in every shell. #Atuin makes history 10x more useful. #Direnv is becoming my friend but I need to understand better how to use it, and it needs additions to work in more situations.(e.g., #uv layout, #pixi layout, better path handling in #GitBashForWindows)
-
Here’s your regular #CommandLine #PSA: #Starship helps you every time you hit return, in every shell. #Atuin makes history 10x more useful. #Direnv is becoming my friend but I need to understand better how to use it, and it needs additions to work in more situations.(e.g., #uv layout, #pixi layout, better path handling in #GitBashForWindows)
-
Here’s your regular #CommandLine #PSA: #Starship helps you every time you hit return, in every shell. #Atuin makes history 10x more useful. #Direnv is becoming my friend but I need to understand better how to use it, and it needs additions to work in more situations.(e.g., #uv layout, #pixi layout, better path handling in #GitBashForWindows)
-
Here’s your regular #CommandLine #PSA: #Starship helps you every time you hit return, in every shell. #Atuin makes history 10x more useful. #Direnv is becoming my friend but I need to understand better how to use it, and it needs additions to work in more situations.(e.g., #uv layout, #pixi layout, better path handling in #GitBashForWindows)
-
Here’s your regular #CommandLine #PSA: #Starship helps you every time you hit return, in every shell. #Atuin makes history 10x more useful. #Direnv is becoming my friend but I need to understand better how to use it, and it needs additions to work in more situations.(e.g., #uv layout, #pixi layout, better path handling in #GitBashForWindows)
-
RE: https://mastodon.social/@digiresacademy/115702812560956273
I attended the dry-run of this course* recently and it was pretty mind-blowing - I'd seen before how #pixi can be used for python packaging but this course introduced me to a few new example use cases:
- Reproducible data analysis in #R
- A new application in #C
-Image processing with #Python
- #Jupyter Notebook for exploratory analysis -
RE: https://mastodon.social/@digiresacademy/115702812560956273
I attended the dry-run of this course* recently and it was pretty mind-blowing - I'd seen before how #pixi can be used for python packaging but this course introduced me to a few new example use cases:
- Reproducible data analysis in #R
- A new application in #C
-Image processing with #Python
- #Jupyter Notebook for exploratory analysis -
RE: https://mastodon.social/@digiresacademy/115702812560956273
I attended the dry-run of this course* recently and it was pretty mind-blowing - I'd seen before how #pixi can be used for python packaging but this course introduced me to a few new example use cases:
- Reproducible data analysis in #R
- A new application in #C
-Image processing with #Python
- #Jupyter Notebook for exploratory analysis -
RE: https://mastodon.social/@digiresacademy/115702812560956273
I attended the dry-run of this course* recently and it was pretty mind-blowing - I'd seen before how #pixi can be used for python packaging but this course introduced me to a few new example use cases:
- Reproducible data analysis in #R
- A new application in #C
-Image processing with #Python
- #Jupyter Notebook for exploratory analysis -
RE: https://mastodon.social/@digiresacademy/115702812560956273
I attended the dry-run of this course* recently and it was pretty mind-blowing - I'd seen before how #pixi can be used for python packaging but this course introduced me to a few new example use cases:
- Reproducible data analysis in #R
- A new application in #C
-Image processing with #Python
- #Jupyter Notebook for exploratory analysis -
Demnächst auf der Rückseite vom Pixibuch:
Hallo, euer #Pixi zeigt euch heute, worauf ihr achten müsst, wenn ihr einem #Text eine #Überschrift gebt:
Haltet euch immer an die Wahrheit. Das bedeutet, dass ihr di:en #Leserin mit der Überschrift #locken dürft, aber nur im Rahmen der #Tatsachen!
#PixiBuch #literaturfuerkinder #literaturfueralle #wahrheit #echteWahrheit #nichtsalsdiewahrheit #diewirklichechtewahrheit #journalismus #journalismusAlteSchule #teaser #bildzeitung #blödzeitung -
I am really enjoying the Pixi package manager, https://pixi.sh , made by @prefix. We have been using conda at my work for managing the dependencies of our python application. It involves scientific data analysis so there are lots of dependencies, and it has been a challenge to keep things up to date. Pixi has nice support for cleanly defining the direct dependencies in the pixi.toml file, and then it automatically generates a lock file. There is a command to upgrade all the dependencies too. It's amazing! I'm just starting to use it, but it is helpful so far.
-
I am really enjoying the Pixi package manager, https://pixi.sh , made by @prefix. We have been using conda at my work for managing the dependencies of our python application. It involves scientific data analysis so there are lots of dependencies, and it has been a challenge to keep things up to date. Pixi has nice support for cleanly defining the direct dependencies in the pixi.toml file, and then it automatically generates a lock file. There is a command to upgrade all the dependencies too. It's amazing! I'm just starting to use it, but it is helpful so far.
-
I am really enjoying the Pixi package manager, https://pixi.sh , made by @prefix. We have been using conda at my work for managing the dependencies of our python application. It involves scientific data analysis so there are lots of dependencies, and it has been a challenge to keep things up to date. Pixi has nice support for cleanly defining the direct dependencies in the pixi.toml file, and then it automatically generates a lock file. There is a command to upgrade all the dependencies too. It's amazing! I'm just starting to use it, but it is helpful so far.
-
I am really enjoying the Pixi package manager, https://pixi.sh , made by @prefix. We have been using conda at my work for managing the dependencies of our python application. It involves scientific data analysis so there are lots of dependencies, and it has been a challenge to keep things up to date. Pixi has nice support for cleanly defining the direct dependencies in the pixi.toml file, and then it automatically generates a lock file. There is a command to upgrade all the dependencies too. It's amazing! I'm just starting to use it, but it is helpful so far.
-
I am really enjoying the Pixi package manager, https://pixi.sh , made by @prefix. We have been using conda at my work for managing the dependencies of our python application. It involves scientific data analysis so there are lots of dependencies, and it has been a challenge to keep things up to date. Pixi has nice support for cleanly defining the direct dependencies in the pixi.toml file, and then it automatically generates a lock file. There is a command to upgrade all the dependencies too. It's amazing! I'm just starting to use it, but it is helpful so far.