home.social

#direnv — Public Fediverse posts

Live and recent posts from across the Fediverse tagged #direnv, aggregated by home.social.

  1. Set up my first #nixflake for use with #direnv (used from #nushell) -- and once it all clicks its a thing of beauty.

  2. Set up my first #nixflake for use with #direnv (used from #nushell) -- and once it all clicks its a thing of beauty.

  3. So tonight I found out about direnv. How did I not know about this handy tool before? So, so unbelievably useful

  4. My day job is all about #Python (which I love). Here are some personal rules, specific to working with Python projects:

    * Do **not** install or modify global tools, especially Python itself or any packages. This means a given system might not even **have** a global Python
    * Always use virtual environments (`uv` agrees with me, and doesn't need this but). I always set the global environment variable `PIP_REQUIRE_VIRTUALENV`.
    * The two rules above mean my virtual environment contains (not via a link, it's really there) Python itself (and of course, of the right version)
    * Virtual environments always live **inside** a project directory. Never global.
    * Activate virtual environments only **inside** the project directory (`direnv` #direnv makes this easy)
    * Don't install (let alone use) #Anaconda, #Miniconda, or #Mamba, because those violate all the rules above (but see the next rule)
    * Anaconda-based packages implies a `pixi` #Pixi project (it's the same people, but a better answer, and you still get what you want -- the correct packages)
    * No Anaconda-based packages implies a `uv` #UV project
    * Always use `pyproject.toml` #pyprojecttoml over any other config file (e.g., `requirements.txt` #requirementstxt), except where things just don't work, such as needing `pyrefly.toml`
    * `uv`, `pixi`, and `direnv` must exist outside of any project, so install them at the user level, or else globally if and only if that is appropriate and compelling enough to override rule one

    That was a wall of text, but in practice doing it this way is trivial. It's probably **less** work than you have been doing. This post is just about managing your Python versions, environments, and projects. Not about, e.g., using `pre-commit` #precommit, or doing type checking, etc. But if you follow these rules, your work will be easier, faster, more adaptable, and encounter fewer obstacles.

    #HowTo

  5. My day job is all about #Python (which I love). Here are some personal rules, specific to working with Python projects:

    * Do **not** install or modify global tools, especially Python itself or any packages. This means a given system might not even **have** a global Python
    * Always use virtual environments (`uv` agrees with me, and doesn't need this but). I always set the global environment variable `PIP_REQUIRE_VIRTUALENV`.
    * The two rules above mean my virtual environment contains (not via a link, it's really there) Python itself (and of course, of the right version)
    * Virtual environments always live **inside** a project directory. Never global.
    * Activate virtual environments only **inside** the project directory (`direnv` #direnv makes this easy)
    * Don't install (let alone use) #Anaconda, #Miniconda, or #Mamba, because those violate all the rules above (but see the next rule)
    * Anaconda-based packages implies a `pixi` #Pixi project (it's the same people, but a better answer, and you still get what you want -- the correct packages)
    * No Anaconda-based packages implies a `uv` #UV project
    * Always use `pyproject.toml` #pyprojecttoml over any other config file (e.g., `requirements.txt` #requirementstxt), except where things just don't work, such as needing `pyrefly.toml`
    * `uv`, `pixi`, and `direnv` must exist outside of any project, so install them at the user level, or else globally if and only if that is appropriate and compelling enough to override rule one

    That was a wall of text, but in practice doing it this way is trivial. It's probably **less** work than you have been doing. This post is just about managing your Python versions, environments, and projects. Not about, e.g., using `pre-commit` #precommit, or doing type checking, etc. But if you follow these rules, your work will be easier, faster, more adaptable, and encounter fewer obstacles.

    #HowTo

  6. My day job is all about (which I love). Here are some personal rules, specific to working with Python projects:

    * Do **not** install or modify global tools, especially Python itself or any packages. This means a given system might not even **have** a global Python
    * Always use virtual environments (`uv` agrees with me, and doesn't need this but). I always set the global environment variable `PIP_REQUIRE_VIRTUALENV`.
    * The two rules above mean my virtual environment contains (not via a link, it's really there) Python itself (and of course, of the right version)
    * Virtual environments always live **inside** a project directory. Never global.
    * Activate virtual environments only **inside** the project directory (`direnv` makes this easy)
    * Don't install (let alone use) , , or , because those violate all the rules above (but see the next rule)
    * Anaconda-based packages implies a `pixi` project (it's the same people, but a better answer, and you still get what you want -- the correct packages)
    * No Anaconda-based packages implies a `uv` project
    * Always use `pyproject.toml` over any other config file (e.g., `requirements.txt` ), except where things just don't work, such as needing `pyrefly.toml`
    * `uv`, `pixi`, and `direnv` must exist outside of any project, so install them at the user level, or else globally if and only if that is appropriate and compelling enough to override rule one

    That was a wall of text, but in practice doing it this way is trivial. It's probably **less** work than you have been doing. This post is just about managing your Python versions, environments, and projects. Not about, e.g., using `pre-commit` , or doing type checking, etc. But if you follow these rules, your work will be easier, faster, more adaptable, and encounter fewer obstacles.

  7. My day job is all about #Python (which I love). Here are some personal rules, specific to working with Python projects:

    * Do **not** install or modify global tools, especially Python itself or any packages. This means a given system might not even **have** a global Python
    * Always use virtual environments (`uv` agrees with me, and doesn't need this but). I always set the global environment variable `PIP_REQUIRE_VIRTUALENV`.
    * The two rules above mean my virtual environment contains (not via a link, it's really there) Python itself (and of course, of the right version)
    * Virtual environments always live **inside** a project directory. Never global.
    * Activate virtual environments only **inside** the project directory (`direnv` #direnv makes this easy)
    * Don't install (let alone use) #Anaconda, #Miniconda, or #Mamba, because those violate all the rules above (but see the next rule)
    * Anaconda-based packages implies a `pixi` #Pixi project (it's the same people, but a better answer, and you still get what you want -- the correct packages)
    * No Anaconda-based packages implies a `uv` #UV project
    * Always use `pyproject.toml` #pyprojecttoml over any other config file (e.g., `requirements.txt` #requirementstxt), except where things just don't work, such as needing `pyrefly.toml`
    * `uv`, `pixi`, and `direnv` must exist outside of any project, so install them at the user level, or else globally if and only if that is appropriate and compelling enough to override rule one

    That was a wall of text, but in practice doing it this way is trivial. It's probably **less** work than you have been doing. This post is just about managing your Python versions, environments, and projects. Not about, e.g., using `pre-commit` #precommit, or doing type checking, etc. But if you follow these rules, your work will be easier, faster, more adaptable, and encounter fewer obstacles.

    #HowTo

  8. My day job is all about #Python (which I love). Here are some personal rules, specific to working with Python projects:

    * Do **not** install or modify global tools, especially Python itself or any packages. This means a given system might not even **have** a global Python
    * Always use virtual environments (`uv` agrees with me, and doesn't need this but). I always set the global environment variable `PIP_REQUIRE_VIRTUALENV`.
    * The two rules above mean my virtual environment contains (not via a link, it's really there) Python itself (and of course, of the right version)
    * Virtual environments always live **inside** a project directory. Never global.
    * Activate virtual environments only **inside** the project directory (`direnv` #direnv makes this easy)
    * Don't install (let alone use) #Anaconda, #Miniconda, or #Mamba, because those violate all the rules above (but see the next rule)
    * Anaconda-based packages implies a `pixi` #Pixi project (it's the same people, but a better answer, and you still get what you want -- the correct packages)
    * No Anaconda-based packages implies a `uv` #UV project
    * Always use `pyproject.toml` #pyprojecttoml over any other config file (e.g., `requirements.txt #requirementstxt), except where things just don't work, such as needing `pyrefly.toml`
    * `uv`, `pixi`, and `direnv` must exist outside of any project, so install them at the user level, or else globally if and only if that is appropriate and compelling enough to override rule one

    That was a wall of text, but in practice doing it this way is trivial. It's probably **less** work than you have been doing. This post is just about managing your Python versions, environments, and projects. Not about, e.g., using `pre-commit` #precommit, or doing type checking, etc. But if you follow these rules, your work will be easier, faster, more adaptable, and encounter fewer obstacles.

    #HotTo

  9. I use #Git. A feature of Git I leverage heavily is #Worktree. I usually have at least four around at a time. For small tasks, sure, a simple branch and then switch back, but bigger things: a worktree.

    Making a worktree is actually annoying for me: not just the upfront decisions about branches and start points and where to put the new directory (and also immediately `cd`ing there: but getting all the #submodules (submodules suck by the way), hooking up `.envrc` if you use #Direnv (and you should be), which should then set up your virtual environment and path and stuff. Clone isn’t quite as bad but has some of the same problems.

    I do this so often, I wrote a script. It might be useful to others with this workflow. It’s opinionated, and therefore I could really use some feedback! What did I do right? What did I do that’s only right for me? What is totally missing?

    The script is stand-alone, though you do need #UV. (You don’t even need Python! `uv` will transparently get you everything!) Just download this one Python file, and get it on your `$PATH`. If you want the additional `cd` behavior, then add the shell function, too as described in the `README`. Everything is tested. The tests are right there, too.

    github.com/wolf/dotfiles/blob/

    The `README.md` is right next to it.

    I **do** see one thing I’m missing: I need to provide a way to automatically copy in your custom stuff. I’ll add that today.

  10. @csepp #direnv is like table salt: sprinkle a little on anything to enhance the taste. It is especially rather good with #guix.

  11. #Direnv + #Nix is a really amazing combo. :flan_awe:

  12. The more I learn about #Direnv, the happier I become. It **already** works with #Pixi. It **already** helps you with secrets (ignore `.envrc` in your `.gitignore` or equivalent). I grabbed a `layout_uv` from the direnv wiki (made some small modifications), and it works basically everywhere I want it.

    If you're not using using `direnv` yet, you are doing yourself a disservice.

  13. The more I learn about #Direnv, the happier I become. It **already** works with #Pixi. It **already** helps you with secrets (ignore `.envrc` in your `.gitignore` or equivalent). I grabbed a `layout_uv` from the direnv wiki (made some small modifications), and it works basically everywhere I want it.

    If you're not using using `direnv` yet, you are doing yourself a disservice.

  14. The more I learn about , the happier I become. It **already** works with . It **already** helps you with secrets (ignore `.envrc` in your `.gitignore` or equivalent). I grabbed a `layout_uv` from the direnv wiki (made some small modifications), and it works basically everywhere I want it.

    If you're not using using `direnv` yet, you are doing yourself a disservice.

  15. The more I learn about #Direnv, the happier I become. It **already** works with #Pixi. It **already** helps you with secrets (ignore `.envrc` in your `.gitignore` or equivalent). I grabbed a `layout_uv` from the direnv wiki (made some small modifications), and it works basically everywhere I want it.

    If you're not using using `direnv` yet, you are doing yourself a disservice.

  16. Here’s your regular #CommandLine #PSA: #Starship helps you every time you hit return, in every shell. #Atuin makes history 10x more useful. #Direnv is becoming my friend but I need to understand better how to use it, and it needs additions to work in more situations.(e.g., #uv layout, #pixi layout, better path handling in #GitBashForWindows)

  17. Here’s your regular #CommandLine #PSA: #Starship helps you every time you hit return, in every shell. #Atuin makes history 10x more useful. #Direnv is becoming my friend but I need to understand better how to use it, and it needs additions to work in more situations.(e.g., #uv layout, #pixi layout, better path handling in #GitBashForWindows)

  18. Here’s your regular : helps you every time you hit return, in every shell. makes history 10x more useful. is becoming my friend but I need to understand better how to use it, and it needs additions to work in more situations.(e.g., layout, layout, better path handling in )

  19. Here’s your regular #CommandLine #PSA: #Starship helps you every time you hit return, in every shell. #Atuin makes history 10x more useful. #Direnv is becoming my friend but I need to understand better how to use it, and it needs additions to work in more situations.(e.g., #uv layout, #pixi layout, better path handling in #GitBashForWindows)

  20. Here’s your regular #CommandLine #PSA: #Starship helps you every time you hit return, in every shell. #Atuin makes history 10x more useful. #Direnv is becoming my friend but I need to understand better how to use it, and it needs additions to work in more situations.(e.g., #uv layout, #pixi layout, better path handling in #GitBashForWindows)

  21. I’ve made progress with #Direnv in #GitBashForWindows. It’s automatically setting environment variables and activating virtual environments in #uv projects. Not yet working with #Pixi. Not yet getting my `$PYTHONPATH` right in these uv projects.

  22. I’ve made progress with #Direnv in #GitBashForWindows. It’s automatically setting environment variables and activating virtual environments in #uv projects. Not yet working with #Pixi. Not yet getting my `$PYTHONPATH` right in these uv projects.

  23. I’ve made progress with in . It’s automatically setting environment variables and activating virtual environments in projects. Not yet working with . Not yet getting my `$PYTHONPATH` right in these uv projects.

  24. I’ve made progress with #Direnv in #GitBashForWindows. It’s automatically setting environment variables and activating virtual environments in #uv projects. Not yet working with #Pixi. Not yet getting my `$PYTHONPATH` right in these uv projects.

  25. #Direnv is such a disappointment to me. I want it to work in #GitBashForWindows. I want it to automatically activate my chosen #Pixi virtual environment (and set up PYTHONPATH). I’ve put over two hours into it and it still isn’t functioning. I assume this all works fine everywhere that is **not** Windows. But it seems like a great idea!

  26. #Direnv is such a disappointment to me. I want it to work in #GitBashForWindows. I want it to automatically activate my chosen #Pixi virtual environment (and set up PYTHONPATH). I’ve put over two hours into it and it still isn’t functioning. I assume this all works fine everywhere that is **not** Windows. But it seems like a great idea!

  27. is such a disappointment to me. I want it to work in . I want it to automatically activate my chosen virtual environment (and set up PYTHONPATH). I’ve put over two hours into it and it still isn’t functioning. I assume this all works fine everywhere that is **not** Windows. But it seems like a great idea!

  28. #Direnv is such a disappointment to me. I want it to work in #GitBashForWindows. I want it to automatically activate my chosen #Pixi virtual environment (and set up PYTHONPATH). I’ve put over two hours into it and it still isn’t functioning. I assume this all works fine everywhere that is **not** Windows. But it seems like a great idea!

  29. RE: hachyderm.io/@miketheman/11561

    Use Trusted Publishing instead of long-lived PyPI tokens. For other things, here's how to use 1Password with direnv to set secrets in env vars.
    hugovk.dev/blog/2025/secrets-i
    #security #1Password #direnv #cli #PyPI

  30. A minimal, declarative setup for productive Rust 🦀 hacking on Emacs + Guix

    jointhefreeworld.org/blog/arti

    I noticed there was a blatant lack of resources and documentation on this particular setup.

    With a tiny manifest and a small Emacs configuration, you get a powerful, reproducible, elegant Rust development environment.

    #rust #guix #emacs #dev #manifest #shell #development #environment #ide #clippy #lsp #gnu #reproducible #direnv #eglot

  31. @calum

    Recently I added to my terminal. Now when I cd to a :python: repo its .venv is automatically sourced. Including picked up by :neovim: when running it there. All needed is that .envrc with "layout uv" in the repo root.

    And, as seen in the screenshot, is also picking it up.

    Adding this kind of cleverness on system level, instead of a GUI monolith, takes using and developing to completely new levels.

  32. @gaborudvari "automatically setup an environment with direnv and a manifest.scm file" - I should use this more, it's such a great productivity trick. Then you can just change into the right directory and have the whole set-up automated. Nice!

    I'm not sure if we have a full worked example of using `guix.scm` and `manifest.scm`. There is some coverage of using #direnv

    guix.gnu.org/cookbook/en/guix-

  33. Ok, #direnv to the rescue. Add this to your .envrc to fix the #conda environment on your terminal and on #vscode.

    ```
    conda activate whatever-environment

    mkdir -p .vscode
    cat > .vscode/settings.json << EOF
    {
    "python.defaultInterpreterPath": "$(which python)",
    "python.terminal.activateEnvironment": false,
    }
    EOF
    ```

    #python

  34. quick reminder to myself on how to run a uv project with via

    [Unit]
    Description=Python Bot

    [Service]
    ExecStart=direnv exec . uv run main.py
    WorkingDirectory=/home/user/project
    Restart=always

    [Install]
    WantedBy=default.target