home.social

Search

58 results for “YesJustWolf”

  1. @FritzAdalis @RuntimeArguments @jammcq @YesJustWolf

    Thanks. I did look this up after I wrote the post. I should have looked it up before. But still, without knowing that history, it appeared the speaker was either confused about and or equating them or something. It wasn't obvious to me that the OpenBSD team *wrote* OpenSSH. That's the way I heard it, might have misinterpreted what was said.

  2. @RuntimeArguments @jammcq @YesJustWolf

    You touched on the -R flag briefly. I've used it, but I don't recall it for the purpose you mentioned.

    I need to check out using certificates.

    I didn't know about password managers being ssh key agents. Another thing to check out, as I use a few 😀

    I also didn't know about ssh-import-id-gh, which doesn't appear to be part of any package in the Fedora repos.

    A better episode than I expected given my long use of .

    2/2

  3. @RuntimeArguments @jammcq @YesJustWolf

    I've been a user since 1984, and spent my working life developing flavors of Unix and now . I listened to this episode over the past couple of days. I'm a long time user of One point of confusion and a few points that I learned.

    When talking about the origins of you talked about but didn't explain how it related to OpenSSH . Was OpenBSD involved in the creation of OpenSSH ? It could have used explanation.

    1/2

  4. I’ve fully switched to OmniFocus (The Omni Group) from Things (Cultured Code). The two driving factors were that OF can handle some more complex relationships than can Things; and better external access into OF’s data (in this case through an MCP).

    Both are great. Things is perfect for most people. If not for my new needs, I would probably still be there.

    #OmniFocus #Things3 #MCP #Productivity #TaskManager

  5. I’ve fully switched to OmniFocus (The Omni Group) from Things (Cultured Code). The two driving factors were that OF can handle some more complex relationships than can Things; and better external access into OF’s data (in this case through an MCP).

    Both are great. Things is perfect for most people. If not for my new needs, I would probably still be there.

    #OmniFocus #Things3 #MCP #Productivity #TaskManager

  6. I’ve fully switched to OmniFocus (The Omni Group) from Things (Cultured Code). The two driving factors were that OF can handle some more complex relationships than can Things; and better external access into OF’s data (in this case through an MCP).

    Both are great. Things is perfect for most people. If not for my new needs, I would probably still be there.

  7. I’ve fully switched to OmniFocus (The Omni Group) from Things (Cultured Code). The two driving factors were that OF can handle some more complex relationships than can Things; and better external access into OF’s data (in this case through an MCP).

    Both are great. Things is perfect for most people. If not for my new needs, I would probably still be there.

    #OmniFocus #Things3 #MCP #Productivity #TaskManager

  8. Yesterday was just a fantastic day for me. I planned to post on it. Just overjoyed. Strongly tempered today by the acquisition of Astral, and therefore the uncertain future of #uv and #ty (and transitively, #pixi).

    I’m gonna focus on yesterday.

  9. Yesterday was just a fantastic day for me. I planned to post on it. Just overjoyed. Strongly tempered today by the acquisition of Astral, and therefore the uncertain future of #uv and #ty (and transitively, #pixi).

    I’m gonna focus on yesterday.

  10. Yesterday was just a fantastic day for me. I planned to post on it. Just overjoyed. Strongly tempered today by the acquisition of Astral, and therefore the uncertain future of and (and transitively, ).

    I’m gonna focus on yesterday.

  11. Yesterday was just a fantastic day for me. I planned to post on it. Just overjoyed. Strongly tempered today by the acquisition of Astral, and therefore the uncertain future of #uv and #ty (and transitively, #pixi).

    I’m gonna focus on yesterday.

  12. @kevin @Techmeme OMG! I’m pretty sure this is horrible and frightening news; though (counter example) GitHub is still doing okay. I’m a superfan of #uv and #ty. Please don’t make me switch!

  13. Today my work machine graduated from Windows 11 to Ubuntu 24.04. The hardware hasn’t changed. I would have preferred Kubuntu. They gave me ext4 instead of btrfs. But my wishes there are nice-to-haves. What they gave me is a huge step!

    #Linux #Windows #Ubuntu #Ext4 #Btrfs #Upgrade

  14. I love Bash. I used to write tons of Bash. There is a lot of Bash in my life, even to this day.

    But here's my life now:

    * Bash holds some stuff together (small stuff: usually setting variables, aliases, and/or piping together a few CLI tools. See github.com/wolf/dotfiles/tree/ for examples)

    * Zsh is good at doing stuff when I type, so that's my login shell

    * If I have to do something interesting, why not just a Python script? In modern times, with a `uv` shebang line and self-specified dependencies ... the only externally visible additional requirement is `uv` itself (you don't even need Python). Just like a shell-based answer: you end up with a single stand-alone file

    I'm not going to argue about "but you have to install `uv`". You do you.

    #Bash #Zsh #Python #uv #Dotfiles

  15. My day job is all about #Python (which I love). Here are some personal rules, specific to working with Python projects:

    * Do **not** install or modify global tools, especially Python itself or any packages. This means a given system might not even **have** a global Python
    * Always use virtual environments (`uv` agrees with me, and doesn't need this but). I always set the global environment variable `PIP_REQUIRE_VIRTUALENV`.
    * The two rules above mean my virtual environment contains (not via a link, it's really there) Python itself (and of course, of the right version)
    * Virtual environments always live **inside** a project directory. Never global.
    * Activate virtual environments only **inside** the project directory (`direnv` #direnv makes this easy)
    * Don't install (let alone use) #Anaconda, #Miniconda, or #Mamba, because those violate all the rules above (but see the next rule)
    * Anaconda-based packages implies a `pixi` #Pixi project (it's the same people, but a better answer, and you still get what you want -- the correct packages)
    * No Anaconda-based packages implies a `uv` #UV project
    * Always use `pyproject.toml` #pyprojecttoml over any other config file (e.g., `requirements.txt` #requirementstxt), except where things just don't work, such as needing `pyrefly.toml`
    * `uv`, `pixi`, and `direnv` must exist outside of any project, so install them at the user level, or else globally if and only if that is appropriate and compelling enough to override rule one

    That was a wall of text, but in practice doing it this way is trivial. It's probably **less** work than you have been doing. This post is just about managing your Python versions, environments, and projects. Not about, e.g., using `pre-commit` #precommit, or doing type checking, etc. But if you follow these rules, your work will be easier, faster, more adaptable, and encounter fewer obstacles.

    #HowTo

  16. My day job is all about #Python (which I love). Here are some personal rules, specific to working with Python projects:

    * Do **not** install or modify global tools, especially Python itself or any packages. This means a given system might not even **have** a global Python
    * Always use virtual environments (`uv` agrees with me, and doesn't need this but). I always set the global environment variable `PIP_REQUIRE_VIRTUALENV`.
    * The two rules above mean my virtual environment contains (not via a link, it's really there) Python itself (and of course, of the right version)
    * Virtual environments always live **inside** a project directory. Never global.
    * Activate virtual environments only **inside** the project directory (`direnv` #direnv makes this easy)
    * Don't install (let alone use) #Anaconda, #Miniconda, or #Mamba, because those violate all the rules above (but see the next rule)
    * Anaconda-based packages implies a `pixi` #Pixi project (it's the same people, but a better answer, and you still get what you want -- the correct packages)
    * No Anaconda-based packages implies a `uv` #UV project
    * Always use `pyproject.toml` #pyprojecttoml over any other config file (e.g., `requirements.txt` #requirementstxt), except where things just don't work, such as needing `pyrefly.toml`
    * `uv`, `pixi`, and `direnv` must exist outside of any project, so install them at the user level, or else globally if and only if that is appropriate and compelling enough to override rule one

    That was a wall of text, but in practice doing it this way is trivial. It's probably **less** work than you have been doing. This post is just about managing your Python versions, environments, and projects. Not about, e.g., using `pre-commit` #precommit, or doing type checking, etc. But if you follow these rules, your work will be easier, faster, more adaptable, and encounter fewer obstacles.

    #HowTo

  17. My day job is all about (which I love). Here are some personal rules, specific to working with Python projects:

    * Do **not** install or modify global tools, especially Python itself or any packages. This means a given system might not even **have** a global Python
    * Always use virtual environments (`uv` agrees with me, and doesn't need this but). I always set the global environment variable `PIP_REQUIRE_VIRTUALENV`.
    * The two rules above mean my virtual environment contains (not via a link, it's really there) Python itself (and of course, of the right version)
    * Virtual environments always live **inside** a project directory. Never global.
    * Activate virtual environments only **inside** the project directory (`direnv` makes this easy)
    * Don't install (let alone use) , , or , because those violate all the rules above (but see the next rule)
    * Anaconda-based packages implies a `pixi` project (it's the same people, but a better answer, and you still get what you want -- the correct packages)
    * No Anaconda-based packages implies a `uv` project
    * Always use `pyproject.toml` over any other config file (e.g., `requirements.txt` ), except where things just don't work, such as needing `pyrefly.toml`
    * `uv`, `pixi`, and `direnv` must exist outside of any project, so install them at the user level, or else globally if and only if that is appropriate and compelling enough to override rule one

    That was a wall of text, but in practice doing it this way is trivial. It's probably **less** work than you have been doing. This post is just about managing your Python versions, environments, and projects. Not about, e.g., using `pre-commit` , or doing type checking, etc. But if you follow these rules, your work will be easier, faster, more adaptable, and encounter fewer obstacles.

  18. My day job is all about #Python (which I love). Here are some personal rules, specific to working with Python projects:

    * Do **not** install or modify global tools, especially Python itself or any packages. This means a given system might not even **have** a global Python
    * Always use virtual environments (`uv` agrees with me, and doesn't need this but). I always set the global environment variable `PIP_REQUIRE_VIRTUALENV`.
    * The two rules above mean my virtual environment contains (not via a link, it's really there) Python itself (and of course, of the right version)
    * Virtual environments always live **inside** a project directory. Never global.
    * Activate virtual environments only **inside** the project directory (`direnv` #direnv makes this easy)
    * Don't install (let alone use) #Anaconda, #Miniconda, or #Mamba, because those violate all the rules above (but see the next rule)
    * Anaconda-based packages implies a `pixi` #Pixi project (it's the same people, but a better answer, and you still get what you want -- the correct packages)
    * No Anaconda-based packages implies a `uv` #UV project
    * Always use `pyproject.toml` #pyprojecttoml over any other config file (e.g., `requirements.txt` #requirementstxt), except where things just don't work, such as needing `pyrefly.toml`
    * `uv`, `pixi`, and `direnv` must exist outside of any project, so install them at the user level, or else globally if and only if that is appropriate and compelling enough to override rule one

    That was a wall of text, but in practice doing it this way is trivial. It's probably **less** work than you have been doing. This post is just about managing your Python versions, environments, and projects. Not about, e.g., using `pre-commit` #precommit, or doing type checking, etc. But if you follow these rules, your work will be easier, faster, more adaptable, and encounter fewer obstacles.

    #HowTo

  19. My day job is all about #Python (which I love). Here are some personal rules, specific to working with Python projects:

    * Do **not** install or modify global tools, especially Python itself or any packages. This means a given system might not even **have** a global Python
    * Always use virtual environments (`uv` agrees with me, and doesn't need this but). I always set the global environment variable `PIP_REQUIRE_VIRTUALENV`.
    * The two rules above mean my virtual environment contains (not via a link, it's really there) Python itself (and of course, of the right version)
    * Virtual environments always live **inside** a project directory. Never global.
    * Activate virtual environments only **inside** the project directory (`direnv` #direnv makes this easy)
    * Don't install (let alone use) #Anaconda, #Miniconda, or #Mamba, because those violate all the rules above (but see the next rule)
    * Anaconda-based packages implies a `pixi` #Pixi project (it's the same people, but a better answer, and you still get what you want -- the correct packages)
    * No Anaconda-based packages implies a `uv` #UV project
    * Always use `pyproject.toml` #pyprojecttoml over any other config file (e.g., `requirements.txt #requirementstxt), except where things just don't work, such as needing `pyrefly.toml`
    * `uv`, `pixi`, and `direnv` must exist outside of any project, so install them at the user level, or else globally if and only if that is appropriate and compelling enough to override rule one

    That was a wall of text, but in practice doing it this way is trivial. It's probably **less** work than you have been doing. This post is just about managing your Python versions, environments, and projects. Not about, e.g., using `pre-commit` #precommit, or doing type checking, etc. But if you follow these rules, your work will be easier, faster, more adaptable, and encounter fewer obstacles.

    #HotTo

  20. My port of the Alabaster theme (family) for Helix was approved, merged to `main`, further modified as per requests, and modifications merged to `main`, too.

    The time from creating the PRs to having them merged was really short. Now how long it takes to get from `main` into an actual **release**, that I don’t know.

    My original repo (just the theme family) lives at github.com/wolf/alabaster-for-. The `README` there has screenshots; points to the works of Alabaster’s creator (Nikita Prokopov, “tonsky”), which includes the original themes, repo, and an article that explains it all.

    I use this theme all the time and will continue to maintain it. I see further suggestions in Issues on my repo. I take suggestions, issues, and PRs seriously.

    #HelixEditor #OpenSource #Themes #Alabaster

  21. Published my port of the Alabaster theme family for Helix.

    Alabaster is a minimal syntax highlighting approach by Nikita Prokopov (tonsky) - only 4 semantic colors: strings, constants, comments, and definitions. Everything else stays plain text because code structure is already clear from formatting.

    I've ported all 6 variants (light/dark × standard/BG/mono) from the original Sublime theme (staying as close as possible to the original). Also submitted a PR to ship these with Helix upstream!

    Original theme: github.com/tonsky/sublime-sche
    Read tonsky's essay: tonsky.me/blog/syntax-highligh
    My port: github.com/wolf/alabaster-for-

    I tried to duplicate the original exactly, however Helix has multiple selections so I made the colors distinct between "selection" and "primary-selection".

    #HelixEditor #Helix #Alabaster #MinimalDesign #SyntaxHighlighting #TextEditor #Rust

  22. I use #Git. A feature of Git I leverage heavily is #Worktree. I usually have at least four around at a time. For small tasks, sure, a simple branch and then switch back, but bigger things: a worktree.

    Making a worktree is actually annoying for me: not just the upfront decisions about branches and start points and where to put the new directory (and also immediately `cd`ing there: but getting all the #submodules (submodules suck by the way), hooking up `.envrc` if you use #Direnv (and you should be), which should then set up your virtual environment and path and stuff. Clone isn’t quite as bad but has some of the same problems.

    I do this so often, I wrote a script. It might be useful to others with this workflow. It’s opinionated, and therefore I could really use some feedback! What did I do right? What did I do that’s only right for me? What is totally missing?

    The script is stand-alone, though you do need #UV. (You don’t even need Python! `uv` will transparently get you everything!) Just download this one Python file, and get it on your `$PATH`. If you want the additional `cd` behavior, then add the shell function, too as described in the `README`. Everything is tested. The tests are right there, too.

    github.com/wolf/dotfiles/blob/

    The `README.md` is right next to it.

    I **do** see one thing I’m missing: I need to provide a way to automatically copy in your custom stuff. I’ll add that today.

  23. The more I learn about #Direnv, the happier I become. It **already** works with #Pixi. It **already** helps you with secrets (ignore `.envrc` in your `.gitignore` or equivalent). I grabbed a `layout_uv` from the direnv wiki (made some small modifications), and it works basically everywhere I want it.

    If you're not using using `direnv` yet, you are doing yourself a disservice.

  24. The more I learn about #Direnv, the happier I become. It **already** works with #Pixi. It **already** helps you with secrets (ignore `.envrc` in your `.gitignore` or equivalent). I grabbed a `layout_uv` from the direnv wiki (made some small modifications), and it works basically everywhere I want it.

    If you're not using using `direnv` yet, you are doing yourself a disservice.

  25. The more I learn about , the happier I become. It **already** works with . It **already** helps you with secrets (ignore `.envrc` in your `.gitignore` or equivalent). I grabbed a `layout_uv` from the direnv wiki (made some small modifications), and it works basically everywhere I want it.

    If you're not using using `direnv` yet, you are doing yourself a disservice.

  26. The more I learn about #Direnv, the happier I become. It **already** works with #Pixi. It **already** helps you with secrets (ignore `.envrc` in your `.gitignore` or equivalent). I grabbed a `layout_uv` from the direnv wiki (made some small modifications), and it works basically everywhere I want it.

    If you're not using using `direnv` yet, you are doing yourself a disservice.

  27. My brand-new #AirPods Pro 3’s were having charging troubles I couldn’t resolve on my own, so I made a #GeniusBar appointment at my local #AppleStore (Briarwood Mall, Ann Arbor). No trouble getting the time or day I wanted. No trouble on arrival. I was assigned a Genius named D'Air (spelling may be incorrect). I was actually insensitive (accidentally) about his name and I’m still kicking myself about that. He was deserving of respect and I let him down there, and also I value kindness in myself and I let us **both** down there (and if anyone should know better about names, it’s me). I apologized immediately (well, as soon as I understood), but I remain disappointed in myself. Especially since he was also incredibly helpful: solving not just one problem, but three! AirPods fixed (though they had to be sent out); critical notifications enabled for my continuous glucose monitor; and he helped me understand how I will deal with my iPhone battery in my 15 Pro Max as it ages.

    He was knowledgeable, competent, helpful, and kind, even when I fucked up. He solved all my problems. I hope he gets better treatment from others, and I’m sorry he didn’t get it from me.

  28. By the way: switching to #Fastmail, including making my email address use my own domain and importing all my old email from #HeyEmail #Hey was quick, easy, and in the end cost something like 40% less. Of course I don’t know yet how much I will enjoy using the interface; but everything I’ve done so far is promising.

  29. I really liked #HeyEmail #Hey. It fit me very well. But at a fundamental level #DHH’s values differ from mine; and in a way that pretty much means I have to switch. So, #Fastmail, here I come.

  30. Let's say you want to do good type-checking for the #Python project you're working on. You pick a tool, maybe you use it as an #LSP also (so your editor can show you errors, too). As an example, I'm using #Ty at the moment. There's three places this might be installed: globally (e.g., `brew install ty`), as a dev-only dependency inside your project (e.g., `uv add --dev ty`), or -- and this one might surprise you -- it might only be used and installed by `pre-commit`, which builds a separate environment for each needed tool (which is great for instance where I use `codespell` as a `pre-commit` check, which seems to need some higher version of Python than my actual project).

    Where should you install it?

    If you're the only one on your team running it, globally is fine. If more than just you, then absolutely as a dev-only dependency inside your project ... and **maybe** globally as well.

    The only real problem is updates. If you use a reasonable global install scheme, updates will be easy. They're less easy inside your project or in `pre-commit`. And you might care one way or the other! I **don't** want updates! I **do** want updates!

    As for Python type-checking, `ty` seems good so far, but not enough experience with it yet. `basedpyright`, `pyrefly`, and `ruff` all good. These four are my favorites.

    #BasedPyright #Pyrefly #Ruff #PreCommit #CodeSpell #Homebrew