So, I have a python script I’d like to run from time to time from the CLI (on Linux) that resides inside a venv. What’s the recommended/intended way to do this?
Write a wrapper shell script and put it inside a $PATH-accessible directory that activates the virtual environment, runs the python script and deactivates the venv again? This seems a bit convoluted, but I can’t think of a better way.

  • Andy@programming.dev
    link
    fedilink
    arrow-up
    2
    ·
    5 months ago

    I use my own Zsh project (zpy) to manage venvs stored like ~/.local/share/venvs/HASH-OF-PROJECT-PATH/venv, so use zpy’s vpy function to launch a script with its associated Python executable ad-hoc, or add a full path shebang to the script with zpy’s vpyshebang function.

    vpy and vpyshebang in the docs

    If anyone else is a Zsh fan and has any questions, I’m more than happy to answer or demo.

    • Faulkmore@mastodon.social
      link
      fedilink
      arrow-up
      1
      ·
      5 months ago

      @Andy The convention is to place the venv in a .venv/ sub folder. Follow the convention!

      This is shell agnostic

      Learn pyenv and minimize shell scripts (only lives within a Makefile).

      Shell scripts within Python packages is depreciated

      • Andy@programming.dev
        link
        fedilink
        arrow-up
        2
        ·
        5 months ago

        The convention

        That’s one convention. I don’t like it, I prefer to keep my venvs elsewhere. One reason is that it makes it simpler to maintain multiple venvs for a single project, using a different Python version for each, if I ever want to. It shouldn’t matter to anyone else, as it’s my environment, not some aspect of the shared repo. If I ever needed it there for some reason, I could always ln -s $VIRTUAL_ENV .venv.

        Learn pyenv

        I have used pyenv. It’s fine. These days I use mise instead, which I prefer. But neither of them dictate how I create and store venvs.

        Shell scripts within Python packages is depreciated

        I don’t understand if what you’re referencing relates to my comment.

        • logging_strict@programming.dev
          link
          fedilink
          arrow-up
          1
          ·
          5 months ago

          The multiple venv for different Python versions sounds exactly like what tox does

          Then setup a github action that does nightly builds. Which will catch issues caused by changes that only tested against one python version or on one platform

          py313 is a good version to test against cuz there were many modules removed or depreciated or APIs changed

          good luck. Hope some of my advice is helpful

          • Andy@programming.dev
            link
            fedilink
            arrow-up
            1
            ·
            5 months ago

            Thanks, yes, I use nox and github actions for automated environments and testing in my own projects, and tox instead of nox when it’s someone else’s project. But for ad hoc, local and interactive multiple environments, I don’t.