Looking for ways to install #Python programs on #Debian 11 and 12 systems, globally (as would normally be in /usr/local/bin/ for executables, elsewhere in /usr/local/ for additional files), to run periodically or as daemons (later setting systemd services for those). "Easy install" is deprecated, pip points at the externally managed environment and wants a virtual environment, pipx also installs into a venv in the user's home directory. Executable scripts are available to other users then, but sitting in a user directory, which is unconventional, contrary to the FHS. Packaging into .deb must be possible, apparently using dh-python, though so far I have not found a complete guide to follow, possibly will have to dig deeper into it. Single-file scripts could be simply copied into /usr/local/bin/, but there can be multi-file programs as well. I wonder whether I am missing something: this looks surprisingly tricky for such a task, with a very popular language and a popular Linux distribution. How do you package and install custom Python programs on Debian?
@defanor
> How do you package and install custom Python programs on Debian?
With difficulty.
Like pretty much every language-specific packaging system, #Python packaging system(s) assumes it is the centre of the universe, and makes only grudging accommodation to the operating system package manager.
In such package systems, yes the assumption is you're installing into an isolated environment that ignores installed libraries and standard file locations for the operating system.
This is, in my opinion, partly caused by MS Windows utter hostility to the concept of a standard operating system package manager, with a standard API for fetch + install + configure.
Decades of that have taught developers that to distribute a program, is to bundle the entire thing and install it in a way that insulates it from the operating system, not work with it.
And those conventions carry through to language-specific package systems: The operating system is chaos, ignore it.
@defanor
> Packaging [a #Python application for #Debian] must be possible, apparently using dh-python, though so far I have not found a complete guide to follow, possibly will have to dig deeper into it.
You're right. This is partly because the landscape for the Python packaging system has never settled, and partly because that system makes it so very difficult to install system-wide working with the operating system existing installed libraries.
Also partly because volunteers lack time.
The best I think we can offer today @defanor is to look at existing #Python applications already packaged in #Debian.
But you'll find there a bunch of hacks to work around the misfit of the tool to the job. It insists on virtualenv and assumes you're packaging a library (doesn't work for an executable application); and the corollary lack of support by the packaging tools to install an application (not a library).
Some example #Python applications I have packaged in #Debian, that you can use for inspiration or guidance.
But, again, be aware this is not necessarily *good* packaging, these all have hacks that try to work around the inadequacies of the Python packaging system support for installing an application.
'dput'
'python-adventure'
'python-coverage'
'towncrier'
'xkcdpass'
When I say that volunteers lack time, I'm describing two camps:
The #Debian volunteers who work to support Python, lack time to keep up with the ever shifting "standard way" that is different every few years. Tools get outdated, interfaces never get properly supported.
The #Python stewards wisely don't want to officially crown any one system until it's stable and comprehensively solves the problem. They don't do the work themselves, so defer to the community. Who haven't solved it.
@bignose Thanks! I built a two-module test package now, based on xkcdpass's packaging. The modules are in /usr/lib/python3/dist-packages/, the executable file in /usr/bin/, and systemd service files can be added easily into it. Despite the issues, at least there is a path to a nicely installed program.
I'm glad you got a package working!
@defanor
> The modules are in /usr/lib/python3/dist-packages/
Yeah, that's a huge point of pain. For many applications the internal modules should *not* be in the general #Python import path, polluting the import namespace. These are intended only for the application.
There was a way to specify "install Python modules here, data files there, documentation hither", etc., at install time. That option is gone now, in an explicit snub to application packagers.
@defanor Don't you package and install Python scripts exactly the same as you package and install every other kind of script? I don't see any reason why there should be a difference.
@wbpeckham It would be easy for single-file scripts, if you only install executable files (so they can be simply copied into /usr/bin/), but if you have a modular Python program, the additional modules should go into /usr/lib/python3/dist-packages/<package-name>/. Unsure if there is more to it, so I would rather rely on packaging tools to follow the proper process, not to simply set those paths manually (although that seems doable, too).
@defanor For that the package requirements might serve, but I do not Python myself. Generally, if libraries are installed, the library location and path is taken care of by the installer. Since I have not tested that, I cannot verify that for Python scripts.
@defanor dpkg is and always was the right tool. Problem is that dh_python documentation varies between obsolete and nonexistent. But I've recently learned that one does not need to go full blown dh_python. Just creating a directory structure, a control file and running dpkg with proper args is good enough for simple use cases.
@radon Indeed that can work: I usually do that for Haskell (after `cabal install --prefix=/usr --install-method=copy --installdir=deb/usr/bin/`), to simply install a binary with dependencies pulled from Hackage (though I would like to stick just to those from Debian, but no luck there so far), but those tend to be single binaries, while for Python you may also have to install some files into /usr/lib/python3/dist-packages/, and possibly something I am not aware of, so it feels more comfortable to use the more proper process. Fortunately that is also doable, and worked now, as mentioned in a sibling toot.
@defanor My preference is to create a virtualenv under /opt
and install the Python package(s) inside that. This follows FHS and plays nicely with systemd's security options. And since systemd units require an absolute executable path, saying /opt/yourvenv/bin/tool
is no big loss compared to /usr/local/bin/tool
.
@CyrikCroc Thanks, /opt/ does look more appropriate than someone's $HOME, indeed, though not quite as conventional and unsurprising as /usr/bin/ and /usr/lib/. Fortunately (as mentioned in a nearby toot) I managed to build a Debian package with dh-python now, but going to aim using /opt/ in case if there will be issues with that in the future.
@defanor pipx with global options (sudo PIPX_HOME=/opt/pipx PIPX_BIN_DIR=/usr/local/bin pipx install cowsay) would be easiest.
(I'd swap out /opt for /usr/local/lib.)
But yea, putting in the extra work to package a deb is nice if you're going to install on multiple systems. Like others suggested: apt-get source some other python package and adapt as needed.
@sedje Thanks, tried it out as well now. Fortunately packaging into a deb worked, but going to use this as a backup option.
@sedje @defanor Yeah this is what I would do too.
Or, honestly, for things that are supposed to run as services rather than being available for users to execute, I'd probably create venvs (separate for each program) under /opt because I'm pretty familiar how to manage venvs, plus that way the programs don't get added to $PATH. But if you really don't want to do that work and you want something to manage venvs for you, pipx with these global options seems like a good way to go.
@defanor what I ended up doing is creating a directory in `/opt`, install as many deps as possible using the OS' packages (for better security coverage), create a venv there that uses the system packages, and run stuff from there.
@mdione Combining those with system packages sounds like a potentially nice compromise; the lack of centralized updates is one of my concerns about use of venvs, too. Going to try packaging into deb (which worked for a test package now, as mentioned in nearby replies), but this combination of /opt/ with system packages looks like an okay backup option. Thanks.
@defanor @mdione It's worth noting that if you rely on the system package manager to install your Python dependencies, you might be getting pretty out-of-date versions for some of them. Depends on the distro and the package of course, but I've seen Ubuntu packages months out of date in apt, even on the latest Ubuntu release. So you gotta consider the tradeoff between convenience and security, as always.