../

The 'distro' model is broken

I have nothing but utmost respect for the distribution maintainers slaving away and packaging thousands of Linux tools and applications. Incredible projects with decades of experience like Debian should be commended for achieving a usable operating system where things aim to tie together.

However, the way we go about distributing, shipping, and installing software on Linux is fundamentally broken.

How it's done on Windows

As a contrast, I'm bringing up Windows. When you use Windows, software is generally obtained straight from the developer's website and its libraries and files are dumped into their own folder. This allows users to obtain the latest version of whatever software they need and have it operate isolated from other programs running on their system.

This is immensely beneficial when you have silo'd off proprietary software as the bread and butter of an ecosystem. Everyone's working on their own closed-off projects, and can ship exactly the libraries they need in order for it to work.

You can immediately see how this is appealing to various organizations that rely on decade-old software for business, as well as a wide range of older or unmaintained software like video games. Games by their nature generally stop receiving updates after a certain point in time, and they're generally not security-critical applications that need consistent library refreshes.

When all of your users are working with the program YOU shipped, you can establish a clear "chain of trust" between you, the developer, and them, the user. This is a common trend that security guys are big on these days.

How it's done on Linux

Linux, on the other hand, has oriented itself towards the distribution packaging model. Open source developers will publish software, often as Git repositories, and maintainers will do the work of compiling, potentially patching for per-distro functionality, and hosting binary builds. Packages tend to share libraries, rather than duplicating them. There are some advantages to this model:

Both systems are inadequate

The Windows system is not ideal and leads to its own problems and headaches. But, at the same time, the Linux model is maybe even less appealing. In my experience, most Linux software I end up needing is out of date in default repositories. I end up having to append a third-party repository to get what I need. "Stable" distributions like Debian, Ubuntu, and RHEL often times have incredibly old versions of packages that I rely on, or they're not present in the repository at all. As a user, it's unpleasant to have to fiddle around with extra repositories to get what I need. Even rolling release distros that try to stick to upstream may have slightly out of date packaging.

As a developer, there's no guarantee that the software I developed and shipped out is the same as the one in Debian's repositories. In fact, recently, a Debian developer was under fire for gutting useful features out of the KeePassXC password manager.

Mastodon link

Whether or not this is a good decision, it's a very drastic one that introduces clearly unintended behavior from the application. Now KPXC developers have to worry about Debian users that are running versions of their software which exhibit weird error states. Network functionality was gutted out of the program.

Not only is KeePassXC a security-critical application just by nature of being a password manager, but it's something that should probably stick close to what the developers intended it to be. Windows and Mac users receive updates for KPXC and there is a clear developer to user progression here with all security fixes coming straight to them. On Debian, there's an opinionated middleman that feels the need to make various decisions on their own.

This is one of the primary challenges of developing anything for Linux -- you have to implicitly support all the well-meaning users that are being shipped outdated or broken versions of the software you worked on. This doesn't just apply to desktop systems, it's a problem that servers deal with as well.

Another issue is that you end up implicitly trusting all packages that your distro ships based solely on some GPG keys. When malware finds its way into a Linux repository, people are generally inclined to not read into it. After all, each individual library and package does not have its own verification for upstream functionality.

We need a different model. Maybe core system components can be packaged by maintainers, but LibreOffice, Firefox, GIMP, whatever SHOULD NOT be managed at the whims of distro maintainers.

Solutions?

I have mentioned Flatpaks, Snaps, and AppImages as methods for desktop users to obtain software straight from the source. All of these have problems, but they're a start at rethinking this problem. The benefit is they all encourage a straight-from-upstream model and (in my opinion) correctly put the burden of packaging on application developers.

But what about on servers?

The industry standard is pretty much just pushing the problem down inside some kind of container or VM. "Dockerize everything" is the current philosophy, slamming distro-packaged software into an image-based container. Ultimately, the problem remains, but the idea is at minimum everyone is experiencing the same kind of "works on my machine". For that reason, I feel the "straight from upstream" concept has been somewhat resolved through containerization. However, not everything can be comfortably containerized. As an example... what about the container software itself. How are you getting fresh updates for that?

/foss/ /linux/