More flatpak grumbling... so even after I went through the process of installing some utterly massive base packages to get just one or two packages to run, there happen to be updates to those base packages (as expected!). This means I'm looking at downloading up to 2.9 GB just to get system updates. There _has_ to be a better way to do this.
Again - I want things like Flatpak and Snap to succeed! But I have to wonder if developers realize that a broad majority of us consumers may sometimes (or semi-permanently) be on slow (< 10 Mbps) internet connections.
I can deal with running updates overnight every once in a while if there are a massive amount of updates, but the fact that they're now duplicated (or triplicated) by flatpak / snap is just frustrating.
@urusan Yeah it's basically how flatpak / flathub work.
I think you're onto something, but I expect that to be part of the tooling (not some hack I have to do on my own).
I'm just shooting from the hip here, but we HAVE things that do binary deltas and catchup: zsync is a thing. Are we just getting lazy with all of this containerization nowadays? Why not put a bit of thought into how much effort it is for clients to update containers / layers?
@funnylookinhat Yeah, it is just lazy not to have this built in. It's probably just because the developers all have good Interest connections, so this didn't come up for them.
Although container layers help, it is much less powerful and other development organization concerns usually dominate the decision about where to split the layers.
Fosstodon is an English speaking Mastodon instance that is open to anyone who is interested in technology; particularly free & open source software.