fosstodon.org is one of the many independent Mastodon servers you can use to participate in the fediverse.
Fosstodon is an invite only Mastodon instance that is open to those who are interested in technology; particularly free & open source software. If you wish to join, contact us for an invite.

Administered by:

Server stats:

10K
active users

R. L. Dane :debian: :openbsd:

Dear Programming Language dev teams,

If your language has a package manager,

it NEEDS to be as fully-featured and easy-to-use as apt.

No more "use the website to search for packages"
No more "it installs, but not upgrades or uninstalls"
No more "Ok, it installs, but it isn't necessarily fully functional"
No more "It installs, but you need arcane options for the upgrade to function."

MAKE IT AS EASY AS APT,
Or don't release a package manager.

</rant>, a.k.a. Fin.

Merci.

@RL_Dane Also stop letting every dipshit with an email address upload a package. You're safer piping curl to bash than installing from npm or pypi these days.

@timjclevenger

Oh man, I steer clear of npm XD

@timjclevenger @RL_Dane how else could that possibly work though? You have a committee that approves packages? There's lots of software I want to install that isn't in my distribution.

@immibis @timjclevenger

I'm guessing do it like native packages: require a package maintainer to answer emails and do tests and such.

@RL_Dane @timjclevenger if the developer nominates themselves as maintainer isn't it the same as what we have now?

@immibis @timjclevenger

I'm not super familiar with all the work a package maintainer has to do, but I suppose they'd be more responsible for it and for things like resolving conflicts.

But yeah, I don't quite know how you'd do QA on packages, except maybe to have some kind of election or rating system.

@RL_Dane @timjclevenger The package maintainer's role is to do whatever is needed to bridge the gap between the package and the packaging system. So in Debian they download the source code and write the scripts to make Deb files (I don't think they have to build them). In Gentoo they do something similar but the scripts are shipped to the end customer instead of running on a Debian build machine. Gentoo build scripts may be less opinionated than Debian build scripts.

There are usually also patches to fix bugs in the resulting system. E.g. Debian systems may have something in a different path than the package author looked for it.

If an author uploads their own package to pip they are taking on both author and maintainer roles.
@RL_Dane @timjclevenger Traditionally, dependency QA was done by not having many dependencies and evaluating how much you trust them. When you have 3000 dependencies you can't do that. Reducing the number of dependencies would go a long way towards reducing their surface area.

@immibis @timjclevenger

Yeah, this trend really worries me. It's like the approach to dependency management is just "lol wtf yolo"

@immibis @timjclevenger

Probably so. I've just become aware of it more recently as I've been compiling more of my own (niche) software.

Also, from hearing others bemoaning the dependency hell of node.js.

P.S., I've also been out of I.T. for 11 years, so that accounts for some of it as well ;)

@MxVerda @timjclevenger

It means that the standards for inclusion into the software ecosystem is basically nil.

That's why npm (IIRC) has garbage packages like IsEven and IsOdd (and they're hugely popular).

@RL_Dane @MxVerda And malware uploaded as isItEven and isItOdd to attract downloads.

@RL_Dane It also really bothers me when _any_ alternetive package manager requires you to use the package manager it's self to then _launch_ the newly installed software.

Bonus negative points if you have to do something _really_ stupid, like type the entire name, version, and install source, in a very specifically structured order and format, every single time you want to launch anything...

@OpenComputeDesign

Flatpak can be rather annoying in this regard, but at least they provide good .desktop files.

@RL_Dane I can't remember the last time I've had anything on the desktop. I think early Debian 9?

@OpenComputeDesign

You mean you don't run X11 or Wayland?

@RL_Dane No, I mean, I've got a task bar, a wallpaper, and that's it. No widgets, no icons, no folders.

Also X11 for life, Wayland is _unusable_, but I digress

@OpenComputeDesign

Oh, ok. The .desktop files aren't just for desktop items or desktop environments. They define what applications can run. It's what dmenu_run and `rofi -dmenu` use to determine what applications you have, and how to run them.

Take a looksie at /usr/local/share/applications and ~/.local/share/applications

I use and maintain .desktop files for my i3wm and sway machines, not just KDE

@RL_Dane Oh, I just knew .desktop files are the thing that's always broken if I download a precompiled binary. Any time something doesn't show up in the start menu, I just run it from terminal. Plus that way I'll know why it inevitably crashes, and can ctrl-c it at a moment's notice (works way better than the kill command) So it's honestly tempting to just run everything from terminal :P

@RL_Dane Ok actually to be fair I am running everything from terminal right now because I'm using stock OpenBSD with stock FVWM and that's the only way I can figure out how to launch anything

@OpenComputeDesign

I haven't run fvwm in a while, but you can always just install dmenu, or just write your own script to launch .desktop files ;)

If fvwm is like cwm, you probably just add a menu item in your config for each application you want to run from the menu.

@RL_Dane You should know I'm _very_ lazy.

I used to run two routers because I ran out of switches and didn't feel like messing with the config on the spare router.

I still run two routers, because we moved and I didn't want to reconnect all my wifi devices to a new SSID

@OpenComputeDesign

Ummm, so do you have an open terminal for every open GUI app, then?

@RL_Dane I have 4GB of ram, so every GUI app is one firefox window with a half dozen tabs :P

@RL_Dane When I boot up in the morning, I power up, log in, start firefox, restore session. Then in the evening I press the power button long enough for the computer to eventually register when it ever starts responding again, but not so long it hard resets.

@OpenComputeDesign

Uhh...

*surveys his own machine
*five Alacritty windows open, and nothing else

...carry on, carry on.

(My work machine is a whole 'nother ball game. But my personal machines are mostly just terminals, sometimes XLinks, seldom Firefox and Signal)

@RL_Dane I'm too burnt out to do anything other than watch youtube.

I've got a seperate laptop for my homework, and that machine usually has on okular, four gwenviews, a half dozen dolphins, three firefoxs, and way too many libreoffices

And not even any wiresharks or packettracers. This class blows btw :P

@RL_Dane I only use yt-dlp for music and old children's movies (aka things I'm intending to watch more than once) For timewasting "content", I don't intend for it to stick around any longer than daytime TV broadcasts.

Still wouldn't help me use w3m for my main browser like I'd like to, because I've still never managed to get a mastodon client to work (I tried again recently, mostly on my pinephone, to hard crashing the entire phone results) and google chat for talking to my little sister

@OpenComputeDesign

Oh, I'm not saving the videos. I'm just passing options to mpv to pass to yt-dlp to do things like restrict the resolution on slower machines (on machines with lower-resolution displays)

Oh, I don't use w3m as my *main* browser, but more like my first line of defense from the crappiness of the modern web.

A lot of basic queries work in w3m, as does writingmonth.org and brutaldon. :D

@RL_Dane Well then you're way less lazy then me. Only time I really use multiple browsers in parallel is when I need to use multiple accounts for the same services (e.g. two ebay accounts)

@OpenComputeDesign

That's what firefox containers are for! :D

@RL_Dane@fosstodon.org @OpenComputeDesign@linuxrocks.online I determine which way the wind is blowing make a decision and live with it. Oh and then regret it determine I should have blown against it and gone with blah not blow then blahblow or blowblah and. …. Don’t give a fuck !!!!

@RL_Dane
I wish.
Sorry didn't write it clearly. I just think anything GUI is better than anything CLI.

@blaue_Fledermaus

Oh. I haven't been particularly impressed with GUI package managers. KDE Discover is ok.

I don't hate GUIs, I just hate trying to use them without a mouse (or being forced to use a mouse when I don't want to).

There was a time in the 80s and 90s when all GUIs were very keyboard navigable. (Because not *everyone* had a mouse)

@tjpcc

I mean, all of them. XD

pipx is pretty good, except it has no search.
cargo is robust, but waaaaay to complicated for ordinary mortals.
go is decent, but doesn't install manpages or .desktop files. You have to do that yourself

@RL_Dane it's very slow, and the interface is clunky

@craftyguy

Try brew or pkg_add, and get back to me. 😄

@RL_Dane so your argument is "there are things worse than apt" ? 😅

@craftyguy

I guess 🤣

apt/apt-get could be faster.
pacman spoiled me.

Dear Programming Language dev teams, don't release a package manager.

</rant>, a.k.a. Fin.

Merci.

@cnx I am not sure i understand the frustration against package managers. Aren't they a wonderful quality of life improvement for your development workflow if your language of choice has one? How is it preferable to not have one at all? Genuinely curious on your take. :blobfoxfloofcofe:

To be clear, @dynge, I am not against package managers, but recursive curl | sh in a trench coat that modern programming language implementations usually offer. FMIIW, but all of them are designed for developers to upload software, not users (including library users) to use a collection of software together in an extended time period. They should have been called package relays IMHO.

Package management does not concern a single software and its dependency, but a multitude of packages to efficiently and securely coexist on a system. Efficient use of resource means sharing common (version of) dependencies, which is only possible when each piece of software is robust, i.e. providing a stable API and supporting a range of versions of the libraries it uses. In practice, when a language-specific package "manager" exists, developers will pin the version of each dependencies because everyone is moving fast and breaking things (not to mention foreign language packages will just be bundled for the lack of other choices). It's a cycle of doom: eventually there would be no common version of a shared indirect dependency that is compatible with all of its users.

Disk space, or even memory and processing power, can be cheap, but exploited vulnerabilities are still as expensive as ever. Let says a security bug is fixed in libfoo N, then all packages pinning to some version of libfoo less than N will have to be updated by upstream. Security is absolute, and absolute coordination between many parties is impossible. See also the modern packager’s security nightmare.

Furthermore, convenience in adding dependencies means more dependencies will convene under a program. Meanwhile, meaningful trust scale neither horizontally nor vertically that well, which means the dependency graph will leave more opportunity to be infiltrated by bad actors.

Disclaimer: I have extensive contributions in a major language-specific package "manager" and also a few GNU/Linux distributions. For the latter case, the situation is drastically different:

  1. Only one or a very few versions of a package exist at a time
  2. Software written in any language is a first-class citizen
  3. Patching of a single vulnerability is done by a single party (the distro maintainers)
  4. Users only need to trust one single party, which vets the packaged software and also responds in a timely manner to newly arisen issues (see point 3)
Michał Górny · The modern packager’s security nightmareOne of the most important tasks of the distribution packager is to ensure that the software shipped to our users is free of security vulnerabilities. While finding and fixing the vulnerable code is…