Follow

What do y'all think about static vs dynamic linking of libraries in a released application?

(I'm thinking specifically of distro issues, where static linking makes packaging easier but increases HD usage and reduces the packager's ability to update dependencies. But I'm interested on other takes too)

@codesections depends. Depends on size of the linked stuff, the target audience: Desktop PCs with TB of storage don't care whether your CLI app takes 5MB or 15MB of space, but your Smartphone might care about that. Also, if its more like 100MB vs 500MB, things might change.

In general, I would like to see system libraries (like openssl, libevent, libuv, ... for example), which are _widely used_, being linked dynamically but specialized libraries for your app being built-in.

@codesections I wouldn't mind, though, a good story for dynamic linking of things in my favourite programming language.

Having the ability to actually define, in my code right there, "now load libfoobar with this interface or fail" is a thing I would really enjoy for some very specific usecases.

@codesections Lets say there's a bug in OpenSSL, and a fix is released. The release is 100% backwards compatible.

With dynamic linking, you only need to update OpenSSL. With static linking, you need to recompile and update everything that links to it.

It's not just more space on the disk, it's _much_ more to download when updating, even when updates are only security updates (and lets not talk about rolling releases). So dynamic linking drastically reduces network usage too.

@algernon @codesections except you're using #nixos where you'd have to compile all the things again anyways.

But that's also the whole point of nixos, so there's that.

(Disclaimer: I'm a fan, user and contributor to nixos)

@musicmatze @codesections Yeah, if you need to recompile everything anyway, then dynamic linking is of little help indeed.

@algernon @codesections well, it still helps with install sizes if you can reuse a dynamically linked library in multiple applications, right?

@musicmatze @codesections Indeed, but on a typical desktop, that's negligible. Especially since with static linking, the linker might end up throwing away a whole lot of unreferenced/unreachable/etc things that it wouldn't throw away from a shared lib. So statically linking a library that's 5Mb when shared might not end up being +5Mb when statically linked, but maybe only 1.

So the size diff is even less important on desktop with plenty of space.

@algernon @codesections or on a server.

I think we agree that the answer to the question is "it depends" 😄

@musicmatze @codesections Oh, and for embedded systems with very limited space, static linking might make more sense, because that allows link-time code elimination, so you might very well end up using less space with static linking, if you can throw half the stuff away at link time.

Do measure first, though, as is the case with every optimisation :)

@algernon @codesections There's this famous quote by a german blogger about dynamic vs. static linking and that we invented dynamic linking because of reasons and now we're putting everything into containers to ensure that the libs are there in the right versions and so on ...

@codesections That said, if the app in question has good reasons to not use the shared libraries of the distribution, then do link it statically, instead of shipping a gazillion SOs and a lot of wrappers to make sure they're used (hi Steam!).

Static linking makes sense if you distribute an app outside of a particular distro. If you need to run everywhere, regardless of system libs.

For distros, unless it's like NixOS where you need to recompile anyway, dynamic linking has significant benefits.

@algernon

> Lets say there's a bug in OpenSSL, and a fix is released. The release is 100% backwards compatible. With dynamic linking, you only need to update OpenSSL.

I take that point. The flip side, though, is when the release *isn't* 100% backwards compatible for all dependencies (perhaps due to bugs in those dependencies).

In that case (if I understand correctly) the distro has to decide between breaking packages or holding back the update until upstreams fix the bug, right?

(Non-nix)

@codesections

> In that case (if I understand correctly) the distro has to decide between breaking packages or holding back the update until upstreams fix the bug, right?

The third option is to recompile everything.

@codesections my opinion is that dynamic linking should only be done by distro maintainers.

If you distribute by yourself, static is a must.

@codesections when someone finds a cve in a library at least dynamic let's the library author update it and it also makes it easy to detect the dependency.

@penguin42 @codesections
If you link static, you will need to care for ALL updates.
If you link dynamic, the distribution cares for all library updates. You only care for your programs update.

@codesections
i would say for common popular libs, yes, dynamic linking. but for obscure libs only a few projects use, you should definitely statically link and include with your souce distribution.

ten years later the library may be gone. maybe they got frustrated and gave up on the project. maybe they consider it obsolete and deleted it. maybe they didn't want to pay hosting. maybe they had it on github, and microsoft suspended their account.

@sowth

> but for obscure libs only a few projects use, you should definitely statically link and include with your source distribution. ten years later the library may be gone.

That is … a point I honestly hadn't considered, thanks. I've gotten used to working in languages with built-in/consensus language repositories (like Cargo, Pip, CPAN, NPM, Bundle, etc).

Central repos like that *largely* avoid the risk you're talking about (though also introduce a single point of failure)

Sign in to participate in the conversation
Fosstodon

Fosstodon is an English speaking Mastodon instance that is open to anyone who is interested in technology; particularly free & open source software.