The diversity of Linux distributions is one of its strengths, but it can also be challenging for app and game development. Where do we need more standards? For example, package management, graphics APIs, or other aspects of the ecosystem? Would such increased standards encourage broader adoption of the Linux ecosystem by developers?

  • dosse91@lemmy.trippy.pizza
    link
    fedilink
    arrow-up
    71
    arrow-down
    3
    ·
    edit-2
    4 days ago

    Generally speaking, Linux needs better binary compatibility.

    Currently, if you compile something, it’s usually dynamically linked against dozens of libraries that are present on your system, but if you give the executable to someone else with a different distro, they may not have those libraries or their version may be too old or incompatible.

    Statically linking programs is often impossible and generally discouraged, making software distribution a nightmare. Flatpak and similar systems made things easier, but it’s such a crap solution and basically involves having an entire separate OS installed in parallel, with its own problems like having a version of Mesa that’s too old for a new GPU and stuff like that. Applications must be able to be packaged with everything they need with them, there is no reason for dynamic linking to be so important in Linux these days.

    I’m not in favor of proprietary software, but better binary compatibility is a necessity for Linux to succeed, and I’m saying this as someone who’s been using Linux for over a decade and who refuses to install any proprietary software. Sometimes I find myself using apps and games in Wine even when a native version is available just to avoid the hassle of having to find and probably compile libobsoletecrap-5.so

    • lumony@lemmings.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 day ago

      Static linking is a good thing and should be respected as such for programs we don’t expect to be updated constantly.

      • lumony@lemmings.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 day ago

        That’s a fair disagreement to have, and a sign that you’re fighting bigger battles than just getting software to work.

        Static linking really is only an issue for proprietary software. Free software will always give users the option to fix programs that break due to updated dependencies.

    • pr06lefs@lemmy.ml
      link
      fedilink
      arrow-up
      24
      arrow-down
      1
      ·
      3 days ago

      nix can deal with this kind of problem. Does take disk space if you’re going to have radically different deps for different apps. But you can 100% install firefox from 4 years ago and new firefox on the same system and they each have the deps they need.

    • catloaf@lemm.ee
      link
      fedilink
      English
      arrow-up
      8
      ·
      3 days ago

      I don’t think static linking is that difficult. But for sure it’s discouraged, because I can’t easily replace a statically-linked library, in case of vulnerabilities, for example.

      You can always bundle the dynamic libs in your package and put the whole thing under /opt, if you don’t play well with others.

    • MyNameIsRichard@lemmy.ml
      link
      fedilink
      arrow-up
      10
      arrow-down
      1
      ·
      4 days ago

      You’ll never get perfect binary compatibility because different distros use different versions of libraries. Consider Debian and Arch which are at the opposite ends of the scale.

      • 2xsaiko@discuss.tchncs.de
        link
        fedilink
        arrow-up
        30
        arrow-down
        2
        ·
        3 days ago

        And yet, ancient Windows binaries will still (mostly) run and macOS allows you to compile for older system version compatibility level to some extent (something glibc alone desperately needs!). This is definitely a solvable problem.

        Linus keeps saying “you never break userspace” wrt the kernel, but userspace breaks userspace all the time and all people say is that there’s no other way.

        • Magiilaro@feddit.org
          link
          fedilink
          arrow-up
          7
          ·
          3 days ago

          It works under Windows because the windows binaries come with all their dependency .dll (and/or they need some ancient visual runtime installed).

          This is more or less the Flatpack way, with bundling all dependencies into the package

          Just use Linux the Linux way and install your program via the package manager (including Flatpack) and let that handle the dependencies.

          I run Linux for over 25 years now and had maybe a handful cases where the Userland did break and that was because I didn’t followed what I was told during package upgrade.

          The amount of time that I had to get out of .dll-hell on Windows on the other hand. The Linux way is better and way more stable.

          • 2xsaiko@discuss.tchncs.de
            link
            fedilink
            arrow-up
            5
            arrow-down
            1
            ·
            3 days ago

            I’m primarily talking about Win32 API when I talk about Windows, and for Mac primarily Foundation/AppKit (Cocoa) and other system frameworks. What third-party libraries do or don’t do is their own thing.

            There’s also nothing wrong with bundling specialized dependencies in principle if you provide precompiled binaries. If it’s shipped via the system package manager, that can manage the library versions and in fact it should do that as far as possible. Where this does become a problem is when you start shipping stuff like entire GUI toolkits (hello bundled Qt which breaks Plasma’s style plugins every time because those are not ABI-compatible either).

            The amount of time that I had to get out of .dll-hell on Windows on the other hand. The Linux way is better and way more stable.

            Try running an old precompiled Linux game (say Unreal Tournament 2004 for example). They can be a pain to get working. This is not just some “ooooh gotcha” case, this is an important thing that’s missing for software preservation and cross-compatibility, because not everything can be compiled from source by distro packagers, and not every unmaintained open-source software can be compiled on modern systems (and porting it might not be easy because of the same problem).

            I suppose what Linux is severely lacking is a comprehensive upwards-compatible system API (such as Win32 or Cocoa) which reduces the churn between distros and between version releases. Something that is more than just libc.

            We could maybe have had this with GNUstep, for example (and it would have solved a bunch of other stuff too). But it looks like nobody cares about GNUstep and instead it seems like people are more interested in sidestepping the problem with questionably designed systems like Flatpak.

            • navordar@lemmy.ml
              link
              fedilink
              arrow-up
              1
              ·
              2 days ago

              There was the Linux Standard Base project, but there were multiple issues with it and finally it got abandoned. Some distributions still have a /etc/lsb-release file for compatibility.

            • Magiilaro@feddit.org
              link
              fedilink
              arrow-up
              3
              ·
              edit-2
              3 days ago

              Unreal Tournament 2004 depends on SDL 1.3 when I recall correctly, and SDL is neither on Linux nor on any other OS a core system library.

              Binary only programs are foreign to Linux, so yes you will get issues with integrating them. Linux works best when everyone plays by the same rules and for Linux that means sources available.

              Linux in its core is highly modifiable, besides the Kernel (and nowadays maybe systemd), there is no core system that could be used to define a API against. Linux on a Home theater PC has a different system then Linux on a Server then Linux on a gaming PC then Linux on a smartphone.

              You can boot the Kernel and a tiny shell as init and have a valid, but very limited, Linux system.

              Linux has its own set of rules and his own way to do things and trying to force it to be something else can not and will not work.

        • MyNameIsRichard@lemmy.ml
          link
          fedilink
          arrow-up
          4
          ·
          edit-2
          3 days ago

          The difference is that most of your software is built for your distribution, the only exception being some proprietary shit that says it supports Linux, but in reality only supports Ubuntu. That’s my pet peeve just so that you know!

          • 2xsaiko@discuss.tchncs.de
            link
            fedilink
            arrow-up
            2
            ·
            3 days ago

            Distributions are not the problem. Most just package upstream libraries as-is (plus/minus some security patches). Hence why programs built for another distro will a lot of the time just run as is on a contemporary distro given the necessary dependencies are installed, perhaps with some patching of the library paths (plenty of packages in nixpkgs which just use precompiled deb packages as a source, as an extreme example because nixpkgs has a very different file layout).

            Try a binary built for an old enough Ubuntu version on a new Ubuntu version however…

    • CarrotsHaveEars@lemmy.ml
      link
      fedilink
      arrow-up
      5
      arrow-down
      4
      ·
      3 days ago

      What you described as the weakness, is actually what is strong of an open source system. If you compile a binary for a certain system, say Debian 10, and distribute the binary to someone who is also running a Debian 10 system, it is going to work flawlessly, and without overhead because the target system could get the dependency on their own.

      The lack of ability to run a binary which is for a different system, say Alpine, is as bad as those situations when you say you can’t run a Windows 10 binary on Windows 98. Alpine to Debian, is on the same level of that 10 to 98, they are practically different systems, only marked behind the same flag.

      • Ephera@lemmy.ml
        link
        fedilink
        English
        arrow-up
        3
        ·
        3 days ago

        The thing is, everyone would agree that it’s a strength, if the Debian-specific format was provided in addition to a format which runs on all Linux distros. When I’m not on Debian, I just don’t get anything out of that…

    • iii@mander.xyz
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      2
      ·
      3 days ago

      I think webassembly will come out on top as preferred runtime because of this, and the sandboxing.