DNS is the most neoliberal shit system that too many have just accepted as how computers work and always worked to the point where I have heard actual landlord arguments deployed to defend it

It’s administered by ICANN, which is like the ideal neoliberal email factory NGO that justifies the de facto monopoly of a tiny few companies with responsible internet government-stewardship stakeholderism, etc type bureaucracy while upholding the right of domain landlords to charge hundreds or even thousands of dollars in rent for like 37 bytes on a server somewhere lol

Before this it was administered by the US military-industrial complex, you can thank Bill Clinton and the US Chamber of Commerce for this version of it along with Binky Moon for giving us cheap .shit TLDs for 3 dollars for the first year

Never forget the architects of the internet were some of the vilest US MIC and Silicon Valley ghouls who ever lived and they are still in control fundamentally no matter how much ICANN and IANA claim to be non-partison, neutral, non-political, accountable, democratic, international, stewardshipismists

“Nooooo we’re running out of IPv4 addresses and we still can’t get everyone to use the vastly better IPv6 cuz uhhh personal network responsibility. Whattttt??? You want to take the US Department of Defense’s multiple /8 blocks? That’s uhhhh not possible for reasons :|” Internet is simultaneously a free-market hellscape where everyone with an ASN is free to administer it however they want while at the same time everyone is forced into contracts with massive (usually US-based) transit providers who actually run all the cables and stuff. Ohhh you wanna run traffic across MYYYYYYY NETWORK DOMAINNNNNNN??? That’ll be… 1 cent per packet please, money please now money now please money now money please now now nwoN OWOW

  • hello_hello [comrade/them]@hexbear.net
    link
    fedilink
    English
    arrow-up
    11
    ·
    4 months ago

    The Domain Name System today enables traffic amplification attacks, censorship (i.e. China)

    catgirl-disgust

    Autotools is pretty dense, but all build-systems are like that. Believe me, the only good build system is the cp -rf $src $dest build system badeline-heh.

    At least autotools was one of the first (and transparent too). I don’t know what the excuse for CMakeLists.txt is

      • hello_hello [comrade/them]@hexbear.net
        link
        fedilink
        English
        arrow-up
        10
        ·
        4 months ago

        Programmers aren’t taught portability in Uni at all and that’s why we have a dozen build systems for Python and JS.

        Why be portable when you can shove a huge docker container into it and forget about it?

        I wish Rust had not become the new C++

        As someone who had to write build scripts for Rust. Fuck Rust (from a package maintainer perspective) -> no dynamic linking, compute intensive compiler, virtually single source of truth in crates dot io. Dependency trees are so fucked that a trivial library has the power to pull in the test framework for a GAME ENGINE (which requires compiling that engine). Slow ass fuck compile times that I can’t cache because I write packages.

        • piggy [they/them]@hexbear.net
          link
          fedilink
          English
          arrow-up
          5
          ·
          edit-2
          4 months ago

          Because portability has only been practical for the majority of applications since 2005ish.

          You’re not having a system where every executable has 100mb of OS libs statically linked to them in the 90’s be fuckin for real.

          You complain a lot about static linking in rust and it’s the only way to actually achieve portability.

            • piggy [they/them]@hexbear.net
              link
              fedilink
              English
              arrow-up
              2
              ·
              4 months ago

              I agree about static linking but… 100mb of code is absolutely massive, do Rust binaries actually get that large?? Idk how you do that even, must be wild amounts of automatically generated object oriented shit lol

              My brother in Christ if you have to put every lib in the stack into a GUI executable you’re gonna have 100mb of libs regardless of what system you’re using.

              Also Plan 9 did without dynamic linking in the 90s. They actually found their approach was smaller in a lot of cases over having dynamic libraries around: https://groups.google.com/g/comp.os.plan9/c/0H3pPRIgw58/m/J3NhLtgRRsYJ

              Plan 9 was a centrally managed system without the speed of development of a modern OS. Yes they did it better because it was less complex to manage. Plan 9 doesn’t have to cope with the fact that the FlatPak for your app needs lib features that don’t come with your distro.

              Also wdym by this? Ppl have been writing portable programs for Unix since before we even had POSIX

              It was literally not practical to have every app be portable because of space constraints.

                • piggy [they/them]@hexbear.net
                  link
                  fedilink
                  English
                  arrow-up
                  2
                  ·
                  edit-2
                  4 months ago

                  You just link against the symbols you use though :/ Lemme go statically link some GTK thing I have lying around and see what the binary size is cuz the entire GTK/GLib/GNOME thing is one of the worst examples of massive overcomplication on modern Unix lol

                  If you link against symbols you are not creating something portable. In order for it to be portable the lib cannot ever change symbols. That’s a constraint you can practically only work with if you have low code movement and you control the whole system. (see below for another way but it’s more complex rather than less complex).

                  Also I’m not a brother :|

                  My bad. I apologize. I am being inconsiderate in my haste to reply.

                  It was less complex cuz they made it that way though, we can too. FlatPaks are like the worst example too cuz they’re like dynamically linked things that bring along all the libraries they need to use anyway (unless they started keeping track of those?) so you get the worst of both static and dynamic linking. I just don’t use them lol

                  But there’s no other realistic way.

                  You mean portable like being able to copy binaries between systems? Cuz back in the 90s you would usually just build whatever it was from source if it wasn’t in your OS or buy a CD or smth from a vendor for your specific setup. Portable to me just means like that programs can be be built from source and run on other operating systems and isn’t too closely attached to wherever it was first created. Being able to copy binaries between systems isn’t something worth pursuing imo (breaking userspace is actually cool and good :3, that stable ABI shit has meant Linux keeps around so much ancient legacy code or gets stuck with badddd APIs for the rest of time or until someone writes some awful emulation layer lol)

                  That’s a completely different usage of “portable” and is basically a non-problem in the modern era, as long as and see my response to the symbols point, you are within the same-ish compatibility time frame.

                  It’s entirely impossible to do this over a distributed ecosystem over the long term. You need symbol migrations so that if I compile code from 1995 it can upgrade to the correct representation in modern symbols. I’ve built such dependency management systems for making evergreen data in DSLs. Mistakes, deprecation, and essentially everything you have ever written has to be permanent, it’s not a simple way to program. It can only be realized in tightly and directly controlled environments like Plan 9 or if you’re the architect of an org.

                  Dependency management is an organization problem that is complex, temporal, and intricate. You cannot “technology” your way out of the need to manage the essential complexity here.