• 0 Posts
  • 23 Comments
Joined 1 year ago
cake
Cake day: June 30th, 2023

help-circle


  • Jérôme Flesch@lemmy.kwain.nettoLinux@lemmy.ml*Permanently Deleted*
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    2
    ·
    edit-2
    10 months ago

    In term of software compatibility, on Linux, you have the option of making chroots. Since the kernel devs makes a lot of effort to preserve compatibility, old software can still work fine. If I remember correctly, some kernel devs tested a while ago some really really old versions of bash, gcc, etc, and they still work fine with modern kernels.



  • For those wondering, it also works with a Linux VM:

    • Host: AMD Ryzen 9 3900X + Proxmox
    • PCI passthrough for an Nvidia RTX 3060 12GB
    • A Debian VM with 16GB and as many cores as the host have (if you set less cores, you will have to tune cpu affinity/pinning)
    • An HDMI dummy
    • I stream the VM to my office using Sunshine and Moonlight

    It’s not easy to set up, but it works. I’m able to run some games like Borderlands 3 running at ~50FPS with a resolution of 1920x1080 with visual effects set to the max (important: disable vsync in the games !).

    Only problem is disk access. It tends to add some latency. So with games coded with their ass (ex: Raft), the framerate drops a lot (Raft goes down to 20FPS sometimes).


  • Yes I would count this game as self-hosted (as long as you don’t need a third-party service to start it). And yes I agree it is a pretty wide definition. But at the same time, I really think there are a lot of good reasons to not dismiss it:

    • I think it is the simplest form of self-hosting you can do and it is doable by anybody without much technical expertise. For people with little to no technical expertise, it’s the perfect gateway to self-hosting. All you need to start is a backup drive.
    • For a single person, it’s actually the approach that often makes the more sense.
    • And even for technical people, sometimes you just don’t want to deploy and maintain yet-another-service.
    • And finally, you can still access your data when you’re offline.

    To be honest, when it comes to self-hosting, I can’t shake this feeling that a lot of people are dismissing desktop apps immediately just because they are not cool nor hype anymore.

    Regarding Syncthing, if I’m not mistaken, the Web UI can be opened to the network (most likely for headless servers) but by default it is only reachable through the loopback.

    Regarding OP, for me, it wasn’t entirely clear at first whether they wanted network access or not. They clarified it later in comments.


  • It is “hosted” on your workstation. There is no need for a server-client relationship for self-hosting.

    By requiring a server-client relationship, you’re making self-hosting uselessly hard to deploy and enforce a very specific design when others (P2P, file sync, etc) can solve the same problems more efficiently. For example, in my specific case, with Paperwork + Nextcloud file sync, my documents are distributed on all my workstations and always available even if offline. Another example is Syncthing which IMO fits the bill for self-hosting, but doesn’t fit your definition of self-hosted.











  • Based on my tests on my family and friends, the main problem is tech support. Most geeks seem to assume other people want the same things than themselves (privacy, freedom, etc). Well, they don’t. They want a computer that just works.

    Overall when using Linux, people actually don’t need much tech support, but they need it. My father put it really well by saying: “the best OS is the one of your neighbor.”

    I apply few rules:

    1. The deal with my family and friends is simple: you want tech support from me ? ok, then I’m going to pick your computer (usually old Lenovo Thinkpads bought on Ebay at ~300€) and I’m going to install Linux on it.

    2. I’m not shy. I ask them if they want me to have remote access to their computer. If they accept, I install a Meshcentral agent. Thing is, on other OS, they are already spied on by Google, Microsoft, Apple, etc. And most people think “they have nothing to hide”. Therefore why should they worry more about a family member or a friend than some unknown big company ? Fun fact, I’ve been really surprised by how easily people do accept that I keep a remote access on their computer: even people that are not family ! Pretty much everybody has gladly agreed up to now. (and God knows I’ve been really clear that I can access their computer whenever I want).

    3. I install the system for them and I make the major updates for them. Therefore, if I have remote access to the system, I pick the distribution I’m the most at ease with (Debian). They just don’t care what actually runs on their computers.

    4. When they have a problem, they call me after 8pm. With remote access, most problems are solved in a matter of minutes. Usually, they call me a few times the first days, and then I never hear from them anymore until the next major update.

    So far, everybody seems really happy with this deal. And for those wondering, I can see in Meshcentral they really do use those computers :-P



  • I worked for a bank. When they decided to deploy Linux on their infrastructure, they chose RHEL and they have signed a big contract with RedHat for tech support.

    Overall, they chose RedHat for the same reason they chose Microsoft before: tech support. They have >10000 engineers, and yet somehow they think they absolutely need tech support… They pay a lot for it. In my building, they even got a Microsoft engineer once a week on-site until Covid. I don’t know for the other people working for this bank, but I asked for Microsoft support only once in 2 years. In the end, their guy sent me back an email telling me “I’ve transmitted your question to the corresponding engineering team” and … diddlysquat.

    Now to be fair, for paying customers, RHEL and Microsoft both ensure security updates for a really a long time. Red Hat pays a lot of people to backport security patches from upstream to previous versions. It allows companies like the bank I worked for to keep running completely crappy and obsolete software for an insane amount of time without having to worry too much about security vulnerabilities.

    Anyway regarding RedHat contributions, a lot of them are subtle.

    • A friend of mine works for RedHat. He is a core Python developer and is paid full-time by RedHat to work on Python.
    • Through this friend, I applied for a position in their company at some point (unfortunately, it didn’t happen ; don’t remember why exactly). The position was in a team dedicated to improve hardware support. They have built an infrastructure to let computer manufacturers (Dell, Lenovo, etc) test the compatibility of their new hardware with Linux/RHEL quickly and automatically.
    • Part of the technical support they provide to some clients is “making things work”. It may imply fixing bugs or improving drivers and then sending patches upstream.
    • If I’m not mistaken, they paid Lennart Poettering to work on Systemd and Pulseaudio.
    • They pay for the development of some infrastructure software like Corosync for instance.

    This list is far from exhaustive. I’m sure they have paid for a lot of other things you’re using daily without knowing it.