• 𝕽𝖚𝖆𝖎𝖉𝖍𝖗𝖎𝖌𝖍
    link
    fedilink
    arrow-up
    2
    arrow-down
    1
    ·
    5 days ago

    Rust run times are excellent. And statically linked binaries are the superior intellect.

    Runtime performance counts for me only some specific cases, and there are many programs I have installed that I recompile because of updates far more frequently than I run them; and when I do run them, rarely use performance an issue.

    But you have a good point: performance in the kernel is important, and it is run frequently, so the kernel is a good use case for Rust - where Go, perhaps, isn’t. My original comment, though, was that Zig appears to have many of the safety benefits of Rust, but vastly better compile times.

    I really do need to write some Zig projects, because I sound like an advocate when really my opinions are uninformed. I have written Rust, though, and obviously have opinions about it, and especially how it is affecting my system update times.

    I’ll keep ripgrep, regardless of compile times. Probably fd, too.

    • GarlicToast@programming.dev
      link
      fedilink
      arrow-up
      2
      ·
      5 days ago

      It is easier to safely optimize Rust than C, but that was not the point. The point was on correctness of code.

      It is not unheard of for code to run for weeks and months. I need the code to be as bug free as possible. For example, when converting one of our tools to Rust we found out a bug that will lead to the wrong results on big samples. It was found by the Rust compiler! Our tests didn’t cover the bug because it will only happen on very big sample. We can’t create a test file of hundreds of GB by hand and calculate the expected result. Our real data would have triggered the bug. So without moving to Rust we would have gotten the wrong results.

      • So, a couple of thoughts. You can absolutely write safe code that produces wrong results. Rust doesn’t help - at all - with correctness. Even Rustaceans will agree on that point.

        I agree that Rust is safer than C; my point is that if correctness and safeness is the deciding criteria, then why not use Haskell? Or Ada? Both are more “safe” even than Rust, and if you’re concerned about correctness, Haskell is a “provable” language, and there are even tools for performing correctness analysis on Haskell code.

        But those languages are not allowed in the kernel, and - indeed - they’re not particularly popular; certainly not in comparison to C, Go, or Rust. There are other factors than just safety and correctness; otherwise, something like OCaml would probably be a dominant language right now.

        • GarlicToast@programming.dev
          link
          fedilink
          arrow-up
          1
          ·
          18 hours ago

          We didn’t get similar run times with Haskell.

          Rust let us abstract even file types (path to a fastq file, fasta file, annotations, etc) with no run time costs. This eliminate many bugs at compile time.

          You may say that we can get it in C too, and you will be correct. But in C we spend our time on herding pointers. Research is given X money for N months (sort of), so we have time constraints on development time.

          If we do bit wise work, the compiler tests our base types.

          Not to mention multithreading just works. Even big projects like BLAST had bugs that led to wrong results due C/CPP horrible multithreading. We encountered two more tools that had similar bugs.

          I think that if someone ever does a meta-studies of research code written in C it may get papers retracted.