• kbotc@lemmy.world
    link
    fedilink
    English
    arrow-up
    15
    ·
    8 months ago

    Dynamically sized but stored contiguously makes the systems performance engineer in me weep. If the lists get big, the kernel is going to do so much churn.

    • Killing_Spark@feddit.de
      link
      fedilink
      English
      arrow-up
      15
      ·
      8 months ago

      Contiguous storage is very fast in terms of iteration though often offsetting the cost of allocation

    • :3 3: :3 3: :3 3: :3@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      7
      ·
      edit-2
      8 months ago

      Which is why you should:

      1. Preallocate the vector if you can guesstimate the size
      2. Use a vector library that won’t reallocate the entire vector on every single addition (like Rust, whose Vec doubles in size every time it runs out of space)

      Memory is fairly cheap. Allocation time not so much.

    • yetiftw@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      8 months ago

      matlab likes to pick the smallest available spot in memory to store a list, so for loops that increase the size of a matrix it’s recommended to preallocate the space using a matrix full of zeros!