Dynamically sized but stored contiguously makes the systems performance engineer in me weep. If the lists get big, the kernel is going to do so much churn.
Preallocate the vector if you can guesstimate the size
Use a vector library that won’t reallocate the entire vector on every single addition (like Rust, whose Vec doubles in size every time it runs out of space)
Memory is fairly cheap. Allocation time not so much.
matlab likes to pick the smallest available spot in memory to store a list, so for loops that increase the size of a matrix it’s recommended to preallocate the space using a matrix full of zeros!
Dynamically sized but stored contiguously makes the systems performance engineer in me weep. If the lists get big, the kernel is going to do so much churn.
Contiguous storage is very fast in terms of iteration though often offsetting the cost of allocation
Which is why you should:
Vec
doubles in size every time it runs out of space)Memory is fairly cheap. Allocation time not so much.
matlab likes to pick the smallest available spot in memory to store a list, so for loops that increase the size of a matrix it’s recommended to preallocate the space using a matrix full of zeros!
Is that churn or chum? (RN or M)
Churm