I spin up a lot of Docker containers with large data sets locally.
I spin up a lot of Docker containers with large data sets locally.
Developer here. Completely depends on your workflow.
I went base model and the only thing I regret is not getting more RAM.
Speeds have been phenomenal when there binaries are native. Speeds have been good when the binaries are running through Rosetta.
The specs you’re wavering between are extremely workflow specific. You know if your workflow requires the 16 extra GPU cores. You know if your workflow requires another 64 GB of RAM.
Raising the standard enables new uses of technology.
Serious. I installed VSCodium today.
I didn’t realize vscode is open source. Good to know!
Use two providers on different networks. They can fill in the gaps for each other.
Not just performance; I can’t imagine it would be good for five drives of a volume to go missing if a single cable fails.
I’m wondering if I can move my four current drives into the DX517 and save the volume. Can I just move the drives around without consequence?
I’m starting to think a second NAS full of SSDs would be best to host home lab applications off of instead of trying to make it work with my current NAS and an expansion unit.
I added 16GB to mine. It was recognized without me doing anything special. I run about 15 Docker containers in addition to the normal Synology suite. I only end up using about 3 GB of ram, but I don’t mind having 17 GB available for paging.
Chris from Mr. Beast is non-binary.
Link from below:
https://edition.cnn.com/2023/04/15/entertainment/mr-beast-transphobia-chris-tyson-trnd/index.html
Yet the same people still expect meteorologists to use the same science to predict the weather for them.
I highly recommend storing your DB and pictrs directories on an SSD volume.
macOS
So far so good on my little one user instance as well.
Exactly what you said. It has always been about control.