I've read that Parallels Desktop supports eGPUs now. Does anyone know if this is an option for machine learning development? I'd like to do TensorFlow development on my MacBook Pro 16" in a Linux virtual machine that's making use of an Nvidia card connected as an eGPU. Is this possible? Thanks!
Parallels virtual machines cannot directly access eGPU. So it is not possible to use it within a virtual machine.
I've tried something similar. Parallels does support eGPUs, but it doesn't offer full native GPU passthrough to Linux VMs, so Nvidia cards aren't fully compatible with CUDA inside the VM. For ML workloads, macOS with Parallels and an Nvidia eGPU isn't ideal. If you need CUDA, it's better to run Linux natively or use a different machine with proper Nvidia drivers.
No -- using Parallels Desktop for Mac with an eGPU on a Mac will not give you the kind of GPU-access you'd need to reliably do serious machine-learning (e.g. TensorFlow with an NVIDIA card) in a Linux VM