Machine Learning with eGPU

Discussion in 'Linux Virtual Machine' started by JacobK7, Nov 5, 2020.

  1. JacobK7

    JacobK7 Bit poster

    Messages:
    1
    I've read that Parallels Desktop supports eGPUs now. Does anyone know if this is an option for machine learning development? I'd like to do TensorFlow development on my MacBook Pro 16" in a Linux virtual machine that's making use of an Nvidia card connected as an eGPU.

    Is this possible?

    Thanks!
     
  2. Ajith1

    Ajith1 Parallels Support

    Messages:
    2,534
    Parallels virtual machines cannot directly access eGPU. So it is not possible to use it within a virtual machine.
     
    Rohits19 likes this.
  3. KritikaR1

    KritikaR1

    Messages:
    1
    I've tried something similar. Parallels does support eGPUs, but it doesn't offer full native GPU passthrough to Linux VMs, so Nvidia cards aren't fully compatible with CUDA inside the VM. For ML workloads, macOS with Parallels and an Nvidia eGPU isn't ideal. If you need CUDA, it's better to run Linux natively or use a different machine with proper Nvidia drivers.
     
  4. No -- using Parallels Desktop for Mac with an eGPU on a Mac will not give you the kind of GPU-access you'd need to reliably do serious machine-learning (e.g. TensorFlow with an NVIDIA card) in a Linux VM
     

Share This Page