1

The Server is running on Proxmox VE. My goal is to use any GPU in a VM. So I blacklisted nvidia noveau radeon amdgpu to ensure all GPUs are correctly accessible to assign the VFIO driver. I've added all the ids from lspci -vnn to /etc/modprobe.d/vfio-pcie.conf.

In /etc/modules

vfio
vfio_iommu_type1
vfio_pci

In /etc/default/grub

#--snip--
GRUB_CMDLINE_LINUX_DEFAULT="quiet splash amd_iommu=on"
#--snip--

I've regenerated my initram and grub.

The OS is installed as EFI boot.

Now here is the problem which is the cause of my confusion: If I unplug my DVI before I boot, all my GPUs are abled to work inside any VM, if the DVI stays plugged in, the OS seems to grab the primary GPU (afterwards I'm unable to use the primary GPU in any configuration).

I've tried to add video=efifb:off to /etc/default/grub but no success.

I've tried to unbind the primary GPU on boot after the wait-quit.service but this doesn't resolved anything.

Any help is appreciated.

1 Answers1

0

ive noticed an grub statement while checking the hint of @NikitaKipriyanov load_video

So i removed that and all echo statements and was abled to use all gpus for vms