Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Workaround broken PCIe speeds on Intel Arc #344

Open
wants to merge 3 commits into
base: master
Choose a base branch
from

Conversation

Steve-Tech
Copy link
Contributor

@Steve-Tech Steve-Tech commented Dec 14, 2024

Hi again,

Intel Arc Alchemist has a weird (hardware?) bug, where the card and one of it's two PCIe bridges report their link speed as PCIe 1.0 x1, this PR implements a workaround to get the correct speed from it's first PCIe bridge.

e.g. lspci -tv:

-[0000:00]-+-01.1-[01-04]----00.0-[02-04]--+-01.0-[03]----00.0  Intel Corporation DG2 [Arc A770]
                                           \-04.0-[04]----00.0  Intel Corporation DG2 Audio Controller
             ^ Correct       ^ Incorrect     ^ Incorrect

I have checked a few probes on the Linux Hardware Database to verify whether it's just my system or card, and it's not. There is also a bug report on Intel's website with some info.

I have no idea if this is fixed in Battlemage.

I've also added Intel to the libdrm detection in src/CMakeLists.txt, and moved AMDGPU and MSM support checks out of the libdrm detection block since FATAL_ERROR will cause CMake will exit anyway. I hope this is okay.

Thanks,
Steve

@Syllo
Copy link
Owner

Syllo commented Dec 30, 2024

Sounds reasonable, but I'd rather only enable this for the Alchemist devices (ARC) devices for now.

Looking at Intel's hardware table it seems that all the ARC PCI IDs starts with "0x56" so doing a check like if (dev_id & 0xff00 == 0x5600) before entering your workaround would be better imho.

@Steve-Tech
Copy link
Contributor Author

Yeah good point, it also looks like the PCI bridges are usually '4fa0' and '4fa4', so I can check those too. I'll definitely implement this sometime after next week!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants