Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Build failure: amdvlk #216294

Closed
toastal opened this issue Feb 14, 2023 · 15 comments · Fixed by #216796
Closed

Build failure: amdvlk #216294

toastal opened this issue Feb 14, 2023 · 15 comments · Fixed by #216796
Labels
0.kind: build failure A package fails to build

Comments

@toastal
Copy link
Contributor

toastal commented Feb 14, 2023

Steps To Reproduce

Similar to Hydra now, I'm having issues with amdvlk with i686

Build log

https://hydra.nixos.org/build/209322760

Additional context

This is used for x86_64 to get the drivers to run Steam and other older games. Building from master as I have for months while I wait for #177623 + #187303 because I need to turn off the broken build flag. Maybe one day it will get reviewed and merged in and I can go back to unstable.

Notify maintainers

@Flakebi

Metadata

Please run nix-shell -p nix-info --run "nix-info -m" and paste the result.

nix-shell -p nix-info --run "nix-info -m"
 - system: `"x86_64-linux"`
 - host os: `Linux 6.1.11, NixOS, 23.05 (Stoat), 23.05.20230213.dirty`
 - multi-user?: `yes`
 - sandbox: `yes`
 - version: `nix-env (Nix) 2.13.2`
 - channels(toastal): `""`
 - nixpkgs: `/nix/var/nix/profiles/per-user/root/channels/nixos`
@toastal toastal added the 0.kind: build failure A package fails to build label Feb 14, 2023
@Flakebi
Copy link
Member

Flakebi commented Feb 14, 2023

I bisected it to commit cb57273 stdenv: gcc_11 → gcc_12

@toastal
Copy link
Contributor Author

toastal commented Feb 14, 2023

What is "it"? I'm not sure what you mean or what I should do/try with master.

@Flakebi
Copy link
Member

Flakebi commented Feb 14, 2023

That was just a note that cb57273 introduced that regression. driversi686Linux.amdvlk builds fine before that commit, but does not build with that commit. It would probably build if you revert that commit, but you’d have to rebuild the world, so I can’t recommend that.
So, no fix at this point. You can switch to a commit before cb57273, if you need a working 32-bit amdvlk.

cc @fabianhjr

@andresilva
Copy link
Member

GPUOpen-Drivers/gpurt#5 (comment)

I don't know enough about amdvlk to be able to do anything with this information.

@mpasternacki
Copy link
Contributor

The comment in gpurt doesn't help much: LTO is disabled by default in directx-shader-compiler, and the other cmake option is already set.

I noticed that at least in my case dxc is segfaulting when building 32-bit amdvlk:

[Feb17 10:38] dxc[2850619]: segfault at 0 ip 00000000f6cefad1 sp 00000000ffff1984 error 4 in libdxcompiler.so.3.7[f614c000+184e000] likely on CPU 10 (core 13, socket 0)
[  +0.000017] Code: ff 83 ec 0c 89 46 3c 8b 46 04 89 3e 83 e0 03 09 f8 89 46 04 0f b6 85 47 fe ff ff 88 46 40 8b 85 7c fe ff ff 8d 98 9c 01 00 00 <8b> 07 57 ff 50 10 83 c4 0c 50 53 ff b5 20 fe ff ff 8b 9d 88 fe ff
[  +0.906522] dxc[2850797]: segfault at 0 ip 00000000f6cefad1 sp 00000000ffff2d54 error 4 in libdxcompiler.so.3.7[f614c000+184e000] likely on CPU 1 (core 1, socket 0)
[  +0.000016] Code: ff 83 ec 0c 89 46 3c 8b 46 04 89 3e 83 e0 03 09 f8 89 46 04 0f b6 85 47 fe ff ff 88 46 40 8b 85 7c fe ff ff 8d 98 9c 01 00 00 <8b> 07 57 ff 50 10 83 c4 0c 50 53 ff b5 20 fe ff ff 8b 9d 88 fe ff
[  +0.026606] dxc[2850803]: segfault at 0 ip 00000000f6cefad1 sp 00000000ffff2d44 error 4 in libdxcompiler.so.3.7[f614c000+184e000] likely on CPU 4 (core 5, socket 0)
[  +0.000016] Code: ff 83 ec 0c 89 46 3c 8b 46 04 89 3e 83 e0 03 09 f8 89 46 04 0f b6 85 47 fe ff ff 88 46 40 8b 85 7c fe ff ff 8d 98 9c 01 00 00 <8b> 07 57 ff 50 10 83 c4 0c 50 53 ff b5 20 fe ff ff 8b 9d 88 fe ff
[  +0.675123] dxc[2850883]: segfault at 0 ip 00000000f6cefad1 sp 00000000ffff2d24 error 4 in libdxcompiler.so.3.7[f614c000+184e000] likely on CPU 9 (core 12, socket 0)
[  +0.000017] Code: ff 83 ec 0c 89 46 3c 8b 46 04 89 3e 83 e0 03 09 f8 89 46 04 0f b6 85 47 fe ff ff 88 46 40 8b 85 7c fe ff ff 8d 98 9c 01 00 00 <8b> 07 57 ff 50 10 83 c4 0c 50 53 ff b5 20 fe ff ff 8b 9d 88 fe ff

I don't understand graphic drivers plumbing or what's going on in that build, but thanks to @Flakebi's bisect I tried to just build 32-bit directx-shader-compiler with older gcc and it builds now (it also builds gcc11, but it's still better than a failing build). Here's the overlay:

final: prev: {
  pkgsi686Linux = prev.pkgsi686Linux.extend (pfinal: pprev: {
    directx-shader-compiler = pprev.directx-shader-compiler.override {stdenv = pfinal.gcc11Stdenv;};
  });
}

@pshirshov
Copy link
Contributor

I experience the same problem:

ninja: build stopped: subcommand failed.
error: builder for '/nix/store/cb4pmz31hb8jd30l5qh9ny14cfwdg4bj-amdvlk-2022.Q4.4.drv' failed with exit code 1;
       last 10 log lines:
       > [196/2595] Building CXX object compiler/llpc/llvm/lib/Support/CMakeFiles/LLVMSupport.dir/Error.cpp.o
       > [197/2595] Building CXX object compiler/llpc/llvm/lib/Support/CMakeFiles/LLVMSupport.dir/FileOutputBuffer.cpp.o
       > [198/2595] Building CXX object compiler/llpc/llvm/lib/Support/CMakeFiles/LLVMSupport.dir/ELFAttributeParser.cpp.o
       > [199/2595] Building CXX object compiler/llpc/llvm/lib/Support/CMakeFiles/LLVMSupport.dir/FileUtilities.cpp.o
       > [200/2595] Building CXX object compiler/llpc/llvm/lib/Support/CMakeFiles/LLVMSupport.dir/FileCollector.cpp.o
       > [201/2595] Building CXX object compiler/llpc/llvm/lib/Support/CMakeFiles/LLVMSupport.dir/APInt.cpp.o
       > [202/2595] Building CXX object compiler/llpc/llvm/lib/Demangle/CMakeFiles/LLVMDemangle.dir/ItaniumDemangle.cpp.o
       > [203/2595] Building CXX object compiler/llpc/llvm/lib/Support/CMakeFiles/LLVMSupport.dir/APFloat.cpp.o
       > [204/2595] Building CXX object compiler/llpc/llvm/lib/Support/CMakeFiles/LLVMSupport.dir/CommandLine.cpp.o
       > ninja: build stopped: subcommand failed.
       For full logs, run 'nix log /nix/store/cb4pmz31hb8jd30l5qh9ny14cfwdg4bj-amdvlk-2022.Q4.4.drv'.
error: 1 dependencies of derivation '/nix/store/cjavca5yx86wmkkjljawifh62h82lacq-opengl-drivers-32bit.drv' failed to build

@Lillecarl
Copy link
Contributor

amdvlk is enabled by default if you're using nixos-hardware AMD modules, and figuring that out isn't entirely obvious for newer NixOS users.

@mpasternacki
Copy link
Contributor

amdvlk is enabled by default if you're using nixos-hardware AMD modules, and figuring that out isn't entirely obvious for newer NixOS users.

Not since NixOS/nixos-hardware#558

@pshirshov
Copy link
Contributor

amdvlk is enabled by default

Not sure what exactly you are talking about, but this problem only happens with 32-bit amdvlk. The 64-bit one builds/works just fine.

@cid-chan
Copy link
Contributor

Steam enables 32bit support here

driSupport32Bit = true;

@SFrijters
Copy link
Member

Overriding the compiler is no longer necessary #222072 .

@toastal
Copy link
Contributor Author

toastal commented Mar 23, 2023

32-bit is broken again now for me.

Output log:
Compiling BuildQBVH.hlsl:BuildQBVHCollapse -> BuildQBVHCollapse_spv.h...
Subprocess invocation failed. Error code -11. Additional details follow:
Environment: {'SHELL': '/nix/store/ksdblan990svayw70nqwcbqpjih8kivy-bash-5.2-p15/bin/bash', '_PYTHON_HOST_PLATFORM': 'linux-i686', 'NIX_BUILD_CORES': '16', 'configureFlags': '', 'CTEST_OUTPUT_ON_FAILURE': '1', 'me>
Command arguments: ['dxc', '-fspv-target-env=vulkan1.1', '-spirv', '-E', 'BuildQBVHCollapse', '-DAMD_VULKAN', '-DAMD_VULKAN_DXC', '-DAMD_VULKAN_SPV', '-DGPURT_BUILD_RTIP2', '-fvk-use-scalar-layout', '-Od', '-Vd', >
Expected empty output: False
Working directory: /build/amdvlk-src/build/gpurt/src/pipelines/spv/BuildQBVHCollapse
Stdout:

Stderr:

Failed to compile Vulkan shader config ShaderConfig< Path: BuildQBVH.hlsl, EntryPoint: BuildQBVHCollapse, OutputName: None, BaseLogicalId: None, RootSignaturePath: None, Defines: None, GroupTag: BVH >
Compilation failed for shader InitScanExclusiveInt4DLB
Output log:
Compiling ScanExclusiveInt4DLB.hlsl:InitScanExclusiveInt4DLB -> InitScanExclusiveInt4DLB_spv.h...
Subprocess invocation failed. Error code -11. Additional details follow:
Environment: {'SHELL': '/nix/store/ksdblan990svayw70nqwcbqpjih8kivy-bash-5.2-p15/bin/bash', '_PYTHON_HOST_PLATFORM': 'linux-i686', 'NIX_BUILD_CORES': '16', 'configureFlags': '', 'CTEST_OUTPUT_ON_FAILURE': '1', 'me>
Command arguments: ['dxc', '-fspv-target-env=vulkan1.1', '-spirv', '-E', 'InitScanExclusiveInt4DLB', '-DAMD_VULKAN', '-DAMD_VULKAN_DXC', '-DAMD_VULKAN_SPV', '-DGPURT_BUILD_RTIP2', '-fvk-use-scalar-layout', '-Od', >
Expected empty output: False
Working directory: /build/amdvlk-src/build/gpurt/src/pipelines/spv/InitScanExclusiveInt4DLB
Stdout:

Stderr:

Failed to compile Vulkan shader config ShaderConfig< Path: RadixSort/ScanExclusiveInt4DLB.hlsl, EntryPoint: InitScanExclusiveInt4DLB, OutputName: None, BaseLogicalId: None, RootSignaturePath: None, Defines: None, >
Launching threads for raytracing shader compilation...
Raytracing shader compilation completed with no errors.
Launching threads for raytracing shader compilation...

@toastal
Copy link
Contributor Author

toastal commented Mar 23, 2023

final: prev: {
  pkgsi686Linux = prev.pkgsi686Linux.extend (pfinal: pprev: {
    directx-shader-compiler = pprev.directx-shader-compiler.override {stdenv = pfinal.gcc11Stdenv;}
  });
 }

@mpasternacki 's overlay is working however

@toastal toastal reopened this Mar 23, 2023
@vcunat vcunat closed this as completed in 26f5517 Mar 23, 2023
@SFrijters
Copy link
Member

@toastal Weird; I tested building driversi686Linux.amdvlk without the overlay and that worked for me, and nixpkgs review passed as well. Do you have any idea where the difference might come from?

@toastal
Copy link
Contributor Author

toastal commented Mar 23, 2023 via email

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
0.kind: build failure A package fails to build
Projects
None yet
Development

Successfully merging a pull request may close this issue.

8 participants