Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feature Request: Allow WCR AI features on non-Copilot+ hardware (GPUs) #5126

Open
whiskhub opened this issue Feb 8, 2025 · 1 comment
Open

Comments

@whiskhub
Copy link

whiskhub commented Feb 8, 2025

Currently most (if not all?) Windows Copilot Runtime features require a Copilot+ PC both for the developer and the user.

This means the usability of the APIs is greatly limited, as the overwhelming majority of users do not have such a device. Therefore the incentive for developers to use these APIs is quite low unfortunately.
This is on top of the other restrictions, like OS version and packaging, will hamper adoption even if the APIs have great features and usability ("UWP 2.0") .

Therefore it would be a big deal if WCR would also support non-Copilot+ hardware, like GPUs in x64 systems.

Especially NVIDIA GPUs have the appropriate power and wide adoption. Intel and AMD GPUs also have a wide adoption. It's clear they would not provide the same power/performance as NPUs in Copilot+ chips, but this could be made transparent to developers and users.

@BenJKuhn
Copy link
Member

We appreciate the input and are paying attention to the community's feedback in planning future work. I can't commit to anything at this time but keep the upvotes coming.

It is impressive with what can be done on modern high-end GPUs. If you want to get started today leveraging capabilities similar to WCR on the GPU, I encourage you to explore the existing options for integrating the open versions of some of these models (like Phi: https://azure.microsoft.com/en-us/products/phi/) using the existing options. You can also find CUDA-tuned optimized versions of the model on Hugging Face (e.g. https://huggingface.co/microsoft/Phi-3-medium-4k-instruct-onnx-cuda).

You won't find free versions of everything in WCL on Azure or Hugging Face, but there are some great models available both for local and cloud-based AI capabilities if you can't scope your work to CoPilot+ PCs.

Thanks,

Ben

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

3 participants