Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update device selection explainer #824

Open
wants to merge 2 commits into
base: main
Choose a base branch
from

Conversation

zolkis
Copy link
Collaborator

@zolkis zolkis commented Feb 26, 2025

Updated the device selection explainer with the latest discussions.
Fixed links that refer to past spec versions.

@zolkis zolkis requested review from anssiko and fdwr February 26, 2025 20:57
> about the likely need for a caller to know whether a particular device is supported or not, because an app may want to (if say GPU is not supported) use a different more performant fallback than for WebNN to silently fall back to CPU. For example, if GPU was unavailable (even though you preferred high performance), then it might be faster to execute the model with WebGPU shaders than WebNN CPU, or it might be okay to use CPU, but the app could load a different model that's more CPU-friendly, if it knew that was the case.

That sparked a discussion in [Query mechanism for supported devices #815
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm not sure about the value of including this specific idea - while it garnered a lot of interest, it is (1) not clear it's actually implementable on all platforms and (2) the proposed output is very ambiguous.

Capturing the use cases from #815 makes a lot of sense, though!

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This use case should be added here: #815 (comment)

When we go deeper to the solutions space a bit later (hoping we're able to extract some more use cases!), querySupport could go into considered alternatives along with a description of its known issues.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

OK, I will add the other use cases and reformulate the specific idea as an example to a starting point that could be explored next.

> about the likely need for a caller to know whether a particular device is supported or not, because an app may want to (if say GPU is not supported) use a different more performant fallback than for WebNN to silently fall back to CPU. For example, if GPU was unavailable (even though you preferred high performance), then it might be faster to execute the model with WebGPU shaders than WebNN CPU, or it might be okay to use CPU, but the app could load a different model that's more CPU-friendly, if it knew that was the case.

That sparked a discussion in [Query mechanism for supported devices #815
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This use case should be added here: #815 (comment)

When we go deeper to the solutions space a bit later (hoping we're able to extract some more use cases!), querySupport could go into considered alternatives along with a description of its known issues.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants