-
Notifications
You must be signed in to change notification settings - Fork 49
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Update device selection explainer #824
base: main
Are you sure you want to change the base?
Update device selection explainer #824
Conversation
Signed-off-by: Zoltan Kis <[email protected]>
> about the likely need for a caller to know whether a particular device is supported or not, because an app may want to (if say GPU is not supported) use a different more performant fallback than for WebNN to silently fall back to CPU. For example, if GPU was unavailable (even though you preferred high performance), then it might be faster to execute the model with WebGPU shaders than WebNN CPU, or it might be okay to use CPU, but the app could load a different model that's more CPU-friendly, if it knew that was the case. | ||
|
||
That sparked a discussion in [Query mechanism for supported devices #815 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm not sure about the value of including this specific idea - while it garnered a lot of interest, it is (1) not clear it's actually implementable on all platforms and (2) the proposed output is very ambiguous.
Capturing the use cases from #815 makes a lot of sense, though!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This use case should be added here: #815 (comment)
When we go deeper to the solutions space a bit later (hoping we're able to extract some more use cases!), querySupport could go into considered alternatives along with a description of its known issues.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
OK, I will add the other use cases and reformulate the specific idea as an example to a starting point that could be explored next.
> about the likely need for a caller to know whether a particular device is supported or not, because an app may want to (if say GPU is not supported) use a different more performant fallback than for WebNN to silently fall back to CPU. For example, if GPU was unavailable (even though you preferred high performance), then it might be faster to execute the model with WebGPU shaders than WebNN CPU, or it might be okay to use CPU, but the app could load a different model that's more CPU-friendly, if it knew that was the case. | ||
|
||
That sparked a discussion in [Query mechanism for supported devices #815 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This use case should be added here: #815 (comment)
When we go deeper to the solutions space a bit later (hoping we're able to extract some more use cases!), querySupport could go into considered alternatives along with a description of its known issues.
Signed-off-by: Zoltan Kis <[email protected]>
Updated the device selection explainer with the latest discussions.
Fixed links that refer to past spec versions.