Skip to content

Support SPMD placeholder tensors #9489

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Draft
wants to merge 1 commit into
base: master
Choose a base branch
from

Conversation

rpsilva-aws
Copy link
Collaborator

@rpsilva-aws rpsilva-aws commented Jul 17, 2025

This PR is an extension to the placeholder feature #8612 that extends the functionality to accommodate sharded tensors for SPMD. It simultaneously fixes a typo in the existing binding for collecting placeholder tensors.

Refer to #8785 for the existing survey around how to leverage the object's address as the handle address for placeholder tensors. In addition, we also introduce a placeholder specific handling for mark sharding, as it currently entails an async data transfer. Note that for sharded data, we not do generate PjRtShardedData sharded objects in the BackendData.

This allows users to leverage placeholder tensors for staging computations in their program, without invoking any data transfer or PJRT buffers under the hood.

@rpsilva-aws rpsilva-aws force-pushed the rpsilva_spmd_placeholder branch from 12c858c to 88aa24f Compare July 17, 2025 21:18
@rpsilva-aws rpsilva-aws force-pushed the rpsilva_spmd_placeholder branch 6 times, most recently from e09b12f to 0cdb6fb Compare July 21, 2025 20:11
@rpsilva-aws rpsilva-aws force-pushed the rpsilva_spmd_placeholder branch from 0cdb6fb to 3403525 Compare July 21, 2025 20:11
@rpsilva-aws rpsilva-aws marked this pull request as draft July 22, 2025 16:44
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant