-
Notifications
You must be signed in to change notification settings - Fork 79
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[BugFix] Fix non-deterministic key order in stack #1230
Changes from all commits
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -1757,7 +1757,7 @@ def _check_keys( | |
strict: bool = False, | ||
include_nested: bool = False, | ||
leaves_only: bool = False, | ||
) -> set[str]: | ||
) -> set[str] | list[str]: | ||
from tensordict.base import _is_leaf_nontensor | ||
|
||
if not len(list_of_tensordicts): | ||
|
@@ -1769,27 +1769,29 @@ def _check_keys( | |
) | ||
# TODO: compile doesn't like set() over an arbitrary object | ||
if is_compiling(): | ||
keys = {k for k in keys} # noqa: C416 | ||
keys_set = {k for k in keys} # noqa: C416 | ||
else: | ||
keys: set[str] = set(keys) | ||
keys_set: set[str] = set(keys) | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Out of curiosity, is it much more efficient using There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. torch.compile used to not understand set() that's all. I should check if it's still the case |
||
for td in list_of_tensordicts[1:]: | ||
k = td.keys( | ||
include_nested=include_nested, | ||
leaves_only=leaves_only, | ||
is_leaf=_is_leaf_nontensor, | ||
) | ||
if not strict: | ||
keys = keys.intersection(k) | ||
keys_set = keys_set.intersection(k) | ||
else: | ||
if is_compiling(): | ||
k = {v for v in k} # noqa: C416 | ||
else: | ||
k = set(k) | ||
if k != keys: | ||
if k != keys_set: | ||
raise KeyError( | ||
f"got keys {keys} and {set(td.keys())} which are incompatible" | ||
) | ||
return keys | ||
if strict: | ||
return list(keys) | ||
return keys_set | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. If keys can be exclusive, their order becomes arbitrary There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. By curiosity, what are the downstream functions that would be impacted by this? In other words, in which context is There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Yes when using lazy stacks iirc |
||
|
||
|
||
def _set_max_batch_size(source: T, batch_dims=None): | ||
|
Original file line number | Diff line number | Diff line change | ||||
---|---|---|---|---|---|---|
|
@@ -1887,6 +1887,20 @@ class MyDataNested: | |||||
): | ||||||
torch.stack([data1, data3], dim=0) | ||||||
|
||||||
def test_stack_keyorder(self): | ||||||
|
||||||
class MyTensorClass(TensorClass): | ||||||
foo: Tensor | ||||||
bar: Tensor | ||||||
|
||||||
tc1 = MyTensorClass(foo=torch.zeros((1,)), bar=torch.ones((1,))) | ||||||
|
||||||
for _ in range(10000): | ||||||
assert list(torch.stack([tc1, tc1], dim=0)._tensordict.keys()) == [ | ||||||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more.
Suggested change
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. This was on purpose to avoid any artifacts caused by |
||||||
"foo", | ||||||
"bar", | ||||||
] | ||||||
|
||||||
def test_statedict_errors(self): | ||||||
@tensorclass | ||||||
class MyClass: | ||||||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Not sure if this added space is on purpose.