Open
Description
Describe the bug
A PSCollection should contain optimizer states besides weights. The optimizer states tensors are obtained directly from EmbeddingCollection Module.
However, the sharded_module.fused_optimizer.state_dict()['state'] does not contain key {table_name}.momentum2
because
- TBE::get_optimizer_state() which is used by
PSCollection
will not return key likexxx.momentum1
orxxx.momentum2
. They are customized by TBE. - The states keys are renamed by torchrec::EmbeddingFusedOptimizer. The first state falls back on
xxx.momentum1
while the left keys are copied from above retrived results.
See the below illustration where optimizer is Adam. The expected number of state tensors should be 2, but the it eventually gives momentum1
and leaves momentum2
(which is synonymously exp_avg_sq
) out.
It will pose impact on all kinds of optimizer that contains momentum2.