Skip to content

Conversation

@HosseinZaredar
Copy link

Hi,

There was a small problem with the mask returned from TextTokenizer forward function.
The next function using this mask needs a 2D tensor. Therefore, in TextTokenizer, the mask should not be unsqueezed before being returned.

The problem is fixed in this pull request.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant