Skip to content

iRoPE (-p > 8192) #4063

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 1 commit into from
Closed

iRoPE (-p > 8192) #4063

wants to merge 1 commit into from

Conversation

y-sq
Copy link

@y-sq y-sq commented May 1, 2025

Summary:
When the max_seq_len is larger than 8192, one input sample will be divided into multiple sequences. Such as:
When bs = 2, and seqlen = 7, we will have seq_lens = [0, 7, 7, 7, 7, 14, 14, 14, 14] in the prefill attention. In decoding, it won't as it's handled by the gappy bias.

Differential Revision: D73833204

@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D73833204

Copy link

netlify bot commented May 1, 2025

Deploy Preview for pytorch-fbgemm-docs ready!

Name Link
🔨 Latest commit a1f8206
🔍 Latest deploy log https://app.netlify.com/projects/pytorch-fbgemm-docs/deploys/682e3e744996ac0008ba4faf
😎 Deploy Preview https://deploy-preview-4063--pytorch-fbgemm-docs.netlify.app
📱 Preview on mobile
Toggle QR Code...

QR Code

Use your smartphone camera to open QR code link.

To edit notification comments on pull requests, go to your Netlify project configuration.

@y-sq y-sq force-pushed the export-D73833204 branch from c499e1b to f44be73 Compare May 3, 2025 09:21
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D73833204

@y-sq y-sq force-pushed the export-D73833204 branch from f44be73 to 30b9e0a Compare May 21, 2025 20:21
y-sq added a commit to y-sq/FBGEMM-1 that referenced this pull request May 21, 2025
Summary:

X-link: facebookresearch/FBGEMM#1149

When the max_seq_len is larger than 8192, one input sample will be divided into multiple sequences. Such as:
When bs = 2, and seqlen = 7, we will have seq_lens = [0, 7, 7, 7, 7, 14, 14, 14, 14] in the prefill attention. In decoding, it won't as it's handled by the gappy bias.

Reviewed By: sijiac

Differential Revision: D73833204
y-sq added a commit to y-sq/FBGEMM-1 that referenced this pull request May 21, 2025
Summary:

X-link: facebookresearch/FBGEMM#1149

When the max_seq_len is larger than 8192, one input sample will be divided into multiple sequences. Such as:
When bs = 2, and seqlen = 7, we will have seq_lens = [0, 7, 7, 7, 7, 14, 14, 14, 14] in the prefill attention. In decoding, it won't as it's handled by the gappy bias.

Reviewed By: sijiac

Differential Revision: D73833204
@y-sq y-sq force-pushed the export-D73833204 branch from 30b9e0a to da7fb7a Compare May 21, 2025 20:21
y-sq added a commit to y-sq/FBGEMM-1 that referenced this pull request May 21, 2025
Summary:

X-link: facebookresearch/FBGEMM#1149

When the max_seq_len is larger than 8192, one input sample will be divided into multiple sequences. Such as:
When bs = 2, and seqlen = 7, we will have seq_lens = [0, 7, 7, 7, 7, 14, 14, 14, 14] in the prefill attention. In decoding, it won't as it's handled by the gappy bias.

Reviewed By: sijiac

Differential Revision: D73833204
@y-sq y-sq force-pushed the export-D73833204 branch from da7fb7a to 4ba0fe0 Compare May 21, 2025 20:22
y-sq added a commit to y-sq/FBGEMM-1 that referenced this pull request May 21, 2025
Summary:

X-link: facebookresearch/FBGEMM#1149

When the max_seq_len is larger than 8192, one input sample will be divided into multiple sequences. Such as:
When bs = 2, and seqlen = 7, we will have seq_lens = [0, 7, 7, 7, 7, 14, 14, 14, 14] in the prefill attention. In decoding, it won't as it's handled by the gappy bias.

Reviewed By: sijiac

Differential Revision: D73833204
@y-sq y-sq force-pushed the export-D73833204 branch from 4ba0fe0 to 1af3b4a Compare May 21, 2025 20:23
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D73833204

@y-sq y-sq force-pushed the export-D73833204 branch from 1af3b4a to bc2bf6f Compare May 21, 2025 20:24
y-sq added a commit to y-sq/FBGEMM-1 that referenced this pull request May 21, 2025
Summary:
Pull Request resolved: pytorch#4063

X-link: facebookresearch/FBGEMM#1149

When the max_seq_len is larger than 8192, one input sample will be divided into multiple sequences. Such as:
When bs = 2, and seqlen = 7, we will have seq_lens = [0, 7, 7, 7, 7, 14, 14, 14, 14] in the prefill attention. In decoding, it won't as it's handled by the gappy bias.

Reviewed By: sijiac

Differential Revision: D73833204
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D73833204

y-sq added a commit to y-sq/FBGEMM-1 that referenced this pull request May 21, 2025
Summary:
Pull Request resolved: pytorch#4063

X-link: facebookresearch/FBGEMM#1149

When the max_seq_len is larger than 8192, one input sample will be divided into multiple sequences. Such as:
When bs = 2, and seqlen = 7, we will have seq_lens = [0, 7, 7, 7, 7, 14, 14, 14, 14] in the prefill attention. In decoding, it won't as it's handled by the gappy bias.

Reviewed By: sijiac

Differential Revision: D73833204
@y-sq y-sq force-pushed the export-D73833204 branch from bc2bf6f to 40279d4 Compare May 21, 2025 20:36
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D73833204

y-sq added a commit to y-sq/FBGEMM-1 that referenced this pull request May 21, 2025
Summary:
Pull Request resolved: pytorch#4063

X-link: facebookresearch/FBGEMM#1149

When the max_seq_len is larger than 8192, one input sample will be divided into multiple sequences. Such as:
When bs = 2, and seqlen = 7, we will have seq_lens = [0, 7, 7, 7, 7, 14, 14, 14, 14] in the prefill attention. In decoding, it won't as it's handled by the gappy bias.

Reviewed By: sijiac

Differential Revision: D73833204
@y-sq y-sq force-pushed the export-D73833204 branch from 40279d4 to 67191bc Compare May 21, 2025 20:48
Summary:
Pull Request resolved: pytorch#4063

X-link: facebookresearch/FBGEMM#1149

When the max_seq_len is larger than 8192, one input sample will be divided into multiple sequences. Such as:
When bs = 2, and seqlen = 7, we will have seq_lens = [0, 7, 7, 7, 7, 14, 14, 14, 14] in the prefill attention. In decoding, it won't as it's handled by the gappy bias.

Reviewed By: sijiac

Differential Revision: D73833204
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D73833204

@y-sq y-sq force-pushed the export-D73833204 branch from 67191bc to a1f8206 Compare May 21, 2025 20:58
@y-sq y-sq closed this May 22, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants