Skip to content

fix: set Content-Length on presigned PUT uploads for S3 compatibility#1246

Open
mishushakov wants to merge 7 commits intomainfrom
mishushakov/fix-sdk-issues
Open

fix: set Content-Length on presigned PUT uploads for S3 compatibility#1246
mishushakov wants to merge 7 commits intomainfrom
mishushakov/fix-sdk-issues

Conversation

@mishushakov
Copy link
Copy Markdown
Member

Summary

  • JS SDK: Buffer the gzipped tar stream and set explicit Content-Length header instead of streaming with Transfer-Encoding: chunked via fetch. Removes the duplex: 'half' workaround.
  • Python SDK: Set explicit Content-Length header on httpx PUT calls for presigned URL uploads.
  • S3 presigned PUT URLs do not support Transfer-Encoding: chunked, causing 501 NotImplemented on self-hosted AWS deployments.

Closes #1235
Closes #1243

Test plan

  • Build a template with .copy() on a self-hosted AWS (S3) deployment
  • Build a template on E2B Cloud (GCS) to verify no regression
  • Verify both JS and Python SDKs

🤖 Generated with Claude Code

S3 presigned PUT URLs do not support Transfer-Encoding: chunked. The JS
SDK was streaming the gzipped tar body via fetch without Content-Length,
causing 501 NotImplemented on S3-backed deployments. Buffer the
compressed tar and set Content-Length explicitly in both JS and Python
SDKs.

Closes #1235
Closes #1243

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
@cursor
Copy link
Copy Markdown

cursor bot commented Mar 30, 2026

PR Summary

Medium Risk
Changes template file upload behavior in both JS and Python SDKs to avoid chunked transfer; could impact memory usage and upload reliability for large templates or different HTTP client behaviors.

Overview
Fixes presigned S3 PUT template uploads by ensuring requests include a Content-Length instead of using chunked transfer encoding.

In the JS SDK, uploadFile now buffers the gzipped tar stream into a Buffer before fetch (and drops the duplex: 'half' streaming workaround). In the Python SDK (sync + async), uploads now explicitly set the Content-Length header (and async tar creation is moved to a thread executor).

Written by Cursor Bugbot for commit 4de627c. This will update automatically on new commits. Configure here.

@changeset-bot
Copy link
Copy Markdown

changeset-bot bot commented Mar 30, 2026

🦋 Changeset detected

Latest commit: 4de627c

The changes in this PR will be included in the next version bump.

This PR includes changesets to release 2 packages
Name Type
@e2b/python-sdk Patch
e2b Patch

Not sure what this means? Click here to learn what changesets are.

Click here if you're a maintainer who wants to add another changeset to this PR

The tar Pack type doesn't expose [Symbol.asyncIterator] in its type
definitions, even though it implements it at runtime. Cast through
unknown to satisfy the CLI package's stricter typecheck.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
@github-actions
Copy link
Copy Markdown
Contributor

github-actions bot commented Mar 30, 2026

Package Artifacts

Built from 7499699. Download artifacts from this workflow run.

JS SDK (e2b@2.18.1-mishushakov-fix-sdk-issues.0):

npm install ./e2b-2.18.1-mishushakov-fix-sdk-issues.0.tgz

CLI (@e2b/cli@2.9.1-mishushakov-fix-sdk-issues.0):

npm install ./e2b-cli-2.9.1-mishushakov-fix-sdk-issues.0.tgz

Python SDK (e2b==2.19.0+mishushakov-fix-sdk-issues):

pip install ./e2b-2.19.0+mishushakov.fix.sdk.issues-py3-none-any.whl

@mishushakov mishushakov marked this pull request as ready for review March 30, 2026 20:22
Copy link
Copy Markdown

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: 575c2f5f02

ℹ️ About Codex in GitHub

Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".

Use a block scope so the chunks array is eligible for GC immediately
after Buffer.concat, preventing peak memory from holding both the
individual chunks and the concatenated buffer simultaneously.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
The synchronous tar_file_stream call blocks the event loop in the async
upload_file function, preventing other coroutines from running during
tar creation. Offload to a thread pool via run_in_executor.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Copy link
Copy Markdown

@cursor cursor bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Cursor Bugbot has reviewed your changes and found 2 potential issues.

Fix All in Cursor

Bugbot Autofix is OFF. To automatically fix reported issues with cloud agents, enable autofix in the Cursor dashboard.

mishushakov and others added 2 commits March 30, 2026 22:48
- JS: remove explicit Content-Length header (undici auto-sets it for
  Buffer bodies; Content-Length is a forbidden Fetch spec header)
- Python: use get_running_loop() instead of deprecated get_event_loop()

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
@djeebus
Copy link
Copy Markdown
Contributor

djeebus commented Apr 1, 2026

Can we get some tests verifying the new functionality?

@mishushakov
Copy link
Copy Markdown
Member Author

@djeebus I rly don't have a repro here - I am assuming it's self-hosters using regular AWS S3 instead of GCS S3-compatible backend.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

3 participants