Skip to content

Feat: AE-2353: bump max flash tarball size#258

Open
jhcipar wants to merge 3 commits intomainfrom
jhcipar/bump-max-flash-tarball-size
Open

Feat: AE-2353: bump max flash tarball size#258
jhcipar wants to merge 3 commits intomainfrom
jhcipar/bump-max-flash-tarball-size

Conversation

@jhcipar
Copy link
Contributor

@jhcipar jhcipar commented Mar 5, 2026

we will have to keep an eye on storage costs and sls cold starts on the host side as part of this, but ml workloads often require pretty large dependencies

@jhcipar jhcipar changed the title Feat: AE-2343: bump max flash tarball size Feat: AE-2353: bump max flash tarball size Mar 5, 2026
Copy link
Contributor

@runpod-Henrik runpod-Henrik left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Review

Clean, complete change. The constant is used consistently everywhere (build.py:346, app.py:108) so bumping it propagates correctly — no hardcoded 500s left behind.


Question: is 1.5GB the actual platform-enforced limit?

The client-side check (MAX_TARBALL_SIZE_MB) is a pre-flight guard. If the RunPod platform upload endpoint enforces a different limit, users will pass the client check but get an opaque server-side error. Does the backend actually accept up to 1.5GB today, or is this change ahead of a platform change?


Nit: unit inconsistency in updated strings

app.py docstring and build.py docstring say 1500MB, markdown docs say 1.5GB. Both correct but inconsistent — worth picking one style across all four places.


Positives

  • All enforcement code references the constant, not a literal — single change propagates everywhere correctly
  • Test updated to 1600MB to stay above the new limit — correct
  • No stray 500 references left in size-related code

Verdict: PASS pending confirmation that the backend supports 1.5GB today.

🤖 Reviewed by Henrik's AI-Powered Bug Finder

Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR bumps the maximum allowed Flash deployment tarball size from 500MB to 1500MB (1.5GB) to better support ML workloads that have large dependency requirements.

Changes:

  • Raises MAX_TARBALL_SIZE_MB constant from 500 to 1500
  • Updates all documentation (docstrings and markdown) to reflect the new 1.5GB limit
  • Updates the test's mocked oversized file from 600MB to 1600MB to remain above the new limit

Reviewed changes

Copilot reviewed 6 out of 6 changed files in this pull request and generated no comments.

Show a summary per file
File Description
src/runpod_flash/core/resources/constants.py Updates the MAX_TARBALL_SIZE_MB constant from 500 to 1500
src/runpod_flash/core/resources/app.py Updates the upload_build docstring to reference the new 1500MB limit
src/runpod_flash/cli/commands/build.py Updates the run_build docstring to reference the new 1500MB limit
src/runpod_flash/cli/docs/flash-build.md Updates user-facing size limit documentation to 1.5GB
src/runpod_flash/cli/docs/flash-deploy.md Updates user-facing size limit documentation to 1.5GB
tests/unit/core/resources/test_app.py Updates the mocked oversized file to 1600MB so it remains above the new 1500MB limit

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

You can also share your feedback on Copilot code review. Take the survey.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants