Skip to content

fix: use file instead of in-memory buffer for download #22

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 2 commits into from
Apr 30, 2025

Conversation

pepperoni505
Copy link
Contributor

@pepperoni505 pepperoni505 commented Apr 22, 2025

@pepperoni505 pepperoni505 requested a review from a team as a code owner April 22, 2025 08:33
@SkySails SkySails requested a review from Copilot April 22, 2025 09:13
Copy link

@Copilot Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull Request Overview

This PR updates the download process by switching from an in-memory buffer to a file-based approach to reduce memory usage during downloads.

  • Updated fs and io imports to use File and BufReader instead of Cursor.
  • Introduced a constant (DOWNLOAD_TEMP_FILE_PATH) to store the temporary download file and modified the download logic accordingly.
  • Adjusted extraction to load the zip archive from the file.
Comments suppressed due to low confidence (1)

src/wasm/src/funcs.rs:28

  • [nitpick] Consider reviewing the file path for cross-platform compatibility; using Path::new or platform-specific path delimiters may prevent issues on non-Windows environments if applicable.
const DOWNLOAD_TEMP_FILE_PATH: &str = "\work/ng_download.temp";

Copy link
Contributor

@SkySails SkySails left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Seems to work as advertised! The memory usage stays put, and each chunk is successfully flushed to the temporary file. I manually throttled my internet connection to 300 KB/s to test this and had one download take over 2 minutes, which means that it would normally have failed (without chunking).

Screenshot shows mid download with the first chunks having been flushed to the file already, while the memory remains stable.

image

@pepperoni505 pepperoni505 merged commit abd627c into main Apr 30, 2025
5 checks passed
@pepperoni505 pepperoni505 deleted the download-memory-optimize branch April 30, 2025 16:50
@navigraph-bot
Copy link

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants