-
Notifications
You must be signed in to change notification settings - Fork 975
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Feature request: memory limit #75
Comments
agrees with you Guetzli's is use to much memory , my linux only have 4gb ram is very lag |
The 300MB/MPix estimate is reasonably conservative. There is no mechanism that can cause memory usage to grow superlinearly with the size of an image. |
I second the memory limit request. This becomes even more important if you run the algorithm in parallel on multiple images at once (to help with the slow performance when batch-processing images.) I've also found it easy to bring my Mac to a screeching halt. Those with fewer than 16GB of RAM will be especially vulnerable. |
This limit gets compared against the upper bound that's already provided in README. Providing a facility to handle the memory limit internally makes it easier to change the estimate used in the future. Fixes google#75.
This limit gets compared against the upper bound that's already provided in README. Providing a facility to handle the memory limit internally makes it easier to change the estimate used in the future. Fixes google#75.
This limit gets compared against the upper bound that's already provided in README. Providing a facility to handle the memory limit internally makes it easier to change the estimate used in the future. Fixes google#75.
This limit gets compared against the upper bound that's already provided in README. Providing a facility to handle the memory limit internally makes it easier to change the estimate used in the future. Fixes #75.
I've tried Guetzli on a large batch of images, and my macOS machine ended locking up hard. I presume it's because macOS doesn't fail
malloc
, but instead switches to using a combination of compressed RAM and swap, which on such a large memory demand bring performance to a halt.I'm interested in limiting Guetzli's memory use to a percentage of machine's RAM size (e.g. no more than half of all RAM).
Will it be reliable enough if it's done heuristically based on number of pixels in the image?
Is
CacheAligned::Allocate
used for the majority of allocations? Would it be better to put a limit there?The text was updated successfully, but these errors were encountered: