Python Forum

Full Version: Allocating maximum memory to chunks
You're currently viewing a stripped down version of our content. View the full version with proper formatting.
I read large files (>10 GBs) in chunks; later performing some arithmetic operations on these chunks.

Reading in chunks is needed to avoid memory-issues. For my machine, each chunk can be as large as 150 MB to avoid memory issues. The issue is that the user of my code may have a different machine; which means that the maximum chunksize that can be allocated in their machine will be different.

How can I automate this in my script such that the chunk size is automatically selected by pulling some system-specific information (RAM etc., may be)? Of course I want to maximize the chunk size as much as possible.

Finally, any thoughts on how 150 MB is related to my machine's specification (link below).

Sytem_specs (click here)
the package psutil: https://pypi.org/project/psutil/
will allow you to get the available and used system memory.
You can create a simple algorithm to figure out say 30 - 50 % of free memory,
available when your program starts, and use that for your chunk size.