I read large files (>10 GBs) in chunks; later performing some arithmetic operations on these chunks.
Reading in chunks is needed to avoid memory-issues. For my machine, each chunk can be as large as 150 MB to avoid memory issues. The issue is that the user of my code may have a different machine; which means that the maximum chunksize that can be allocated in their machine will be different.
How can I automate this in my script such that the chunk size is automatically selected by pulling some system-specific information (RAM etc., may be)? Of course I want to maximize the chunk size as much as possible.
Finally, any thoughts on how 150 MB is related to my machine's specification (link below).
Sytem_specs (click here)
Reading in chunks is needed to avoid memory-issues. For my machine, each chunk can be as large as 150 MB to avoid memory issues. The issue is that the user of my code may have a different machine; which means that the maximum chunksize that can be allocated in their machine will be different.
How can I automate this in my script such that the chunk size is automatically selected by pulling some system-specific information (RAM etc., may be)? Of course I want to maximize the chunk size as much as possible.
Finally, any thoughts on how 150 MB is related to my machine's specification (link below).
Sytem_specs (click here)