Python Forum
Allocating maximum memory to chunks
Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
Allocating maximum memory to chunks
#1
Lightbulb 
I read large files (>10 GBs) in chunks; later performing some arithmetic operations on these chunks.

Reading in chunks is needed to avoid memory-issues. For my machine, each chunk can be as large as 150 MB to avoid memory issues. The issue is that the user of my code may have a different machine; which means that the maximum chunksize that can be allocated in their machine will be different.

How can I automate this in my script such that the chunk size is automatically selected by pulling some system-specific information (RAM etc., may be)? Of course I want to maximize the chunk size as much as possible.

Finally, any thoughts on how 150 MB is related to my machine's specification (link below).

Sytem_specs (click here)
Reply
#2
the package psutil: https://pypi.org/project/psutil/
will allow you to get the available and used system memory.
You can create a simple algorithm to figure out say 30 - 50 % of free memory,
available when your program starts, and use that for your chunk size.
Gribouillis likes this post
Reply


Possibly Related Threads…
Thread Author Replies Views Last Post
  Pandas read csv file in 'date/time' chunks MorganSamage 4 1,698 Feb-13-2023, 11:24 AM
Last Post: MorganSamage
  Accessing details of chunks in HDF5 file Robotguy 0 1,569 Aug-29-2020, 06:51 AM
Last Post: Robotguy

Forum Jump:

User Panel Messages

Announcements
Announcement #1 8/1/2020
Announcement #2 8/2/2020
Announcement #3 8/6/2020