Hi,
I am facing memory leak issue while referring data frame
Done -3.23828125
PS E:\Pyhton\Ktrader_gui> e:; cd 'e:\Pyhton\Ktrader_gui'; & 'C:\Users\Administrator\AppData\Local\Microsoft\WindowsApps\python3.11.exe' 'c:\Users\Administrator\.vscode\extensions\ms-python.python-2023.14.0\pythonFiles\lib\python\debugpy\adapter/../..\debugpy\launcher' '62523' '--' 'e:\Pyhton\Ktrader_gui\test.py'
I am facing memory leak issue while referring data frame
import os import sys import time import pandas as pd sys.path.append("gui/gui_utils") import mongo_db import gc import psutil from memory_profiler import profile @profile def get_from_to_date_for_days(days): current_date = time.strftime("%Y-%m-%d") from_date = time.strftime("%Y-%m-%d", time.gmtime(time.time() - 60 * 60 * 24 * days)) current_date = pd.to_datetime(current_date).to_pydatetime(); from_date = pd.to_datetime(from_date).to_pydatetime(); df = mongo_db.mongo_db().queryBetweenDates("stock_historical_data_10y_1d","360ONE",from_date,current_date,"Date"); #code memory usage print(df.memory_usage(index=True).sum()) del df gc.collect() if __name__ == "__main__": start = psutil.Process().memory_info().rss / (1024 * 1024) get_from_to_date_for_days(30) end = psutil.Process().memory_info().rss / (1024 * 1024) print ("Done ",(start, " " ,end))--------------------------------------------------------------------------------------------------------
Done -3.23828125
PS E:\Pyhton\Ktrader_gui> e:; cd 'e:\Pyhton\Ktrader_gui'; & 'C:\Users\Administrator\AppData\Local\Microsoft\WindowsApps\python3.11.exe' 'c:\Users\Administrator\.vscode\extensions\ms-python.python-2023.14.0\pythonFiles\lib\python\debugpy\adapter/../..\debugpy\launcher' '62523' '--' 'e:\Pyhton\Ktrader_gui\test.py'
Output: start :: isCollectionExist
360ONE Exist
1196
Filename: e:\Pyhton\Ktrader_gui\test.py
Line # Mem usage Increment Occurrences Line Contents
=============================================================
11 75.7 MiB 75.7 MiB 1 @profile
12 def get_from_to_date_for_days(days):
13
14 75.7 MiB 0.0 MiB 1 current_date = time.strftime("%Y-%m-%d")
15 75.7 MiB 0.0 MiB 1 from_date = time.strftime("%Y-%m-%d", time.gmtime(time.time() - 60 * 60 * 24 * days))
16 76.1 MiB 0.4 MiB 1 current_date = pd.to_datetime(current_date).to_pydatetime();
17 76.1 MiB 0.0 MiB 1 from_date = pd.to_datetime(from_date).to_pydatetime();
18
19
20
21 78.0 MiB 1.9 MiB 1 df = mongo_db.mongo_db().queryBetweenDates("stock_historical_data_10y_1d","360ONE",from_date,current_date,"Date");
22
23 #code memory usage
24
25
26 78.2 MiB 0.2 MiB 1 print(df.memory_usage(index=True).sum())
27
28 78.2 MiB 0.0 MiB 1 del df
29 78.2 MiB 0.0 MiB 1 gc.collect()
Done (75.046875, ' ', 78.23046875)
Please help if any way to reduce memory usage , I am using dataframe wastly in project
Larz60+ write Jun-14-2024, 06:42 PM:
Please post all code, output and errors (it it's entirety) between their respective tags. Refer to BBCode help topic on how to post. Use the "Preview Post" button to make sure the code is presented as you expect before hitting the "Post Reply/Thread" button.
Code tags have been added this time. Please use BBCode tags on future posts.
Please post all code, output and errors (it it's entirety) between their respective tags. Refer to BBCode help topic on how to post. Use the "Preview Post" button to make sure the code is presented as you expect before hitting the "Post Reply/Thread" button.
Code tags have been added this time. Please use BBCode tags on future posts.