Python Forum
Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
Running two scripts
#1
Hi all,
I have two scripts, say script1.py and script2.py. In script1 I calculate a lot of parameters and matrices, and it takes time, say 30 mins to run script1. Then I want to use these calculated parameters to do graphs or other small calculations. So I try to import script1 into script2 and do the plots. However, whenever I import another script, it runs the entire script1, and it takes again 30 minutes. So basically even if I have to make a small change in a graph, I need to re-run the code for half hour. How to do this efficiently? Basically I want to know, once I run my script, is there a way to save all the calculated variables somehow so that I can import them multiple times later, instead of running the entire code again.
Reply
#2
(Mar-21-2018, 10:21 PM)arka7886 Wrote: is there a way to save all the calculated variables somehow so that I can import them multiple times later, instead of running the entire code again.
You can save numpy arrays or pandas dataframes containing your results in files on disk.
Reply
#3
(Mar-21-2018, 11:02 PM)Gribouillis Wrote:
(Mar-21-2018, 10:21 PM)arka7886 Wrote: is there a way to save all the calculated variables somehow so that I can import them multiple times later, instead of running the entire code again.
You can save numpy arrays or pandas dataframes containing your results in files on disk.

Thanks for the reply. But sometimes, there are large arrays and lot of dimensions. Saving everything into output files is not a very smart idea I thought.
Reply
#4
(Mar-21-2018, 11:11 PM)arka7886 Wrote: Saving everything into output files is not a very smart idea I thought.
You want to save the variables and you don't want to save them. Don't you think there is a contradiction in your wishes? You can save them in a memory file if you don't want to write on disk.
Reply
#5
I meant I am trying to avoid writing all the data arrays/matrices to output files, that would generate 100's of output files, or a large .csv file, which I again have to read in, in the other program. I was trying to find out if we can save the outputs in the memory. I do not know what is a memory file though.
Reply
#6
(Mar-22-2018, 04:31 AM)arka7886 Wrote: that would generate 100's of output files, or a large .csv file, which I again have to read in, in the other program.
Just reading files can be much faster than the 30 minutes of calculations.

You can also use python's pickle module to store in a more 'binary' way, but I think numpy has its own binary format.

As for a memory file, it is an ordinary file, but it is stored in RAM. If you're in linux it is usually implemented as the /dev/shm directory, although this question may be worth reading.
Reply


Forum Jump:

User Panel Messages

Announcements
Announcement #1 8/1/2020
Announcement #2 8/2/2020
Announcement #3 8/6/2020