Python Forum

Full Version: Sharing imported modules with Sub Processes?
You're currently viewing a stripped down version of our content. View the full version with proper formatting.
Hi all, I am new to Python (and coding as well I guess). I have a menu script which monitors two buttons and calls two different scripts as a sub process depending on the button pressed. The problem is the sub process takes a fair few seconds to load which I think is just all the imports at the start.

I need to speed this up, many of the imported modules in the sub process are loaded by the menu script as well, do I need to import them again in the sub process script?

If so, is there a way of caching / speeding this up?

This is running on an rpi 3 model B using the normal sd card. I have to load the scripts as sub process as I need to time it out after 5 seconds and the sub process script works in a way the traditional timing loops don't work so I have to terminate it instead.

Perhaps I could load some stuff in RAM? Woold that help? No idea how of course :)
Is it possible to do threads?
I'm not sure, without looking it up I believe a thread would run simultaneously whereas a sub process pauses? I have used threads in the sub process script already to beep when an incorrect device is scanned on the reader. The danger is that if another button was pressed during the threaded script running (the one invoked by the first button press) that another instance would try to run which would not be good. I suppose there could be some simple logic to check for the presence of either of the two possible threaded scripts and prevent another running.

Also, given that the module running literally pauses the script whilst waiting for a scan on an rfid reader forever there would have to be a way to terminate the thread from the menu script after x seconds as it'll never close otherwise.

Assuming we could resolve my second point then yes threads could be used, would this be quicker?