Python Forum
Killing processes via python
Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
Killing processes via python
#1
Hello,

my knowledge of python is rather limited so I might just be poking in the dark...

What I am attempting to do is open a file, write some stuff in it and close it.

The problem I have is, what if that file is opened?

To fit my needs, it needs to be closed, no matter what's happening to it, so I can open it via python application.

So if I cant open a file that's open, I can try to force close it and then open it via python?

Found this lib: https://psutil.readthedocs.io/en/latest/

It kind of does what I want, maybe I'm just missing on the how.

It currently returns me a list of all processes, and gives me ids that I can use to kill processes.

While in reality, I would like to close test.csv that is open in excel instead of closing all of excel, any ideas how I
can achieve that?
Reply
#2
Usually my answer to this is:
Don't ask for permission, ask for forgiveness.

But your problem is different. I guess program A is writing something to file A and afterwards program B should work with file A or vice versa.

One little trick to ship around this problem is renaming.

First you write your data to some_name.txt.0.
When everything is done and the file is closed, then the program should rename the file to some_name.txt.
The second program is not allowed to open the *.0 file.

Here a fuser example for psutil:
def fuser(file):
    for proc in psutil.process_iter():
        try:
            for open_file in proc.open_files():
                if file == open_file.path:
                    print(proc.pid, open_file.path)
        except psutil.AccessDenied:
            pass
But this check does not guarantee, that after the function call the file is still not in in use.



On Linux exists also the command fuser which is used to look which PIDs accessing a file.
The Linux command lsof is also capable of this, but could also list open network sockets.
Almost dead, but too lazy to die: https://sourceserver.info
All humans together. We don't need politicians!
Reply
#3
It is better you can use the multiprocessing module which is almost the same and it has terminate() function for killing a processes. Here, to kill a process , you can simply call the method:

yourProcess.terminate() # kill the process!

Python will kill your process (on Unix through the SIGTERM signal, while on Windows through the TerminateProcess() call). Pay attention to use it while using a Queue or a Pipe! (it may corrupt the data in the Queue/Pipe)
Reply


Possibly Related Threads…
Thread Author Replies Views Last Post
  processes shall be parallel flash77 4 1,109 Sep-20-2022, 11:46 AM
Last Post: DeaD_EyE
  Sharing imported modules with Sub Processes? Stubblemonster 2 1,500 May-02-2022, 06:42 AM
Last Post: Stubblemonster
  How to share a numpy array between 2 processes on Windows? qstdy 0 2,159 Jan-29-2021, 04:24 AM
Last Post: qstdy
  sharing variables between two processes Kiyoshi767 1 1,869 Nov-07-2020, 04:00 AM
Last Post: ndc85430
  2 or more processes on the write end of the same pipe Skaperen 4 3,871 Sep-27-2020, 06:41 PM
Last Post: Skaperen
  Errors using --processes parameter sonhospa 3 2,388 Jul-01-2020, 02:24 PM
Last Post: sonhospa
  cv2.resize(...) shutting down processes? DreamingInsanity 1 2,261 Dec-18-2019, 04:06 PM
Last Post: DreamingInsanity
  waiting for many processes in parallel Skaperen 2 1,887 Sep-02-2019, 02:20 AM
Last Post: Skaperen
  how to clean up unstarted processes? Skaperen 2 2,231 Aug-27-2019, 05:37 AM
Last Post: Skaperen
  waiting for a mix of file and processes Skaperen 0 1,482 Jul-28-2019, 06:58 AM
Last Post: Skaperen

Forum Jump:

User Panel Messages

Announcements
Announcement #1 8/1/2020
Announcement #2 8/2/2020
Announcement #3 8/6/2020