Dec-02-2017, 01:42 AM
Thanks for the reply Buran...
I ended up getting it to work a different way - I'm sure there is a much better way to do this, but it is working the way I need so it'll do for now until I learn more
It loops through looking for these files names and if it is found it will +1 to the total_files variable and remove that file from the expected_files list variable and continue the loop looking or what is left in the expected files variable.
Once the total_files variable = the number of expected files in the query it will start the download.
If it has been waiting for 2 hours it will call the update_job_log_table to update a table in a db and send a notification email about the failure and then exit.
The only part I have not tested is the time part to exit after 2 hours, everything else however, works fine
If someone reads this and wants to suggest better ways of doing this I'm more than happy to have a read and give it a shot
I ended up getting it to work a different way - I'm sure there is a much better way to do this, but it is working the way I need so it'll do for now until I learn more
t0 = time.time() while total_files != total_files_required: time.sleep(10) remote_files = sftp.listdir("REMOTE_DIRECTORY_TO_WATCH") logging.info("Robot is currently waiting for the total matched filenames to = the total expected matches, current number of matches is: " + str(total_files)) for filename in expected_files: t1 = time.time() logging.info("Looking for: " + filename + " in the remote directory.") total_time = t1-t0 if total_time == 7200: logging.warning("Robot has been waiting for all of the expected extract files for 2 hours & has now aborted.") update_job_log_table("subjet for email notification", "job status", "email template to use", job number) sys.exit(1) if filename + SUFFIX_TO_FETCH in remote_files: logging.info("I've found filename: " + filename + " in the remote directory & will remove it from the list of required files variable so i'm only looking for files that are still missing.") expected_files = [expected_files for expected_files in expected_files if expected_files != filename] total_files = total_files + 1So basically I have a query before this part which will tell me the filenames that are expected in the remote folder + the count of expected files.
It loops through looking for these files names and if it is found it will +1 to the total_files variable and remove that file from the expected_files list variable and continue the loop looking or what is left in the expected files variable.
Once the total_files variable = the number of expected files in the query it will start the download.
If it has been waiting for 2 hours it will call the update_job_log_table to update a table in a db and send a notification email about the failure and then exit.
The only part I have not tested is the time part to exit after 2 hours, everything else however, works fine
If someone reads this and wants to suggest better ways of doing this I'm more than happy to have a read and give it a shot