Jan-21-2020, 06:15 PM
Hi everyone,
I have a module that connects to a db to check for new entries, and if new, trigger more functions etc.
My question is regarding the connection itself. As it currently stands, I create and close the connection in each function, each time a function needs to interact with the db. I'm wondering how this affects performance and would think it better to connect and close just once when the loops is starting up.
My issue is that the module that connects runs a loop, and I don't want to create incremental numbers of connections... at the same time, I don't want to make the connection once and just have it time out or something.
What's the best practice for this?
Here's what some of my code looks like:
I have a module that connects to a db to check for new entries, and if new, trigger more functions etc.
My question is regarding the connection itself. As it currently stands, I create and close the connection in each function, each time a function needs to interact with the db. I'm wondering how this affects performance and would think it better to connect and close just once when the loops is starting up.
My issue is that the module that connects runs a loop, and I don't want to create incremental numbers of connections... at the same time, I don't want to make the connection once and just have it time out or something.
What's the best practice for this?
Here's what some of my code looks like:
def beginSending(): global last_sent cnxn = pyodbc.connect('UID='+dbUser+';PWD='+dbPassword+';DSN='+dbHost) cursor = cnxn.cursor() cursor.execute(pullId) results = cursor.fetchone() last_sent = results cnxn.close() Logger.writeAndPrintLine('Send listener started.', 0) sendLoop() def sendLoop(): while 0 == 0: sendListener() time.sleep(2.0) def sendListener(): global last_sent cnxn = pyodbc.connect('UID='+dbUser+';PWD='+dbPassword+';DSN='+dbHost) cursor = cnxn.cursor() cursor.execute(pullId) results = cursor.fetchone() if results != last_sent: sendSMS() else: cnxn.close()