Here's my attempt at explanation and again i am very very new to Python :
Assignment : Read the csv files and load the data into tables in postgres database.
Code for reading and loading the files into postgres tables is working.(utilized pandas , numpy , sqlalchemy )
Requirement : Need to account whether the script ran fine and all tables got loaded.
One approach that i had was to come up with an Audit table that has name of the table ,load_status and load_date.
there are 12 tables. If the count is greater than 0 for each table , meaning table got loaded with records.
I am not sure whether this is correct approach
Also any other suggestions to achieve the requirement stated above is appreciated.
Thanks !
Assignment : Read the csv files and load the data into tables in postgres database.
Code for reading and loading the files into postgres tables is working.(utilized pandas , numpy , sqlalchemy )
Requirement : Need to account whether the script ran fine and all tables got loaded.
One approach that i had was to come up with an Audit table that has name of the table ,load_status and load_date.
there are 12 tables. If the count is greater than 0 for each table , meaning table got loaded with records.
conn = create_engine ("postgresql://<id>:<password>@hostname:port/databasename") cur = conn.cursor() #to execute select statements cur.execute (""" CREATE TABLE Audit ( table_name text , Load_status text , load_date date)""") cur.execute("Select * from table1") results_table1 = cursor.fetchall() for r in results_table1: if results_table1 > 0: [b][i]## I am not sure how to achieve it.[/i][/b] then INSERT_QUERY = { "INSERT INTO Audit VALUES {}".format("(table1name" ,'Successful' ,'02-24-2020')") cur.execute(INSERT_QUERY)## also i have to check it for all 12 tables
I am not sure whether this is correct approach
Also any other suggestions to achieve the requirement stated above is appreciated.
Thanks !