Hi Team,
is it possible to use multiprocessing in below scenario.
downloading sql table into csv files.
to speed up downloading.
is it possible to use multiprocessing in below scenario.
downloading sql table into csv files.
to speed up downloading.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 |
Import csv Quary = "select * from UK_Table" x = 0 With open ( "E:\\backup\\output.csv" , "w" , "newline=" ") as outfile writer = csv.writer(outfile,quoting = csv.QUOTE_NONNUMERIC) writer.writerows(col[ 0 ] for col in cursor.description) While True : rows = cursor.fetchmany( 10000 ) if len (rows) = = 0 : print ( "no records found" ) break else : x = x + len (rows) print (x) for row in rows: writer.writerows(row) conn.close() print ( "success" ) print ( 'time taken' . time.time() - initial, "Seconds" ) 1 ) Example of multiprocessing. import multiprocessing import os def square(n): print ( "Worker process id for {0}:{1}" . format (n,os.getpid())) return (n * n) if __name__ = = "__main__" : #input list arr = [ 1 , 2 , 3 , 4 , 5 ] #creating a pool object p = multiprocessing.Pool() #map list to target function result = p. map (square,arr) print ( "Square of each elements:" ) print (result) |