Python Forum

Full Version: How to multiprocessing Api post request
You're currently viewing a stripped down version of our content. View the full version with proper formatting.
I successfully post each record one by one from a csv file. However, I'm trying to implement multiprocessing to make it more efficient to handle large data file in the future.
    ENDPOINT_URL = 'https://example.com'
    headers = {'Api-key': '123abc'}
    
    with open("student.csv", "r") as csv_ledger:
        r = csv.DictReader(csv_ledger)
        data = [dict(d) for d in r ]
        groups = {}
    
        for k, g in groupby(data, lambda r: (r['name'])):
            #My data mapping
    
            #for loop to post each record
            post_api = requests.post(ENDPOINT_URL, json=groups, headers=headers)
Is there any easy way to do the multiprocessing for api request?
here's a couple of resources to read: https://pymotw.com/3/concurrent.futures/
https://docs.python.org/3/library/concur...tures.html
https://www.blog.pythonlibrary.org/2016/...es-module/
There are other asyncio methods, but this is one of the easiest to use.