Python Forum
How to multiprocessing Api post request - Printable Version

+- Python Forum (https://python-forum.io)
+-- Forum: Python Coding (https://python-forum.io/forum-7.html)
+--- Forum: General Coding Help (https://python-forum.io/forum-8.html)
+--- Thread: How to multiprocessing Api post request (/thread-17226.html)



How to multiprocessing Api post request - kinojom - Apr-02-2019

I successfully post each record one by one from a csv file. However, I'm trying to implement multiprocessing to make it more efficient to handle large data file in the future.
    ENDPOINT_URL = 'https://example.com'
    headers = {'Api-key': '123abc'}
    
    with open("student.csv", "r") as csv_ledger:
        r = csv.DictReader(csv_ledger)
        data = [dict(d) for d in r ]
        groups = {}
    
        for k, g in groupby(data, lambda r: (r['name'])):
            #My data mapping
    
            #for loop to post each record
            post_api = requests.post(ENDPOINT_URL, json=groups, headers=headers)
Is there any easy way to do the multiprocessing for api request?


RE: How to multiprocessing Api post request - Larz60+ - Apr-03-2019

here's a couple of resources to read: https://pymotw.com/3/concurrent.futures/
https://docs.python.org/3/library/concurrent.futures.html
https://www.blog.pythonlibrary.org/2016/08/03/python-3-concurrency-the-concurrent-futures-module/
There are other asyncio methods, but this is one of the easiest to use.