Apr-02-2019, 11:20 PM
I successfully post each record one by one from a csv file. However, I'm trying to implement multiprocessing to make it more efficient to handle large data file in the future.
ENDPOINT_URL = 'https://example.com' headers = {'Api-key': '123abc'} with open("student.csv", "r") as csv_ledger: r = csv.DictReader(csv_ledger) data = [dict(d) for d in r ] groups = {} for k, g in groupby(data, lambda r: (r['name'])): #My data mapping #for loop to post each record post_api = requests.post(ENDPOINT_URL, json=groups, headers=headers)Is there any easy way to do the multiprocessing for api request?