Oct-28-2020, 11:40 AM
import os import pandas as pd df_result =pd.DataFrame() directory = os.path.join("D:\\","\PythonCodes\inputmultifiles") for root,dirs,files in os.walk(directory): for file in files: f = os.path.join(directory,file) if f.endswith(".csv"): ff=pd.read_csv(f) tmp = ff['Name'] print(tmp) df_result= pd.concat([df_result,ff['Name']]) df_result = df_result.reset_index(drop=True) df_result.columns = ['New_col']if the file size is large, and it takes time, wait the previous iteration finish. Now I want to do like multiple threading to trigger all iterations at a time and combine the results from each iteration.