Python Forum
Appending Dataframes along columns
Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
Appending Dataframes along columns
#1
i wanted to append only non empty dataframes from a set of dataframes.. using python!!
I tried so using following code:
dfs = [df2_test,df3_test,df4_test,df5_test,df6_test,df7_test,df8_test,df9_test,df10_test,df12_test,df13_test,df14_test,df15_test,df16_test,df17_test,df18_test,df19_test,df20_test,df21_test,df22_test,df23_test,df24_test,df25_test,df26_test,df27_test,df28_test,df29_test,df30_test]
df_final=pd.concat([df for df in dfs if not df.empty], axis=1, join='outer').to_csv("D:/tool/optimized_final_test.csv",header=True,index=False)

Can anyone help me out to fix it

note: the non-empty dataframe contains elements of size 30 x 4. say for eg if I have 4 nonempty df I should get 30 X 16 size dataframe.
Reply
#2
I think you want axis=0, not axis=1.

Why do you have so many dataframes and so many dataframe variables? I would cull the list where the data frames were corrected.

I could do this:
import pandas as pd
import random

data_frames = []
for _ in range(15):
    count = random.choice([0, 4])  # Replace with some test that can determine if the df is empty
    data_frames.append(pd.DataFrame({i:[i]*count for i in range(10)}))
df = pd.concat([df for df in data_frames if not df.empty])
print(len(data_frames), len(ne), len(df))
But it is more efficient to do this:
import pandas as pd
import random

data_frames = []
for _ in range(15):
    if random.choice([0, 4]) > 0:
        data_frames.append(pd.DataFrame({i:[i]*4 for i in range(10)}))
df = pd.concat(data_frames, axis=0)
print(len(data_frames), len(df))
Both give the same result, but I have 1 variable instead of 29 and I don't make a bunch of empty dataframes.
Reply


Possibly Related Threads…
Thread Author Replies Views Last Post
  Check DataFrames with different sorting in columns and rows Foxyskippy 0 854 Nov-19-2022, 07:49 AM
Last Post: Foxyskippy

Forum Jump:

User Panel Messages

Announcements
Announcement #1 8/1/2020
Announcement #2 8/2/2020
Announcement #3 8/6/2020