Python Forum
Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
slice per group
#1
Hi,

I would like to extract the first 50 data points of each group factor in a data frame.

So far, I stumbled over:

grouped = df.groupby('factor').first()
which extracts the first data point (also when it is not a time format as stated in the documentation)

grouped = df.groupby('factor').nth()
which extracts the nth data point, so a single one instead of a list

grouped = df.groupby('factor').apply(lambda x: x.iloc[0:2]))
which extracts the first 50 rows indeed - but only for the first group instead of for all groups..

Can someone please shed some light on me?
Thank you!

I got it. You have to use ".iloc" instead of ".loc"

grouped = df.groupby('factor').apply(lambda x: x.iloc[0:50])
Reply
#2
You need to decide where these groups will be stored, in a list, or you want concatenate them into new data-frame?

import pandas as pd
# generate sample data
df = pd.DataFrame({'factor': pd.np.random.choice(range(5), 1000), 'value':pd.np.random.rand(1000)})\

#groups
dfs = [df.loc[v[:50]] for g, v in df.groupby('factor').groups.items()]
Reply
#3
They are stored in the grouped dataframe?

What do g and v do in
dfs = [df.loc[v[:50]] for g, v in df.groupby('factor').groups.items()]
?
Reply
#4
(Jul-19-2019, 01:50 PM)Progressive Wrote: They are stored in the grouped dataframe?

dfs is a list of data frames of length 50 for each group. g is group name, g = 0, 1, 2, 3, 4.
Reply


Possibly Related Threads…
Thread Author Replies Views Last Post
  Inserting slice of array objects into different slice ThemePark 4 2,482 Apr-01-2020, 01:10 PM
Last Post: ThemePark
  TypeError: '(slice(None, None, None), 0)' is an invalid key zaki424160 1 15,129 Jul-17-2019, 11:53 PM
Last Post: scidam
  Melt or Slice Grin 0 2,143 Jun-24-2018, 06:02 PM
Last Post: Grin
  How to group variables & check correlation of group variables wrt single variable SriRajesh 2 2,929 May-23-2018, 03:01 PM
Last Post: SriRajesh

Forum Jump:

User Panel Messages

Announcements
Announcement #1 8/1/2020
Announcement #2 8/2/2020
Announcement #3 8/6/2020