Python Forum

Full Version: Pandas Dataframe to Google Big Query
You're currently viewing a stripped down version of our content. View the full version with proper formatting.
Hiya,

Just when I almost get it, it slips away.

I have a dataframe that I would like to put into a google big query table.
However (aha) as per normal, when I try it, it just errors (pyarrow cannot convert byte to intger on an object/string).

So I thought I would try to force a type on a column on the dataframe to see if that would allow it to go up.
Can't even get that to work.
df.reset_index()
df.rename(mapper={ 
    'year':'theyear', 'month':'themonth', 
    'excl':'totalexclbtw','incl':'totalinclbtw',
    'autotrading':'automatedtrading'
    }, axis='columns',inplace=True)
df[ 'dateofprocesing' ] = None
df[ 'publisherid' ] = 0
df[ 'siteid' ] = 0
df.astype([{'theyear':'int64'},{'themonth':'int64'}], copy=False, errors='raise')

df.dtypes
The month and the year should all be numbers, but appears it has been assigned an object (assuming string)

How do I change the type of the column..?
If you would have looked here you´d seen that you need to provide a dictionary and not a list of column names.
df.astype({'theyear':'int64', 'themonth':'int64'}, copy=False)
Thank you.