Hello
I want to import data in Arangodb so I need .json files. All of my data are in .xlsx file (big ones 10 to 20 MB each) with 25 to 35 sheets each. So I created a loop with this code:
1. The code works I have some of the .json files to prove it, but after a few of the sheets the program crashes. Can anyone help with why is this happening? I am using jupyter notebook and I got an error that thee kernel is dead.
2. Obviously when the loop is running it reads every time the .xlsx file and creates the new dataframe. Is there a way to load the file only once in the memory and get the dataframes from that instead of loading it every time ?
I want to import data in Arangodb so I need .json files. All of my data are in .xlsx file (big ones 10 to 20 MB each) with 25 to 35 sheets each. So I created a loop with this code:
#sheets is alist with all the sheet names from the file for i in sheets: df = pd.read_excel(file, sheet_name = i, index = None, header = 1) json_file = df.to_json(("{}.json").format(i))I have the following questions:
1. The code works I have some of the .json files to prove it, but after a few of the sheets the program crashes. Can anyone help with why is this happening? I am using jupyter notebook and I got an error that thee kernel is dead.
2. Obviously when the loop is running it reads every time the .xlsx file and creates the new dataframe. Is there a way to load the file only once in the memory and get the dataframes from that instead of loading it every time ?