Jul-16-2019, 11:23 AM
(Jul-16-2019, 11:12 AM)perfringo Wrote: The question is too ambiguous to answer: "its bulky and takes some time to read in to python". What specifically is bulky and what time performance is considered satisfactory? And the main question: into what datastructure you read the file and what would you do with aquired data.
the file size increases into a file that takes time to read in to python. Based on experience in R, reading an R object in to R is near instant and that is what i would consider satisfactory.
The object would be able to hold the table and the downstream processes would be manipulation of the data for later analysis of the dataframe. a pandas dataframe is the ultimate goal. which again, is absolutely fine reading in the CSV file itself. I was jsut wondering if there was a known python style object that holds the csv file...