Jan-19-2020, 09:14 PM
That's what I'm finding - lots of ways to store the data. Each way I have tried has limitations in data indexing.
With the MATLAB code I can index the number of workers in all files at once like filedata[:].numWorkers. I can access the locations of all trucks in a given file like filedata[1].truck[:].location - I haven't come across a way to do that in Python without n-dimensional arrays, and my problem is that the number of trucks or workers will change from shift to shift, requiring different sized vectors in each tier of the structure/class/whatever I set up. It may just be that it isn't possible to do this in Python the way I need to. I guess I could just create a large arrays to ensure my data will fit, and just store the size of each set of variables for bookkeeping (or search for non-NaN values).
With the MATLAB code I can index the number of workers in all files at once like filedata[:].numWorkers. I can access the locations of all trucks in a given file like filedata[1].truck[:].location - I haven't come across a way to do that in Python without n-dimensional arrays, and my problem is that the number of trucks or workers will change from shift to shift, requiring different sized vectors in each tier of the structure/class/whatever I set up. It may just be that it isn't possible to do this in Python the way I need to. I guess I could just create a large arrays to ensure my data will fit, and just store the size of each set of variables for bookkeeping (or search for non-NaN values).