Jul-14-2020, 08:29 AM
Hello everyone.
I have some issue with my data, or with the needed simulation based on the obtained data.
I record data and get data from several other groups, all of them are pressure-time datas, which all look similar to this
![[Image: pressure-time.png]](https://i.ibb.co/f2TSkzB/pressure-time.png)
(pressure starts between 0.1 and 1 bar, and gets increased to something between 10 and 50 bar)
There are now two problems:
1) The data is noisy (which would not be that much of a problem i could use something like scipy.signal.bessel)
but:
2) These data however come in various sample rates:
From 250k, 50k data points, to 9.6k, to 5k.
But all of them are to large to make simulations afterwards. A reasonable size here would be something around 100 to maybe 1000 data points
How can i reduce the number of data points to not lose information (such as the initial pressure, and the measured peak?)?
Best,
René
I have some issue with my data, or with the needed simulation based on the obtained data.
I record data and get data from several other groups, all of them are pressure-time datas, which all look similar to this
![[Image: pressure-time.png]](https://i.ibb.co/f2TSkzB/pressure-time.png)
(pressure starts between 0.1 and 1 bar, and gets increased to something between 10 and 50 bar)
There are now two problems:
1) The data is noisy (which would not be that much of a problem i could use something like scipy.signal.bessel)
but:
2) These data however come in various sample rates:
From 250k, 50k data points, to 9.6k, to 5k.
But all of them are to large to make simulations afterwards. A reasonable size here would be something around 100 to maybe 1000 data points
How can i reduce the number of data points to not lose information (such as the initial pressure, and the measured peak?)?
Best,
René