Jan-28-2021, 06:47 PM
Hi to all! Can anyone help me with this:
My code:
The program works like this: There are two measuring devices - Calibrator and Multimeter. The range is set on the multimeter (key from the dictionary), and the calibrator supplies the given points to the multimeter (values from the dictionary), then the multimeter measures these points (there are two of them here) and then the measurements are added to the dataframe. I would like to see the dataframe in this form (based on the values from the dictionary, they can change there):
0.1 -0.1 1 -1
0 0.099999 -0.099998 1.0000000 -1.0000032
1 0.099998 -0.0999997 1.0000076 -1.0000031
2 0.099999 -0.099998 1.0000032 -1.0000031
and like this:
0. 0.1 0.099999 0.099998 0.099999 0.099998
1 -0.1 -0.099998 -0.0999997 -0.099998 -0.0999997
2 1 0.999999 0.999989 1.0000032 1.0000031
3 -1 -0.99999 -0.99998 -1.0000032 -1.0000035
Here -0.1, -0.1, 1, -1 are the values from the dictionary (set_value = float (value)) setting on calibrator, below them the values measured on a multimeter (meas = dmm.get_data ()).
and continue the dataframe further according to the dictionary.
I apologize for these questions, I just started learning python with examples.
My code:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 |
ver_val = OrderedDict() ver_val[ 0.1 ] = [ 0.1 , - 0.1 ] ver_val[ 1 ] = [ 1 , - 1 ] ver_val[ 10 ] = [ 1 , 5 , 10 , - 1 , - 5 , - 10 ] ver_val[ 100 ] = [ 100 , - 100 ] ver_val[ 1000 ] = [ 1000 , - 1000 ] results = pd.DataFrame() measurements = pd.DataFrame() calc = {} meas = {} calc_pool = [] # val2 = [] # Цикл на установку диапазона и обнуление мультиметра for rang in ver_val: cal.out_set( "0 mV" ) cal.out_enable() # Wait for output to settle and read output cal.inst.write( "*WAI; OUT?" ) time.sleep( 3 ) # take a nap few seconds dmm.set_range(rng = str (rang)) # dmm.set_null() time.sleep( 3 ) # Цикл на установку точек калибровки for value in ver_val[rang]: set_value = float (value) cal.out_set(set_value) # enable output cal.out_enable() # Wait for output to settle and read output cal.inst.write( "*WAI; OUT?" ) time.sleep( 3 ) # take a nap few seconds cal.out_read() # Цикл на количество измерений в одной точке val = [] for samples in range ( 2 ): # количество измерений meas[ 'Measurements' ] = dmm.get_data() dmm.get_data() print ( "Измерение %d: %.9f В" % (samples + 1 , dmm_val)) val.append(dmm_val) # Создает список из количества измерений measurements = measurements.append(meas, ignore_index = True ) |
0.1 -0.1 1 -1
0 0.099999 -0.099998 1.0000000 -1.0000032
1 0.099998 -0.0999997 1.0000076 -1.0000031
2 0.099999 -0.099998 1.0000032 -1.0000031
and like this:
0. 0.1 0.099999 0.099998 0.099999 0.099998
1 -0.1 -0.099998 -0.0999997 -0.099998 -0.0999997
2 1 0.999999 0.999989 1.0000032 1.0000031
3 -1 -0.99999 -0.99998 -1.0000032 -1.0000035
Here -0.1, -0.1, 1, -1 are the values from the dictionary (set_value = float (value)) setting on calibrator, below them the values measured on a multimeter (meas = dmm.get_data ()).
and continue the dataframe further according to the dictionary.
I apologize for these questions, I just started learning python with examples.