Python Forum

Full Version: transform result to DataFrame
You're currently viewing a stripped down version of our content. View the full version with proper formatting.
Hi to all! Can anyone help me with this:
My code:
 ver_val = OrderedDict()
ver_val[0.1] = [0.1, -0.1]
ver_val[1] = [1, -1]
ver_val[10] = [1, 5, 10, -1, -5, -10]
ver_val[100] = [100, -100]
ver_val[1000] = [1000, -1000]
results = pd.DataFrame()
measurements = pd.DataFrame()
calc = {}
meas = {}
calc_pool = []
# val2 = []
# Цикл на установку диапазона и обнуление мультиметра
for rang in ver_val:
    cal.out_set("0 mV")
    cal.out_enable()
    # Wait for output to settle and read output
    cal.inst.write("*WAI; OUT?")
    time.sleep(3)  # take a nap few seconds
    dmm.set_range(rng=str(rang))
    # dmm.set_null()
    time.sleep(3)
    # Цикл на установку точек калибровки
    for value in ver_val[rang]:
        set_value = float(value)
        cal.out_set(set_value)  # enable output
        cal.out_enable()
        # Wait for output to settle and read output
        cal.inst.write("*WAI; OUT?")
        time.sleep(3)  # take a nap few seconds
        cal.out_read()
        # Цикл на количество измерений в одной точке
        val = []
        for samples in range(2):  # количество измерений
            meas['Measurements'] = dmm.get_data()
            dmm.get_data()
            print("Измерение %d: %.9f В" % (samples + 1, dmm_val))
            val.append(dmm_val)  # Создает список из количества измерений
            measurements = measurements.append(meas, ignore_index=True)
The program works like this: There are two measuring devices - Calibrator and Multimeter. The range is set on the multimeter (key from the dictionary), and the calibrator supplies the given points to the multimeter (values from the dictionary), then the multimeter measures these points (there are two of them here) and then the measurements are added to the dataframe. I would like to see the dataframe in this form (based on the values from the dictionary, they can change there):

0.1 -0.1 1 -1
0 0.099999 -0.099998 1.0000000 -1.0000032
1 0.099998 -0.0999997 1.0000076 -1.0000031
2 0.099999 -0.099998 1.0000032 -1.0000031

and like this:

0. 0.1 0.099999 0.099998 0.099999 0.099998
1 -0.1 -0.099998 -0.0999997 -0.099998 -0.0999997
2 1 0.999999 0.999989 1.0000032 1.0000031
3 -1 -0.99999 -0.99998 -1.0000032 -1.0000035


Here -0.1, -0.1, 1, -1 are the values from the dictionary (set_value = float (value)) setting on calibrator, below them the values measured on a multimeter (meas = dmm.get_data ()).
and continue the dataframe further according to the dictionary.
I apologize for these questions, I just started learning python with examples.
Okey, I decided my question. Now I have another question:
How I can name column with two names, like this:
Range 0.1 Range 0.1
Point 0.1. Poin -0.1

0 0.099999 -0.099999
1 0.099998 -0.099997
2 0.099999. -0.099999
3 0.099997. -0.099999