Jul-06-2017, 02:57 PM
Hi guys,
I wrote a code to print some tables, and to do that I used the following lines of code to fill my cells with the values of my array p:
So, in the position 0,0 from data_prec I have the precision for the value of p on the position 0,0. I mean, for the value 1 in the array p, the precision is 0.01 as you can see in the following example:
I tried to modify the code above but doesn't worked. Here's my attempt:
I wrote a code to print some tables, and to do that I used the following lines of code to fill my cells with the values of my array p:
for row in range(prism): cell_text.append(['%d' % item for item in p[row,0:4]])That works ok. But now I have a list data_prec of values which represents the precision of the values of the array p.
So, in the position 0,0 from data_prec I have the precision for the value of p on the position 0,0. I mean, for the value 1 in the array p, the precision is 0.01 as you can see in the following example:
Output:Array p:
[1, 2, 3, 4]
List data_prec:
[array([0.01, 0.02, 0.05, 0.02])]
So I need to fill the cells of my tables with an element of p and its correspondent precision, I mean something like 1 +- 0.01.I tried to modify the code above but doesn't worked. Here's my attempt:
for row in range(prism): cell_text.append(['%d +- %d' % item for item in p[row,0:4], % elem for elem in data_prec[row,0:4]])I appreciate if anyone can help me. Thanks !!