![]() |
Need help with For Loop logic for reading docx Table column data - Printable Version +- Python Forum (https://python-forum.io) +-- Forum: Python Coding (https://python-forum.io/forum-7.html) +--- Forum: General Coding Help (https://python-forum.io/forum-8.html) +--- Thread: Need help with For Loop logic for reading docx Table column data (/thread-23629.html) |
Need help with For Loop logic for reading docx Table column data - vintysaw - Jan-09-2020 Hi Everyone, Please help with the loop logic. There are multiple tables available in the word docx. I am trying to read a particular Table e.g. Table2 in a docx document. I need to read contents only from column 3 of Table2 having column name as 'status'. The current logic, it reads all the rows data from Table 2 till Table 3. How do I make read only column3 data from Table2. Table1 Sr. Name Age City 1 Mr. XYZ 12 sfs 2 Mr. YDS 34 safdas 3 Mr. ZSD 20 asfd 4 Mr. BSD 50 asd Table2 Sr. Name Status City 1 Mr. XYZ Single sfs 2 Mr. YDS Married safdas 3 Mr. ZSD Single asfd 4 Mr. BSD Divorcee asd Table3 Sr. Name Age City State 1 Mr. XYZ 12 sfs MK 2 Mr. YDS 34 safdas JS 3 Mr. ZSD 20 asfd SD 4 Mr. BSD 50 asd WE import docx file2 = docx.Document("C:\\Test_Document.docx") document1 = file2 columns1=[] i = 0 j = 0 def function_verifystatus(document1): flag1 = True for (table_no,table) in enumerate(document1.tables): for (i,row) in enumerate(table.rows): for (j,cell) in enumerate(row.cells): if cell.text == "Status": k = j flag1 = False if (flag1 == False): for (i,k) in enumerate(row.cells): columns1.append(k.text) print("status value is:", k.text) flag1 = True return columns1 result1=function_verifystatus(document1) RE: Need help with For Loop logic for reading docx Table column data - Larz60+ - Jan-09-2020 You may get an answer to your question here, but perhaps you should contact the author of the package here: [email protected] RE: Need help with For Loop logic for reading docx Table column data - vintysaw - Jan-10-2020 Thanks Larz. I checked that google group. It seems, it has not been active for a while now. |