Mar-07-2021, 09:03 AM
Hi,
Here is a code:
Here is a code:
def beginning_zeros(x): count = 0 for letters in x: if letters == '0': count += 1 else: return countI wrote this code to count the zeros at the beginning. While i excecuted it for '0001' it gave the right answer(3) but when i ran for '0000' it gives me 1. Can someone explain why? And what should i do for this?