Oct-06-2019, 01:09 PM
(This post was last modified: Oct-06-2019, 01:11 PM by Gribouillis.)
It is because your test is incorrect, the expected output should be -4.5. Nevertheless, here is a new version because I discovered that
generate_token()
is only kept for backwards compatibility but it is undocumented. Here is the version using the official tokenize()
functionimport ast import re from tokenize import tokenize, NUMBER, ENCODING __version__ = '2019.10.6' def val(s): t = re.sub(r'\s+', '', s).encode() sign = True for tok in tokenize(iter([t + b'\n', b'']).__next__): if tok[0] == ENCODING: continue if sign and tok[1] in ('-', '+'): sign = False continue if tok[0] == NUMBER: return ast.literal_eval(t[:tok[3][1]].decode()) return 0 return 0 if __name__ == '__main__': sample = [ ("45", 45), ("++++++45", 0), ("-+54", 0), ("4.5", 4.5), ("4.5ABC", 4.5), ("ABCD", 0), ("AB45", 0), (" 2 45 7 ", 2457), (" 2 4 . 5 7 ", 24.57), ("- 12 e - 7", -12e-7), ("+4.5", 4.5), ("-4.5", -4.5), ] for inp, out in sample: print(inp, out, val(inp)) assert out == val(inp)