Posts: 4,647
Threads: 1,494
Joined: Sep 2016
i have a function with a bunch of arguments, 3 of which have a lot of choices. i worry that it is easy for development of the calling code to make mistakes. so the function has thorough error checking. this error checking could, in some cases, be more extensive than the function's actions. what i am doing is adding a keyword argument defaulting to
False that is
bypass_error_checks=False for developers to use once they have tested their code so things will run faster.
is this a reasonable thing to do?
Tradition is peer pressure from dead people
What do you call someone who speaks three languages? Trilingual. Two languages? Bilingual. One language? American.
Posts: 4,647
Threads: 1,494
Joined: Sep 2016
the function runs a command pipeline ... a list of list of string. tuple can be used in place of list. bytes or bytearray can be used in place of str. it checks that valid types are used. it applies the checks over every item in every list. there is a keyword argument out= that specifies where output of the pipeline goes to, such as an open file or file name. there are a few other keywords and it also cross checks the combinations of settings.
Tradition is peer pressure from dead people
What do you call someone who speaks three languages? Trilingual. Two languages? Bilingual. One language? American.
Posts: 4,647
Threads: 1,494
Joined: Sep 2016
what about setting an environment variable to enable the maximum level of checks? the developer could leave that set when doing development. without it being set the function will assume this call is from well debugged code and that the checks would pass.
these checks are intended to meaningfully report to the developer what coding errors she has made.
Tradition is peer pressure from dead people
What do you call someone who speaks three languages? Trilingual. Two languages? Bilingual. One language? American.