I like this
def split_by_n( seq, n ):
"""A generator to divide a sequence into chunks of n units."""
while seq:
yield seq[:n]
seq = seq[n:]
print(list(split_by_n("1234567890",4)))
@[metulburr]: that code is very inefficient, it will run in quadratic time relative to the length of the sequence. If you were iterating over a file with 10000 lines or something that would take much, much longer than it needs to.
(Oct-07-2016, 08:32 PM)micseydel Wrote: [ -> ]@[metulburr]: that code is very inefficient, it will run in quadratic time relative to the length of the sequence. If you were iterating over a file with 10000 lines or something that would take much, much longer than it needs to.
Would you explain please, how to determine the efficiency of a code? In the Tutorials may be.
My plan is to repost this this weekend, but here's the one metulburr already migrated:
http://python-forum.io/Thread-Efficiency-Crash-Course
The idea is basically the same as the example. Slicing a string, like building a string with concatenation, will create a whole new one each time.
Also notable I suppose on top of efficiency is the API. I would argue an ideal API supports iterables, not just slicables.