May-21-2018, 09:31 PM
Since this is SSD you can increase the buffer size. For example 10M.
Also because of that buffer, you could know exactly how many bytes are copied. You can do some math instead calling os.stat in every iteration.
Or you can use tqdm.
Something like this:
I didn't tested the script.
Also because of that buffer, you could know exactly how many bytes are copied. You can do some math instead calling os.stat in every iteration.
Or you can use tqdm.
Something like this:
import os import tqdm src = './source/big_file.big' dest = './path/big_file.big' f_size = os.stats('./source/big_file.big').st_size buff = 10485760 # 1024**2 * 10 = 10M num_chunks = f_size // buff + 1 with open(src, 'rb') as src_f, open(dest, 'wb') as dest_f: try: for _ in tqdm(range(num_chunks)): chunk = src_f.read(buff) dest_f.write(chunk) except IOError as e: print(e) finally: print(f'Done! Copied {f_size} bytes.')It will give you nice progress bar.
I didn't tested the script.