Jun-06-2024, 08:47 AM
Quote:I think I'll also be better to read SD disk-image data in as "chunks" is there a library for that?
Following the Linux motto "Everything is a file" you can easily read your file in chunks:
# set chunksize much bigger than 1024, this is 1024 * 1024 def read_in_chunks(file_object, chunk_size=1048576): while True: data = file_object.read(chunk_size) if not data: break yield data path2data = '/home/pedro/myPython/books/Ashwin Pajankar - Practical Python Data Visualization_ A Fast Track Approach To Learning Data Visualization With Python (2021, Apress) - libgen.li.pdf' with open(path2data, 'rb') as f: for piece in read_in_chunks(f): print(len(piece)) # do somethingLooks like this:
Output:for piece in read_in_chunks(f):
print(len(piece))
1048576
1048576
1048576
1048576
845377
0
That said, if I want to copy anything from somewhere to somewhere, I would just use rsync.This copies from my laptop to a usb stick, but you can copy to anywhere on a network if you have write permission.
In bash:
Quote:rsync -av -e "ssh" --progress /home/pedro/myPython [email protected]:/media/pedro/295df732-017f-490a-b6cb-19061b2965e8/home/pedro/
rsync checks that data has changed before overwriting existing files, so saves a lot of time.