Oct-09-2020, 05:42 PM
My test:
[deadeye@nexus ~]$ dd if=/dev/urandom of=random.bin bs=1M count=64 64+0 Datensätze ein 64+0 Datensätze aus 67108864 Bytes (67 MB, 64 MiB) kopiert, 0,815292 s, 82,3 MB/s [deadeye@nexus ~]$ python file2hex.py [deadeye@nexus ~]$ md5sum random.bin random2.bin 929b3a89653f956721743a93955e2ec2 random.bin 929b3a89653f956721743a93955e2ec2 random2.binCode:
from binascii import hexlify, unhexlify def file2hex(input_file, output_file): with open(input_file, "rb") as fd_in, open(output_file, "wb") as fd_out: while chunk := fd_in.read(20): fd_out.write(hexlify(chunk)) fd_out.write(b"\n") def hex2file(input_file, output_file): with open(input_file, "rb") as fd_in, open(output_file, "wb") as fd_out: for line in fd_in: fd_out.write(unhexlify(line.rstrip())) file2hex("random.bin", "random.hex") hex2file("random.hex", "random2.bin")
Almost dead, but too lazy to die: https://sourceserver.info
All humans together. We don't need politicians!
All humans together. We don't need politicians!