Sep-15-2018, 05:15 AM
i am designing a program that will send tarballs to AWS S3. this means the data in tar format needs to be buffered and instead of writing to disk, will be given to functions in the botocore module to be stored in an S3 object without any more I/O than the reading the disk files that are being archived and the network traffic with AWS. the problem is that i see no interface for the writing of the tarball to a buffer. and i need to stream this since the size can be many gigabytes, or even many terabytes (this may be run on an EC2 instance where the kind of network capacity to do that is possible). and this will usually involve compression and may also involve encryption.
the only way i can think of doing this is to make a unix named pipe that will be set up for the tarfile module to write to in an separate process. does anyone know of a cleaner way to do this which does not involve any OS features and is totally solved in Python? Python3 will be used for this project.
the only way i can think of doing this is to make a unix named pipe that will be set up for the tarfile module to write to in an separate process. does anyone know of a cleaner way to do this which does not involve any OS features and is totally solved in Python? Python3 will be used for this project.
Tradition is peer pressure from dead people
What do you call someone who speaks three languages? Trilingual. Two languages? Bilingual. One language? American.
What do you call someone who speaks three languages? Trilingual. Two languages? Bilingual. One language? American.