Python Forum

Full Version: Not able to use boto library with compressed content in python3
You're currently viewing a stripped down version of our content. View the full version with proper formatting.
I am trying to use the boto library with python3 to upload the data to GCS buckets. My data is latin-1 encoded, but it fails with following exception
>>> get_utf8_value(zlib.compress(b'test').decode('latin-1'))b'x\xc2\x9c+I-.\x01\x00\x04]\x01\xc3\x81'>>> from boto.compat import StringIO>>> StringIO(get_utf8_value(zlib.compress(b'test').decode('latin-1')))Traceback (most recent call last): File "<stdin>", line 1, in <module>TypeError: initial_value must be str or None, not bytes
>>> from boto.utils import get_utf8_value
>>> get_utf8_value(zlib.compress(b'test').decode('latin-1'))
>>> from boto.compat import StringIO
>>> StringIO(get_utf8_value(zlib.compress(b'test').decode('latin-1')))
Traceback (most recent call last): File "<stdin>", line 1, in <module>TypeError: initial_value must be str or None, not bytes
To be more specific I am using the latest version of boto library: 2.49.0 .
Please advice if some other version of boto library will be better suited for uploading the compressed strings (zlib compressed) to GCS Buckets? Or if some other APIs should be used?
I am resharing the code (from boto library)that gets invoked as part of invocation to: k.set_contents_from_string( zlib.compress(b'test').decode('latin-1'), headers={'Content-Type': "application/gzip"}) for uploading compressed content.

Quote:>>> from boto.utils import get_utf8_value
>>> get_utf8_value(zlib.compress(b'test').decode('latin-1'))
>>> from boto.compat import StringIO
>>> StringIO(get_utf8_value(zlib.compress(b'test').decode('latin-1')))
The exception thrown is Traceback (most recent call last): File "<stdin>", line 1, in <module>TypeError: initial_value must be str or None, not bytes