I have a Sqlite3 database that I use to store testing data. Most of the columns are small but one column can be HUGE as it stores the JSON results from our test. This column can be up to 500KB is size, with the average being around 62KB. My database is about. My file is about 80MB after about a months usage.
I've done some testing using ZLIB to compress the data in that large column and I can get a 500% compression ratio using it. That means my current database would go from 80 MB to 16MB, which is a great savings in disk space.
What I want to know is, are there are any databases available to python that you can specify to compress a column, or that does data compression, in the background?
I've done some testing using ZLIB to compress the data in that large column and I can get a 500% compression ratio using it. That means my current database would go from 80 MB to 16MB, which is a great savings in disk space.
What I want to know is, are there are any databases available to python that you can specify to compress a column, or that does data compression, in the background?