Closed
Description
Is there an existing method or workflow that would meet the needs of this scenario:
- Have 1000
.xml
files - Build a dictionary based on those files
- Compress all
.xml
files with the dictionary from step vs gzip & Zophli? #2 - 1000 new files added and need to be compressed (2000 files total now)
- Update dictionary (or build new dict) with all 2000 files as the source
- Compress all files using the newly updated dictionary
This could be done manually by uncompressing all the files to a temp directory, building a new dictionary, and then re-compressing all the files but that's a lot of disk churn if you're talking about thousands of files.
In other words we have a directory of XML files that grows overtime. The dictionary created on day one may not be the most efficient for XML files added four weeks later. Is there a way to keep that dictionary up to date and efficient for the new files coming in?