incedo / fabricate

Automatically exported from code.google.com/p/fabricate
0 stars 0 forks source link

Compress redundant data in .deps file (or just gzip it) to reduce size #18

Closed GoogleCodeExporter closed 9 years ago

GoogleCodeExporter commented 9 years ago
As per Michael Haggerty
http://groups.google.com/group/fabricate-users/browse_thread/thread/64670b732d07
1b75
-- the .deps file is both huge and contains lots of redundant info for
larger builds.

Consider compressing the redundant data somehow -- a simple gzip might be
the simplest way, and for a 17MB file this would actually speed things up,
because gzipping is faster than writing big files.

There are other, perhaps more code-complex ways to compress this data too.
But if gzip is chosen, it'll mean the .deps is no longer human-readable, so
have an no-zip override for debugging (but how often does one read their
.deps file?).

Original issue reported on code.google.com by benh...@gmail.com on 7 Aug 2009 at 2:12

GoogleCodeExporter commented 9 years ago

Original comment by benh...@gmail.com on 7 Aug 2009 at 2:13

GoogleCodeExporter commented 9 years ago
See especially Michael Haggerty's comments about "normalizing the data" in this
message: 
http://groups.google.com/group/fabricate-users/msg/b3d5e3466559b416?hl=en

Compressing redundant data in the .deps file and reducing this number of MD5s 
we need
to do as per Issue 17 are definitely related. Assuming it's not 
ultra-complicated,
this would be a better solution than gzipping.

Original comment by benh...@gmail.com on 10 Aug 2009 at 9:53

GoogleCodeExporter commented 9 years ago
Note: if you gzip the file and rename it .deps.gz, then people will have no 
trouble reading it - they can just gunzip it, or for many editors, open it 
as-is and let the editor decompress the file.

Original comment by seth.laf...@ridemission.com on 28 Oct 2011 at 6:08

GoogleCodeExporter commented 9 years ago
I don't agree with gzipping the .deps file, it just adds overhead to every 
build but only benefits the rare use-case of very limited disk space.

Agree with Ben's comment 2 that removing redundant data to reduce the overhead 
is useful.

Original comment by ele...@gmail.com on 28 Oct 2011 at 10:48

GoogleCodeExporter commented 9 years ago
This issue was closed by revision r150.

Original comment by benh...@gmail.com on 9 Nov 2011 at 2:26

GoogleCodeExporter commented 9 years ago
This issue was closed by revision cc5a43f974f8.

Original comment by simon.al...@gmail.com on 15 Apr 2013 at 4:38