valette / Wavemesh

Progressive compression of 3D triangular meshes
https://www.creatis.insa-lyon.fr/~valette/public/project/wavemesh/
GNU General Public License v3.0
42 stars 15 forks source link

Usage about Wavemesh #1

Closed Seven377 closed 6 years ago

Seven377 commented 8 years ago

Can you provide the instruction about wavemesh ? I don't know how to use it.

valette commented 8 years ago

As explained in the Readme file:

execute wavemesh without arguments to see the available options.

Also, feel free to submit pull requests if you want to add more documentation!

Thanks

Seven377 commented 8 years ago

I have successfully run Wavemesh, I have two questions:

  1. when I set the coordinates quantization to 16 bits, there have a error : " terminate called after throwing an instance of 'std::length_error' what(): vector::_M_fill_insert "
  2. I don't know how to compute compression rates (bpv),including connectivity compression rates and geometry compression rates.
valette commented 8 years ago

when I set the coordinates quantization to 16 bits, there have a error : " terminate called after throwing an instance of 'std::length_error' what(): vector::_M_fill_insert "

I think you cannot use 16 bits, 15 bits should work however

I don't know how to compute compression rates (bpv),including connectivity compression rates and geometry compression rates

during decompression, a file "report.txt" should give you all these infos

Seven377 commented 8 years ago

I find it , thank you very much.

Seven377 commented 8 years ago

I have understood the meaning of each value in the file "report.txt" , however, I want to know that if the "total data" contains the compression rates of connectivity and geometry ? Can I obtain the compression rates of geometry at each detail of level ?

valette commented 8 years ago

OK, the data is not completely clear:

Let's see: Level 25: 11547f, 5826v, valence entropy= 2.2838, total data: 4.51972 bits/v (connectivity: 25757bits, 3.73387 bits/vertex for this level)

at level 25, the total data weighs 4.51972 b/v i.e. 4.51972 *5826 = 26331 bits. As written, connectivity took 25757 bits, so geometry took the remaining part.

Seven377 commented 8 years ago

I have tried to compute geometry compression rates in this way, however, the size of total data sometimes is less than connectivity. for example, when I used Bunny model , Level 16: 5618f, 2811v, valence entropy= 2.31131, total data: 3.19518 bits/v (connectivity: 13092bits, 3.92571 bits/vertex for this level) the total data weighs 3.19518 bits/v i.e. 3.19518*2811=8982 bits, but connectivity took 13092bits. In fact, this problem exists at level 1 to level 16.

valette commented 8 years ago

oops, sorry, I made a mistake in my explanation on bunny: Filename: out.ddd Quantization : 12 bits No lifting No Wavelet Geometrical Criterion Total execution time : 0.75816 seconds : 91604.7 faces/s Level 0 : 26f, 19v, total data: 1456 bits (connectivity: 422bits) Level 1: 27f, 20v, valence entropy= 2.22821, total data: 0.050066 bits/v (connectivity: 606bits, 184 bits/vertex for this level) [...cut...] Level 25: 25587f, 12874v, valence entropy= 2.0304, total data: 8.37412 bits/v (connectivity: 48430bits, 3.29857 bits/vertex for this level) Level 26: 69451f, 34834v, valence entropy= 1.19835, total data: 16.2191 bits/v (connectivity: 96366bits, 2.18288 bits/vertex for this level) Global coding: 16.2191 bits/vertex, connectivity : 2.76644 bits/vertex, geometry : 13.4527bits/vertex File size: 70622bytes

for level 25: the total data weighs 8.37412 bits/v i.e. 8.37412 * 34834=291704 bits. for this model, you should always multiply the bitrate (in bits/v) by the original number of vertices : 34834 in the bunny case.

Sorry for the mixup is that better now?

Seven377 commented 8 years ago

I have two questions:

  1. For Bunny model, at each level, compression rates of geometry=(total data* 34834- connectivity)/34834 , right ? or I should divide the current number of vertices at this level (12874, take level 25 for example )?
  2. Should I use the original number of vertices for all model ? Because this problem also exists when I used Sphere model.
valette commented 8 years ago

1 - correct. 2 - Yes, you should always use the original number of vertices. This is a common use for progressive compression (all papers report this in the same way)

Seven377 commented 8 years ago

I have learned is far from enough for mesh compression, I will continue in my efforts.Thank you for your patient explanation.

valette commented 8 years ago

you're welcome!