jonnenauha / obj-simplify

Object File (.obj) simplifier
MIT License
133 stars 16 forks source link

Add 'Quickstart' section to README.md #3

Closed donmccurdy closed 6 years ago

donmccurdy commented 6 years ago

Figured things out from #2, but a quickstart section would be very helpful for those of us who are new to Go but want to optimize some models.

Thanks for making this! Just ran it through a bunch of models: ~70% average filesize reduction (they were messy models...); way fewer draw calls. About 4% of models failed or timed out, but I set a 10s timeout so that's on me. :)

jonnenauha commented 6 years ago

@donmccurdy What did you mean with timeouts? This tool timed out or you network requests fetching the files in?

If it was this tool you could upload them somewhere and share the link to me (here or private message). I know there are problems with very big files (gigabytes), I don't think there is any "timeout" in the code but for me OS runs out of memory and the go process dies :) That is ~6-7 GB mem usage on the process (I have only 8gb on my home machine). Nothing inherent in golang that would make big allocs problematic.

I have been working on a branch (well havent touched it in a long time) that parses multiple gig files really fast and the deduplication is a lot simpler. It's a simpler string compare (after zero trimming etc.) rather than string > float > compare with epsilon. This will not get all duplicates but will probably find 95% from typical obj files I've seen out in the world.

Anyhow I had a multiple gig source file myself that I tested, don't think its a typical use case for users. Perhaps I could make the tool auto select the simpler/faster mode if file size is >100MB or something.

donmccurdy commented 6 years ago

What did you mean with timeouts? This tool timed out or you network requests fetching the files in?

I don't think it's your tool's fault — I have a lot (~10000) of mostly-small files (each <1MB), and simplifying certain files hung indefinitely, so I imposed a time limit myself:

ls -1 $in | while read model ; do

  if [ ! -f "$out/$model" ]; then
    echo "Simplifying $model..."
    gtimeout 10s obj-simplify -in "$in/$model" -out "$out/$model" -no-progress -quiet
  fi

done

With that, about 4% of models got skipped. I haven't gone through to figure out what was up with those files yet (maybe they were larger? maybe weird geometry artifacts?), but if it turns out to be something fixable-looking I'll file a bug and share some of the source models.