Closed donmccurdy closed 6 years ago
Hm, not only does KeyframeTrack call optimize()
on itself when created, but AnimationClip calls optimize()
on each track again when they're passed into its constructor? That does seem redundant... π€
Yeah, optimization and validation in exporters rather than in importer/viewer sounds more reasonable.
Hm, not only does KeyframeTrack call optimize() on itself when created, but AnimationClip calls optimize() on each track again when they're passed into its constructor? That does seem redundant... π€
I haven't looked into the code yet (I'm in the airport now!) but I'm wondering how the CPU time for optimize()
will be if we remove redundancy.
Unless there is a valid reason to call optimize
more than once on the same instance, wouldn't it be better to just set an isOptimized
-flag once an instance is optimized? On consecutive calls to optimize
, it just sees if the flag is set and aborts immediately.
That way we won't have breaking changes in the API, while still removing the redundancy.
wouldn't it be better to just set an isOptimized-flag once an instance is optimized? That way we won't have breaking changes in the API, while still removing the redundancy.
+1
I'm a bit weary of "optimize", I think importers/exporters should take care of that, alternatively - it should be in the user domain. Validation only makes sense in the asset pipeline, I think, let's say deserialization code is justified to perform validation, but that's about it.
Unless there is a valid reason to call optimize more than once on the same instance, wouldn't it be better to just set an isOptimized-flag once an instance is optimized?
The problem is there are two redundancies: we call optimize()
twice, but even one call would be pointless if the author has already optimized their animation, which we have no way of knowing (and most likely, they have not).
That way we won't have breaking changes in the API, while still removing the redundancy.
I'm not suggesting any breaking API changes, just a way for an author to indicate their animation does not need to be optimized without forking KeyframeTrack.
I'm a bit weary of "optimize", I think importers/exporters should take care of that, alternatively - it should be in the user domain.
It would be ideal for optimization to happen before animation is brought into three.js, but that's a much harder problem. In any case we should keep the optimize()
method around, because using it and then storing the result with clip.toJSON()
is a completely valid way of creating an optimization step.
I think importers/exporters should take care of that, alternatively - it should be in the user domain.
That's true, but there are millions of assets uploaded on thousands of servers. Do you expect them to be re-exported in the "optimized" way?
p.s. The backwards compatibility is (should be) a critical factor in software development.
@RemusMar & @donmccurdy
That's true, but there are millions of assets uploaded on thousands of servers. Do you expect them to be re-exported in the "optimized" way?
It would be ideal for optimization to happen before animation is brought into three.js, but that's a much harder problem.
Asset pipeline is something everyone has, some put more effort into it that others, some pipelines are completely custom while others are heavily standardized etc. I am not advocating for removal of validation and optimization methods - I think they are wonderful tools that serve real need. I believe that it's reasonable to assume that once an animation is loaded it is:
When these assumptions are insufficient, i believe it is reasonable to offer the tools (validate
and optimize
) to the user in order to deal with such usecases.
@donmccurdy writeup on GOat was quite insightful, especially part about glTF exporter. They said they wrote 13k lines for that interactive experience which, when reading the writeup, indicated me think that there's a lot of room for improvement in glTF export as well as animation module and three.js in general.
@RemusMar
The backwards compatibility is (should be) a critical factor in software development.
"Everything in moderation". I don't subscribe to your view, backwards compatibility is critical for some applications and harmful for others. Language design is one example where backwards compatibility is a clear net negative for the user, take C++ as example, there are a fraction of a percent of a percent of C++ developers who can rightfully claim to know all of 'standard' C++ - thanks to backwards compatibility. Backwards compatibility is expensive, it bloats your code, it takes time, it leads to rigidity in your architecture which strongly opposes evolution and progress. Three.js is not backwards compatible, I enjoy migration and deprecation notices, but that's not backwards compatibility, I can't boot up my application written against three.js r50 in three.js r93 and expect it to work - it doesn't. There is something called "semantic versioning" which might be a good thing for people to familiarize themselves with, if not to practice religiously. /rant
I don't subscribe to your view, backwards compatibility is critical for some applications and harmful for others.
That's the problem with most of the juniors these days. In their opinion, everything new is better. As result: the internet is full of bloatware and spyware.
If I love Microsoft for something is that I can play most of the DOS games (25 years old) on Windows 10. That's a lesson of backwards compatibility for all the juniors and noobs.
Let's start with one of three things:
I would prefer (1).
@RemusMar given that we currently call optimize()
by default, do you consider (3) to be backward-incompatible? I think that is the only option on the table that potentially breaks back-compat.
...there's a lot of room for improvement in glTF export as well as animation module and three.js in general.
Sure. Keep in mind that they evaluated FBX, JSON, and DAE and had even less success with those. But glTF is a new format and we're fully aware great support from exporters is going to take time, and that's a high priority for Khronos Group. The most effective way to improve things, other than PRs, is to give specific feedback in the appropriate repositories. I'm also working on other points mentioned in the article, not just this one.
Don,
All my assets are highly optimized from any point of view, but that's irrelevant. I was talking about millions of older assets stored everywhere.
I would prefer (1).
That's a good idea.
a fourth idea:
Defer automatic validation and optimization until the clipAction
call, so that it still happens by default, but the user can opt out through the mixer without a global setting:
var track = new KeyframeTrack(...); // not optimized
var clip = new AnimationClip( 'run', [track] ); // not optimized
var action = mixer.clipAction( clip ); // validated and optimized
// (a)
mixer.autoValidate = false;
mixer.autoOptimize = false;
var action = mixer.clipAction( clip ); // validation and optimization skipped
// (b)
var action = mixer.clipAction( clip, null, false ); // validation and optimization skipped
This requires an internal isOptimized/isValidated flag on the clip so that creating multiple actions from one clip won't re-optimize it.
I don't love that this modifies clip
as a side effect of the method, so I'm still leaning toward (1) above, using a global opt-out.
How do you see the global opt-out working out in practice? Some assets may have been optimized, some have not.
For example : I'm building a map editor for my game. I know my assets are all optimized but I can't guarantee that the assets of the end users are. How are we to deal with this?
I still prefer a call-once and store the state option, with the possibility of an opt-out for the situations where you have full control over the assets your 'engine' is loading.
I still prefer a call-once and store the state option, with the possibility of an opt-out for the situations where you have full control over the assets your 'engine' is loading.
Another good idea (and 100% backwards compatible)
How would you suggest implementing possibility of an opt-out for the situations where you have full control over the assets
?
@donmccurdy I believe you already gave a suggestion that I would personally be very happy with:
let action = mixer.clipAction(clip, null, false); // Skip validation & optimization.
let action = mixer.clipAction(clip, null, true); // Do validation & optimization.
However, I believe the default behavior should be true
instead of false
. This way the behavior doesn't change compared with the current situation, whilst giving developers the freedom to skip this process since they know about their own assets.
The problem with changing this API as you initially proposed is that both settings are valid. Either skip validation and optimization (which would be the default from what I get from your suggestion) or enable it. The problem is that since both ways are valid, you can't post a warning message stating that the validation is skipped. This would cause console log clutter for no valid reason.
Since you can't reliably warn the user about it (if that's even needed, but I would presume so for performance reasons), there would be a likely chance of people reporting issues due to - what they think is - unexpected behavior.
That said, I would propose exactly what you did but swap the default behavior around.
After I gave it some thought, I don't like the situation where ClipAction
would rely on a "global" variable that would be declared on the THREE
object, since this could potentially mess up module loading or cause the need of unwanted dependencies.
Ok, thanks! I'd meant to say the default should be to optimize (i.e. keep current behavior) as well, and just provide an option β somewhere β to disable it, so I think we're on the same page.
@RemusMar When commenting on my expression of professional views, you wrote:
That's the problem with most of the juniors these days. In their opinion, everything new is better. As result: the internet is full of bloatware and spyware.
And then immediately after
If I love Microsoft for something is that I can play most of the DOS games (25 years old) on Windows 10. That's a lesson of backwards compatibility for all the juniors and noobs.
Let this remain here for public record. I donβt think I could add to this.
On topic of compatibility. I don't know if we are talking about the same point here. Lets say that I had an invalid animation that I tried to load and it failed before, now it loads but doesn't work potentially - that's one issue and is a break from previous behaviour. When it comes to optimization - I don't see the argument, so if someone could explain it - I would appreciate it; the way I see it - behaviour is still the same. Sure, it might be running a bit or a lot slower, but functionally - it's the same.
@haroldiedema
For example : I'm building a map editor for my game. I know my assets are all optimized but I can't guarantee that the assets of the end users are. How are we to deal with this?
that's a good example, and my answer would be - just run .optimize
explicitly on any imported assets.
My objection stems from my experience with compilers, there's a strong need for correct functionality, but performance is typically an optional requirement, at least it has a gradient. You produce functionally correct code, and then you either optimize it as needed (online optimizations such as hot-spot) or you have special flags in static compilers, and even projects like https://github.com/google/souper which go above and beyond "reasonable" optimizations. This kind of approach puts trade-off between startup time and runtime performance into the hands of users instead of saying "we know what's best for you, you're going to have it this way, and you're going to enjoy it" - which, by the way, is also a totally valid strategy.
@donmccurdy
Sure. Keep in mind that they evaluated FBX, JSON, and DAE and had even less success with those.
Sure, I feel like I say this rather too often: "I didn't mean this as a negative criticism of [insert here]". So... I didn't mean this as a negative criticism of GLTF . I think it's a wonderful format, the tooling is not up to scratch, and I have contributed my meager efforts towards improving that a bit myself. I do seem to rub people the wrong way a lot it seems, sorry about that.
Proposed fix in https://github.com/mrdoob/three.js/pull/14345.
@Usnul thanks, yeah misunderstandings are easy in this format. Appreciate your feedback. π
Let me reopen this issue until #14385 will be merged.
I still have one thing I'm not really sure.
Why do we validate/optimize animation keyframe track on the fly as default while we generally don't validate/optimize other assets(models, textures, and so on) and user inputs? A lot existing animation assets aren't validated/optimized while a lot other assets are? Or any historical reasons?
The docs say equivalent sequential keys ... are common in morph target sequences
. I don't know much more than that, but Blender similarly overbakes animations commonly. I don't feel especially strongly about the default, but would like to provide some way to turn this optimization off.
It won't be productive to have a strict rule that all automatic optimization is bad... we should try to provide good defaults to users, and make any performance issues easy to understand and debug. I don't how often KeyframeTrack's automatic optimizations pay off, and because I don't know I was hoping to avoid changing it while providing the opt-out...
/cc @bhouston
I think disabling it is find and requiring it to be called explicitly. I put it in the contractor when I wrote keyframetrack because I wanted to make it easy to test. So I ran it a lot and yeah it works well. But yes, let's make it explicitly. Should be a big deal.
Should NOT be a big deal.
I'm OK with removing the automatic optimization here then. Or any of my previous PRs. @takahirox preference?
I agree with removing automatic optimization/validation and asking users to explicitly call .validate/optimize()
.
And I don't think we need isValidated/Optimized
flags yet if we remove them. As I mentioned in another thread, the flags can be inconsistent. We can start discussion the flags again when user request such optimization.
By default, the KeyframeTrack constructor calls
this.validate()
andthis.optimize()
. Those calls are unnecessary for animations that are already optimized, and problematic if the track is going to be split later, such as in https://github.com/mrdoob/three.js/pull/13430. From the Oat the Goat technical writeup βAn option in the KeyframeTrack constructor doesn't really help, because they're typically not created directly, and the option would need to be passed through all of the loaders. We could disable the optimization and let users trigger it manually, but I do think it's a reasonable default. What about a (global) opt-out flag?