Open floatingworld3 opened 6 years ago
Yes, a connection to the internet is necessary to have the trial version working longer than 2 minutes.
Oh one last thing - if i wanted to say track a point, say, under the eyes, or the sneer points on the nose, can I add these points to the list of tracking points? If so how do I do that? By clicking on a point?
No, the model of 68 points is fixed. You will always get those 68 points. You can of course calculate your wanted points from the 68 that are provided. An example for that is the extended face with 6 more forehead points, that are not tracked, but calculated.
Ok I see.You might want to provide a little YouTube tutorial on how that's done...many of the 68 points are not very useful in animating a rig...only a fraction of those are useful.....and it's a pity that the pupils aren't tracked...clmtracker does that...but I guess you can't have everything...lol
Yeah, you can't have all the nice things. Those guys did quite well I guess: https://stinkmoji.cool
But I'm not a 3d professional, so can't tell if that's what your are looking for.
Well my friend you could consult one when you come to do version 5 because you could have a streamlined version that tracks only the key expression points animators are interested in...this would make it more efficient and responsive at least for my purposes.....for example - there's no need at all to track all those points along the jaw because they don;t add much to an expression... but the areas under the eyes and inner cheeks are quite important..... By the way when getting the position of a vertex using say...px = points[index].x, how do you specify which tracked face is being referred to?
brfManager.getFaces() is an array of BRFFace objects. That's all for mapping. It is not keeping track of who is who. If you track two faces and both are lost, it depends on the order of re-detection.
While tracking, it keeps the order though.
So - sorry last question - the demo version is time limited to a few seconds when disconnected from web?