Open DelSystem32 opened 9 years ago
There does also seem to be some problem between using SetAlpha() and then getting the colorTransform of a LWF.Movie. SetAlpha() doesn't update the alpha value of LWF.Movie's colorTransform so you may end up changing the alpha during the SetColorTransform().
I have a MovieClip in my flash that is tinted 100% blue. When I in Unity run the following code:
The MovieClip does indeed change color correctly. However, I'm not getting the correct console output:
Here is what the output should be like:
It looks like the SetColorTransform() method of LWF.Movie does not update its colorTransform field properly after it has reflected any changes (it just resets it). This is probably a bug, right?