Closed kdschlosser closed 4 years ago
Hmmm well whatdayaknow... it sorta works. Still has the BLACK issue on wxPy4.1 tho.... I'm beginning to think that whatever changed in wxWidgets is what may be causing the performance issues on wxPy4.1 also, but am not sure...
brushwithalpha.png
I'm guessing you ripped the constants out of win32 to make it purepython. Not sure what you would need to do that with linux or mac...
With the stacked frames approach I made I eventually got to a point where I rage quit about 10 times because flattening out the font into a png made the stroke look worse and worse as I went. Then I came across the left edge wasn't rendering the first pixels on each line for who knows what reason. Hardcoding -1 didn't work either... I figured I could paint the logo into a walrus sitting on a couch, and combine it with my animated python powered apng and rerender it so it might be able to be ripped into a throbber for a animated splash screen. Needless to say, I kept getting frustrated by the little problems. part of the issue is that the font isn't free so that's why I was rendering it as a png.
I think with your approach combined with 2 stacked frames it will be possible to do a fancy animated splash with alpha. I even spent the time to write a custom fade in timer for it that looks nice. ... But yea I might have to add +1 or +2 to the frame size to work around the edge issue. That likely will uglyfy the code a bit by scattering extra variables everywhere which most folks will then ask "Why did you add this hardcoded +2?"
Also with your approach live painting directly from krita onto a frame will now support a alpha background, which is nice.
I do not believe in the phrase "It cannot be done", I will however accept "With out present state of technology we cannot do it." LOL
I knew there was a better way, I have been scouring over the Windows SDK.. actually I have ported almost a million lines of it to Python (pure).
This is the real bonus of the way I have gone about it is if you wanted to draw an alpha frame that had a colored background, then you draw a png on top of it. If that png is not 100% opaque you would normally end up with the color from the background changing the colors in the png. The mechanics are there to stop that from happening.
You like how i made the hole through the middle of the frame?
both the text alpha and the overlap of the pen and brush can be coded around. doing so is expensive and it would be nice to have those issues fixed at the lower level of wxPython/wxWidgets.
I did want to let you know that a fade in and out is super simple to accomplish.
I have not tested this code, But it will give you the general idea anywho.
also you can ditch the use of GraphicsContext all together and only use GCDC, the output is the same. I had it in there for purposes of testing the text alpha. the black issue might go away if you do that.
Yea timer fade is simple enough. Mine is basically like that with range 256 but it skips a bit slower each iteration to make is a slowing curve, tho I don't use thread, just a sleep call. Then I invert to logic and run it about x5 faster on destruction. The curve makes it look a bit nicer than just a hard number for the timer. It still causes issues with wxPy4.1 tho atm. I did actually write a starter for a pie menu im working on which is similar to the cutout. Instead of cutting out, I invert the logic to make slices in a drawn graphic, then convert the graphic to a mask and replace the green with black or whatever color I want when I have my bitmap to splat on the frame mask. You have to keep track of the regions to tell if mouse is over a slice section(button etc)
@kdschlosser If you really want a single file test piece, you can use my python powered logo apng to test. I know there isn't a lot of files that can be ripped into all the tests but it does 99% of them. apngdis/apngasm can do most of the take apart and put back to gether work along with pillow. This one renders slow on browsers, since a gif.
If you really want the apng source files in blender I can send em to ya, tho I would bet 99% of em will cause a BLACK screen on wxPy4.1 if rendered tho
.
If you are reading this line, you will notice the difference between gif and apng in your browser. My opinion is it should look the same way with alpha in a basic demo with like the brush alpha png i provided.
OK so I dod some messing about with your animated PNG file. It is not that hard to create a wx Animation or splash screen from it. The apng specification is an easy specification. There is actually a python library that will break the apng apart into separate png images. go figure on this.. the name of the library is "apng" LOL..
It also provides you the frame header data. so you know how long to wait between frame renderings and how to clear the old frame and ly in the new frame. That is the part you would have to figure out how to do with masks. It did a quick and dirty. it didn't come out 100% correct but the animation portion of it works nd it works just fine. You would need to create a thread to handle the renderings.
The hardest part of the whole deal is going to be the masks. because apng has some goofy ways of handling how a new frame gets drawn. it can either get drawn over the old, or the old needs to get blanked to black with an alpha of 0 first then the new frame gets drawn. or the hole animation takes a back step by a single frame and the new frame gets drawn on top of the frame prior to the one that would normally be getting written on top of. and with the animation above the whole thing does not get drawn each time. only the spinning bit does. so you need to mask off portions of the main image in order to be able to draw it properly.
It is something that is very achievable. I have gotten fairly close to getting it right. But seeing as how apng is just about a dead image format anywho I do not see any kind of real need figure it out. apng never really took off because it had to many downfalls like not being able to reuse already added images. While it was a great attempt at improving upon the animated GIF it never really became widely adopted
Here is a partially working example of the apng. There are some rendering anomalies that still need to be sorted out. read the comments before running the script.
@Metallicow
Here is the APNG support you have requested. You should be able to modify this script to work with any apng file.
I embedded the apng file above into the script.
I did not do a whole mess of performance tweaks and there are things that could use to be cleaned up and changed. It does provide full apng animation support. It works without any anomalies in the rendering.
Had a few min to look over your code. Many strange things occured but I managed to code yet another hmmmOpps() function into SourceCoder. I can manage to get a screenie this time before hard crashing.
Just to note, I dont use the apng lib, I have used the official apngasm/dis from the kickstarter onwards and other image libs to handle most of what I do with them.
1st weird thing. This is just plain stupid. Probably an issue with apng lib itself.
Microsoft Windows [Version 6.1.7601]
Copyright (c) 2009 Microsoft Corporation. All rights reserved.
C:\Users\testuser\Desktop>tmprun.py
You need to install the apng library in order to use this script.
pip install apng
wtf... Ok... reading the file your code. checked location.... looks ok but could be better... Let's do a runitanyway in SourceCoder... wow it actually worked... sorta until crash...
...apparently both your examples start this way. The apng starts running at the speed of light but over time it slows down until it finally crashes.
how long does it take until the crash happens?
I am not using the any of the events in wx to handle the drawing operations like EVT_PAINT. so there is nothing that is changing hands to other thenreads other then the main thread passing off to a worker thread and that worker thread is what is handling the drawing. so there shouldn't be anything that "starts off at the speed of light and slows down".
I have had the animation running for hours and not had it crash. I will start it up again and let it run and see what happens. I will also have it log how long each iteration of the animation it takes.
the apng package does not use any external software to extract the frames inside of an animation. this is the reason why I used it. I had started to write this ability myself. but once I discovered the apng library and it was done fairly close to what I would have done I decided not to.
In your test did you run the script exactly how it is right now? or did you modify it in some way?
Nope. no mod. I'd lay a guess if I run it in x64 just the crash might occur faster. ...maybe how the apng lib is loading and sending info... idunno It seems like the thread is outrunning the draw function, but cant be sure. It takes like 4-8 sec to crash, depending on if it was precompiled before running.
OK so I did run the test and after 27 iterations I did produce the error you pointed out.
That Error has got to be a bug in wxWidgets. The same operation is being performed over and over again and does not have an issue for 27 times then all of a sudden out of the blue it is saying that I have not selected a bitmap before I am creating a GCDC..
and the thread is running at the same pace or really close to the same pace.
4.180239200592041
3.711212396621704
3.713212251663208
3.713212490081787
3.712212324142456
3.713212251663208
3.712212562561035
3.713212490081787
3.728212833404541
4.181239366531372
3.9932281970977783
3.713212490081787
3.712212562561035
3.71421217918396
3.713212490081787
3.712212324142456
3.727213144302368
3.714212417602539
3.727213144302368
3.713212251663208
3.7282135486602783
3.72821307182312
3.822218656539917
3.714212417602539
3.7582149505615234
3.728213310241699
3.729213237762451
I would need to compile a debugging version of wx and also python to be able to track down the issue.
OK so this is the portion of the wxWidgets code where the problem is stemming from in wxWidgets.
wxGraphicsContext * wxGDIPlusRenderer::CreateContext( const wxMemoryDC& dc)
{
ENSURE_LOADED_OR_RETURN(NULL);
#if wxUSE_WXDIB
// It seems that GDI+ sets invalid values for alpha channel when used with
// a compatible bitmap (DDB). So we need to convert the currently selected
// bitmap to a DIB before using it with any GDI+ functions to ensure that
// we get the correct alpha channel values in it at the end.
wxBitmap bmp = dc.GetSelectedBitmap();
wxASSERT_MSG( bmp.IsOk(), "Should select a bitmap before creating wxGCDC" );
I am in fact selecting the bitmap before the call to construct the GCDC. It turns out that the bitmap for some reason is not returning True from IsOk(). I need to figure out why.
I did however discover there is a missing method in wxPython. there is no method MemoryDC.GetSelectedBitmap. I would love to be able to make a comparison between the one that MemoryDC has selected and the one I have created.
I still need to do some more testing and find out why the bitmap is not ok. and this seems to be happening always on the 27th loop of the animation on the 24th frame of that loop. That is remaining a constant.
We have this little python game we play sometimes...
Actually i kinda invented it. I like to call it pyTink but phonetic for "think". verbal vs interpreter
Well, there is a piece of hardware that needed to be retired. I didn't want to just destroy it like all the rest, so I made it into a bork machine....
When we had game night, most everyone there knew at least basics of python.
I then installed python on it and said guess what!?! nor more computers vs games tonight.
Anyhow we managed to make up some [if, elif, exec, etc] cards all python based....
The rule of the game was simple: defeat your opponents and be last man standing.
Card deck on table with obviously "dangeous" ways to play them in it.
Each player is givin 1-3 variables to input into the interpreter to play the game. No other players know what you put into it LOL.
Then let the cards fly and have the python god to be doomed sort out who is winner.
Interpreter is of course the game rules so its answer is final..
After many iterations one night we had me vs linuxguru as last surviving players that night.
I had 1 card left and he had all 20. Obviously I had no chance in my mind...
so I just flopped next top card over on table and it was CHEESE!
that's like a wild card with another card played. I just so happened to have the "1-liner" card.
so after like 10 min of arguing who would win, we input it into interpreter. 1 card was obviously if True: run()
It did some weird things and has never worked the same since.
I still can't stop rolling on the floor laughing everytime I ask him a question nowadays.
replies to anything, I ask as "Is that TRUE?"
...somehow I got a extra long exception named after me in his codebase hahaha. it violates all pep8 rules
I found the problem.
There is a memory leak. It is not in wxWidgets or wxPython bit it was in the script code. I was not cleaning up properly. I needed to release and delete some of the windows handles. But also I needed to call wxBitmap.Destroy() and delete the python object.
This solved the memory leak problem.
The error that is produced by wxWidgets is misleading. Because wxWidgets uses wxBitmap.IsOk to determine if the memory dc has selected a bitmap the error gets created if IsOk returns False. The issue there is that there is more then one reason why IsOk would return False. One of them being insufficient memory.
I have modified the script to fix the memory leak, and I also modified it to provide a proper error if there is no enough memory to render the animation.
But that should solve the issue with the crash. I was unable to locate an issue with the speeding up problem you encountered. One of the things you have to remember is when dealing with any graphics animations any large rendering that takes place by any application is going to cause a slowdown. even if the rendering is being performed by a completely separate process. I think there is only a single thread/process that actually handles drawing to the screen. so anything that is to be drawn has to "wait in line". This would cause a slowdown. I have not looked into the inner working of the Windows GDI so I am not able to tell you if that really is the case. But it does make sense.
The animation should have a total of about 3.7 seconds for a runtime. I would have to go and check and see what each frames delay time is and add them up to see how accurate the script is actually running.
I just checked the delays. each frame has a delay of 0.016666666666666666 seconds. and all frames added up ends up being 1.9833333333333312 seconds. so my time of 3.7ish is not correct.
I am going to have to add in a high precision timer to time how long the drawing takes and subtract that from the delay. and if there is anything left over then wait that time. and if not then continue on to render the next frame.
I will add that now. Tho the speed looks good how it is I still want it to be proper!!
OK so I have gotten the animation time to be better then it was.
I am down to 2.264129400253296 seconds total animation run time. I am not able to get it to be exact with what is in the apng. This is due to how long is takes wxPython to do it's rendering operations and also how ,ong it takes for the windows api call to do it's thing.
264ms off over 118 frame is not horrible. If I used Cython and converted the script into c code and compiled it into a python extension I am sure we could get the times to be perfect. I think that a 264ms error over 118 frames is not awful considering the speed at which each frame should be rendered.
It was taking a HUGE amount of time to crash on my machine because I am running 64gb of ram. when I tested it after you reported the problem I had a project that was open in my IDE that is massive at close to a million lines of code. So my IDE had a grip of RAM all gobbled up. so there was far less available. When I had the animation running for a really long time to test it I didn't have much open. so most of the 64gb was available. and at 20meg or so being consumed each loop it would have taken a really long time to run out.
here is a newer version that fixes an issue where the apng will not render properly if frame 0 has an alpha channel apng_script.zip
Adding wx.MilliSleep(8)
to the end of the update animation method fixes the draw rate.
Github garbles the filename when attaching files. The apng should be 120frames a second. Gifs have problems running at full speed in all browsers.
The last script you uploaded seems to run fine by adding the millisleep line.
I have other versions that have fewer frames but that one is 120 a sec.
I recall that I had an animation where I had my snakey icon bonce around a screen, and had a few different ways of doing it. Regular timer in wxPython would only render at like 15 on MSW. this is like a limitation. But using pygame and thread got it to run faster on windows. Tho I scrapped my idea when I started figuring out how bad crashes was when using pygame embedded in wxPy. Tho on linux the icon bounced around at the speed of light. But yea, I recall I used a thread to bypass the frame draw limitation.
https://commons.wikimedia.org/wiki/File:Animated_PNG_example_bouncing_beach_ball.png
This one is still showing as black background.
self.apng = apng.APNG.open(file=r'Animated_PNG_example_bouncing_beach_ball.png')
the calculation for the delay I think needs more work, but at least the page tells how many frames and a delay.
The python apng has a background on the first frame, the bouncing ball doesn't. maybe that has something to do with it...
hmmm I think I might restart over from the one that was working fine and incorporate the latter stuff in bit by bit. If you got the unanimated one to work with alpha, then somehow something broke inbetween. @kdschlosser This is how you would load the png from data, the "supposedly" not lazy way.
import io
img = wx.Image(io.BytesIO(png.to_bytes()), wx.BITMAP_TYPE_PNG)
I have the black background issue fixes.
I am not sure where you are coming up with the 120 frames a second. You have each frame set to a 16 millisecond render time.
16 * 120 = 1920 = 1.92 seconds
If you add any additional delays or waits you are going to slow the rendering time down even further. Right now it is getting as close as it can to the 16 milliseconds per frame as it can. the rendering time is longer then 16 milliseconds. it takes on the order of 17 milliseconds to render each frame.
The only thing I do not like about wx.Millisleep is the inability to exit the sleep. the program has to wait until the sleep is finished. This is not a good thing when dealing with threads. If a thread needs to be shut down it usually has to be done at the snap of the fingers. This is the same reason why you do not want to use time.sleep() in a thread. using threading.Event.wait() you have the ability to exit the wait from another thread if needed by calling threading.Event.set(). The same threading.Event instance can also be used to keep a loop going inside of the thread. so if you want to have the thread exit you can do so with 2 commands. one to set the event, and the second to join the thread so your program will not continue until the thread has shutdown. This is the way to do it in order to not have PyDeadObject tracebacks take place when closing an application GUI.
Here is an updated version of the script. I fixes the issue with the background not having an alpha channel. It also removes the writing of the temporary files.
You can run the script from a shell and supply a command line argument which would be a path to an apng file you want to load instead of the embedded one.
I changed the embedded apng to the bouncy ball. apng_script.zip
Oh and the calculation for the delay is exactly to the apng specification that is written here,
The delay_num and delay_den parameters together specify a fraction indicating the time to display the current frame, in seconds. If the denominator is 0, it is to be treated as if it were 100 (that is, delay_num then specifies 1/100ths of a second). If the the value of the numerator is 0 the decoder should render the next frame as quickly as possible, though viewers may impose a reasonable lower bound.
Frame timings should be independent of the time required for decoding and display of each frame, so that animations will run at the same rate regardless of the performance of the decoder implementation.
I have done this to the best of my ability. Because of how long it takes to render the bmp to the screen (something I do not have control over) I time how long that rendering process takes. I subtract that time from the delay that is provided in the apng frame. if the remaining value is > 0 then I have the program wait for that remainder. This is the best I am able to do in order to get the delays as close as possible. I am handling as much of the rendering of the frame as possible ahead of time so the only thing that ends up needing to be done is setting up a region that defines the alpha and copying the buffered frame to a new bmp and that new bmp is what gets drawn to the screen.
I might be able to squeeze it for better performance by reusing some of the windows structures and possible setting it up so that I do not have to keep on creating and destroying windows handles.
@kdschlosser I ripped everything out that wasn't needed with apng and started over from the sample that worked on wxPy4.0. I think this should work. Please test. Bouncy ball apng works now with alpha.
The python apng has 120 frames and should be 1/60 second so 1000/120 = 8 milliseconds is how i calculated it. so in order for it to look proper one rotation should take 1 second
The animation speed is sporadic on your last example and speeds up and slows down, but yea you managed to get the alpha to look right.
I also noticed that the ball reflection on your sample isn't semi-transparent....
With my example, I need to do some sort or position coding on each frame to get it to look right. It works, but each frame is bouncing against the left wall instead of forcing the python logo to be square. also 16 milliseconds looks more proper. Something is wrong with the division somewhere. The ball animation looks OK speedwise tho. I think the issue is that since apng was never accepted initially, everyone else nowadays has a different idea as to how the delays and everything should be formatted. Tho the optimized way I rendered the python apng is actually only drawing background on 1st frame, so I may need to adjust that bit... with the drawing constants somehow... there never really was an official spec, so I think apng lib being used might be off also... Still waiting for Pillow to push thru the apng support they are working on.
It appears that depending on the era that the apng was made you might get different results because whatever the math may be the numbers might be flipped and for example if you feed that number into Sleep(), MilliSleep(), or MicroSleep() you will get a totally different result. Plus your example does calculations for the delay which are subtracted, which should be already part of the delay spec in any sane guess. The weird part is that most browsers nowadays render them fine, so however firefox/chrome/opera is doing the figuring out of the delay will probably be the accepted way in the end. The two different apngs render fine in the browser, but depending on if I use your example or mine, one is right and the other is wrong and vise versa. My calculation of 8 milliseconds on your example with python png runs fine, but the ball example works with my minimal example. But both render just fine in the browser... so obviously the browser code has a way of figuring out what the delay spec should be.
OK so I have trimmed the fat off of the Windows API function calls. When I use the apng you made above the rendering time is down to 6.89748699999998
milliseconds. So now the program is able to properly render the frames at the 16 millisecond speed that is enbedded into the frames.
The delay the browser is using is the exact same delay that I am using. The problem was stemming from the rendering time being greater then the delay. This would make the animation slower. I reorked the Windows API calls so they reuse portions of it between frame renders instead of creating each piece new every frame. This has reduced the overhead quite a bit.
I checked the apng file you attached with an apng disassembler and it too states that you have a 16ms delay set for each frame. So I know my calculations are correct. The specification states that the rendering portions of the code should not affect the speed in which the apng can be displayed. I have done as much rendering as possible before the animation starts. I wish there was a way I could remove having to copy the bmp so we can draw it on the screen. But there is no way to achieve that with the Windows calls.
If I set this up to not use the alpha shaped frame. and set it up to be a panel instead I could simply write the data to a ClientDC instead and I would not have to have the additional overhead caused by the copying of the bmp.
I am not sure how much that would speed things up I do not believe it would be all that much.
I would love to extend the wx.Bitmap class so it can handle APNG files. There would need to be a whole lot of Monkey patching taking place in order to handle the drawing of each frame. for each of the various widgets that you can use a wx.Bitmap with. I am fairly certain that the bitmaps are not "drawn" by wxWidgets but instead has the data copied out of it and placed into a Windows bitmap buffer and that buffer gets passed to a Windows API call to be drawn. Because of the nature of that process there are not going to be repeated calls unless there is a need to refresh the information on that portion of the screen. So once it is drawn it does not get redrawn at some kind of an interval.
So those redrawing mechanics would need to be added to each of the available widgets. The question is how is the data contained within the wx.Bitmap retrieved when the bitmap is to be drawn on the screen. If I knew that bit of information I would be able to refresh and update which would trigger a collection of data to occur from the wx.Bitmap instance. at that point I would be able to return the proper frame.
All of this could be handled internally to the bitmap if the bitmap had a "parent" that would be the widget. I do not know if this exists. I do not think that it does.
I am going to poke about the wx.Bitmap class and see if I am able to isolate the mechanism that gets used to get the data,
Ok. checked out your sample and made afew modifications to fix the sporadic timing issues im getting... The reflection on the ball is still incorrect tho. still not semi-transparent.
zero speed just runs as fast as it can render so it is actually valid, so changed > to >=. placed sanity check in print statement. if it wasn't in a thread it is throttled/limited to 15ms how wxPython/Widgets deals with windows, so the way you got it bypasses that just fine in a GUI without causing any issues.
def run(self):
while not self.close_event.is_set():
start = perf_counter()
self._update_animation()
frame = self.frames[self.current_frame]
stop = perf_counter()
delay = frame.delay - (stop - start)
print('debug %d' % delay)
if delay >= 0:
print('delay > 0: %d' % delay)
self.close_event.wait(delay)
# wx.MilliSleep(delay)
if self.current_frame + 1 == len(self.frames):
self.current_frame = 0
else:
self.current_frame += 1
In order to get it to work right with the thread timer the delay_den should be wrapped as a float in order to accomodate python 2 otherwise it always ends up as 0.
change your import to be Py2/3 phoenix friendly for perf_counter
try:
from time import perf_counter # NOQA
except ImportError:
from time import clock as perf_counter # NOQA
Also tested with this also. The antialias on the edges look grainy and flattened. no semi transparency. https://commons.wikimedia.org/wiki/File:APNG_throbber.png If you can get the edges to look proper on the 2 with the transparency, then I think everything else looks fine with the python powered one with this iteration.
Edit also I think the perf_counter probably isnt needed anyhow.... cant see any reason why to do the extra calc...
with the float fix, this would be the optimized run method.
def run(self):
frames = self.frames
len_frames = len(self.frames)
_update_animation = self._update_animation
close_event = self.close_event
while not close_event.is_set():
current_frame = self.current_frame
_update_animation()
close_event.wait(frames[current_frame].delay)
if current_frame + 1 == len_frames:
self.current_frame = 0
else:
self.current_frame += 1
creating a value or func/meth to tell if all the values are the same would be useful also... how useful would depend on the users use of the value, if even used at all. for example any apng with all the same delays could be scaled as anything to look quote "right" to the user, tho an amination that takes a minute might be scaled elsewise and look ugly....
wxPython 4.x only runs on Python 3.x so adding the float() is moot and does not need to be done.
That sanity check of >=
does not need to be done. It is an additional function call that gets done if delay == 0
and that call does not need to be made. That call does take time to process. so why even do it when it is not necessary?? All it does is adds additional time to process thus slowing down the animation. Now it may not seem like much but if you keep on doing that over and over again it will accumulate.
I can also add in a "frame skip" if needed. it would do this when the returned value of running_time % animation_loop_time
is >=
a frame's delay time in the animation. The skipping of the frame only means that it is not going to get drawn to the screen. The only frame that cannot get skipped would be frame 0.
This would keep the animation run time correct. I use a similiar process when I am rendering my security cameras using wxPython. The other option we can add is turning off anti-aliasing when rendering. This should also speed up the rendering.
I have been writing an apng addon for wxPython I am removing the use of the apng library and adding in the full specification for apng into it. I am also breaking down the png specification so a user would be able to set the metadata for each frame if they wanted to.
It is also going to handle sequencing of the frames, this is something the apng library does not have the ability to do. The specification does not mandate that the frames be coded in any specific order.
The delay is not something that should be "scaled" the delay would always remain a constant. Right now we are only rendering the apng in the exact size it is made in. This is going to change as well. we will scale the size of the apng when a size event occurs. so the smaller an apng is the faster it is going to render and the delay adjustment code is going to handle that change in rendering speed as needed. the displayed size coupled with the users computer speed is what is going to determine exactly how fast we re able to render each frame. because both of those factors are ?'s the only thing we can do in terms of producing a correct animation speed is what I have done above, other then adding in a frame skip. The frame skip is something that we can add in as an option that can be turned on or off. The other thing we can add is an event that will signal the application that there is an overrun occurring this way the application then can make a decision as to whether or not it wants to resize the apng or to turn on frame skipping. In the event object we can add in the amount of overrun that is occurring.
I am writing this thing to also allow a user to create a new animation. a frame in an animation can be any file type that wx Image is able to convert into a wxBitmap. the apng would also be able to be saved to a file or a file like object.
The png and apng specification is really not all that complex. Most of the png specific bits are handled by wxPython/wxWidgets all that has to be done is we have to separate the IDAT chunk from a PNG and place it into an fdAT chunk inside the apng. there needs to also be an fcTL chunk added for each frame which provides width, height, x offset, y offset, blend op, delay and dispose op data that is needed to know how to render each frame.
I will create a repo for the apng library and you can help out with it if you like. Probably best to move the conversation to there anyway. This topic is way off track. I will probably close the issue and possibly reopen it. I have some code examples of what can be done to band-aide the problems that I will post in the new issue.
In the next day or so I will open the repo. I will post the link to it in here. then we can continue the conversation there. Even tho the apng specification is almost dead it is a great way to go about providing animation support for any image type that wxPython supports. This is because it is a really simple specification. Adding support for other animation types can be derived from this model.
One thing I have not checked is how an animation gets loaded using wx.Image. I know that wx.Image supports multiple "layers" or images in a single object. Does it load each frame if i load an apng?? gonna have to try it.
Phoenix has ran on Py2 and 3 since it was launched... I ported the whole demo... Im running all your code on Py 2 too lol. The sample on Py2 is ending up doing integer division is more specific with the float part.
Edit: when you are calculating this in the animation class, this will always do true division on Py2/3.
Then when run in thread.wait() it will get a float instead of 0 or whatever
self.delay = self.delay_num / float(self.delay_den)
I didn't say Phoenix. I said wxPython 4.x I didn't think that wxPython 3.x was being developed anymore.
either or it is not a big problem to add the float().
I did however run into a road block. the components needed from wxWigets to be able to add custom animation decoders have not been added to wxPython. So I am not able to do what I wanted to without having to write a new control. The reason why I wanted to add a new decoder was because the rendering of the animation to the screen was handled in c code and it would be faster. a good comparison is racing a top fuel drag car (c code) against a pinto (python). Plus everything is already in place to handle sizing events and the running of the animation. It would simply be the best and easiest way to go about it.
wxPy 4 is officially Phoenix launch. If it doesnt say phoenix when you do wx.version, then you are using classic. There was some versions in wxPy 3 that worked but they was still alpha/beta testing. If it isnt on PyPI then it is probably classic.
well I'll be damned. last time I had checked (don't know when that was) Phoenix was not available for Python 2.x I just checked and it is there now....
So my bad on mentioning 3.x I have them all mixed up. the 3.x is because last I knew 3.x was the latest release that was available for Python 2.x That is no longer the case.
so the float does in fact need to be there you are correct.
OK so here is the repository that adds APNG support to wxPython. It is not tested and is a WIP.
https://github.com/kdschlosser/wxAnimation
What I am doing is extending wx.adv.AnimationCtrl and wx.adv.Animation and adding a python version of wxAnimationDecoder. I have also added the methods for AnimationCtrl and Animation that have not been included in wxPython. I am setting it up so that the original classes will get used if an animation is an ANI or GIF. otherwise it will use the pure python version of it.
There is actually a problem with the original C code, It uses wx.Timer which cannot be run using a timeout value of 0. I would have to check but I believe that wxTimer has resolution down to 1 millisecond. So all frames will have at least a 1 millisecond delay. I am going to alter the code so it will do a proper update without having to have a delay of 1ms for a frame that may have a delay of 0.
I am also going to add in code to correct the rendering time vs delay time.
iirc wx.Timer was the one restricted to like 15ms on windows. MilliSleep or Thread should work around the limitation tho. Might ask Robin again but I recall that was the case when I ran into it the first time.
I like the approach oif using threads anyhow. because it will not tie up the main thread. I have corrected some of the initial problems with the code I posted. I have to figure out why it is not rendering correctly. and I also have to add in support for transparency. as it seems from looking at the code that the original animation code does not support the rendering of alpha.
It appears that pillow finally is getting apng support/imagegrab pushed with 7.1.0. https://github.com/python-pillow/Pillow/issues/4354#issuecomment-606950009 Might be something to look into when it gets released. Tho pillow 7.1.0 won't support python 2 so it you are looking for that then might have to backport or use alternate apng lib or wxAnimationDecoder stuff when it matures.
@kdschlosser
I managed to get this to work with the alpha on wxPy 4.1 by doing this. I didnt have the BLACK problem. ... now to figure out how to get rid of the PIL dependency and do it straight in wxPython to create a proper alpha image.
##from PIL import Image
pil_im = Image.new(mode='RGBA', size=(width, height), color=(0, 0, 0, 0))
bmp = wx.Bitmap.FromBufferRGBA(width, height, data=pil_im.tobytes())
tested on wxPython 4.1.1a1.dev4883+75f1081f msw (phoenix) wxWidgets 3.1.4 Scintilla 3.7.2 Python 3.8.3 (tags/v3.8.3:6f8c832, May 13 2020, 22:37:02) [MSC v.1924 64 bit (AMD64)] on win32
@Metallicow Embed any images into an example if you could please :smile: If you run the script below and enter the filename of the image it will print out a base64 encoded image.
# Python 2.7 and 3.5+ supported
from __future__ import print_function
import base64
try:
file_name = raw_input('Enter File Name\n')
except NameError:
file_name = input('Enter File Name\n')
with open(file_name, 'rb') as f:
data = f.read()
encoded = base64.b64encode(data)
lines = []
line = " b'"
for item in encoded:
try:
line += chr(item)
except TypeError:
line += item
if len(line) + 1 == 78:
line += "'"
lines += [line]
line = " b'"
if len(line) > 6:
line += "'"
lines += [line]
print('IMAGE = (')
print('\n'.join(lines))
print(')')
copy and paste the output into your example and then add the following lines of code to it.
from io import BytesIO
stream = BytesIO(base64.b64decode(IMAGE))
stream.seek(0)
You should be able to pass the "stream" file object to most classes/functions that deal with handling images.
One other thing I did was I wrote a cross platform high precision timer for Python and this timer has microsecond resolution... We cannot use anything that is built into Python to handle stalling a thread or the program in a traditional manner because these mechanisms are not stable and can return before the wanted time or after it. the one that I wrote will return either on time of 4-5 microseconds after the desired time. But not before it and never 10-12 milliseconds after it. This fixes the inconsistent animation speeds.
I was able to overcome the black by using masks. @RobinD42 extended the Python access to the c functions and methods So now they can be properly overridden. I have not had the opportunity yet to update the code to work with the new changes.
One thing I would like to know and maybe @RobinD42 can answer this. Is there a way to draw the frame like what is being done in the examples? To make the frame transparent without having to use a shaped frame? This is because a shaped frame is not able to be created properly using an image that has an alpha channel. If a frame can be drawn like what is being done above but for other OS's we can write an animation handler that will deal with APNG files and then separately write a shaped frame that will handle alpha channels. The shaped frame portion needs to be able to be cross platform.
Operating system:
wxPython version & source:
Python version & source:
Description of the problem: There are a few issues I am running into. I have attached an image to show the problems that are taking place.
I am not able to draw any text that has an alpha channel. Well let me correct that. It draws the text, however the alpha channel is not used. You can see this in the attached image. Every color used in that image has an alpha channel. I did a screen capture with the frame placed over a white background so you would be able to see the problems. The white is not part of the example code below.
I have tried with GCDC.DrawText and also GraphicsContext.DrawText and neither of them support an alpha channel.
I am drawing a simple drop shadow using wx.Region to subtract the actual text from the bitmap where the shadow is drawn. The appearance starts out fine but as the text continues there ends up being areas that get improperly clipped, this issue gets worse as the text continues. you can see this problem in 1, 2 and 3 of the attached image.
I have studied this fairly closely and if you look at the text you will see the text being antialiased to black. I am not sure why this is taking place, I have set the background text dolor to
wx.TransparentColour
I may have mentioned this before, I do not remember if I have or not. This problem only becomes evident when using an alpha channel. When rendering a rectangle the filled in area of the rectangle can be seen under the outline. This can be seen in the image item 5, you may need to zoom into the image to see it better.
If you look at item 4 you do not see this issue because the filled in area has an alpha channel of 0.
There needs to be some kind of a check to see if the pen has an alpha level is 0 < alpha < 255 and if that is the case then the filled in area needs to be adjusted to be smaller. I am sure this same issue also exists with the other functions as well. There is also another issue regarding the use of these types of functions. If you take this example
The resulting rectangle is NOT 150, 150. it ends up being 154, 154. I do know this problem has existed for a very long time and to correct it would cause a lot of problems for projects already using wxPython. Perhaps the addition of a global flag that can be set that would tell wxPython to render to exact sizes.
I may be doing something incorrect, tho I do not think that I am.
Here is the frame as seen without the white behind it.
This example script will ONLY run on Windows.
@Metallicow Here is an example how how to overcome the alpha issue with shaped frames, this example only works on Windows. I would think that there are similiar API functions for OSX and Linux that can be used to achieve the same results