lvgl / lv_lib_gif

GIF library for LVGL
MIT License
36 stars 13 forks source link

The gif rendering takes too much memory #42

Open potato1992 opened 1 year ago

potato1992 commented 1 year ago

I see the code in gd_render_frame(gifobj->gif, (uint8_t *)gifobj->imgdsc.data), a whole image is rendered at once and the color of every pixel is stored in gifobj->imgdsc.data.

I think a better way is to render a part of the image, say, 1/10 for example, then loop to render the whole image, this would save a lot of memory required to run this library. In most platforms for LVGL, the hardware RAM is very limited, and a 200x200 gif would eat up color bit length x 40000 bytes memory.

This may requires deeper integration with LVGL core instead of using the lv_img directly... Or a simple way to save the memory is to re-organize the memory of the raw data, so that gifobj->imgdsc.data can be set a address directly instead of creating a new array of color...

kisvegabor commented 1 year ago

I agree, that it can be improved.

My idea was to use the indexed image directly as LVGL can decode it out of the box. However the palette can change on each frame and it is assumed that the pixels from the previous frame with the other palette are already written.

If we added a limitation, i.e. "GIFs only with global palette can be used" the RAM usage would be width x height bytes. I'm sure there are tools to convert GIFs to global palette only format.

I think it'd be the simplest. What do you think?

potato1992 commented 1 year ago

Sounds reasonable, it would surely bring a lot more chips to run lv_lib_gif if the RAM usage could be cut half!

stale[bot] commented 1 year ago

This issue or pull request has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

kisvegabor commented 2 months ago

@bitbank2 has a decoder which is more optimal in therm of memory usage.