100ideas / live-earth-desktop

A tool that continually downloads the latest GOES-East image of the Earth.
1 stars 0 forks source link

RFP: animation techniques #1

Open 100ideas opened 2 years ago

100ideas commented 2 years ago

Hey @n9Mtq4 - I checked out your robust-looking GOES image scraper and some of your youtube timelapses/animations of GOES full-disc images. They look really smooth! If you have a second, could you share a little bit about your toolchain for making them?

@celoyd, glittering.blue is really beautiful and I think you for sharing the codebase (https://github.com/celoyd/hi8)! From what I can tell, your videos have smooth motion because they are running at 20 fps or, compressing 240 frames taken over 24 hrs into 12 sec, like a classic timelapse. But your code seems to do a fair amount of geometric transformation on the input data? Could you explain it a little bit?

@willwhitney, thanks for posting your osx background-image update script live-earth-desktop. Had to give it a try - and I have to say my desktop is looking very neat. I wanted to add some docs about how I got the scripts running and what features to pursue next, hence this repo.

About me & this issue: I remain amazed and in awe of the beautiful, near-real-time imagery NOAA and JMA make available - and would like to find ways of incorporating it into my environment, perhaps a sort of artful ambient display (based off an old tablet or 4k monitor + picture frame) running an app that presents the whole-disk picture in near-realtime with minimal aesthetics, maybe a mild ken-burns track-n-pan or interpolation between frames to create super-slow "real-time-speed" animation.

I'm opening this issue as a sort of informal RFP for sharing ideas and resources related to making beautiful animations and ambient displays of the near-real-time data available from NOAA and JMA via GOES and Himawari. You three have already done interesting projects with the images and I'd like to invite you to collaborate here or share your thoughts.

Welcome!

100ideas commented 2 years ago

First up - an older project that implemented frame interpolation to create slower timelapses:

(I made this comment back in 2017 on maxogdens GOES scraper repo (https://github.com/maxogden/goes-16-cira-geocolor/issues/2)):

animate-earth is an experiment in applying optical-flow-based motion interpolation to processed RGB satellite imagery from Himawari-8's AHI imager, with the aim of regularly producing high-quality smoothed videos from these images, and sharing them with the world. The final products can be found on this Youtube channel, updated nearly daily.

Motion interpolation is accomplished with the open source Butterflow tool by dthpham), which uses OpenCV’s implementation of an algorithm devised by Gunnar Farnebäck.

n9Mtq4 commented 2 years ago

I'd be happy to share how I make the timelapses, but it's nothing special.

GOES-16/17 takes a full disk image every 10 minutes (15 minutes prior to April 2019) and I make the timelapses with the images at 30 fps. They look smooth due to the short 10 minutes between images and the full 30 fps. I use the image scraper I wrote to download the images. If you don't want to dig through the code, it simply grabs the latest image by parsing the directory listing from https://cdn.star.nesdis.noaa.gov/GOES16/ABI/FD/GEOCOLOR/.

The full process consists of just these two commands.

java -jar NOAA-GOES-16-Image-Scraper-1.2-SNAPSHOT-all.jar -s GOES16 --resolution 10848x10848 --infotechnique directorylistsize --downloadbatchsize 2

ffmpeg -framerate 30 -pattern_type glob -i './imgs/*.jpg' -c:v libx264 -qp 24 -preset veryslow -tune grain timelapse.mkv
celoyd commented 2 years ago

Glad that code’s useful. It’s old and incomplete but I think the core ideas should be at least a little helpful to anyone on this path.

The videos I made with that code are only motion interpolated in an exceptionally generous sense of the term. Really, I just doubled the frame count and linearly interpolated between real frames – taking simple pixelwise averages, without any awareness of motion per se. I found this was good enough for what I was doing, but you can see the artifacts if you know what to look for. (As a sidebar: with AHI, you also have to account for housekeeping data gaps at 02:40 and 14:40 UTC. At least that was the standard observation schedule when I last used it.)

If I were working on it today, I would try things like butterflow. I dabbled with them just a little bit a few months ago (butterflow and whatever came up when I searched for “ML motion interpolation”) and found they had trouble on this kind of image sequence. I think the root cause is that the atmosphere doesn’t move very much like the objects in the videos that motion interpolators are designed for:

  1. The atmosphere and its motion is constrained to a very small volume relative to the size of the frame. A standard motion interpolator is going to waste a lot of time looking for potential motions and spatial configurations that simply don’t appear on Earth’s disk. (Or if they did, it would mean a mass extinction.)

  2. Earth’s lighting changes extremely quickly when time-lapsed at the 1,000:1 scale. One artifact I saw was motion interpolators interpreting the terminator as the edge of the Earth. Here’s someone else’s work showing this kind of artifact, although I think it’s even worse there because they’re using L2 data that makes the terminator bright. The way the interpolator tries to attach cloud motion to the terminator’s motion is clearly not ideal.

  3. At this compressed timescale, the atmosphere and hydrosphere change in ways that “typical” video subjects do not. For example, clouds are full of partial transparencies, and the oceans are reflective, two challenges for even cutting-edge video analysis tools. There’s virtually no simple motion of rigid objects or objects with reliably trackable textures (like human faces). Consider that clouds regularly appear from, and disappear into, thin air – I’ve seen motion interpolators panic and try to interpret this as a cloud being occluded by a mountain or something else absurd.

  4. There’s just a lot of motion per frame at full resolution. High clouds in particular can move very, very quickly, across different backdrops and while changing shape. Basically, the source data has a low framerate for what it shows. (Or, at least, for the most interesting things it shows. If you had motion interpolation that worked well for everything except hurricanes, for example, you probably wouldn’t be very happy.)

I list all this in the hope of saving you time, not of discouraging you. None of these problems is insurmountable, and succeeding at this could yield very beautiful results.

One might imagine approaches like, for example, training an ML-based interpolator on Himawari data, in order that it starts to work within the particular constraints of this sort of imagery.

a fair amount of geometric transformation on the input data

From memory and about 20 seconds of looking at the code, it’s doing two things:

  1. With the libraries it uses, at least back then, Hamawari-8 data in its native format came out south-up and, I think, east-left. I believe newer versions of most reader software automatically pull it out north-up, east-right. And I think this was before proj gained the geos projection (don’t quote me on that).

  2. AHI’s red band has a ground sample distance of 500 m, while the blue, green, and near infrared are 1 km. I handled this by just shrinking the red band. (Other approaches are possible, he said mysteriously.)

Today you could handle both these issues, if necessary, simply by telling GDAL to project all the images to geos with the correct longitude parameter (14x.x° for Himawari-8, I want to say?) and a resolution of 500 m. Or you could do it in numpy or whatever; there are lots of ways to resample and flip images!

beautiful animations and ambient displays

A few things come to mind:

patriciogonzalezvivo commented 2 years ago

Hi! Sorry for jumping in so late. For Hearth/Hogar I'm using something similar to this script to get the OpenCV optical flow of the last two full disk. Note that the optical flow is gunnar/farneback. The rest of the technique is a shader that does the interpolation in a period of 10min (the necessary to get another payload) , plus a synthetic atmosphere scattering (because the real one is masked on the lense). The result is a realistic real-time continuous speed that is just delayed 10min from the actual view of the earth