openclimatefix / power_perceiver

Machine learning experiments using the Perceiver IO model to forecast the electricity system (starting with solar)
MIT License
7 stars 1 forks source link

Do some geometry :) #13

Open JackKelly opened 2 years ago

JackKelly commented 2 years ago

Consider feeding something like this into the neural network:

What happens when we predict PV power for multiple PV systems at once in, say, a 64 x 64 km square?

In Perceiver models, if we were predicting for just a single PV system, then we could, as part of the spatial encoding, encode a "proximity to the direct-line-of-sight" between the Sun and the PV system (e.g. the proximity would be 0 for pixels sitting directly on the line of sight). What happens when we're predicting for multiple PV systems? Is the direct-line-of-sight basically the same for all PV systems in a 64 x 64 km square?

In general: maybe we should only be giving the model imagery for the direct-line-of-sight, and a blurry image directly above the PV system?

JackKelly commented 2 years ago

Some conclusions:

Parallax at London

Clouds at 10 km altitude will appear to be 16.2 km north of where they really are. Clouds at 2 km altitude will appear to be 3.2 km north of where they really are.

So, if each pixel of HRV satellite imagery represents 2 km north-south then, to capture the vertical column of atmosphere from 0 km to 10 km altitude then we need to add an 8 pixel "buffer" to the northern edge of the image (8 pixels x 2 km per pixel = 16 km). (The resolution of HRV at the equator is 1 km. But it'll be more like 2 km resolution over the UK).

image

JackKelly commented 2 years ago

Line-of-sight between the Sun and a point on the Earth

If the Sun is 10° above the horizon, and if clouds are 10 km above the Earth's surface, then the Sun's rays will pass through the clouds 57 km away (along the horizontal plane) from the point of interest on the Earth's surface (in the compass direction of the Sun from the point of interest).

But 10° above the horizon is pretty low in the sky, so there won't be much PV power being generated.

If we want to see all the clouds in the Sun's path, from 0 km altitude to 10 km altitude, when the Sun is 10° above the horizon then we'll need a strip of 15 pixels of satellite imagery (15 pixels x 4 km per pixel = 60 km), running along the compass direction between the point of interest and the Sun.

The NWP grids are only 2 km apart, so we'll need to look at cloud cover across 30 NWP grid boxes (looking at low cloud cover in the boxes nearest to the point of interest, then medium cloud cover, then high cloud cover).

Of course, as the Sun's azimuth angle above the horizon tends towards zero, the distance offset sends towards infinity (so we need to watch out for this when we implement the trigonometry in software!).

image

JackKelly commented 2 years ago

@jacobbieker absolutely no rush but, if you get a second, please could you sanity check the (very scruffy) maths that I've done above?! :slightly_smiling_face:

JackKelly commented 2 years ago

One more bit of (extremely) simple maths:

How far do clouds move?

Assuming wind speeds of 100 km/hour:

So, yeah, if we want to nowcast up to 4 hours ahead then we need something like 1024 x 1024 km (assuming clouds could move in any compass direction). That's a raw input of about 256 x 256 pixels per channel = 65,536 pixels per channel per timestep (because each pixel is about 4 km for satellite imagery and global NWPs). For NWP that could be 10 channels x 2 timesteps (two hours at one timestep per hour); and for satellite that could be 10 channels x 24 timesteps (two hours with one timestep every 5 minutes), for a total of 17 million input pixels! Which is too high (and probably too high, even using the Hierarchical Perceiver mentioned in issue #14).

So we probably do want to do tricks like:

JackKelly commented 2 years ago

Still TODO: Create a Google Doc for tracking all these notes. And copy the important conclusions from this thread into that doc.