Unidata / awips2

Weather forecasting display and analysis package developed by NWS/Raytheon, released as open source software by Unidata.
http://unidata.github.io/awips2/
Other
172 stars 67 forks source link

Does AWIPS2 support decode Himawari Satllite data? #657

Closed GarryLai closed 7 months ago

GarryLai commented 7 months ago

How can I feed Himawari 9 data into my self-host EDEX server? Do I need some format conversion? Data are from: https://noaa-himawari9.s3.amazonaws.com/index.html

srcarter3 commented 7 months ago

Hi there,

Our AWIPS does have the ability to decode Himawari data.

What I would suggest is to first get a file or two for manual testing and use our manual endpoint: /awips2/edex/data/manual/

If the data gets decoded you should see something new in the /awips2/edex/data/hdf5/ directory.

You should also be able to "watch" the ingest log file when you copy the file into the manual endpoint: tail -f /awips2/edex/logs/edex-ingex-[today's date].log

And that should give you some indication if it was decoded.

I have a suspicion it might not decode right away, because it looks like the file name/header may or may not be the same as what we expect. If it does not decode, you'll want to go edit this file: /awips2/edex/data/utility/common_static/base/distribution/goesr.xml At the bottom of that file you'll see a section for Himawari Imagery that is matching with regular expressions for:

<regex>^OR_HFD</regex>
<regex>^H8FD.B</regex>

You can try editing those or adding another <regex> </regex> line to match on your data.

If the data gets ingested, you should be able to see it in CAVE in the Product Browser.

If that all works, then I'd suggest writing a script that retrieves data in your data source and pushes it to your qpid queue for EDEX to decode automatically. You can see a brief example of this in our script /awips2/ldm/dev/checkFileTime.sh at line 43 -- this is where it puts the new file into the qpid queue for decoding.

After you have all that working, you may want to modify your CAVE menus for easier display access. Some information for that can be found in our documentation website.

Let me know if that all makes sense?

GarryLai commented 7 months ago

Hi there,

Our AWIPS does have the ability to decode Himawari data.

What I would suggest is to first get a file or two for manual testing and use our manual endpoint: /awips2/edex/data/manual/

If the data gets decoded you should see something new in the /awips2/edex/data/hdf5/ directory.

You should also be able to "watch" the ingest log file when you copy the file into the manual endpoint: tail -f /awips2/edex/logs/edex-ingex-[today's date].log

And that should give you some indication if it was decoded.

I have a suspicion it might not decode right away, because it looks like the file name/header may or may not be the same as what we expect. If it does not decode, you'll want to go edit this file: /awips2/edex/data/utility/common_static/base/distribution/goesr.xml At the bottom of that file you'll see a section for Himawari Imagery that is matching with regular expressions for:

<regex>^OR_HFD</regex>
<regex>^H8FD.B</regex>

You can try editing those or adding another <regex> </regex> line to match on your data.

If the data gets ingested, you should be able to see it in CAVE in the Product Browser.

If that all works, then I'd suggest writing a script that retrieves data in your data source and pushes it to your qpid queue for EDEX to decode automatically. You can see a brief example of this in our script /awips2/ldm/dev/checkFileTime.sh at line 43 -- this is where it puts the new file into the qpid queue for decoding.

After you have all that working, you may want to modify your CAVE menus for easier display access. Some information for that can be found in our documentation website.

Let me know if that all makes sense?

It seems that EDEX only support NETCDF format. Where I can download NETCDF format of Himawari data?

ERROR 2024-02-07 12:08:44,635 3990 [Ingest.GOESR-1] GoesrNetcdfDecoder: Unable to open the file HS_H09_20240207_0820_B01_FLDK_R10_S0110.DAT for splitting.
java.io.IOException: java.io.IOException: Cant read /tmp/sbn/manual/goesr/20240207/12/HS_H09_20240207_0820_B01_FLDK_R10_S0110.DAT: not a valid CDM file.
    at ucar.nc2.NetcdfFile.open(NetcdfFile.java:427) ~[cdm-4.6.10.jar:4.6.10]
    at ucar.nc2.NetcdfFile.open(NetcdfFile.java:394) ~[cdm-4.6.10.jar:4.6.10]
    at ucar.nc2.NetcdfFile.open(NetcdfFile.java:381) ~[cdm-4.6.10.jar:4.6.10]
    at ucar.nc2.NetcdfFile.open(NetcdfFile.java:369) ~[cdm-4.6.10.jar:4.6.10]
    at com.raytheon.uf.edex.netcdf.decoder.AbstractNetcdfDecoder.split(AbstractNetcdfDecoder.java:151) ~[com.raytheon.uf.edex.netcdf.jar:na]
    at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:na]
    at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[na:na]
    at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:na]
    at java.base/java.lang.reflect.Method.invoke(Method.java:566) ~[na:na]
    at org.apache.camel.support.ObjectHelper.invokeMethodSafe(ObjectHelper.java:372) ~[camel-support-3.7.0.jar:3.7.0]
    at org.apache.camel.component.bean.MethodInfo.invoke(MethodInfo.java:489) ~[camel-bean-3.7.0.jar:3.7.0]
    at org.apache.camel.component.bean.MethodInfo$1.doProceed(MethodInfo.java:311) ~[camel-bean-3.7.0.jar:3.7.0]
    at org.apache.camel.component.bean.MethodInfo$1.proceed(MethodInfo.java:281) ~[camel-bean-3.7.0.jar:3.7.0]
    at org.apache.camel.component.bean.AbstractBeanProcessor.process(AbstractBeanProcessor.java:145) ~[camel-bean-3.7.0.jar:3.7.0]
    at org.apache.camel.impl.engine.DefaultAsyncProcessorAwaitManager.process(DefaultAsyncProcessorAwaitManager.java:83) ~[camel-base-engine-3.7.0.jar:3.7.0]
    at org.apache.camel.support.AsyncProcessorSupport.process(AsyncProcessorSupport.java:41) ~[camel-support-3.7.0.jar:3.7.0]
    at org.apache.camel.language.bean.BeanExpression.invokeBean(BeanExpression.java:347) ~[camel-bean-3.7.0.jar:3.7.0]
    at org.apache.camel.language.bean.BeanExpression.evaluate(BeanExpression.java:202) ~[camel-bean-3.7.0.jar:3.7.0]
    at org.apache.camel.language.bean.BeanExpression.evaluate(BeanExpression.java:214) ~[camel-bean-3.7.0.jar:3.7.0]
    at org.apache.camel.processor.Splitter.createProcessorExchangePairs(Splitter.java:150) ~[camel-core-processor-3.7.0.jar:3.7.0]
    at org.apache.camel.processor.MulticastProcessor.process(MulticastProcessor.java:276) ~[camel-core-processor-3.7.0.jar:3.7.0]
    at org.apache.camel.processor.Splitter.process(Splitter.java:145) ~[camel-core-processor-3.7.0.jar:3.7.0]
    at org.apache.camel.impl.engine.CamelInternalProcessor.process(CamelInternalProcessor.java:312) ~[camel-base-engine-3.7.0.jar:3.7.0]
    at org.apache.camel.processor.Pipeline$PipelineTask.run(Pipeline.java:90) ~[camel-core-processor-3.7.0.jar:3.7.0]
    at org.apache.camel.impl.engine.DefaultReactiveExecutor$Worker.schedule(DefaultReactiveExecutor.java:148) ~[camel-base-engine-3.7.0.jar:3.7.0]
    at org.apache.camel.impl.engine.DefaultReactiveExecutor.scheduleMain(DefaultReactiveExecutor.java:60) ~[camel-base-engine-3.7.0.jar:3.7.0]
    at org.apache.camel.processor.Pipeline.process(Pipeline.java:147) ~[camel-core-processor-3.7.0.jar:3.7.0]
    at org.apache.camel.impl.engine.CamelInternalProcessor.process(CamelInternalProcessor.java:312) ~[camel-base-engine-3.7.0.jar:3.7.0]
    at org.apache.camel.impl.engine.DefaultAsyncProcessorAwaitManager.process(DefaultAsyncProcessorAwaitManager.java:83) ~[camel-base-engine-3.7.0.jar:3.7.0]
    at org.apache.camel.support.AsyncProcessorSupport.process(AsyncProcessorSupport.java:41) ~[camel-support-3.7.0.jar:3.7.0]
    at org.apache.camel.component.jms.EndpointMessageListener.onMessage(EndpointMessageListener.java:130) ~[camel-jms-3.7.0.jar:3.7.0]
    at org.springframework.jms.listener.AbstractMessageListenerContainer.doInvokeListener(AbstractMessageListenerContainer.java:736) ~[spring-jms-5.3.20.jar:5.3.20]
    at org.springframework.jms.listener.AbstractMessageListenerContainer.invokeListener(AbstractMessageListenerContainer.java:696) ~[spring-jms-5.3.20.jar:5.3.20]
    at org.springframework.jms.listener.AbstractMessageListenerContainer.doExecuteListener(AbstractMessageListenerContainer.java:674) ~[spring-jms-5.3.20.jar:5.3.20]
    at org.springframework.jms.listener.AbstractPollingMessageListenerContainer.doReceiveAndExecute(AbstractPollingMessageListenerContainer.java:331) ~[spring-jms-5.3.20.jar:5.3.20]
    at org.springframework.jms.listener.AbstractPollingMessageListenerContainer.receiveAndExecute(AbstractPollingMessageListenerContainer.java:270) ~[spring-jms-5.3.20.jar:5.3.20]
    at org.springframework.jms.listener.DefaultMessageListenerContainer$AsyncMessageListenerInvoker.invokeListener(DefaultMessageListenerContainer.java:1237) ~[spring-jms-5.3.20.jar:5.3.20]
    at org.springframework.jms.listener.DefaultMessageListenerContainer$AsyncMessageListenerInvoker.executeOngoingLoop(DefaultMessageListenerContainer.java:1227) ~[spring-jms-5.3.20.jar:5.3.20]
    at org.springframework.jms.listener.DefaultMessageListenerContainer$AsyncMessageListenerInvoker.run(DefaultMessageListenerContainer.java:1120) ~[spring-jms-5.3.20.jar:5.3.20]
    at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) ~[na:na]
    at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) ~[na:na]
    at java.base/java.lang.Thread.run(Thread.java:829) ~[na:na]
Caused by: java.io.IOException: Cant read /tmp/sbn/manual/goesr/20240207/12/HS_H09_20240207_0820_B01_FLDK_R10_S0110.DAT: not a valid CDM file.
    at ucar.nc2.NetcdfFile.open(NetcdfFile.java:825) ~[cdm-4.6.10.jar:4.6.10]
    at ucar.nc2.NetcdfFile.open(NetcdfFile.java:424) ~[cdm-4.6.10.jar:4.6.10]
    ... 41 common frames omitted
srcarter3 commented 7 months ago

Hi there,

I believe they are available from AWS:

https://noaa-himawari9.s3.amazonaws.com/index.html#AHI-L2-FLDK-Clouds/2024/02/07/0800/

GarryLai commented 7 months ago

Hi there,

I believe they are available from AWS:

https://noaa-himawari9.s3.amazonaws.com/index.html#AHI-L2-FLDK-Clouds/2024/02/07/0800/

Same not working

WARN  2024-02-07 16:10:10,457 7974 [Ingest.GOESR-1] GoesrNetcdfDecoder: No valid records were found in file: AHI-CHGT_v1r1_h09_s202402070800203_e202402070809397_c202402070818204.nc
WARN  2024-02-07 16:15:55,130 7976 [Ingest.GOESR-1] GoesrNetcdfDecoder: No valid records were found in file: AHI-CPHS_v1r1_h09_s202402070800203_e202402070809397_c202402070818204.nc
WARN  2024-02-07 16:15:58,132 7978 [Ingest.GOESR-2] GoesrNetcdfDecoder: No valid records were found in file: AHI-CMSK_v1r1_h09_s202402070800203_e202402070809397_c202402070818204.nc
srcarter3 commented 7 months ago

Just as a test, can you ingest any current goes data successfully?

GarryLai commented 7 months ago

Just as a test, can you ingest any current goes data successfully?

No, nothing found in /awips2/edex/data/hdf5/

GarryLai commented 7 months ago

Where is the data source used in edex-cloud.unidata.ucar.edu?

srcarter3 commented 7 months ago

You have to pull the data or drop in a manual file from somewhere to see if your EDEX will decode GOES data. I believe it is also available through AWS: https://registry.opendata.aws/noaa-goes/

Our data sources are from LDM feeds here in the United States. Are you an academic institution? Or a private company? Or a weather enthusiast?

GarryLai commented 7 months ago

I found that the correct file is at AHI-L2-FLDK-ISatSS, with the prefix OR_HFD. Thanks for your helping. BTW, I am a college student major in earth science from Taiwan.

srcarter3 commented 7 months ago

Okay great, I'm glad you were able to get Himawari data to work!