Closed nishadhka closed 9 years ago
It's been a long time and I think I already lost my WRF-related data. The spreadsheets are typically exported from GIS apps / emission inventory system.
Screenshots:
Here is what the .xlsx spreadsheet looks like. If the first line on your spreadsheets is header/column label, put "2" in the "starting rows field" to start processing from row 2. Each row represent a grid cell:
x y latitude longitude CO_EMISSION NO3_EMISSION ...
0 0 -7.55 107.711 0.03 0.005
1 0 -7.55 107.712 0.02 0.004
...
x
is relative x coordinate of the grid in meter. UTM projection is used during conversion.y
is relative y coordinate of the grid in meterlatitude
is the y coordinate of the grid in degreeslongitude
is the x coordinate of the grid in degreesCO_EMISSION
is the emission data. If you're not using mol/km^2/hour as emission unit, put the conversion factor in the "conversion factor" field in the emis_converter.py.You can put several pollutants at once using "Add Pollutant" button, and save your configuration with "save list" button.
After converting the data using pyWRFChemEmiss, you can proceed to run WRF-Chem's convert_emiss.exe and continue with the WRF simulation workflow.
Note that I haven't tested this app against the latest version of WRF and WRF-Chem, so it may or may not work.
Best of luck :)
That is very prompt, Thanks. Let me work on it and update here.
No problem. I haven't got a chance to revisit air quality modelling since I was busy with works. But I would love to revisit this again when I had the time. This app probably better if it's rewritten as a web app, so you can just upload your emission data grid, select modelling parameters, and then run the model (WRF-Chem or others) on an on-demand Amazon EC2 clusters.
I think there is no such web application operationalized. It seems all WRF and air quality model routines are developed not considering availability of cloud computer and web application. Your app AQMWEB is working towards that direction is in it?
Yeah, but the webapp stack used in that web application is already out of date now (it's been two years), so I think it's better to start fresh and take advantage in recent advance in web technology. Also, the trend in HPC community is to take advantage of Amazon Cluster GPU instances (see: https://aws.amazon.com/hpc/ ) instead of building your own physical cluster. I think there are many agencies that explore this area for their CFD research, but I'm not sure whether any of them use WRF.
Amazon cluster would be costly for a daily routine usage such as operational real time WRF-CHEM modeling, is in it? I am thinking of physical cluster based on single board computer such as Radxa, having quad code processor, it would be cost wise and energy wise better. Since the modeling routine is set for a limited area and load is predefined, the case of dynamic extension with aws is not rising in this use case.
The program is working with MIX Asian emission inventory data supplied in netcdf format. Python script was used to convert the netcdf into csv then xlsx and fed into the program pyWRFChemEmiss, it gives files wrfem_00to12z_d01
and wrfem_12to24z_d01
with size of 49 MB for my model domain specified in namelist.wps. It tooks about four hours for four nested domain. This was read by WRF CHEM 3.4.1, convert_emiss.exe program and generated wrfchemi_d01
with file size of 1.1MB for next real.exe program. The file size of generated wrfchemi_d01
from wrfem_00to12z_d01
file in wrf chem through doubts on the correctness, it is reported in another issue.
Thanks emiss_converter.py works perfectly. Please provide a sample .xlsx file, it will be much useful to learn and apply the program for different emission sources