Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Help with Regridding New Canopy Variables in sfc_climo to Regional C793 Grid #855

Open
drnimbusrain opened this issue Sep 25, 2023 · 13 comments

Comments

@drnimbusrain
Copy link

drnimbusrain commented Sep 25, 2023

Dear @GeorgeGayno-NOAA

I am working on adding new sfc_climo files/variables for forest canopies to UFS_UTILS for a project we have in AQM and canopy impacts.

Following Mike Barlarge's advice, I have followed your recent work on adding soil_color datasets to the UFS_UTILS as an analog for adding my changes to our fork of UFS_UTILS.

The UFS_UTILS codes build and run OK with my changes, and I am using a local version of the four 1 km global, raw canopy files I am testing on Hera, found here: /scratch2/NAGAPE/arl/Patrick.C.Campbell/UFS/fix/sfc_climo/20221017/canopy*.nc_

I wondered if you could help point me in how to use my updated UFS_UTILS and the driver scripts to generate such regridded files on the FV3 files at a C793 resolution for the AQM application.

Example C793 files for AQM Tile 7 on Hera: /scratch2/NAGAPE/arl/Patrick.C.Campbell/UFS/DOMAIN_DATA/AQM_NA_13km/

I suppose I am not finding the correct grid definitions for my desired AQM C793 grid in the driver scripts, for example on Hera:
Driver script: /scratch2/NAGAPE/arl/Patrick.C.Campbell/UFS_UTILS/driver_scripts/driver_grid.hera.sh
Example Log file: /scratch2/NAGAPE/arl/Patrick.C.Campbell/UFS_UTILS/driver_scripts/log.fv3_grid_driver

Can you provide some guidance as to ways in bringing in the AQM C793 resolution as in my example files above so that UFS_UTILS could regrid our 1 km global canopy files in UFS?

Thank you!!

@GeorgeGayno-NOAA
Copy link
Collaborator

If I understand your question, you want to map your new data to the C793 grid using the existing 'grid' and 'oro' files in /scratch2/NAGAPE/arl/Patrick.C.Campbell/UFS/DOMAIN_DATA/AQM_NA_13km?

@drnimbusrain
Copy link
Author

Hi @GeorgeGayno-NOAA Thanks for your reply. Yes, I have the new data and updated UFS_UTILS to be able to read it, and would like to map it to the C793 grid files already present. I thought I could use UFS_UTILS to do this, either using the driver scripts offline to pre-generate them (as in my directory above), or using UFS_UTILS in the workflow (i.e., RUN_TASK_MAKE_SFC_CLIMO) itself.

I can run the driver scripts on the new data, but wondered how to set this specific AQM C793 domain desired.

Any guidance is much appreciated.

@GeorgeGayno-NOAA
Copy link
Collaborator

There is a utility to run just the sfc_climo_gen step here:
https://github.com/noaa-oar-arl/UFS_UTILS/tree/feature/aqm_canopy/util/sfc_climo_gen

You will need to edit this file for the location of your 'grid' and 'oro' files:
https://github.com/noaa-oar-arl/UFS_UTILS/blob/feature/aqm_canopy/util/sfc_climo_gen/sfc_gen.sh

Then, you would run it using the Hera driver script. Let me know if you can't get it working.

Note, we just merged updates that corrected some problems in the VIIRS vegetation data (See #821). So, you may want to merge these changes to your branch to use the new data. As you probably know, permanent land ice points are determined by the vegetation type, and several surface fields are adjusted to be consistent with the properties of land ice.

@drnimbusrain
Copy link
Author

@GeorgeGayno-NOAA OK, thank you. I have just synced our fork's branch to merge in your upstream [develop] branch including your fixes on #843. It compiles successfully with my updated canopy variable additions.

Thank you for your guidance of the sfc_climo_gen utility, as I was unaware. I will try that now.

@drnimbusrain
Copy link
Author

drnimbusrain commented Sep 27, 2023

@GeorgeGayno-NOAA

I think I got it to work OK, see my script:
/scratch2/NAGAPE/arl/Patrick.C.Campbell/UFS_UTILS/util/sfc_climo_gen/sfc_gen.sh

However, there are inconsistencies with the nx and ny for the number of halo set.

For example, I set HALO=4 in the sfc_climo script, and get the following output file, with a suffix "halo5" on the fileame:
/scratch1/NCEPDEV/stmp4/Patrick.C.Campbell/sfc.C793/C793.canopy_clumping_index.tile7.halo5.nc

dimensions:
        nx = 808 ;
        ny = 552 ;

While, these dimensions pertain to halo4, as in previous files I had, e.g.,:
/scratch2/NAGAPE/arl/Patrick.C.Campbell/UFS/DOMAIN_DATA/AQM_NA_13km/C793.slope_type.tile7.halo4.nc

dimensions:
        nx = 808 ;
        ny = 552 ;

The sfc_climo gen utility output also gives a "halo0" file:
/scratch1/NCEPDEV/stmp4/Patrick.C.Campbell/sfc.C793/C793.canopy_clumping_index.tile7.halo0.nc

dimensions:
        nx = 798 ;
        ny = 542 ;

But, these are also different than the "halo0" files I had previous, e.g.,:
/scratch2/NAGAPE/arl/Patrick.C.Campbell/UFS/DOMAIN_DATA/AQM_NA_13km/C793.slope_type.tile7.halo0.nc

dimensions:
        nx = 800 ;
        ny = 544 ;

Can you help me rectify the correct halo settings to match my previous example AQM_NA_13km files?

Thanks again!

@GeorgeGayno-NOAA
Copy link
Collaborator

@GeorgeGayno-NOAA

I think I got it to work OK, see my script: /scratch2/NAGAPE/arl/Patrick.C.Campbell/UFS_UTILS/util/sfc_climo_gen/sfc_gen.sh

However, there are inconsistencies with the nx and ny for the number of halo set.

For example, I set HALO=4 in the sfc_climo script, and get the following output file, with a suffix "halo5" on the fileame: /scratch1/NCEPDEV/stmp4/Patrick.C.Campbell/sfc.C793/C793.canopy_clumping_index.tile7.halo5.nc

dimensions:
        nx = 808 ;
        ny = 552 ;

While, these dimensions pertain to halo4, as in previous files I had, e.g.,: /scratch2/NAGAPE/arl/Patrick.C.Campbell/UFS/DOMAIN_DATA/AQM_NA_13km/C793.slope_type.tile7.halo4.nc

dimensions:
        nx = 808 ;
        ny = 552 ;

The sfc_climo gen utility output also gives a "halo0" file: /scratch1/NCEPDEV/stmp4/Patrick.C.Campbell/sfc.C793/C793.canopy_clumping_index.tile7.halo0.nc

dimensions:
        nx = 798 ;
        ny = 542 ;

But, these are also different than the "halo0" files I had previous, e.g.,: /scratch2/NAGAPE/arl/Patrick.C.Campbell/UFS/DOMAIN_DATA/AQM_NA_13km/C793.slope_type.tile7.halo0.nc

dimensions:
        nx = 800 ;
        ny = 544 ;

Can you help me rectify the correct halo settings to match my previous example AQM_NA_13km files?

Thanks again!

I will take a look.

@GeorgeGayno-NOAA
Copy link
Collaborator

@drnimbusrain I was able to get it to work. You have a different set of files than what the standard UFS_UTILS grid generation scripts would create. That makes me wonder if UFS_UTILS is producing extra files that are no longer used by the regional model. Anyway, to get it to work for your set of files, I updated sfc_climo.sh as follows:

+++ b/util/sfc_climo_gen/sfc_gen.sh
@@ -65,11 +65,11 @@

 set -x

-export res=768.mx025
+export res=793

-#HALO=4
-#export GRIDTYPE=regional
-#FIX_REG=/lfs/h2/emc/stmp/$LOGNAME/fix.reg
+HALO=4
+export GRIDTYPE=regional
+FIX_REG=/scratch2/NCEPDEV/stmp1/George.Gayno/test

 export veg_type_src="viirs.v3.igbp.30s"

@@ -78,7 +78,7 @@ export soil_type_src="bnu.v3.30s"
 export WORK_DIR=/scratch1/NCEPDEV/stmp2/$LOGNAME/work.sfc
 export SAVE_DIR=/scratch1/NCEPDEV/stmp2/$LOGNAME/sfc.C${res}

-export FIX_FV3=${BASE_DIR}/fix/orog/C${res}
+export FIX_FV3=/scratch2/NAGAPE/arl/Patrick.C.Campbell/UFS/DOMAIN_DATA/AQM_NA_13km

 # Requires much more resources when true. On hera, use 6 nodes,
 # 12 tasks per node. On WCOSS2, use 5 nodes, 12 tasks per node.
@@ -95,10 +95,10 @@ if [[ "$GRIDTYPE" = "regional" ]]; then
   mkdir -p $FIX_REG
   ln -fs $FIX_FV3/C${res}_grid.tile7.halo${HALO}.nc $FIX_REG/C${res}_grid.tile7.halo${HALO}.nc
   ln -fs $FIX_FV3/C${res}_oro_data.tile7.halo${HALO}.nc $FIX_REG/C${res}_oro_data.tile7.nc
-  ln -fs $FIX_FV3/C${res}_mosaic.nc $FIX_REG/C${res}_mosaic.nc
+  ln -fs $FIX_FV3/C${res}_mosaic.halo${HALO}.nc $FIX_REG/C${res}_mosaic.nc
   export mosaic_file=$FIX_REG/C${res}_mosaic.nc
   export FIX_FV3=$FIX_REG
-  HALO=$(( $HALO + 1 ))
+# HALO=$(( $HALO + 1 ))
   export HALO

I placed of copy of that script here: /scratch2/NCEPDEV/stmp1/George.Gayno

@drnimbusrain
Copy link
Author

Thank you @GeorgeGayno-NOAA, and sorry, as should have just commented out the HALO + 1. Closing the issue.

@drnimbusrain
Copy link
Author

drnimbusrain commented Oct 5, 2023

@GeorgeGayno-NOAA Thank you again for your help with UFS_UTILS and getting my new canopy fix files on C793. /scratch2/NAGAPE/arl/Patrick.C.Campbell/UFS/DOMAIN_DATA/AQM_NA_13km/C793.canopy*_

If I may ask you another question/clarification on getting these new fields into the model (i.e., fv3atm and ccpp-physics).

I have modified the relevant UFS-SRW-App (e.g., [fixed files mapping](noaa-oar-arl/ufs-srweather-app@af9973f and python namelist generation) and fv3atm code repositories (e.g., GFS typedefs and diagnostics and GFS meta) to get these fix file paths written to main input.nml and to identify the new canopy files in the GFS type defs and meta files. I mainly used the rdlai and xlaixy and other sfc_climo fields as analogs to follow,. It seems that the LAI can still be read from a sfc_climo input file for the lsm_ruc model (rather than using lai tables as for Noah and Noah-MP).

Thus, I have started to then modify the ccpp-physics code repositories for the routines and meta files where I would like to use these canopy sfc_climo files, but wondered if you could give me guidance on where these sfc_climo fix files (see Hera location above) are actually read in and populated into the ccpp-physics codes. Is the ccpp-physics/physics/sfcsub.F code the location that actually reads in the file and attributes to the correct model variable (e.g., maybe using slope_type or another current field as an analog)?

I know I reopen this, but it is not necessarily UFS_UTILS related. So, if you could at least help point me to who I may better ask this question, I would also appreciate that! Thank you!

@drnimbusrain drnimbusrain reopened this Oct 5, 2023
@GeorgeGayno-NOAA
Copy link
Collaborator

Are these new fields static or will they get updated as the model runs?

@drnimbusrain
Copy link
Author

@GeorgeGayno-NOAA Great question! Some are static, and some vary monthly/daily:

C793.canopy_forest_height.tile7.halo0.nc (static)
C793.canopy_forest_fraction.tile7.halo0.nc (static)
C793.canopy_clumping_index.tile7.halo0.nc (monthly)
C793.canopy_leaf_area_index.tile7.halo0.nc (monthly right now, but soon we will update to daily)

@GeorgeGayno-NOAA
Copy link
Collaborator

Instead of going back and forth in this issue, it might be better if we have a quick telecon so I understand what you want to do. We could invite someone from the land team if needed.

@drnimbusrain
Copy link
Author

@GeorgeGayno-NOAA I definitely agree. Please feel free to email me or setup something on the calendar at Patrick.C.Campbell@noaa.gov

Would sometime early next week, say Mon - Wed (Oct 9-11) work for you?

I will be out for workshop/conference travel from Oct 12 - 20th, and my schedule will be much more limited.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants