ATM419 WRF Real

You might also like

Download as pptx, pdf, or txt
Download as pptx, pdf, or txt
You are on page 1of 33

Real-data WRF: Setup and

run
ATM 419
Spring 2016
Fovell

References
ARW users guide (PDF available)
http://www2.mmm.ucar.edu/wrf/users/docs/user
_guide_V3/contents.html

Technical description of WRF (PDF)


http://www2.mmm.ucar.edu/wrf/users/docs/arw
_v3.
pdf

NetCDF operators (NCO) home page

Terms
Parent model = gridded data used for
initialization and boundary conditions
GFS/FNL, NAM, RAP/HRRR, reanalyses
(NARR, CFSR, NNRP, ERA-interim etc.),
other WRF runs

WPS = WRF Preprocessing System


(consisting of geogrid, ungrib and
metgrid programs).

Case study
One 36-km resolution, 54 x 48 point
domain centered over Kansas
48 h simulation, initialized from GFS at
3/13/2016 at 00Z
Verify near-surface fields (T, Td, RH at 2 m;
10-m wind; SLP) against ASOS stations
using Model Evaluation Tools (MET)
package
See provided script for implementing this
case study

Steps in real WRF run


Geogrid (geogrid.exe)
Set up domain (and nests, if applicable)
Only redone if domain is altered

Ungrib (ungrib.exe)
Unpacks parent model data
Requires the correct variable table (Vtable) translator

Metgrid (metgrid.exe)
Interpolates pressure-level parent model data to WRF model grid

Real (real.exe)
Creates initial and boundary condition files for WRF on model
vertical grid of your choice

WRF (wrf.exe)
Compiled as em_real

Preliminaries
WRF-ARW needs to be compiled as
em_real

Namelists are
namelist.wps (used for geogrid.exe, ungrib.exe,
metgrid.exe)

namelist.input (used for real.exe, wrf.exe)

Create a directory called KANSAS, copy into


it and unpack this file
/network/rit/lab/atm419lab/KANSAS/SETUP.TAR

Allocate 1 cpu on snow

make_all_links.csh
In addition to linking to needed
programs and support files, this shell
script also
Creates directories called geogrid and
metgrid, and places *.TBL files in them.
We do not need to alter those files at
this time.
Links to a newer version of NCL (ncl62)
Copies several variable translation
tables, called Vtable.*

Geogrid
Do geogrid section of script
Creates geo_em.d01.nc

Output from plotgrids.ncl

namelist.wps
&share
wrf_core='ARW',
max_dom=1,
start_date='20160313_00:00:00','20160313_00:00:00'
end_date='20160315_00:00:00','20160315_00:00:00'
interval_seconds=10800,
io_form_geogrid=2,
opt_output_from_geogrid_path='./',
debug_level=0
/

NOTES:
One domain (max_dom=1) so second column of start_date
and end_date do not matter
interval_seconds is time resolution of parent model data in seconds
(for GFS, we have 3-hourly data, so 10800 sec)
(NAM is hourly to 36 h, 3-hourly thereafter)
(RAP is hourly to 18 h)

namelist.wps
&geogrid
parent_id=1,1,2,
parent_grid_ratio=1,3,3,
i_parent_start=1,82,100,
j_parent_start=1,82,36,
s_we=1,1,1,
e_we=54,214,772,
s_sn=1,1,1,
e_sn=48,196,610,
geog_data_res='usgs_lakes+30s','usgs_lakes+30s','usgs_lakes+30s',
dx=36000.,
dy=36000.,
map_proj='lambert',
ref_lat=38.,
ref_lon=100,
truelat1=38.,
truelat2=38.,
stand_lon=100,
geog_data_path='/network/rit/lab/fovelllab_rit/GEOG_V371',
opt_geogrid_tbl_path='geogrid/'
/
OTES:
gain, in this case, only first column matters
ur domain is 54 x 48 and 36 km resolution
Were using the USGS landuse database 30 sec (about 1 km) is its resolution
ambert projection is standard for modest sized domains in midlatitudes.
Use polar stereographic for high latitudes, Mercator for tropical domains.

mbert conformal projection (from Wikipedia)


- shape and accuracy depend somewhat on true latitudes (standard parallels

ref_lat=38.,
ref_lon=100,
truelat1=38.,
truelat2=38.,
stand_lon=100,

At true latitude, there is no map


distortion;
i.e., map factor is 1.0
For relatively small domains, true
latitudes can
be the same (as here)
Map factor = (horizontal grid size)/

Map factors
You specify x (and y) in namelist.input
Map factor m determines actual grid spacing

So when m > 1.0, your actual grid spacing is


smaller than specified x. This puts stress on
your time step.
When m < 1.0, you have less resolution than you
thought you had
Stay as close to 1.0 as possible

e NetCDF operators (NCO) to look in geo_em.d01.nc file

ymaxvMAPFAC_Mgeo_em.d01.ncjunk.nc

mpjunk.nc

MAPFAC_M=1.009169;

[i.e., smallest x is 35.7 km]

yminvMAPFAC_Mgeo_em.d01.ncjunk2.nc

ref_lat=38.,
ref_lon=100,
truelat1=38.,
truelat2=38.,
stand_lon=100,

mpjunk2.nc

MAPFAC_M=0.9999999;

1.009

e ncview (or IDV) to peek at geo_em.d01.nc file

ewgeo_em.d01.nc

[plot 2D variable MAPFAC_M]


MAPFAC_M increases

from ~ 1 to 1.01 away from


central (true) latitude.
You need to keep the map
factors close to 1.0.

1.0

MAPFAC_M viewed in IDL

Ungrib
In this step, we link to the parent model grids
and unpack them into intermediate format files
It is crucial to select the proper variable
table (Vtable)
There is a different Vtable for each parent model
make_all_links.csh copies a few Vtable versions
The file must be named Vtable
Other variable tables found in
/network/rit/home/atm419/WPSV371_ATM419/ungrib/Va
riable_Tables

Follow ungrib part of script

wgrib2GRIBFILE.AAA|more
1:0:d=2016031300:UGRD:planetary boundary layer:anl:
2:558813:d=2016031300:VGRD:planetary boundary layer:anl:
3:1093579:d=2016031300:VRATE:planetary boundary layer:anl:
4:1642644:d=2016031300:GUST:surface:anl:
5:2218981:d=2016031300:HGT:10 mb:anl:
6:2813514:d=2016031300:TMP:10 mb:anl:
7:3067356:d=2016031300:RH:10 mb:anl:
8:3351328:d=2016031300:UGRD:10 mb:anl:
9:3634572:d=2016031300:VGRD:10 mb:anl:
10:3964764:d=2016031300:ABSV:10 mb:anl:
11:4325387:d=2016031300:O3MR:10 mb:anl:
GFS model grids in GRIB2
12:4692016:d=2016031300:HGT:20 mb:anl:
format, on pressure levels
13:5385492:d=2016031300:TMP:20 mb:anl:
14:5640115:d=2016031300:RH:20 mb:anl:
15:5728322:d=2016031300:UGRD:20 mb:anl:
16:6012559:d=2016031300:VGRD:20 mb:anl:
17:6346812:d=2016031300:ABSV:20 mb:anl:

namelist.wps
&share
wrf_core='ARW',
max_dom=1,
start_date='20160313_00:00:00','20160313_00:00:00'
end_date='20160315_00:00:00','20160315_00:00:00'
interval_seconds=10800,
io_form_geogrid=2,
opt_output_from_geogrid_path='./',
debug_level=0
/
&ungrib
out_format='WPS',
prefix='FILE',
/
&metgrid
fg_name='FILE',
io_form_metgrid=2,
/

NOTES:
Execution of ungrib.exe unpacks the parent model grids into
a set of files named by the prefix (here, FILE:)
Program looks for files between start and end dates, at interval
specified as interval_seconds.

Plotting intermediate format


files

ncl62plotfmt.ncl'filename="FILE:20160313_00"'

Metgrid
Follow metgrid portion of script
In this step, we link to the interpolate the
intermediate format files onto the WRF
horizontal grid
Creates files called met_em*

Files may be viewed with ncview (poorly) or


IDV (better)
Use ncdump on any of the met_em* files to get
# of vertical levels and # of soil levels (see
next slide)

ncdumpmet_em.d01.20160313_00:00:00.nc|more

netcdfmet_em.d01.20160313_00\:00\:00{
dimensions:
Time=UNLIMITED;//(1currently)
DateStrLen=19;
west_east=53;
south_north=47;
num_metgrid_levels=27;
num_st_layers=4;
num_sm_layers=4;
This parent model data source
south_north_stag=48;
has 27 vertical atmospheric levels
west_east_stag=54;
and 4 soil temperature and soil
zdimension0132=132;
moisture layers (st and sm).
zdimension0012=12;
zdimension0016=16;
These varies among parent model
zdimension0028=28;

sources.

SLP at initial time in domain, as seen with IDV

Running real.exe and wrf.exe:


Batch scripts

Batch scripts
Running real-data WRF (real.exe and
wrf.exe) is often too resource-intensive to
execute with srun from the command line.
As an alternative, well run them as batch
jobs on the snow cluster. SETUP.TAR
provided two files: submit_real, and
submit_wrf.
Both are presently configured to request 8
cpus on a single node
No need to edit these scripts at this time

submit_real
#!/bin/bash
#Jobname:
#SBATCHjobname=atm419
#SBATCHn8
#SBATCHN1
#SBATCHmempercpu=7G
#SBATCHpsnow
#SBATCHosbatch.out
#SBATCHesbatch.err.out
source/network/rit/home/atm419/.bash_profile
st_tm="$(date+%s)"
echo"runningreal"
srunN1n8oreal.srun.out./real.exe

These need to match

DO NOT CHANGE

Steps for running real.exe


Submit job to Snow
sbatchpsnowsubmit_real

To check on your job status, use


squeueuyournetid

When job disappears from queue,


check tail of rsl.out.0000 file with trsl
Result of real.exe: creation of files
wrfbdy_d01 and wrfinput_d01.

Steps for running wrf.exe


Submit job to Snow
sbatchpsnowsubmit_wrf

To check on your job status, use


squeueuyournetid

When job disappears from queue, check


tail of rsl.out.0000 file with trsl
Result of wrf.exe: creation of wrfout_d01
files
[We can combine the real and wrf jobs
in a single batch file.]

terrain.gs

Look inside
namelist.input

namelist.input
&time_control
run_days=2,
run_hours=0,
run_minutes=0,
run_seconds=0,
start_year=2016,2016,
start_month=03,03,
start_day=13,13,
start_hour=00,00,
start_minute=00,00,00,
AGAIN, only first
start_second=00,00,00,
column matters
end_year=2016,2016,
since max_dom
end_month=03,03,
end_day=15,15,
is 1.
end_hour=00,00,
end_minute=00,00,00,
end_second=00,00,00,
interval_seconds=10800,
input_from_file=.true.,.true.,
history_interval=60,60,
frames_per_outfile=1,1,
/
NOTES:
We will run for 2 days, starting and ending at times shown
interval_seconds should match namelist.wps setting
One history file per history time (frames_per_outfile = 1)

namelist.input
&domains
time_step=180,
time_step_fract_num=0,
time_step_fract_den=1,
max_dom=1,
e_we=54,214,
e_sn=48,196,
e_vert=57,57,
p_top_requested=5000,
num_metgrid_levels=27,
AGAIN, only first
num_metgrid_soil_levels=4,
dx=36000.,4000., column matters
dy=36000.,4000., since max_dom
grid_id=1,2,
is 1.
parent_id=0,1,
i_parent_start=1,82,
j_parent_start=1,82,
parent_grid_ratio=1,3,
parent_time_step_ratio=1,3,
feedback=1,
/
NOTES:
Domain size must match namelist.wps!
We are requesting 57 vertical levels in real.exe.
Get num_metgrid* info from met_em* files via ncdump.

namelist.input
&physics
mp_physics=4,
ra_lw_physics=4,
ra_sw_physics=4,
radt=20,
sf_surface_physics=2, Noah LSM,
sf_sfclay_physics=1, Monin-Obukhov surface layer
bl_pbl_physics=1, YSU PBL
bldt=0,
num_soil_layers=4,
num_land_cat=28,
cu_physics=1,
cudt=5,
cugd_avedx=1,
isfflx=1,
ifsnow=0,
icloud=1,
do_radar_ref=1,
AGAIN, only first
surface_input_source=1,
column matters
mp_zero_out=2,
mp_zero_out_thresh=1.e8,
since max_dom
/
is 1.

NOTES:
Many microphysics options available
Many options available for surface, surface layer and PBL schemes
- surface layer (sfclay) and PBL (bl_pbl) usually come as pairs

How PBL and surface layer schemes


can mix & match
Some available PBL schemes:
YSU: pbl = 1, sfclay = 1
MYJ: pbl = 2, sfclay = 2
MYNN: pbl = 5, sfclay = 1, 2 or 5
ACM2: pbl = 7, sfclay = 7
Some land surface models:
Noah: surface = 2, soil = 4
NoahMP: surface = 4, soil = 4
TD: surface = 1, soil = 5
RUC: surface = 3, soil = 6
PX: surface = 7, soil = 2
CLM: surface = 5, 10

pbl = bl_pbl_physics
sfclay = sf_sfclay_physics
surface = sf_surface_physic
soil = num_soil_layers

wind.gs(t=13)

You might also like