SlideShare ist ein Scribd-Unternehmen logo
1 von 50
Downloaden Sie, um offline zu lesen
Set Up and Run WRF
(ARW-Ideal, ARW-real, and NMM)


         Wei Wang
      NCAR/ESSL/MMM


              Mesoscale & Microscale Meteorological Division / NCAR   1
WRF System Flowchart

                       WRFV3



              ideal.exe
WPS            real.exe                       wrf.exe
           real_nmm.exe




                 Mesoscale & Microscale Meteorological Division / NCAR   2
Outline
• Running WRF code
   –   Before you run..
   –   Running idealized case
   –   Running ARW real-data case
   –   Running NMM real-data case
• Basic runtime options for a single domain run
  (namelist)
• Check output
• Simple trouble shooting
This talk is complementary to ‘Nesting’ talk later.


                        Mesoscale & Microscale Meteorological Division / NCAR   3
Before You Run ..
• Check and make sure appropriate executables are
  created in WRFV3/main/ directory:
      For ARW:                        For NMM:
      - ideal.exe                     - real_nmm.exe
      - real.exe                      - wrf.exe
      - wrf.exe
      - ndown.exe
• If you are running a real-data case, be sure that files
  from WPS are correctly generated:
   – met_em.d01.*, for ARW or
   – met_nmm.d01.* for NMM
• Prepare namelist.input for runtime options.

                            Mesoscale & Microscale Meteorological Division / NCAR   4
WRF test case directories
You have these choices in WRFV3/test/
 (made at compile time):
   nmm_real                                               NMM
   em_real                 3d real-data
   em_quarter_ss
   em_b_wave
   em_les
                           3d ideal
   em_heldsuarez
   em_hill2d_x                                             ARW
   em_squall2d_x
   em_squall2d_y           2d ideal
   em_grav2d_x
   em_seabreeze2d_x


                           Mesoscale & Microscale Meteorological Division / NCAR   5
Steps to Run
1. cd to run/ or one of the test case
   directories
2. Link or copy WPS output files to the
   directory for real-data cases
3. Edit namelist.input file for the
   appropriate grid and times of the case
4. Run initialization program (ideal.exe,
  real.exe, or real_nmm.exe)
5. Run model executable, wrf.exe

                     Mesoscale & Microscale Meteorological Division / NCAR   6
WRFV3/run directory
    README.namelist
    LANDUSE.TBL
    ETAMPNEW_DATA
    GENPARM.TBL       these files are model
    RRTM_DATA         physics data files: they are
    SOILPARM.TBL      used to either initialize
    VEGPARM.TBL       physics variables, or make
    urban_param.tbl   physics computation more
    tr49t67
    tr49t85           efficient
    tr67t85
    gribmap.txt
    grib2map.tbl
…. (a few more)



                      Mesoscale & Microscale Meteorological Division / NCAR   7
WRFV3/run directory after compile
LANDUSE.TBL
ETAMPNEW_DATA
GENPARM.TBL
RRTM_DATA
SOILPARM.TBL            An example after
VEGPARM.TBL
urban_param.tbl         ARW real case
tr49t67                 compile
tr49t85
tr67t85
gribmap.txt
grib2map.tbl
namelist.input -> ../test/em_real/namelist.input
real.exe -> ../main/real.exe
wrf.exe -> ../main/wrf.exe
ndown.exe -> ../main/ndown.exe
…. (a few more)      Mesoscale & Microscale Meteorological Division / NCAR   8
Running an Idealized Case
       ARW only




           Mesoscale & Microscale Meteorological Division / NCAR   9
Running an Idealized Case
• If you have compiled an ideal case, you should have:
      ideal.exe - ideal case initialization program
      wrf.exe - model executable
• These executables are linked to:
      WRFV3/run
  and
      WRFV3/test/em_test-case

 One can go to either directory to run.


                           Mesoscale & Microscale Meteorological Division / NCAR   10
Running an Idealized Case

Go to the desired ideal test case directory: e.g.
    cd test/em_quarter_ss

If there is ‘run_me_first.csh’ in the
   directory, run it first - this links physics data
   files to the currect directory:
      ./run_me_first.csh


                          Mesoscale & Microscale Meteorological Division / NCAR   11
Running an Idealized Case

Then run the ideal initialization program:
   ./ideal.exe

The input to this program is typically a sounding file (file
  named input_sounding), or a pre-defined 3D input
  (e.g. input_jet in em_b_wave case).

Running ideal.exe creates WRF initial condition file:
  wrfinput_d01

Note that wrfbdy file is not needed for idealized cases

                            Mesoscale & Microscale Meteorological Division / NCAR   12
Running an Idealized Case

• To run the model interactively, type
  ./wrf.exe >& wrf.out &
  for single processor (serial) or SMP run. Or
  mpirun -np N ./wrf.exe &
  for a MPI run (where N is the number of processors
  requested)
• Successful running of the model executable will
  create a model history file called wrfout_d01_<date>
   e.g. wrfout_d01_0001-01-01_00:00:00


                           Mesoscale & Microscale Meteorological Division / NCAR   13
Running an Idealized Case

• Edit namelist.input file to change options.
• For your own case, you may provide a different
  sounding.
• You may also edit
  dyn_em/module_initialize_<case>.F to
  change other aspects of the initialization.
Note:
• For 2D cases and baroclinic wave case, ideal.exe
  must be run serially
• For all 2D cases, wrf.exe must be run serially or
  with SMP
                        Mesoscale & Microscale Meteorological Division / NCAR   14
Running ARW Real-Data Case




            Mesoscale & Microscale Meteorological Division / NCAR   15
Running ARW Real-Data Case
• If you have compiled the em_real case, you should
  have:
      real.exe - real data initialization program
      wrf.exe - model executable
      ndown.exe - program for doing one-way nesting
• These executables are linked to:
      WRFV3/run
  and
      WRFV3/test/em_real

 One can go to either directory to run.

                           Mesoscale & Microscale Meteorological Division / NCAR   16
WRFV3/test/em_real directory
LANDUSE.TBL -> ../../run/LANDUSE.TBL
ETAMPNEW_DATA -> ../../run/ETAMPNEW_DATA
GENPARM.TBL -> ../../run/GENPARM.TBL
RRTM_DATA -> ../../run/RRTM_DATA
SOILPARM.TBL -> ../../run/SOILPARM.TBL
VEGPARM.TBL -> ../../run/VEGPARM.TBL
urban_param.tbl -> ../../run/urban_param.tbl
tr49t67 -> ../../run/tr49t67
tr49t85 -> ../../run/tr49t85
tr67t85 -> ../../run/tr67t85
gribmap.txt -> ../../run/gribmap.txt
grib2map.tbl -> ../../run/grib2map.tbl
namelist.input       - require editing
real.exe -> ../../main/real.exe
wrf.exe -> ../../main/wrf.exe
ndown.exe -> ../../main/ndown.exe
…. (a few more)      Mesoscale & Microscale Meteorological Division / NCAR   17
Running WRF ARW Real-data Cases

• One must successfully run WPS, and create
  met_em.* file for more than one time period

• Link or copy WPS output files to the run
  directory:
  cd test/em_real
  ln -s ../../../WPS/met_em.d01.* .


                      Mesoscale & Microscale Meteorological Division / NCAR   18
Running WRF ARW Real-data Cases

• Edit namelist.input file for runtime options (at
  mininum, one must edit &time_control for start, end
  and integration times, and &domains for grid dimensions)

• Run the real-data initialization program:
   ./real.exe, if compiled serially / SMP, or
   mpirun -np N ./real.exe, or
   mpirun -machinefile file -np N ./real.exe
     for a MPI job
  where N is the number of processors requested, and
  file has a list of CPUs for the MPI job

                           Mesoscale & Microscale Meteorological Division / NCAR   19
Running WRF ARW Real-data Cases
• Successfully running this program will create model
  initial and boundary files:

                                 Single time level
     wrfinput_d01                data at model’s
     wrfbdy_d01                  start time

                               Multiple time level data
                               at the lateral boundary,
                               and only for domain 1



                           Mesoscale & Microscale Meteorological Division / NCAR   20
Running WRF ARW Real-data Cases
• Run the model executable by typing:
    ./wrf.exe >& wrf.out &
    or
    mpirun -np N ./wrf.exe &
• Successfully running the model will a create model
  history file:
     wrfout_d01_2005-08-28_00:00:00

  And restart file if restart_interval is set to a time
  within the range of the forecast time:
     wrfrst_d01_2008-08-28_12:00:00

                           Mesoscale & Microscale Meteorological Division / NCAR   21
Running NMM Real-Data Case




            Mesoscale & Microscale Meteorological Division / NCAR   22
Running NMM Real-Data Case
• If you have compiled the nmm_real, you should have:
      real_nmm.exe - NMM real date initialization
                         program
      wrf.exe - NMM model executable

• These executables are linked to:
    WRFV3/run
  and
    WRFV3/test/nmm_real

 One can go to either directory to run.

                           Mesoscale & Microscale Meteorological Division / NCAR   23
WRFV3/test/nmm_real directory
LANDUSE.TBL -> ../../run/LANDUSE.TBL
ETAMPNEW_DATA -> ../../run/ETAMPNEW_DATA
GENPARM.TBL -> ../../run/GENPARM.TBL
RRTM_DATA -> ../../run/RRTM_DATA
SOILPARM.TBL -> ../../run/SOILPARM.TBL
VEGPARM.TBL -> ../../run/VEGPARM.TBL
urban_param.tbl -> ../../run/urban_param.tbl
tr49t67 -> ../../run/tr49t67
tr49t85 -> ../../run/tr49t85
tr67t85 -> ../../run/tr67t85
gribmap.txt -> ../../run/gribmap.txt
grib2map.tbl -> ../../run/grib2map.tbl
namelist.input       - require editing
real_nmm.exe -> ../../main/real_nmm.exe
wrf.exe -> ../../main/wrf.exe

                     Mesoscale & Microscale Meteorological Division / NCAR   24
Running WRF NMM Real-data Cases

• One must successfully run WPS, and create
  met_nmm.* file for more than one time period

• Link or copy WPS output files to the run
  directory:
  cd test/nmm_real
  ln -s ../../../WPS/met_nmm.d01.* .


                      Mesoscale & Microscale Meteorological Division / NCAR   25
Running WRF NMM Real-data Cases
• Edit namelist.input file for runtime options (at
  minimum, one must edit &time_control for start,
  end and integration time, and &domains for grid
  dimensions)

• Run the real-data initialization program (MPI only):
   mpirun -np N ./real_nmm.exe

• Successfully running this program will create model
  initial and boundary files:
     wrfinput_d01
     wrfbdy_d01
                            Mesoscale & Microscale Meteorological Division / NCAR   26
Running WRF NMM Real-data Cases
• Run the model executable by typing (MPI only):
   mpirun -np N ./wrf.exe

• Successfully running the model will create a model
  history file:
     wrfout_d01_2005-08-28_00:00:00

  And restart file if restart_interval is set to a time
  within the range of the forecast time:
     wrfrst_d01_2008-08-28_12:00:00


                            Mesoscale & Microscale Meteorological Division / NCAR   27
Basic namelist Options




          Mesoscale & Microscale Meteorological Division / NCAR   28
What is a namelist?
• A Fortran namelist contains a list of runtime options
  for the code to read in during its execution. Use of a
  namelist allows one to change runtime configuration
  without the need to recompile the source code.
• Fortran 90 namelist has very specific format, so edit
  with care:
   &namelist-record - start
   /                - end
• As a general rule:
   Multiple columns: domain dependent
   Single column: value valid for all domains

                               Mesoscale & Microscale Meteorological Division / NCAR   29
&time_control
run_days            =   0,
run_hours           =   24,              domain 1 option
run_minutes         =   0,
run_seconds         =   0,
start_year          =   2000,   2000,   2000,
start_month         =   01,     01,     01,
start_day           =   24,     24,     24,
start_hour          =   12,     12,     12,
start_minute        =   00,     00,     00,
start_second        =   00,     00,     00,
end_year            =   2000,   2000,   2000,
end_month           =   01,     01,     01,
end_day             =   25,     25,     25,
end_hour            =   12,     12,     12,
end_minute          =   00,     00,     00,
end_second          =   00,     00,     00,
interval_seconds    =   21600
history_interval    =   180,    60,   60,
frame_per_outfile   =   1000,   1000, 1000,         nest options
restart_interval    =   360,



                                Mesoscale & Microscale Meteorological Division / NCAR   30
Notes on &time_control
• run_* time variables:
  – Model simulation length: wrf.exe and domain 1
    only
• start_* and end_* time variables:
  – Program real will use WPS output between these
    times to produce lateral (and lower) boundary file
  – They can also be used to specify the start and end
    of simulation times for the coarse grid.




                          Mesoscale & Microscale Meteorological Division / NCAR   31
Notes on &time_control
• Interval_seconds:
   – Time interval between WPS output times, and
     LBC update frequency
• history_interval:
   – Time interval in minutes when a history output is
     written
   – The time stamp in a history file name is the time
     when the history file is first written, and multiple
     time periods may be written in one file. e.g. a
     history file for domain 1 that is first written for 1200
     UTC Jan 24 2000 is
     wrfout_d01_2000-01-24_12:00:00

                             Mesoscale & Microscale Meteorological Division / NCAR   32
Notes on &time_control
• frame_per_outfile:
   – Number of history times written to one file.
• restart_interval:
   – Time interval in minutes when a restart file is
     written.
   – By default, restart file is not written at hour 0.
   – A restart file contains only one time level data, and
     its valid time is in its file name, e.g. a restart file for
     domain 1 that is valid for 0000 UTC Jan 25 2000 is

     wrfrst_d01_2000-01-25_00:00:00


                              Mesoscale & Microscale Meteorological Division / NCAR   33
&time_control

       io_form_history    =   2,
       io_form_restart    =   2,
       io_form_input      =   2,         IO format options:
       io_form_boundary   =   2,
       debug_level        =   0,           = 1, binary
                                           = 2, netcdf (most common)
                                           = 4, PHDF5
                                           = 5, Grib 1
                                           =10, Grib 2
io_form_restart = 102         :
write in patch sizes: fast
                                     Debug print control:
for large grids and useful
                                     Increasing values give
for restart file
                                     more prints.


                                   Mesoscale & Microscale Meteorological Division / NCAR   34
Notes on &time_control
• frame_per_outfile:
   – Number of history times written to one file.
• restart_interval:
   – Time interval in minutes when a restart file is
     written.
   – By default, restart file is not written at hour 0.
   – A restart file contains only one time level data, and
     its valid time is in its file name, e.g. a restart file for
     domain 1 that is valid for 0000 UTC Jan 25 2000 is

     wrfrst_d01_2000-01-25_00:00:00


                              Mesoscale & Microscale Meteorological Division / NCAR   35
&domains
time_step             =   180
time_step_fract_num   =   0,
time_step_fract_den   =   1,
max_dom               =   1,
s_we                  =   1,     1,     1,
e_we                  =   74,    112,   94,
s_sn                  =   1,     1,     1,
e_sn                  =   61,    97,nest91,
s_vert                =   1,     1,     1,
                                   options
e_vert                =   28,    28,    28,
num_metgrid_levels    =   21
dx                    =   30000, 10000, 3333,
dy                    =   30000, 10000, 3333,
eta_levels            =   1.0,0.996,0.99,0.98,… 0.0
p_top_requested       =   5000,
                      Mesoscale & Microscale Meteorological Division / NCAR   36
Notes on &domains
• time_step, time_step_fract_num, time_step_frac_den:
   – Time step for model integration in seconds.
   – Fractional time step specified in separate integers of
     numerator and denominator.
   – ARW: 6xDX; NMM: 2.25xDX (DX is grid distance in km)
• s_we, s_sn, s_vert:
   – Starting indices in X, Y, and Z direction; 1 for domain 1.
• e_we, e_sn, e_vert:
   – Model grid dimensions (staggered) in X, Y and Z directions.
• num_metgrid_levels:
   – Number of metgrid (input) data levels.
• dx, dy:
   – grid distances in meters for ARW; in degrees for NMM.

                                Mesoscale & Microscale Meteorological Division / NCAR   37
Notes on &domains
• p_top_requested:
   – Pressure value at the model top.
   – Constrained by the available data from WPS.
   – Default is 5000 hPa

• eta_levels:
   – Specify your own model levels from 1.0 to 0.0.
   – If not specified, program real will calculate a set of levels for
     you

• ptsgm (NMM only):
   – Pressure level (Pa) at which the WRF-NMM hybrid
     coordinate transitions from sigma to pressure


                                 Mesoscale & Microscale Meteorological Division / NCAR   38
Where do I start?

• Always start with a namelist template provided in a
  test case directory, whether it is a ideal case, ARW or
  NMM.
   – A number of namelist templates are provided in test/test-
     case/ directories
• Use document to guide the modification of the
  namelist values:
   – run/README.namelist
   – User’s Guide, Chapter 5
   – Full list of namelists and their default values can be found in
     Registry files: Registry.EM (ARW), Registry.NMM and
     registry.io_boilerplate (IO options, shared)

                                Mesoscale & Microscale Meteorological Division / NCAR   39
To run a job in a different directory..

• Directories run/ and test_<case>/ are
  convenient places to run, but it does not have
  to be.
• Copy or link the content of these directories to
  another directory, including physics data files,
  wrf input and boundary files and wrf namelist
  and executables, and you should be able to
  run a job anywhere on your system.


                        Mesoscale & Microscale Meteorological Division / NCAR   40
Check Output




     Mesoscale & Microscale Meteorological Division / NCAR   41
Output After a Model Run

• Standard out/error files:
  wrf.out, or rsl.* files
• Model history file(s):
  wrfout_d01_<date>
• Model restart file(s), optional
  wrfrst_d01_<date>



                      Mesoscale & Microscale Meteorological Division / NCAR   42
Output from a multi-processor run

The standard out and error will go to the
  following files for a MPI run:
 mpirun -np 4 .wrf.exe 

   rsl.out.0000          rsl.error.0000
   rsl.out.0001          rsl.error.0001
   rsl.out.0002          rsl.error.0002
   rsl.out.0003          rsl.error.0003


 There is one pair of files for each processor
 requested
                       Mesoscale & Microscale Meteorological Division / NCAR   43
What to Look for in a standard out File?

Check run log file by typing
   tail wrf.out, or
   tail rsl.out.0000


You should see the following if the job is
  successfully completed:
   wrf: SUCCESS COMPLETE WRF




                         Mesoscale & Microscale Meteorological Division / NCAR   44
How to Check Model History File?

• Use ncdump:
  ncdump –v Times wrfout_d01_<date>
 to check output times. Or
  ncdump –v U wrfout_d01_<date>
 to check a particular variable (U)

• Use ncview or ncBrowse (great tools!)
• Use post-processing tools (see talks later)


                        Mesoscale & Microscale Meteorological Division / NCAR   45
What is in a wrf.out or rsl file?
•    A print of namelist options
•    Time taken to compute one model step:
Timing   for   main:   time   2000-01-24_12:03:00   on    domain     1:      3.25000   elapsed   seconds.
Timing   for   main:   time   2000-01-24_12:06:00   on    domain     1:      1.50000   elapsed   seconds.
Timing   for   main:   time   2000-01-24_12:09:00   on    domain     1:      1.50000   elapsed   seconds.
Timing   for   main:   time   2000-01-24_12:12:00   on    domain     1:      1.55000   elapsed   seconds.


•    Time taken to write history and restart file:
Timing for Writing wrfout_d01_2000-01-24_18:00:00 for domain                    1:     0.14000 elapsed seconds.


•    Any model error prints: (example from ARW run)
5 points exceeded cfl=2 in domain 1 at time 4.200000 MAX AT i,j,k: 123 48 3
     cfl,w,d(eta)= 4.165821
     −> An indication the model has become numerically unstable


                                                         Mesoscale & Microscale Meteorological Division / NCAR   46
Simple Trouble Shooting




          Mesoscale & Microscale Meteorological Division / NCAR   47
Often-seen runtime problems
– module_configure: initial_config: error reading
  namelist: &dynamics

  > Typos or erroneous namelist variables
    exist in namelist record &dynamics in
    namelist.input file

– input_wrf.F: SIZE MISMATCH: namelist
  ide,jde,num_metgrid_levels= 70 61 27 ; input
  data ide,jde,num_metgrid_levels= 74 61 27
  > Grid dimensions in error

                       Mesoscale & Microscale Meteorological Division / NCAR   48
Often-seen runtime problems
– Segmentation fault (core dumped)
  > Often typing ‘unlimit’ or ‘ulimit -s
    unlimited’ or equivalent can help when
    this happens quickly in a run.
— 121 points exceeded cfl=2 in domain 1 at time
  4.200000 MAX AT i,j,k: 123 48 3 cfl,w,d(eta)=
  4.165821
  > Model becomes unstable due to various
    reasons. If it happens soon after the start
    time, check input data, and/or reduce time
    step.
                       Mesoscale & Microscale Meteorological Division / NCAR   49
References

• Information on compiling and running WRF, and
  a more extensive list of namelist options and
  their definition / explanations can be found in
  the ARW and NMM User’s Guide, Chapter 5
• Also see ‘Nesting Setup and Run’ talk.




                       Mesoscale & Microscale Meteorological Division / NCAR   50

Weitere ähnliche Inhalte

Was ist angesagt?

Storm: The Real-Time Layer - GlueCon 2012
Storm: The Real-Time Layer  - GlueCon 2012Storm: The Real-Time Layer  - GlueCon 2012
Storm: The Real-Time Layer - GlueCon 2012Dan Lynn
 
Improved Reliable Streaming Processing: Apache Storm as example
Improved Reliable Streaming Processing: Apache Storm as exampleImproved Reliable Streaming Processing: Apache Storm as example
Improved Reliable Streaming Processing: Apache Storm as exampleDataWorks Summit/Hadoop Summit
 
Spark stream - Kafka
Spark stream - Kafka Spark stream - Kafka
Spark stream - Kafka Dori Waldman
 
Slide #1:Introduction to Apache Storm
Slide #1:Introduction to Apache StormSlide #1:Introduction to Apache Storm
Slide #1:Introduction to Apache StormMd. Shamsur Rahim
 
whats new in java 8
whats new in java 8 whats new in java 8
whats new in java 8 Dori Waldman
 
Scaling Apache Storm (Hadoop Summit 2015)
Scaling Apache Storm (Hadoop Summit 2015)Scaling Apache Storm (Hadoop Summit 2015)
Scaling Apache Storm (Hadoop Summit 2015)Robert Evans
 
Multi-Tenant Storm Service on Hadoop Grid
Multi-Tenant Storm Service on Hadoop GridMulti-Tenant Storm Service on Hadoop Grid
Multi-Tenant Storm Service on Hadoop GridDataWorks Summit
 
Introduction to Twitter Storm
Introduction to Twitter StormIntroduction to Twitter Storm
Introduction to Twitter StormUwe Printz
 
Real-time streams and logs with Storm and Kafka
Real-time streams and logs with Storm and KafkaReal-time streams and logs with Storm and Kafka
Real-time streams and logs with Storm and KafkaAndrew Montalenti
 
Jack_Knutson_SNUG2003_ Copy
Jack_Knutson_SNUG2003_ CopyJack_Knutson_SNUG2003_ Copy
Jack_Knutson_SNUG2003_ CopyJack Knutson
 
Real time and reliable processing with Apache Storm
Real time and reliable processing with Apache StormReal time and reliable processing with Apache Storm
Real time and reliable processing with Apache StormAndrea Iacono
 
GoodFit: Multi-Resource Packing of Tasks with Dependencies
GoodFit: Multi-Resource Packing of Tasks with DependenciesGoodFit: Multi-Resource Packing of Tasks with Dependencies
GoodFit: Multi-Resource Packing of Tasks with DependenciesDataWorks Summit/Hadoop Summit
 
Storm Real Time Computation
Storm Real Time ComputationStorm Real Time Computation
Storm Real Time ComputationSonal Raj
 
Developing Java Streaming Applications with Apache Storm
Developing Java Streaming Applications with Apache StormDeveloping Java Streaming Applications with Apache Storm
Developing Java Streaming Applications with Apache StormLester Martin
 

Was ist angesagt? (20)

Storm: The Real-Time Layer - GlueCon 2012
Storm: The Real-Time Layer  - GlueCon 2012Storm: The Real-Time Layer  - GlueCon 2012
Storm: The Real-Time Layer - GlueCon 2012
 
Apache Storm Tutorial
Apache Storm TutorialApache Storm Tutorial
Apache Storm Tutorial
 
Storm Anatomy
Storm AnatomyStorm Anatomy
Storm Anatomy
 
Improved Reliable Streaming Processing: Apache Storm as example
Improved Reliable Streaming Processing: Apache Storm as exampleImproved Reliable Streaming Processing: Apache Storm as example
Improved Reliable Streaming Processing: Apache Storm as example
 
Spark stream - Kafka
Spark stream - Kafka Spark stream - Kafka
Spark stream - Kafka
 
Slide #1:Introduction to Apache Storm
Slide #1:Introduction to Apache StormSlide #1:Introduction to Apache Storm
Slide #1:Introduction to Apache Storm
 
whats new in java 8
whats new in java 8 whats new in java 8
whats new in java 8
 
Scaling Apache Storm (Hadoop Summit 2015)
Scaling Apache Storm (Hadoop Summit 2015)Scaling Apache Storm (Hadoop Summit 2015)
Scaling Apache Storm (Hadoop Summit 2015)
 
Hadoop 3
Hadoop 3Hadoop 3
Hadoop 3
 
Introduction to Apache Storm
Introduction to Apache StormIntroduction to Apache Storm
Introduction to Apache Storm
 
Multi-Tenant Storm Service on Hadoop Grid
Multi-Tenant Storm Service on Hadoop GridMulti-Tenant Storm Service on Hadoop Grid
Multi-Tenant Storm Service on Hadoop Grid
 
Introduction to Twitter Storm
Introduction to Twitter StormIntroduction to Twitter Storm
Introduction to Twitter Storm
 
Real-time streams and logs with Storm and Kafka
Real-time streams and logs with Storm and KafkaReal-time streams and logs with Storm and Kafka
Real-time streams and logs with Storm and Kafka
 
Jack_Knutson_SNUG2003_ Copy
Jack_Knutson_SNUG2003_ CopyJack_Knutson_SNUG2003_ Copy
Jack_Knutson_SNUG2003_ Copy
 
Apache Storm Internals
Apache Storm InternalsApache Storm Internals
Apache Storm Internals
 
Real time and reliable processing with Apache Storm
Real time and reliable processing with Apache StormReal time and reliable processing with Apache Storm
Real time and reliable processing with Apache Storm
 
GoodFit: Multi-Resource Packing of Tasks with Dependencies
GoodFit: Multi-Resource Packing of Tasks with DependenciesGoodFit: Multi-Resource Packing of Tasks with Dependencies
GoodFit: Multi-Resource Packing of Tasks with Dependencies
 
Storm Real Time Computation
Storm Real Time ComputationStorm Real Time Computation
Storm Real Time Computation
 
Apache Storm
Apache StormApache Storm
Apache Storm
 
Developing Java Streaming Applications with Apache Storm
Developing Java Streaming Applications with Apache StormDeveloping Java Streaming Applications with Apache Storm
Developing Java Streaming Applications with Apache Storm
 

Ähnlich wie Xvii samet dr. yoshihiro yamazake [mini-curso 6ª -feira] 4 wrf-setup_run_apres4

Enhancing and Preparing TIMES for High Performance Computing
Enhancing and Preparing TIMES for High Performance ComputingEnhancing and Preparing TIMES for High Performance Computing
Enhancing and Preparing TIMES for High Performance ComputingIEA-ETSAP
 
Comparison of Open Source Virtualization Technology
Comparison of Open Source Virtualization TechnologyComparison of Open Source Virtualization Technology
Comparison of Open Source Virtualization TechnologyBenoit des Ligneris
 
Building Apache Cassandra clusters for massive scale
Building Apache Cassandra clusters for massive scaleBuilding Apache Cassandra clusters for massive scale
Building Apache Cassandra clusters for massive scaleAlex Thompson
 
HARD COPY REPORT CDAC
HARD COPY REPORT CDACHARD COPY REPORT CDAC
HARD COPY REPORT CDACSarthak Dubey
 
FAIR Projector Builder
FAIR Projector BuilderFAIR Projector Builder
FAIR Projector BuilderMark Wilkinson
 
Troubleshooting common oslo.messaging and RabbitMQ issues
Troubleshooting common oslo.messaging and RabbitMQ issuesTroubleshooting common oslo.messaging and RabbitMQ issues
Troubleshooting common oslo.messaging and RabbitMQ issuesMichael Klishin
 
How to-simulate-network-devices
How to-simulate-network-devicesHow to-simulate-network-devices
How to-simulate-network-devicesSusant Sahani
 
Pet Pen Testing Tools: Zenmap & Nmap
Pet Pen Testing Tools: Zenmap & NmapPet Pen Testing Tools: Zenmap & Nmap
Pet Pen Testing Tools: Zenmap & NmapMatt Vieyra
 
Tuning parallelcodeonsolaris005
Tuning parallelcodeonsolaris005Tuning parallelcodeonsolaris005
Tuning parallelcodeonsolaris005dflexer
 
Rman cloning guide
Rman cloning guideRman cloning guide
Rman cloning guideAmit87_dba
 
Terraform modules restructured
Terraform modules restructuredTerraform modules restructured
Terraform modules restructuredAmi Mahloof
 
Terraform Modules Restructured
Terraform Modules RestructuredTerraform Modules Restructured
Terraform Modules RestructuredDoiT International
 
Comprehensive Terraform Training
Comprehensive Terraform TrainingComprehensive Terraform Training
Comprehensive Terraform TrainingYevgeniy Brikman
 

Ähnlich wie Xvii samet dr. yoshihiro yamazake [mini-curso 6ª -feira] 4 wrf-setup_run_apres4 (20)

Ns2leach
Ns2leachNs2leach
Ns2leach
 
Enhancing and Preparing TIMES for High Performance Computing
Enhancing and Preparing TIMES for High Performance ComputingEnhancing and Preparing TIMES for High Performance Computing
Enhancing and Preparing TIMES for High Performance Computing
 
How To Diffuse
How To DiffuseHow To Diffuse
How To Diffuse
 
Comparison of Open Source Virtualization Technology
Comparison of Open Source Virtualization TechnologyComparison of Open Source Virtualization Technology
Comparison of Open Source Virtualization Technology
 
Lab 10 nmr n1_2011
Lab 10 nmr n1_2011Lab 10 nmr n1_2011
Lab 10 nmr n1_2011
 
Building Apache Cassandra clusters for massive scale
Building Apache Cassandra clusters for massive scaleBuilding Apache Cassandra clusters for massive scale
Building Apache Cassandra clusters for massive scale
 
HARD COPY REPORT CDAC
HARD COPY REPORT CDACHARD COPY REPORT CDAC
HARD COPY REPORT CDAC
 
FAIR Projector Builder
FAIR Projector BuilderFAIR Projector Builder
FAIR Projector Builder
 
Troubleshooting common oslo.messaging and RabbitMQ issues
Troubleshooting common oslo.messaging and RabbitMQ issuesTroubleshooting common oslo.messaging and RabbitMQ issues
Troubleshooting common oslo.messaging and RabbitMQ issues
 
How to-simulate-network-devices
How to-simulate-network-devicesHow to-simulate-network-devices
How to-simulate-network-devices
 
CvmFS Workshop
CvmFS WorkshopCvmFS Workshop
CvmFS Workshop
 
Pet Pen Testing Tools: Zenmap & Nmap
Pet Pen Testing Tools: Zenmap & NmapPet Pen Testing Tools: Zenmap & Nmap
Pet Pen Testing Tools: Zenmap & Nmap
 
How To Sediment
How To SedimentHow To Sediment
How To Sediment
 
Tuning parallelcodeonsolaris005
Tuning parallelcodeonsolaris005Tuning parallelcodeonsolaris005
Tuning parallelcodeonsolaris005
 
Rman cloning guide
Rman cloning guideRman cloning guide
Rman cloning guide
 
Internship presentation
Internship presentationInternship presentation
Internship presentation
 
Rac introduction
Rac introductionRac introduction
Rac introduction
 
Terraform modules restructured
Terraform modules restructuredTerraform modules restructured
Terraform modules restructured
 
Terraform Modules Restructured
Terraform Modules RestructuredTerraform Modules Restructured
Terraform Modules Restructured
 
Comprehensive Terraform Training
Comprehensive Terraform TrainingComprehensive Terraform Training
Comprehensive Terraform Training
 

Mehr von Dafmet Ufpel

O desafio da modelagem de dispersão e química de poluentes
O desafio da modelagem de dispersão e química de poluentesO desafio da modelagem de dispersão e química de poluentes
O desafio da modelagem de dispersão e química de poluentesDafmet Ufpel
 
Experiências em estudos de clima urbano
Experiências em estudos de clima urbanoExperiências em estudos de clima urbano
Experiências em estudos de clima urbanoDafmet Ufpel
 
O desafio de comunicar a Meteorologia - por Estael Sias
O desafio de comunicar a Meteorologia - por Estael SiasO desafio de comunicar a Meteorologia - por Estael Sias
O desafio de comunicar a Meteorologia - por Estael SiasDafmet Ufpel
 
Química da Estratosfera e o buraco na camada de ozônio
Química da Estratosfera e o buraco na camada de ozônioQuímica da Estratosfera e o buraco na camada de ozônio
Química da Estratosfera e o buraco na camada de ozônioDafmet Ufpel
 
Employing heisenberg’s turbulent spectral transfer theory to
Employing heisenberg’s turbulent spectral transfer theory toEmploying heisenberg’s turbulent spectral transfer theory to
Employing heisenberg’s turbulent spectral transfer theory toDafmet Ufpel
 
Eventos de Chuva Intensa na região sul do Brasil
Eventos de Chuva Intensa na região sul do BrasilEventos de Chuva Intensa na região sul do Brasil
Eventos de Chuva Intensa na região sul do BrasilDafmet Ufpel
 
Meteorologia na EACF
Meteorologia na EACFMeteorologia na EACF
Meteorologia na EACFDafmet Ufpel
 
Mudanças climaticas samet
Mudanças climaticas   sametMudanças climaticas   samet
Mudanças climaticas sametDafmet Ufpel
 
Apresentação redec 4 evento 07-12-2012
Apresentação redec 4   evento 07-12-2012Apresentação redec 4   evento 07-12-2012
Apresentação redec 4 evento 07-12-2012Dafmet Ufpel
 
Seminário samet 2012
Seminário samet 2012Seminário samet 2012
Seminário samet 2012Dafmet Ufpel
 
Estratosfera semana acadêmica
Estratosfera   semana acadêmicaEstratosfera   semana acadêmica
Estratosfera semana acadêmicaDafmet Ufpel
 
Meteorologia aplicada aos ecossistemas santiago vianna cuadra
Meteorologia aplicada aos ecossistemas   santiago vianna cuadraMeteorologia aplicada aos ecossistemas   santiago vianna cuadra
Meteorologia aplicada aos ecossistemas santiago vianna cuadraDafmet Ufpel
 
Semana acadêmica ufpel
Semana acadêmica ufpelSemana acadêmica ufpel
Semana acadêmica ufpelDafmet Ufpel
 
Edital de convocacao para as eleicoes do dafmet 2012
Edital de convocacao para as eleicoes do dafmet 2012Edital de convocacao para as eleicoes do dafmet 2012
Edital de convocacao para as eleicoes do dafmet 2012Dafmet Ufpel
 
vaga de estagio 2011 - Meteorologia
vaga de estagio 2011 - Meteorologiavaga de estagio 2011 - Meteorologia
vaga de estagio 2011 - MeteorologiaDafmet Ufpel
 
A experiência do SIMEPAR no monitoramento de tempestades severas
A experiência do SIMEPAR no monitoramento de tempestades severasA experiência do SIMEPAR no monitoramento de tempestades severas
A experiência do SIMEPAR no monitoramento de tempestades severasDafmet Ufpel
 
Cartaz Semana Academica
Cartaz Semana AcademicaCartaz Semana Academica
Cartaz Semana AcademicaDafmet Ufpel
 

Mehr von Dafmet Ufpel (20)

O desafio da modelagem de dispersão e química de poluentes
O desafio da modelagem de dispersão e química de poluentesO desafio da modelagem de dispersão e química de poluentes
O desafio da modelagem de dispersão e química de poluentes
 
Experiências em estudos de clima urbano
Experiências em estudos de clima urbanoExperiências em estudos de clima urbano
Experiências em estudos de clima urbano
 
O desafio de comunicar a Meteorologia - por Estael Sias
O desafio de comunicar a Meteorologia - por Estael SiasO desafio de comunicar a Meteorologia - por Estael Sias
O desafio de comunicar a Meteorologia - por Estael Sias
 
Química da Estratosfera e o buraco na camada de ozônio
Química da Estratosfera e o buraco na camada de ozônioQuímica da Estratosfera e o buraco na camada de ozônio
Química da Estratosfera e o buraco na camada de ozônio
 
Employing heisenberg’s turbulent spectral transfer theory to
Employing heisenberg’s turbulent spectral transfer theory toEmploying heisenberg’s turbulent spectral transfer theory to
Employing heisenberg’s turbulent spectral transfer theory to
 
Eventos de Chuva Intensa na região sul do Brasil
Eventos de Chuva Intensa na região sul do BrasilEventos de Chuva Intensa na região sul do Brasil
Eventos de Chuva Intensa na região sul do Brasil
 
Meteorologia na EACF
Meteorologia na EACFMeteorologia na EACF
Meteorologia na EACF
 
Capincho cumulus
Capincho cumulusCapincho cumulus
Capincho cumulus
 
Mudanças climaticas samet
Mudanças climaticas   sametMudanças climaticas   samet
Mudanças climaticas samet
 
Apresentação redec 4 evento 07-12-2012
Apresentação redec 4   evento 07-12-2012Apresentação redec 4   evento 07-12-2012
Apresentação redec 4 evento 07-12-2012
 
Seminário samet 2012
Seminário samet 2012Seminário samet 2012
Seminário samet 2012
 
Estratosfera semana acadêmica
Estratosfera   semana acadêmicaEstratosfera   semana acadêmica
Estratosfera semana acadêmica
 
Meteorologia aplicada aos ecossistemas santiago vianna cuadra
Meteorologia aplicada aos ecossistemas   santiago vianna cuadraMeteorologia aplicada aos ecossistemas   santiago vianna cuadra
Meteorologia aplicada aos ecossistemas santiago vianna cuadra
 
Semana acadêmica ufpel
Semana acadêmica ufpelSemana acadêmica ufpel
Semana acadêmica ufpel
 
Samet2012
Samet2012Samet2012
Samet2012
 
Empreendedorismo
EmpreendedorismoEmpreendedorismo
Empreendedorismo
 
Edital de convocacao para as eleicoes do dafmet 2012
Edital de convocacao para as eleicoes do dafmet 2012Edital de convocacao para as eleicoes do dafmet 2012
Edital de convocacao para as eleicoes do dafmet 2012
 
vaga de estagio 2011 - Meteorologia
vaga de estagio 2011 - Meteorologiavaga de estagio 2011 - Meteorologia
vaga de estagio 2011 - Meteorologia
 
A experiência do SIMEPAR no monitoramento de tempestades severas
A experiência do SIMEPAR no monitoramento de tempestades severasA experiência do SIMEPAR no monitoramento de tempestades severas
A experiência do SIMEPAR no monitoramento de tempestades severas
 
Cartaz Semana Academica
Cartaz Semana AcademicaCartaz Semana Academica
Cartaz Semana Academica
 

Xvii samet dr. yoshihiro yamazake [mini-curso 6ª -feira] 4 wrf-setup_run_apres4

  • 1. Set Up and Run WRF (ARW-Ideal, ARW-real, and NMM) Wei Wang NCAR/ESSL/MMM Mesoscale & Microscale Meteorological Division / NCAR 1
  • 2. WRF System Flowchart WRFV3 ideal.exe WPS real.exe wrf.exe real_nmm.exe Mesoscale & Microscale Meteorological Division / NCAR 2
  • 3. Outline • Running WRF code – Before you run.. – Running idealized case – Running ARW real-data case – Running NMM real-data case • Basic runtime options for a single domain run (namelist) • Check output • Simple trouble shooting This talk is complementary to ‘Nesting’ talk later. Mesoscale & Microscale Meteorological Division / NCAR 3
  • 4. Before You Run .. • Check and make sure appropriate executables are created in WRFV3/main/ directory: For ARW: For NMM: - ideal.exe - real_nmm.exe - real.exe - wrf.exe - wrf.exe - ndown.exe • If you are running a real-data case, be sure that files from WPS are correctly generated: – met_em.d01.*, for ARW or – met_nmm.d01.* for NMM • Prepare namelist.input for runtime options. Mesoscale & Microscale Meteorological Division / NCAR 4
  • 5. WRF test case directories You have these choices in WRFV3/test/ (made at compile time): nmm_real NMM em_real 3d real-data em_quarter_ss em_b_wave em_les 3d ideal em_heldsuarez em_hill2d_x ARW em_squall2d_x em_squall2d_y 2d ideal em_grav2d_x em_seabreeze2d_x Mesoscale & Microscale Meteorological Division / NCAR 5
  • 6. Steps to Run 1. cd to run/ or one of the test case directories 2. Link or copy WPS output files to the directory for real-data cases 3. Edit namelist.input file for the appropriate grid and times of the case 4. Run initialization program (ideal.exe, real.exe, or real_nmm.exe) 5. Run model executable, wrf.exe Mesoscale & Microscale Meteorological Division / NCAR 6
  • 7. WRFV3/run directory README.namelist LANDUSE.TBL ETAMPNEW_DATA GENPARM.TBL these files are model RRTM_DATA physics data files: they are SOILPARM.TBL used to either initialize VEGPARM.TBL physics variables, or make urban_param.tbl physics computation more tr49t67 tr49t85 efficient tr67t85 gribmap.txt grib2map.tbl …. (a few more) Mesoscale & Microscale Meteorological Division / NCAR 7
  • 8. WRFV3/run directory after compile LANDUSE.TBL ETAMPNEW_DATA GENPARM.TBL RRTM_DATA SOILPARM.TBL An example after VEGPARM.TBL urban_param.tbl ARW real case tr49t67 compile tr49t85 tr67t85 gribmap.txt grib2map.tbl namelist.input -> ../test/em_real/namelist.input real.exe -> ../main/real.exe wrf.exe -> ../main/wrf.exe ndown.exe -> ../main/ndown.exe …. (a few more) Mesoscale & Microscale Meteorological Division / NCAR 8
  • 9. Running an Idealized Case ARW only Mesoscale & Microscale Meteorological Division / NCAR 9
  • 10. Running an Idealized Case • If you have compiled an ideal case, you should have: ideal.exe - ideal case initialization program wrf.exe - model executable • These executables are linked to: WRFV3/run and WRFV3/test/em_test-case  One can go to either directory to run. Mesoscale & Microscale Meteorological Division / NCAR 10
  • 11. Running an Idealized Case Go to the desired ideal test case directory: e.g. cd test/em_quarter_ss If there is ‘run_me_first.csh’ in the directory, run it first - this links physics data files to the currect directory: ./run_me_first.csh Mesoscale & Microscale Meteorological Division / NCAR 11
  • 12. Running an Idealized Case Then run the ideal initialization program: ./ideal.exe The input to this program is typically a sounding file (file named input_sounding), or a pre-defined 3D input (e.g. input_jet in em_b_wave case). Running ideal.exe creates WRF initial condition file: wrfinput_d01 Note that wrfbdy file is not needed for idealized cases Mesoscale & Microscale Meteorological Division / NCAR 12
  • 13. Running an Idealized Case • To run the model interactively, type ./wrf.exe >& wrf.out & for single processor (serial) or SMP run. Or mpirun -np N ./wrf.exe & for a MPI run (where N is the number of processors requested) • Successful running of the model executable will create a model history file called wrfout_d01_<date> e.g. wrfout_d01_0001-01-01_00:00:00 Mesoscale & Microscale Meteorological Division / NCAR 13
  • 14. Running an Idealized Case • Edit namelist.input file to change options. • For your own case, you may provide a different sounding. • You may also edit dyn_em/module_initialize_<case>.F to change other aspects of the initialization. Note: • For 2D cases and baroclinic wave case, ideal.exe must be run serially • For all 2D cases, wrf.exe must be run serially or with SMP Mesoscale & Microscale Meteorological Division / NCAR 14
  • 15. Running ARW Real-Data Case Mesoscale & Microscale Meteorological Division / NCAR 15
  • 16. Running ARW Real-Data Case • If you have compiled the em_real case, you should have: real.exe - real data initialization program wrf.exe - model executable ndown.exe - program for doing one-way nesting • These executables are linked to: WRFV3/run and WRFV3/test/em_real  One can go to either directory to run. Mesoscale & Microscale Meteorological Division / NCAR 16
  • 17. WRFV3/test/em_real directory LANDUSE.TBL -> ../../run/LANDUSE.TBL ETAMPNEW_DATA -> ../../run/ETAMPNEW_DATA GENPARM.TBL -> ../../run/GENPARM.TBL RRTM_DATA -> ../../run/RRTM_DATA SOILPARM.TBL -> ../../run/SOILPARM.TBL VEGPARM.TBL -> ../../run/VEGPARM.TBL urban_param.tbl -> ../../run/urban_param.tbl tr49t67 -> ../../run/tr49t67 tr49t85 -> ../../run/tr49t85 tr67t85 -> ../../run/tr67t85 gribmap.txt -> ../../run/gribmap.txt grib2map.tbl -> ../../run/grib2map.tbl namelist.input - require editing real.exe -> ../../main/real.exe wrf.exe -> ../../main/wrf.exe ndown.exe -> ../../main/ndown.exe …. (a few more) Mesoscale & Microscale Meteorological Division / NCAR 17
  • 18. Running WRF ARW Real-data Cases • One must successfully run WPS, and create met_em.* file for more than one time period • Link or copy WPS output files to the run directory: cd test/em_real ln -s ../../../WPS/met_em.d01.* . Mesoscale & Microscale Meteorological Division / NCAR 18
  • 19. Running WRF ARW Real-data Cases • Edit namelist.input file for runtime options (at mininum, one must edit &time_control for start, end and integration times, and &domains for grid dimensions) • Run the real-data initialization program: ./real.exe, if compiled serially / SMP, or mpirun -np N ./real.exe, or mpirun -machinefile file -np N ./real.exe for a MPI job where N is the number of processors requested, and file has a list of CPUs for the MPI job Mesoscale & Microscale Meteorological Division / NCAR 19
  • 20. Running WRF ARW Real-data Cases • Successfully running this program will create model initial and boundary files: Single time level wrfinput_d01 data at model’s wrfbdy_d01 start time Multiple time level data at the lateral boundary, and only for domain 1 Mesoscale & Microscale Meteorological Division / NCAR 20
  • 21. Running WRF ARW Real-data Cases • Run the model executable by typing: ./wrf.exe >& wrf.out & or mpirun -np N ./wrf.exe & • Successfully running the model will a create model history file: wrfout_d01_2005-08-28_00:00:00 And restart file if restart_interval is set to a time within the range of the forecast time: wrfrst_d01_2008-08-28_12:00:00 Mesoscale & Microscale Meteorological Division / NCAR 21
  • 22. Running NMM Real-Data Case Mesoscale & Microscale Meteorological Division / NCAR 22
  • 23. Running NMM Real-Data Case • If you have compiled the nmm_real, you should have: real_nmm.exe - NMM real date initialization program wrf.exe - NMM model executable • These executables are linked to: WRFV3/run and WRFV3/test/nmm_real  One can go to either directory to run. Mesoscale & Microscale Meteorological Division / NCAR 23
  • 24. WRFV3/test/nmm_real directory LANDUSE.TBL -> ../../run/LANDUSE.TBL ETAMPNEW_DATA -> ../../run/ETAMPNEW_DATA GENPARM.TBL -> ../../run/GENPARM.TBL RRTM_DATA -> ../../run/RRTM_DATA SOILPARM.TBL -> ../../run/SOILPARM.TBL VEGPARM.TBL -> ../../run/VEGPARM.TBL urban_param.tbl -> ../../run/urban_param.tbl tr49t67 -> ../../run/tr49t67 tr49t85 -> ../../run/tr49t85 tr67t85 -> ../../run/tr67t85 gribmap.txt -> ../../run/gribmap.txt grib2map.tbl -> ../../run/grib2map.tbl namelist.input - require editing real_nmm.exe -> ../../main/real_nmm.exe wrf.exe -> ../../main/wrf.exe Mesoscale & Microscale Meteorological Division / NCAR 24
  • 25. Running WRF NMM Real-data Cases • One must successfully run WPS, and create met_nmm.* file for more than one time period • Link or copy WPS output files to the run directory: cd test/nmm_real ln -s ../../../WPS/met_nmm.d01.* . Mesoscale & Microscale Meteorological Division / NCAR 25
  • 26. Running WRF NMM Real-data Cases • Edit namelist.input file for runtime options (at minimum, one must edit &time_control for start, end and integration time, and &domains for grid dimensions) • Run the real-data initialization program (MPI only): mpirun -np N ./real_nmm.exe • Successfully running this program will create model initial and boundary files: wrfinput_d01 wrfbdy_d01 Mesoscale & Microscale Meteorological Division / NCAR 26
  • 27. Running WRF NMM Real-data Cases • Run the model executable by typing (MPI only): mpirun -np N ./wrf.exe • Successfully running the model will create a model history file: wrfout_d01_2005-08-28_00:00:00 And restart file if restart_interval is set to a time within the range of the forecast time: wrfrst_d01_2008-08-28_12:00:00 Mesoscale & Microscale Meteorological Division / NCAR 27
  • 28. Basic namelist Options Mesoscale & Microscale Meteorological Division / NCAR 28
  • 29. What is a namelist? • A Fortran namelist contains a list of runtime options for the code to read in during its execution. Use of a namelist allows one to change runtime configuration without the need to recompile the source code. • Fortran 90 namelist has very specific format, so edit with care: &namelist-record - start / - end • As a general rule: Multiple columns: domain dependent Single column: value valid for all domains Mesoscale & Microscale Meteorological Division / NCAR 29
  • 30. &time_control run_days = 0, run_hours = 24, domain 1 option run_minutes = 0, run_seconds = 0, start_year = 2000, 2000, 2000, start_month = 01, 01, 01, start_day = 24, 24, 24, start_hour = 12, 12, 12, start_minute = 00, 00, 00, start_second = 00, 00, 00, end_year = 2000, 2000, 2000, end_month = 01, 01, 01, end_day = 25, 25, 25, end_hour = 12, 12, 12, end_minute = 00, 00, 00, end_second = 00, 00, 00, interval_seconds = 21600 history_interval = 180, 60, 60, frame_per_outfile = 1000, 1000, 1000, nest options restart_interval = 360, Mesoscale & Microscale Meteorological Division / NCAR 30
  • 31. Notes on &time_control • run_* time variables: – Model simulation length: wrf.exe and domain 1 only • start_* and end_* time variables: – Program real will use WPS output between these times to produce lateral (and lower) boundary file – They can also be used to specify the start and end of simulation times for the coarse grid. Mesoscale & Microscale Meteorological Division / NCAR 31
  • 32. Notes on &time_control • Interval_seconds: – Time interval between WPS output times, and LBC update frequency • history_interval: – Time interval in minutes when a history output is written – The time stamp in a history file name is the time when the history file is first written, and multiple time periods may be written in one file. e.g. a history file for domain 1 that is first written for 1200 UTC Jan 24 2000 is wrfout_d01_2000-01-24_12:00:00 Mesoscale & Microscale Meteorological Division / NCAR 32
  • 33. Notes on &time_control • frame_per_outfile: – Number of history times written to one file. • restart_interval: – Time interval in minutes when a restart file is written. – By default, restart file is not written at hour 0. – A restart file contains only one time level data, and its valid time is in its file name, e.g. a restart file for domain 1 that is valid for 0000 UTC Jan 25 2000 is wrfrst_d01_2000-01-25_00:00:00 Mesoscale & Microscale Meteorological Division / NCAR 33
  • 34. &time_control io_form_history = 2, io_form_restart = 2, io_form_input = 2, IO format options: io_form_boundary = 2, debug_level = 0, = 1, binary = 2, netcdf (most common) = 4, PHDF5 = 5, Grib 1 =10, Grib 2 io_form_restart = 102 : write in patch sizes: fast Debug print control: for large grids and useful Increasing values give for restart file more prints. Mesoscale & Microscale Meteorological Division / NCAR 34
  • 35. Notes on &time_control • frame_per_outfile: – Number of history times written to one file. • restart_interval: – Time interval in minutes when a restart file is written. – By default, restart file is not written at hour 0. – A restart file contains only one time level data, and its valid time is in its file name, e.g. a restart file for domain 1 that is valid for 0000 UTC Jan 25 2000 is wrfrst_d01_2000-01-25_00:00:00 Mesoscale & Microscale Meteorological Division / NCAR 35
  • 36. &domains time_step = 180 time_step_fract_num = 0, time_step_fract_den = 1, max_dom = 1, s_we = 1, 1, 1, e_we = 74, 112, 94, s_sn = 1, 1, 1, e_sn = 61, 97,nest91, s_vert = 1, 1, 1, options e_vert = 28, 28, 28, num_metgrid_levels = 21 dx = 30000, 10000, 3333, dy = 30000, 10000, 3333, eta_levels = 1.0,0.996,0.99,0.98,… 0.0 p_top_requested = 5000, Mesoscale & Microscale Meteorological Division / NCAR 36
  • 37. Notes on &domains • time_step, time_step_fract_num, time_step_frac_den: – Time step for model integration in seconds. – Fractional time step specified in separate integers of numerator and denominator. – ARW: 6xDX; NMM: 2.25xDX (DX is grid distance in km) • s_we, s_sn, s_vert: – Starting indices in X, Y, and Z direction; 1 for domain 1. • e_we, e_sn, e_vert: – Model grid dimensions (staggered) in X, Y and Z directions. • num_metgrid_levels: – Number of metgrid (input) data levels. • dx, dy: – grid distances in meters for ARW; in degrees for NMM. Mesoscale & Microscale Meteorological Division / NCAR 37
  • 38. Notes on &domains • p_top_requested: – Pressure value at the model top. – Constrained by the available data from WPS. – Default is 5000 hPa • eta_levels: – Specify your own model levels from 1.0 to 0.0. – If not specified, program real will calculate a set of levels for you • ptsgm (NMM only): – Pressure level (Pa) at which the WRF-NMM hybrid coordinate transitions from sigma to pressure Mesoscale & Microscale Meteorological Division / NCAR 38
  • 39. Where do I start? • Always start with a namelist template provided in a test case directory, whether it is a ideal case, ARW or NMM. – A number of namelist templates are provided in test/test- case/ directories • Use document to guide the modification of the namelist values: – run/README.namelist – User’s Guide, Chapter 5 – Full list of namelists and their default values can be found in Registry files: Registry.EM (ARW), Registry.NMM and registry.io_boilerplate (IO options, shared) Mesoscale & Microscale Meteorological Division / NCAR 39
  • 40. To run a job in a different directory.. • Directories run/ and test_<case>/ are convenient places to run, but it does not have to be. • Copy or link the content of these directories to another directory, including physics data files, wrf input and boundary files and wrf namelist and executables, and you should be able to run a job anywhere on your system. Mesoscale & Microscale Meteorological Division / NCAR 40
  • 41. Check Output Mesoscale & Microscale Meteorological Division / NCAR 41
  • 42. Output After a Model Run • Standard out/error files: wrf.out, or rsl.* files • Model history file(s): wrfout_d01_<date> • Model restart file(s), optional wrfrst_d01_<date> Mesoscale & Microscale Meteorological Division / NCAR 42
  • 43. Output from a multi-processor run The standard out and error will go to the following files for a MPI run: mpirun -np 4 .wrf.exe  rsl.out.0000 rsl.error.0000 rsl.out.0001 rsl.error.0001 rsl.out.0002 rsl.error.0002 rsl.out.0003 rsl.error.0003 There is one pair of files for each processor requested Mesoscale & Microscale Meteorological Division / NCAR 43
  • 44. What to Look for in a standard out File? Check run log file by typing tail wrf.out, or tail rsl.out.0000 You should see the following if the job is successfully completed: wrf: SUCCESS COMPLETE WRF Mesoscale & Microscale Meteorological Division / NCAR 44
  • 45. How to Check Model History File? • Use ncdump: ncdump –v Times wrfout_d01_<date> to check output times. Or ncdump –v U wrfout_d01_<date> to check a particular variable (U) • Use ncview or ncBrowse (great tools!) • Use post-processing tools (see talks later) Mesoscale & Microscale Meteorological Division / NCAR 45
  • 46. What is in a wrf.out or rsl file? • A print of namelist options • Time taken to compute one model step: Timing for main: time 2000-01-24_12:03:00 on domain 1: 3.25000 elapsed seconds. Timing for main: time 2000-01-24_12:06:00 on domain 1: 1.50000 elapsed seconds. Timing for main: time 2000-01-24_12:09:00 on domain 1: 1.50000 elapsed seconds. Timing for main: time 2000-01-24_12:12:00 on domain 1: 1.55000 elapsed seconds. • Time taken to write history and restart file: Timing for Writing wrfout_d01_2000-01-24_18:00:00 for domain 1: 0.14000 elapsed seconds. • Any model error prints: (example from ARW run) 5 points exceeded cfl=2 in domain 1 at time 4.200000 MAX AT i,j,k: 123 48 3 cfl,w,d(eta)= 4.165821 −> An indication the model has become numerically unstable Mesoscale & Microscale Meteorological Division / NCAR 46
  • 47. Simple Trouble Shooting Mesoscale & Microscale Meteorological Division / NCAR 47
  • 48. Often-seen runtime problems – module_configure: initial_config: error reading namelist: &dynamics > Typos or erroneous namelist variables exist in namelist record &dynamics in namelist.input file – input_wrf.F: SIZE MISMATCH: namelist ide,jde,num_metgrid_levels= 70 61 27 ; input data ide,jde,num_metgrid_levels= 74 61 27 > Grid dimensions in error Mesoscale & Microscale Meteorological Division / NCAR 48
  • 49. Often-seen runtime problems – Segmentation fault (core dumped) > Often typing ‘unlimit’ or ‘ulimit -s unlimited’ or equivalent can help when this happens quickly in a run. — 121 points exceeded cfl=2 in domain 1 at time 4.200000 MAX AT i,j,k: 123 48 3 cfl,w,d(eta)= 4.165821 > Model becomes unstable due to various reasons. If it happens soon after the start time, check input data, and/or reduce time step. Mesoscale & Microscale Meteorological Division / NCAR 49
  • 50. References • Information on compiling and running WRF, and a more extensive list of namelist options and their definition / explanations can be found in the ARW and NMM User’s Guide, Chapter 5 • Also see ‘Nesting Setup and Run’ talk. Mesoscale & Microscale Meteorological Division / NCAR 50