1 min read

Categories

This is a quick guide about using E3SM diag package. More info here.

1) activate environment

This step is to activate the latest e3sm unified environment

  • for Chrysalis
    • source /lcrc/soft/climate/e3sm-unified/load_latest_e3sm_unified_chrysalis.sh
  • for Compy:
    • source /share/apps/E3SM/conda_envs/load_latest_e3sm_unified_compy.sh

2) install PreAndPostProcessingScripts

This tool package contains utilities needed to create E3SM-diags input files git clone git@github.com:E3SM-Project/PreAndPostProcessingScripts.git

3) create a configuration file

A samaple file mysimulation.cfg. I usually only turn on [climo], [ts], and [glb], and run E3SM_diag separately in the next step Observation paths needed in the configuration file

  • for Chrysalis
    • /lcrc/soft/climate/e3sm_diags_data/obs_for_e3sm_diags/
  • for Compy
    • /compyfs/e3sm_diags_data/obs_for_e3sm_diags/

4) generate E3SM-diags input files

This step is to convert the E3SM outputs to climatology files (e.g. climo.nc) for E3SM-diags python PreAndPostProcessingScripts/postprocessing_bundle/v2/post.py -c mysimulation.cfg

5) install E3SM_diag

If using E3SM unifield environment, the package is already installed

6) create an “run_e3sm_diags.py” script and execute it

import os
from e3sm_diags.parameter.core_parameter import CoreParameter
from e3sm_diags.parameter.area_mean_time_series_parameter import AreaMeanTimeSeriesParameter
from e3sm_diags.run import runner

param = CoreParameter()

param.reference_data_path = '/lcrc/soft/climate/e3sm_diags_data/obs_for_e3sm_diags/climatology/'
param.test_data_path = '/lcrc/group/e3sm/ac.tian.zhou/E3SMv2/v2.LR.amip_trigrid_irrimip_irr_e3/post/atm/180x360_aave/clim/40yr/'
param.test_name = 'v2.LR.amip_trigrid_irrimip_irr_e3'
#param.seasons = ["ANN"]   #all seasons ["ANN","DJF", "MAM", "JJA", "SON"] will run,if comment out"

prefix = '/lcrc/group/e3sm/public_html/diagnostic_output/ac.tian.zhou/irrmip'
param.results_dir = os.path.join(prefix, 'spinup')
# Use the following if running in parallel:
#param.multiprocessing = True
#param.num_workers = 40

##Set specific parameters for new sets
ts_param = AreaMeanTimeSeriesParameter()
ts_param.reference_data_path = '/lcrc/soft/climate/e3sm_diags_data/obs_for_e3sm_diags/time-series/'
ts_param.test_data_path = '/lcrc/group/e3sm/ac.tian.zhou/E3SMv2/v2.LR.amip_trigrid_irrimip_irr_e3/post/atm/180x360_aave/ts/monthly/40yr/'
ts_param.test_name = 'e3sm_spinup'
ts_param.results_dir = os.path.join(prefix, 'spinup')
ts_param.start_yr = '1860'
ts_param.end_yr = '1899'

# Use below to run all core sets of diags:
runner.sets_to_run = ['lat_lon','zonal_mean_xy', 'zonal_mean_2d', 'polar', 'cosp_histogram', 'meridional_mean_2d', 'area_mean_time_series']
runner.run_diags([param, ts_param])

7) batch job

We could create a script (diags.bash) and submit it to run a batch job:

#!/bin/bash -l
#SBATCH --job-name=diags
#SBATCH --output=diags.o%j
#SBATCH --account=e3sm
#SBATCH --nodes=1
#SBATCH --time=01:00:00

source /lcrc/soft/climate/e3sm-unified/load_latest_e3sm_unified_chrysalis.sh
python run_e3sm_diags.py --multiprocessing --num_workers=32

then run sbatch diags.bash

8) online visualization

The previous step will create a bunch of files in the html path, which is the public portal for the machine you are running on, it reflects an http address for public access:

  • for Chrysalis
    • html_path: /lcrc/group/e3sm/public_html/diagnostic_output/ac.tian.zhou/
    • web_address: https://web.lcrc.anl.gov/public/e3sm/diagnostic_output/ac.tian.zhou/
  • for Compy
    • html_path: /compyfs/www/zhou014/
    • web_address: https://compy-dtn.pnl.gov/zhou014/