Hadley Centre Coupled Model 3 (HadCM3) deployed on HECToR (High-End Computing Terascale Resource) Supercomputer
This computation involved: Hadley Centre Coupled Model 3 (HadCM3) deployed on HECToR (High-End Computing Terascale Resource) Supercomputer. "HadCM3" (the Hadley Centre Coupled Model, version 3) is a coupled atmosphere-ocean general circulation model (AOGCM) developed at the Hadley Centre in the United Kingdom. It was one of the major models used in the IPCC Third Assessment Report in 2001.
Unlike earlier AOGCMs at the Hadley Centre and elsewhere (including its predecessor HadCM2), HadCM3 does not need flux adjustment (additional "artificial" heat and freshwater fluxes at the ocean surface) to produce an almost unbiased simulation: HadCM3 has been run for over a thousand years, showing little drift in its surface climate.
HadCM3 is composed of two components: the atmospheric model HadAM3 and the ocean model (which includes a sea ice model). Simulations often use a 360-day calendar, where each month is 30 days.
The horizontal and vertical representation, resolution and other important characteristics are in this model as follows (2006):
1. resolution: T63, L18
2. numerical scheme/grid:
- Grid - Spectral for Temperature, Vorticity, Divergence, Surface Pressure
- Model top - Top level at 4.5 hPa.
- Vertical coordinate - hybrid sigma-pressure.
- Number of layers above 200 hPa - 5
- Number of layers below 850 hPa - 4
3. list of prognostic variables :
- Temperature, Vorticity, Divergence, Surface Pressure, Atmospheric moisture (vapour, liquid, and ice)
4. Major parameterizations.
Stratiform cloud scheme (liquid, ice) following Rotstayn (1997, 1998) and Rotstayn et al. (2000). The convection scheme (see below) also produces a convective cloud fraction.
UKMO: Gregory and Rowntree (1990)
c. boundary layer
Follows Louis (1979) with Smith (1990) enhancements.
d. SW, LW radiation
GFDL based (SW: Lacis and Hansen 1974; LW: Fels and Schwarzkopf 1975, 1981; Fels 1985; Schwarzkopf and Fels 1991).
e. any special handling of wind and temperature at top of model
Temperature at top of model for radiation code by simple extrapolation (using vertical coordinate) from top 2 levels.
- Horizontal: Matching the T63 atmospheric model, but twice the meridional resolution ~1.875 degrees EW by approximately 0.84 degrees NS.
- Vertical: 31 levels, spacing increasing with depth, from 10 m at the surface to 400 m in the deep ocean.
2. numerical scheme/grid
MOM2.2 (Pacanowski 1996) model code using Arakawa B grid, leapfrog time stepping. The "quicker" scheme is used for tracer advection, see Leonard (1979) and Pacanowski (1996). Vertical coordinate is layers of prescribed depth (rigid lid). Freshwater flux at surface derived from P-E, river runoff, and ice brine rejection terms, then converted to a virtual salt flux.
3. list of prognostic variables and tracers
Velocities U and V, Temperature and Salinity.
a. eddy parameterization
Adiabatic eddy-induced transport via the Griffies (1998) implementation of the Gent and McWilliams (1990) scheme.
Vertical mixing of tracers via a modified form of Bryan and Lewis (1979) profile. For tropical regions (15 degreesS to 15 degreesN), the profile is modified to have values decreasing to near zero at the surface.
b. bottom boundary layer treatment and/or sill overflow treatment
c. mixed-layer treatment
Integer Power vertical mixing scheme (Wilson 2000, 2002), based on the Pacanowski and Philander (1981) scheme.
d. sunlight penetration
Paulson and Simpson (1977) dual exponential formulation, using turbidity specified in terms of Jerlov (1976) water types (spatially varying, time invariant).
e. tidal mixing
f. river mouth mixing
g. mixing isolated seas with the ocean
Achieved by lateral mixing (at same latitude) to nearest world ocean grid point at a prescribed rate (e.g. Hudson Bay, Mediterranean).
h. treatment of North Pole "singularity"
Artificial island with Fourier filtering for latitudes close to North Pole.
C. sea ice
1. horizontal resolution, number of layers, number of thickness categories
Coded as part of the atmospheric model, with the same resolution - equivalent T63 (1.875 degrees EW by approximately 1.875 degrees NS). 1 or 2 ice layers depending upon ice depth. Snow on ice treated as an additional layer. Deep snow to white ice treatment.
- Arakawa B-grid (T points match spectral AGCM grid points), NCAR based advection scheme in terms of divergence field (Brieglib). Leapfrog timestepping.
3.list of prognostic variables
Ice depth; Ice temperature(s) (1 or 2 layers); Snow depth; Snow temperature(1 layer); Brine heat reservoir; Leads fraction; Temperature of mixed layer in leads and under ice.
Thermodynamics - Semtner (1976) 3 layer thermodynamics
Rheology - Flato and Hibler (1990), O'Farrell (1998)
Leads - Computed fraction, with lateral ice growth/melting
Snow on ice - Has internal layer temperature, surface temperature, ice-snow interface temperature.
5. treatment of salinity in ice
Ice salinity set at 0.01
6. brine rejection treatment
Ice formation/melting gives brine rejection/uptake (relative to 0.035 reference).
7. treatment of the North Pole "singularity"
Artificial island at North Pole. Ice motion filtered towards slab rotation near pole.
D. Land / ice sheets
MOSES II land surface scheme see Cox et al., 1999 and Essery et al., 2003, given in the Johns et al. 2006 reference. Same resolution as atmospheric model.
For details of land/vegetation scheme see technical report (Gordon et al, 2002).
The Land model has T63 resolution, and 6 soil layers (temperature and water/ice).
There are 9 soil types and 13 vegetation types (one soil type and one vegetation type per grid point). No tiling, except each land point has prescribed amount of vegetation and bare ground. Separate flux calculations for each.
1. treatment of frozen soil and permafrost
Water in soil is allowed to freeze.
2. treatment of surface runoff and river routing scheme
Surface runoff taken instantly to oceans by downslope method.
3. treatment of snow cover on land
S now on land has 3 layers (temperatures, snow densities), total snow mass, age dependent albedo.
4. description of water storage model and drainage
Each grid point has an assosciated field capacity. Drainage from lowest soil level if moisture exceeds field capacity.
5. surface albedo scheme
Land surface albedo taken from Sib data set (varying monthly). Snow albedo changes according to snow age and zenith angle.
6. vegetation treatment
Canopy: big-leaf model
7. list of prognostic variables
- Surface temperature;
-6 levels of soil temperature and water amount;
-If land is frozen, then ice amount per 6 levels;
- Moisture amount on vegetation canopy;
- Puddle depth on land;
- If snow on land:
- 3 snow layer temperatures,
- 3 snow densities,
- Total snow mass,
- Snow age.
8. ice sheet characteristics
sheets not included.
E. coupling details
1. frequency of coupling
Every timestep (15 minutes)
2. Are heat and water conserved by coupling scheme?
3. list of variables passed between components:
The CSIRO Mk3 coupled model consists of 2 major components:
(a) The AGCM + Land + Ice model (the "atmosphere" below)
(b) The Ocean model
a. atmosphere &#8211; ocean
- Heat flux
- Water flux
- Solar flux into ocean - Surface stresses.
b. ocean - atmosphere
- Sea surface temperature
- Ocean surface velocities u,v (for driving ice model within AGCM)
4. Flux adjustment?
The HECToR (High-End Computing Terascale Resource) supercomputer, located at the University of Edinburgh, Cray XE6 system consisted of (by HECToR phase 3) 30 cabinets with a total of 704 compute blades. Each blade contained four compute nodes giving a total of 2816 compute nodes, each with two 16-core AMD Opteron 2.3GHz Interlagos processors, amounting to a total of 90,112 cores. Each 16-core socket was coupled with a Cray Gemini routing and communications chip. Each 16-core processor shared 16Gb of memory, giving a system total of around 90 Tb. The theoretical peak performance of the phase 3 system was over 800 Tflops. In addition, the system had a shared, high-performance parallel filesystem consisting of over 1 PB of high-performance RAID disks. The disks were globally accessible from any compute node, utilising the Lustre distributed parallel file system. The archiving system was based on Symantec's Enterprise NetBackup consisting of 1300 800GB tapes, with a maximum capacity of approximately 1.02 PetaBytes.
|Previously used record indentifiers:||