Martin Janssens

and 4 more

Earth's climate sensitivity depends on how shallow clouds in the trades respond to changes in the large-scale tropical circulation with warming. In canonical theory for this cloud-circulation coupling, it is assumed that the clouds are controlled by the field of vertical motion on horizontal scales larger than the convection's depth (~1 km). This assumption has been challenged both by recent in-situ observations, and idealized large-eddy simulations (LESs). Here, we therefore bring together the recent observations, new analysis from satellite data, and a forty-day, large-domain (1600 x 900 km2) LES of the North Atlantic from the 2020 EUREC4A field campaign, to study the interaction between shallow convection and vertical motions on scales between 10-1000 km (mesoscales), in settings that are as realistic as possible. Across all datasets, the shallow mesoscale vertical motions are consistently represented, ubiquitous, frequently organised into circulations, and formed without imprinting themselves on the mesoscale buoyancy field. Therefore, we use the weak-temperature gradient approximation to show that between at least 12.5-400 km scales, the vertical motion balances heating fluctuations in groups of precipitating shallow cumuli. That is, across the mesoscales, shallow convection controls the vertical motion in the trades, and does not simply adjust to it. In turn, the mesoscale convective heating patterns appear to consistently grow through moisture-convection feedback. Therefore, to represent and understand the cloud-circulation coupling of trade cumuli, the full range of scales between the synoptics and the hectometre must be included in our conceptual and numerical models.

Frédéric Hourdin

and 9 more

We demonstrate a new approach for climate model tuning in a realistic situation. Our approach, described in detail in Part I, systematically uses a single-column configuration of a global atmospheric model on a series of test cases for which reference large-eddy-simulations are available. The space of free parameters is sampled running the single-column model from which metrics are estimated in the full parameter space using emulators. The parameter space is then reduced by retaining only the values that are consistent with the metrics computed on large eddy simulations within a given tolerance to error. The approach is applied to the recently designed 6A version of the LMDZ model, itself the result of a long investment in the development of physics parameterizations and by-hand tuning. The boundary layer is revisited by increasing the vertical resolution and varying parameters that were kept fixed so far. The approach allows us to automatically reach a tuning as good as that of the 6A version, after some improvements are done at process scale. This approach helps accelerate the introduction of new parameterizations, by avoiding a tedious manual tuning process and preventing some of the error compensations that could occur if calibration was carried out directly with the full atmospheric model. This way of using machine learning techniques allows us to maintain the physical foundations of the model and to ensure that the improvement of global metrics is obtained for a reasonable behavior at process level. That is, we get things right for the right reasons.

Najda Villefranque

and 8 more

Process-scale development, evaluation and calibration of physically-based parameterizations are key to improve weather and climate models. Cloud–radiation interactions are a central issue because of their major role in global energy balance and climate sensitivity. In a series of papers, we propose papers a strategy for process-based calibration of climate models that uses machine learning techniques. It relies on systematic comparisons of single-column versions of climate models with explicit simulations of boundary-layer clouds (LES). Parts I and II apply this framework to the calibration of boundary layer parameters targeting first boundary layer characteristics and then global radiation balance at the top of the atmosphere. This third part focuses on the calibration of cloud geometry parameters that appear in the parameterization of radiation. The solar component of a radiative transfer scheme (ecRad) is run in offline single-column mode on input cloud profiles synthesized from an ensemble of LES outputs. A recent version of ecRad that includes explicit representation of the effects of cloud geometry and horizontal transport is evaluated and calibrated by comparing radiative metrics to reference values provided by Monte Carlo 3D radiative transfer computations. Errors on TOA, surface and absorbed fluxes estimated by ecRad are computed for an ensemble of cumulus fields. The average root-mean-square error can be less than 5 Wm$^{-2}$ provided that 3D effects are represented and that cloud geometry parameters are well calibrated. A key result is that configurations using calibrated parameters yield better predictions than those using parameter values diagnosed in the LES fields.

Fleur Couvreux

and 16 more

The development of parameterizations is a major task in the development of weather and climate models. Model improvement has been slow in the past decades, due to the difficulty of encompassing key physical processes into parameterizations, but also of calibrating or â\euro˜tuningâ\euro™ the many free parameters involved in their formulation. Machine learning techniques have been recently used for speeding up the development process. While some studies propose to replace parameterizations by data-driven neural networks, we rather advocate that keeping physical parameterizations is key for the reliability of climate projections. In this paper we propose to harness machine learning to improve physical parameterizations. In particular we use Gaussian process-based methods from uncertainty quantification to calibrate the model free parameters at a process level. To achieve this, we focus on the comparison of single-column simulations and reference large-eddy simulations over multiple boundary-layer cases. Our method returns all values of the free parameters consistent with the references and any structural uncertainties, allowing a reduced domain of acceptable values to be considered when tuning the 3D global model. This tool allows to disentangle deficiencies due to poor parameter calibration from intrinsic limits rooted in the parameterization formulations. This paper describes the tool and the philosophy of tuning in single-column mode. Part 2 shows how the results from our process-based tuning can help in the 3D global model tuning.

Bjorn Stevens

and 291 more

The science guiding the \EURECA campaign and its measurements are presented. \EURECA comprised roughly five weeks of measurements in the downstream winter trades of the North Atlantic — eastward and south-eastward of Barbados. Through its ability to characterize processes operating across a wide range of scales, \EURECA marked a turning point in our ability to observationally study factors influencing clouds in the trades, how they will respond to warming, and their link to other components of the earth system, such as upper-ocean processes or, or the life-cycle of particulate matter. This characterization was made possible by thousands (2500) of sondes distributed to measure circulations on meso (200 km) and larger (500 km) scales, roughly four hundred hours of flight time by four heavily instrumented research aircraft, four global-ocean class research vessels, an advanced ground-based cloud observatory, a flotilla of autonomous or tethered measurement devices operating in the upper ocean (nearly 10000 profiles), lower atmosphere (continuous profiling), and along the air-sea interface, a network of water stable isotopologue measurements, complemented by special programmes of satellite remote sensing and modeling with a new generation of weather/climate models. In addition to providing an outline of the novel measurements and their composition into a unified and coordinated campaign, the six distinct scientific facets that \EURECA explored — from Brazil Ring Current Eddies to turbulence induced clustering of cloud droplets and its influence on warm-rain formation — are presented along with an overview \EURECA’s outreach activities, environmental impact, and guidelines for scientific practice.

Olivier Audouin

and 3 more

The representation of stable boundary layers (SBLs) still challenges turbulence parameterizations implemented in current weather or climate models. The present work assesses whether these model deficiencies reflect calibration choices or intrinsic limits in currently-used turbulence parameterization formulations and implementations. This question is addressed for the ARPEGE-Climat 6.3 CNRM atmospheric model in a single-column model/large-eddy simulation (SCM/LES) comparison framework, using the history matching with iterative refocusing statistical approach. The GABLS4 case, which samples a nocturnal strong SBL observed at Dome C, Antarctic Plateau, is used. The standard calibration of the ARPEGE-Climat 6.3 turbulence parameterization leads to a too deep SBL, a too high low-level jet and misses the nocturnal wind rotation. This behavior is found for low and high vertical resolution model configurations. The statistical tool then proves that these model deficiencies reflect a poor parameterization calibration rather than intrinsic limits of the parameterization formulation itself. In particular, the role of two lower bounds that were heuristically introduced during the parameterization implementation to increase mixing in the free troposphere and to avoid runaway cooling in snow- or ice-covered region is emphasized. The statistical tool identifies the space of the parameterization free parameters compatible with the LES reference, accounting for the various sources of uncertainty. This space is non-empty, thus proving that the ARPEGE-Climat 6.3 turbulence parameterization contains the required physics to capture the GABLS4 SBL. The SCM framework is also used to validate the statistical framework and a few guidelines for its use in parameterization development and calibration are discussed.