Abstract
Eddy Covariance method has been actively used by expert
micrometeorologists for over 30 years, covering 2155 stationary
locations globally, and numerous mobile campaigns over land and water
surfaces. Latest measurement technologies and automated processing
software are rapidly expanding the use of the method to
non-micrometeorological research. Regulatory and commercial uses of the
method also increase year-by-year. Despite widening adoption of the
method, academic investigators outside the area of micrometeorology and
the majority of non-academic investigators are still not familiar enough
with the proper implementation of the method required for conducting
high-quality, reliable, traceable, and defensible measurements in their
respective areas of interest. Although data collection and processing
are now automated, the method still requires significant care to
correctly design the experiment, set up the site, organize and analyze
the large amount of data. Efforts of the flux networks (e.g., FluxNet,
Ameriflux, Asiaflux, ICOS, NEON, OzFlux, etc.) have led to major
progress in the standardization of the method. The project-specific
workflow, however, is difficult to unify because various experimental
sites and purposes of studies demand different treatments, and site-,
measurement- and purpose-specific approaches. To address this situation,
step-by-step instructions were created to introduce a novice to general
principles, requirements, applications, processing and analysis steps of
the conventional Eddy Covariance technique in the form of the free
electronic resource, a 660-page textbook titled “Eddy Covariance Method
for Scientific, Regulatory, and Commercial Applications”. The
explanations are provided using easy-to-understand illustrations and
real-life examples, and text is written in a non-technical language to
be practically useful to those new to this field. Information is
provided on theory of the method (including the state of methodology,
basic derivations, practical formulations, major assumptions, sources of
errors, error treatments, etc.), practical workflow (e.g., experiment
design, implementation, data processing, quality control and analysis),
data sharing and flux stations networking, key alternative methods, and
the most frequently overlooked details.