Extraction of PROSAIL-simulated Spectra from Multi-angular UAV
Observations: Application for Leaf Angle Estimation
Abstract
Crop leaf angle is a crucial feature of plant architecture which
influences photosynthetic efficiency and yield. Therefore
high-throughput mapping of leaf angle is of extreme interest for both
precision agriculture and crop breeding operations. In this study, we
propose a UAV-based hybrid approach by combining a radiative transfer
model (PROSAIL) and deep neural networks to estimate leaf angle from
multi-angular hyperspectral and LiDAR data. PROSAIL can simulate canopy
hyperspectral reflectance from a given list of parameters, where Average
Leaf Inclination Angle (ALIA) is one of the canopy parameters. The goal
is to develop a deep learning-based inversion function that takes UAV
hyperspectral reflectance and other PROSAIL parameters as input and
estimate ALIA of Maize as output. The other PROSAIL parameters will be
estimated using several machine learning pipelines from UAV
hyperspectral and LiDAR information. We also propose a multi-angular
reflectance scheme where each image pixel will generate multiple
simulated and observed reflectance from the overlapping regions using
different angles (i.e., solar zenith angle, viewing zenith angle, and
relative azimuth angle between the sun and sensor) involved in the
PROSAIL simulation. An automated Python-based tool was developed that
can calculate all three PROSAIL angles for a given hyperspectral data
cube and generate the simulated reflectance for every vegetation pixels
per experimental plot. Since the proposed method incorporates both crop
information (i.e., PROSAIL) and a data-driven approach (i.e., deep
learning), the method can be easily transferable for other study areas
and crops, and it can rely on less ground truth data.