Modeling canopy architecture traits using UAS-acquired LiDAR features in
diverse maize varieties
Abstract
Plant growth and development is impacted by the ability to capture
resources including sunlight, determined in part by the arrangement of
plant parts throughout the canopy. This is a very complex trait to
describe, but has a major impact on downstream traits such as biomass or
grain yield per acre. Though some is known about genetic factors
contributing to leaf angle, maturity, and leaf size and number, these
discrete traits do not encompass the structural complexity of the
canopy. In addition, modeling and prediction for plant developmental
traits using genomics or phenomics are usually conducted separately. We
have developed proof-of-concept models that incorporate spatio-temporal
factors from drone-acquired LiDAR features in a maize diversity panel to
predict plant growth and development over time to improve our
understanding of the biology of canopy formation and development.
Briefly, voxel models for probability of beam penetration into the
foliage were generated from 3D LiDAR scans collected at seven dates
throughout crop canopy development. From the same plots, key
architectural features of the maize canopy were measured by hand: stand
count; plant, tassel, and flag leaf height; anthesis and silking dates;
ear leaf, total leaf, and largest leaf number; and largest leaf length
and width. We develop a self-supervised autoencoding neural network
architecture that separately encodes plant temporal growth patterns for
individual genotypes and plant spatial distributions for each plot.
Then, leveraging the resulting latent space encoding of the LiDAR scans,
we train and demonstrate accurate prediction of hand-measured crop
traits.