Non-Destructive Monitoring of Underground Root Development with Deep Learning-Based ResNet and In-Soil Fiber Optic Sensors
- Kabir Hossain,
- Steven Binder,
- Mable P Fok,
- Alexander Bucksch
Steven Binder
School of Electrical and Computer Engineering, College of Engineering, University of Georgia
Mable P Fok
School of Electrical and Computer Engineering, College of Engineering, University of Georgia
Alexander Bucksch
School of Plant Science, University of Arizona
Abstract
Non-destructive, real-time monitoring of root development can be helpful to farmers in improving crop resilience while minimizing resource use (Mervin et al., 2022). However, it is still an unexplored frontier in understanding root responses efficiently. In this study, we employed three in-soil fiber Bragg grating (FBG) based fiber sensors to generate root phenotyping data and developed an automated method using the deep learning architecture ResNet to monitor underground root development. In our preliminary study, we conducted a simulation experiment using two metal rods with diameters of 1mm and 5mm to mimic plant's roots. These rods were inserted to a depth of 15 cm in two different scenarios, 6 and 11 minutes, with the three in-soil FBG sensors continuously collecting data-two FBGs placed on the sides, and one placed at the bottom. The sensor data was preprocessed, resulting in 3228 samples for root diameter and 477 for root depth prediction models. We used an 80/20 split for training and testing the ResNet models to predict the artificial root diameter and ten different depth levels. The achieved accuracy was 0.95 for depth and 0.91 for diameter prediction. Overall, our study demonstrates the potential of ResNet architectures to accurately predict root depth and diameter with fiber optics-based sensors. Therefore, non-destructive root phenotyping in agricultural applications might be possible. Future work will involve evaluating these models in field experiments to assess their real-world performance.02 Nov 2023Submitted to NAPPN 2024 02 Nov 2023Published in NAPPN 2024