DeepCount: In-Field Automatic Quantification of Wheat Spikes Using Simple Linear Iterative Clustering and Deep Convolutional Neural Networks

Crop yield is an essential measure for breeders, researchers, and farmers and is composed of and may be calculated by the number of ears per square meter, grains per ear, and thousand grain weight. Manual wheat ear counting, required in breeding programs to evaluate crop yield potential, is labor-intensive and expensive; thus, the development of a real-time wheat head counting system would be a significant advancement. In this paper, we propose a computationally efficient system called DeepCount to automatically identify and count the number of wheat spikes in digital images taken under natural field conditions. The proposed method tackles wheat spike quantification by segmenting an image into superpixels using simple linear iterative clustering (SLIC), deriving canopy relevant features, and then constructing a rational feature model fed into the deep convolutional neural network (CNN) classification for semantic segmentation of wheat spikes. As the method is based on a deep learning model, it replaces hand-engineered features required for traditional machine learning methods with more efficient algorithms. The method is tested on digital images taken directly in the field at different stages of ear emergence/maturity (using visually different wheat varieties), with different canopy complexities (achieved through varying nitrogen inputs) and different heights above the canopy under varying environmental conditions. In addition, the proposed technique is compared with a wheat ear counting method based on a previously developed edge detection technique and morphological analysis. The proposed approach is validated with image-based ear counting and ground-based measurements. The results demonstrate that the DeepCount technique has a high level of robustness regardless of variables, such as growth stage and weather conditions, hence demonstrating the feasibility of the approach in real scenarios. The system is a leap toward a portable and smartphone-assisted wheat ear counting systems, results in reducing the labor involved, and is suitable for high-throughput analysis. It may also be adapted to work on Red; Green; Blue (RGB) images acquired from unmanned aerial vehicle (UAVs).

Data and Resources

Additional Info

Field Value
Source
Version
Authors
  • Name: Sadeghi-Tehran, Pouria, Type: Corresponding Author,
  • Name: Virlet, Nicolas, Type: Author,
  • Name: Ampe, Eva M., Type: Author,
  • Name: Reyns, Piet, Type: Author,
  • Name: Hawkesford, Malcolm J., Type: Author,
Maintainer
Maintainer Email
Article Host Type publisher
Article Is Open Access true
Article License Type cc-by
Article Version Type publishedVersion
Citation Report https://scite.ai/reports/10.3389/fpls.2019.01176
DFW Organisation RRes
DFW Work Package 1
DOI 10.3389/fpls.2019.01176
Date Last Updated 2019-10-21T06:29:11.678585
Evidence open (via page says license)
Journal Is Open Access true
Open Access Status gold
PDF URL https://www.frontiersin.org/articles/10.3389/fpls.2019.01176/pdf
Publisher URL https://doi.org/10.3389/fpls.2019.01176