Skip to main content

Soybean stress image based phenotyping

These are two examples of research projects in this research area:

1. Explainable ML phenotyping: We work to resolve gaping holes in phenotyping pipeline regarding automation of information extraction from digital images for stress phenotyping. We develop explainable deep ML to automate soybean diseases phenotpying. The deep learning model has high accuracy (matching that of the trained subject-matter specialists mentioned above), and can explain which visual symptoms are used to make disease assessment. This project provides tools to build user confidence in machine-generated information, and circumvents large-scale annotation efforts for generating stress phenotyping models. Our models can be deployed on smartphones and ground/aerial systems, enabling real-time stress detection.
Outcomes: The work on explainability of DL models is a major domain advancement as it gives practitioners’ confidence in using what was previously a black box approach. Faster and deeper phenotyping.

2. ML enabled image phenotyping: We collaborate in a multi-disciplinary project to develop deep learning architectures to solve complex phenotyping problems. For example, we worked on a ‘rare object identification in clutter-filled images’ problem to identify and count the Soybean Cyst Nematode (SCN) eggs that outperforms expert annotations. Another example includes hyperspectral imaging. We strive to overcome the issues associated with visual phenotyping (i.e., slow and error prone) by developing methods for accurate disease identification and quantification through the use of hyperspectral cameras. We use different data analytic methods (e.g., genetic algorithm and ML methods) for the identification of maximally effective waveband combination from the hyperspectral bands.
Outcomes:  Method significantly reduces the time to count SCN eggs and is more accurate than human count influenced by fatigue. Our use of auto-encoder to solve a rare-object detection in agriculture domain was new at the time of manuscript publication. For the early disease detection project, our results enables timely intervention, and precision disease control circumventing indiscriminate chemical spraying impacting profitability and sustainability. For hyperspectral imaging, the optimal wavebands can identify disease symptoms even before they were visible to the human eye. This is noteworthy as early disease detection allows implementing mitigation steps, such as applications of pesticides, before crop losses occur.

Primary collaborators: Dr. Arti Singh, Dr. Baskar Ganapathysubramanian, Dr. Soumik Sarkar, Dr. Daren Mueller, Dr. Greg Tylka.
Funding: Iowa Soybean Association, R F Baker Center for Plant Breeding, Monsanto Chair in Soybean Breeding, USDA-AFRI, Plant Sciences Institute.

Related Publications:

  • Nagasubramanian K, S Jones, AK Singh, S Sarkar, A Singh, B Ganapathysubramanian. 2019. Plant disease identification using explainable 3D deep learning on hyperspectral images. BMC Plant Methods. 15, Article number: 98.
  • Ghoshal, S, D Blystone, AK Singh, B Ganapathysubramanian, A Singh, S Sarkar. 2018. Bringing consistency to plant stress phenotyping through an explainable deep machine vision framework. Proceedings of the National Academy of Sciences. 115 (18) 4613-4618.
  • Akintayo A, GL Tylka, AK Singh, B Ganapathysubramanian, A Singh, S Sarkar. 2018. A deep learning framework to discern and count microscopic nematode eggs. Scientific Reports. 8: 9145
  • Nagasubramanian K, S Jones, S Sarkar, AK Singh, A Singh, B Ganapathysubramanian. 2018. Hyperspectral band selection using genetic algorithm and support vector machines for early identification of charcoal rot disease in soybean stems. Plant Methods 14:86.