Peng Song ,Jinglu Wang ,Xinyu Guo *,Wanneng Yang *,Chunjiang Zhao
a National Key Laboratory of Crop Genetic Improvement,National Center of Plant Gene Research,Huazhong Agricultural University,Wuhan 430070,Hubei,China
b Beijing Key Lab of Digital Plant,Beijing Research Center for Information Technology in Agriculture,Beijing Academy of Agriculture and Forestry Sciences,Beijing 100097,China
Keywords:High-throughput phenotyping Crop breeding Crop phenomics Phenotyping platform Data analysis
ABSTRACT With the rapid development of genetic analysis techniques and crop population size,phenotyping has become the bottleneck restricting crop breeding.Breaking through this bottleneck will require phenomics,defined as the accurate,high-throughput acquisition and analysis of multi-dimensional phenotypes during crop growth at organism-wide levels,ranging from cells to organs,individual plants,plots,and fields.Here we offer an overview of crop phenomics research from technological and platform viewpoints at various scales,including microscopic,ground-based,and aerial phenotyping and phenotypic data analysis.We describe recent applications of high-throughput phenotyping platforms for abiotic/biotic stress and yield assessment.Finally,we discuss current challenges and offer perspectives on future phenomics research.
Facing the challenges presented by resource shortages,climate change,and increasing global population,crop yield and quality need to be improved in a sustainable way over the coming decades[1].Genetic improvement by breeding is the best way to increase crop productivity[2].With the rapid progression of functional genomics,an increasing number of crop genomes have been sequenced and dozens of genes influencing key agronomic traits have been identified[3,4].However,current genome sequence information has not been adequately exploited for understanding the complex characteristics of multiple gene,owing to a lack of crop phenotypic data[5].Efficient,automatic,and accurate technologies and platforms that can capture phenotypic data that can be linked to genomics information for crop improvement at all growth stages have become as important as genotyping.Thus,high-throughput phenotyping has become the major bottleneck restricting crop breeding[6].
Plant phenomics has been defined[7]as the high-throughput,accurate acquisition and analysis of multi-dimensional phenotypes during crop growing stages at the organism level,including the cell,tissue,organ,individual plant,plot,and field levels.Crop phenotypic performance involves a complex interaction between genotypes and environmental factors,which include climate,soil factors,abiotic/biotic factors,and crop management methods[8].With the rapid development of novel sensors,imaging technology,and analysis methods,numerous infrastructure platforms have been developed for phenotyping.
Previous reviews[7,9-11]have described the development of phenotyping platforms employing diverse sensors and the ways in which phenotyping enhances the use of crop genetic resources.Here we provide an overview of crop phenomics research from technological and platform viewpoints at various scales,including microscopic,ground-based,and aerial phenotyping,for phenotypic data collection and analysis.We describe recent applications of high-throughput phenotyping platforms for the measurement of abiotic/biotic stress and yield estimation.Finally,we discuss current challenges and provide our perspectives on phenomics research.
Over the last two decades,the rapid development of nondestructive sensing and imaging techniques has dramatically advanced the measurement of crop phenotypic traits in controlled environments as well as in the field[12,13].The imaging techniques include visible,thermal infrared,fluorescence,3D,and multi-or hyperspectral)imaging,and tomographic imaging bymagnetic resonance imaging(MRI)or X-ray computed tomography(CT)[5].
Integration of sensing technologies,automatic control technology,computers,robotics,and aeronautics has led to the development of an increasing number of high-throughput phenotyping platforms for investigating crop phenotypic traits.Scientists have developed multiple phenotyping platforms for crop traits at multiple application scales.In this review,phenotyping platforms are divided into three types based on the imaging level:microscopic,ground-based,and aerial phenotyping platforms,which allow the characterization of phenotypic traits at the tissue,organ,individual plant,plot,and field levels(Fig.1).
Various tissue morphologies and micro-phenotypes within plants have attracted much attention in recent years.It is desirable to measure crop micro-phenotypes rapidly and nondestructively.Although the impediments include destructive and complex pretreatment procedures,low microscale imaging resolution,and a lack of automated image analysis techniques[9],many new algorithms and tools have been proposed[14,15]for observing and extracting micro-phenotypes of crop organs or tissues.
Kernel traits are critical determinants of final yield,but their analysis is laborious and often requires destructive harvesting.A current challenge is to develop an accurate,nondestructive method for kernel-trait analysis capable of handling large populations.Hughes et al.[16]proposed a robust method for the accurate extraction and measurement of wheat spike and kernel morphometric parameters from images acquired by X-ray micro-CT.Xiong et al.[17]proposed a novel system based on 3D morphological processing to identify wheat spike kernels in 3D X-ray micro-CT images.A set of newly defined 3D phenotypes,including kernel aspect ratio,porosity,kernel-to-kernel distance,and kernel angle,were also introduced.In maize,3D models of kernel geometry based on micro-CT images have been generated[18],greatly improving the computational accuracy of estimating trait phenotypes of internal tissue structures of kernels compared with previous methods.These models were proposed to be useful for illustrating relationships between phenotypic traits of tissue structures and their functions.
Stalk anatomical traits play key roles in plant function and display evolutionary adaptation to the surrounding environment.The complex microstructure of crop stalks poses a challenge to developing suitable data analysis workflows for the detection and identification of microscopic phenotypes of stalk tissue.Zhang et al.[19]developed a method to quantify the lignification of maize tissues by automated color image analysis of stained maize stem cross sections.Legland et al.[20]presented a method that quantifies the average spatial organization of vascular bundles within maize stems by integrating information from replicated images.A micro-CT technology for stalk imaging was introduced by Du et al.[21],who developed VesselParser 1.0 for automatic and accurate analysis of phenotypic traits of vascular bundles within entire maize-stalk cross sections.Subsequently,based on VesselParser 4.0,a standard process for stem micro-CT data acquisition and an automatic CT image-processing pipeline were developed[22]to estimate vascular bundle traits of stems,including geometry,morphology,and distribution traits.Exploiting years of work on micro-phenotypes,a team(Beijing Key Laboratory of Digital Plant)devised a novel method to improve the X-ray absorption contrast of maize tissue suitable for ordinary micro-CT scanning and developed a set of image-processing workflows for maize roots,stalks,and leaves to effectively extract microscopic phenotypes of vascular bundles[23].
In addition to crop kernels and stalks,roots and root vascular bundles play important roles in plants and not only help anchor the plant firmly in the ground but influence the absorption of water and nutrients from the soil.Wu et al.[24]introduced the computer-aided calculation of wheat root vascular bundle phenotypic information extraction based on sequential images of paraffin sections.Later,RootScan,a program for the semi-automated analysis of anatomical traits in root cross-section images,was developed by Burton et al.[25].RootScan permits phenotypic scoringof physiologically and agronomically important traits on a large number of genotypes.Another software,RootAnalyzer[26],a cross-section image-analysis tool for automated characterization of root cells and tissues,was developed to further improve image-segmentation efficiency and ensure high accuracy.Keyes et al.[27]applied time-lapse(4D)synchrotron X-ray CT to observe micro-scale interactions between plant roots and soil,identifying marked differences in deformation mechanisms among phenotypes under various soil conditions.Pan et al.[28]performed a 3D reconstruction and visualization of maize root tissues based on X-ray micro-CT and developed an image-processing workflow for the 3D segmentation of metaxylem vessels that allowed accurate estimation of root metaxylem vessel traits.
Table 2Main data-management systems used in crop phenotyping platforms.
Ground-based phenotyping platforms are effective solutions for crop phenotyping at the individual and field-plot scales.In general,ground-based phenotyping platforms may be classified as portable phenotyping,stationary,and movable platforms according to the application[5].Portable instruments are widely used to measure crop phenotypic traits for their easy operation,portability and low cost[29].The availability of smartphones with highresolution RGB cameras and powerful computing capabilities has led to the creation of phenotyping applications[30].For example,Leaf-GP,an open-source software package based on crop image sequences captured by mobile devices,can estimate multiple growth traits automatically[31].With the rapid development of sensor-platform-processing solutions,sensors including highresolution RGB cameras,incident-light sensors,thermal infrared imaging cameras,and even hyperspectral cameras have been incorporated into specialized portable instruments[32].For example,PocketLAI,an instrument that integrates an accelerometer and compass orientation capability,can accurately estimate leaf area index(LAI)[33].LeafSpec,which comprises a hyperspectral camera(HSC),leaf scanner,lightbox,and Advanced RISC Machine-based microcontroller,can detect plants based on nitrogen fertilizer treatment and genotype[34].Despite the convenience of portable instruments,their detection scale and efficiency are generally small,limiting their applications in high-throughput phenotyping.
Stationary platforms are fixed-site phenotyping equipment that can carry multiple sensors,such as RGB,multi/hyperspectral,thermal,fluorescent,and 3D imaging sensors,to increase the precision,resolution,and throughput of phenotyping in controlled environments as well as in the field[35,36].CropDesign(Overijse,Belgium)developed the first high-throughput phenotyping platform,which carries an RGB camera to measure crop morphometric traits(biomass,plant morphology,and color)that might be associated with yield[37].Yang et al.developed a high-throughput rice phenotyping platform employing a color imaging device and linear X-ray CT for monitoring at least 15 traits for rice genes identification[12].Rothamsted Research(Harpenden,United Kingdom)introduced a fully automated,rail-based gantry system called the Field Scanalyzer[38]that can measure details of canopy development across all crop growth stages in the field.Stationary phenotyping platforms can provide simultaneous monitoring at high resolution during the whole crop growth period.However,owing to their fixed-site attribute,their operation is restricted to a limited area and their construction and maintenance are expensive.
A movable phenotyping platform can be built by adding multiple sets of sensors to an existing agricultural vehicle,including manual cart,propelled tractor,and phenotyping robot platforms.A cart may be powered by an electric wheel and steered by an operator walking behind it,using light/laser radar(LiDAR)for rapid multi-temporal and nondestructive canopy height,ground cover and aboveground biomass estimation[39].The U.S.Departmentof Agriculture Agricultural Research Service developed a field highthroughput phenotypic analysis platform mounted on a highclearance tractor[40].Four nadir-view ultrasonic transducers and two LiDAR systems were installed for phenotypic monitoring of cotton and plants with complex canopy structures and short plant types.The robot phenotyping platform can provide driving power and global positioning system(GPS)modules.By carrying imaging sensors and environmental sensors,the platform can acquire phenotypic information 24 h per day with little human intervention following a predetermined navigation route[41,42]However,most optical imaging sensors are sensitive to environmental conditions,making challenging the precise measurement of phenotypic traits in the field under varying sunlight.To solve this problem,platforms equipped with multiple sensors within a mobile dark chamber have been designed to exclude the influence of wind and sunlight,such as the GPhenoVision systems[43].Movable phenotyping platforms are much more flexible than stationary platforms and more efficient than portable phenotyping instruments.However,they are restricted by specific conditions,such as for phenotyping paddy rice in a field full of water,which may be better achieved by aerial phenotyping.
In recent years,aerial phenotyping platforms,which include mainly satellites and unmanned aerial vehicles(UAVs),have been widely used for crop phenotyping in the field[44].Compared with ground-based platforms,aerial phenotyping platforms are usually used in large-scale crop breeding.Images from manned aircraft are used mainly to measure canopy temperature and structure,chlorophyll content,nitrogen content,plant height,and biomass in thousands of field plots[45].Satellite images can be used to study the growth rules of,and genetic differences between,crop cultivars in the field at a large scale[46].However,their lower image resolution limits their application in crop trial monitoring[47].
UAV phenotyping platforms have shown great potential for high-throughput phenotyping.In contrast to satellite and manned aircraft platforms,UAV-based phenotyping platforms are flexible and easy to operate,present low hardware costs,and potentially provide images with much higher spatial resolutions(~1 mm per pixel)than satellites[7].For these reasons,UAV system applications in crop phenotyping have increased exponentially[48].Researchers have used fixed-wing or rotating-wing UAV platforms equipped with multi-imaging sensors.The sensors that UAVs carry typically include visible-light(RGB)cameras,infrared thermal imagers,LiDAR,multispectral cameras,and hyperspectral sensors.RGB cameras can produce high-resolution 2D images and have been adopted for yield prediction,canopy cover estimation,and stress quantification[49,50].In contrast,LiDAR provides an alternative approach for plant 3D models[51].Multispectral and hyperspectral cameras are an efficient means of photosynthetic status detection[52],chlorophyll-based diagnosis[53],and drought stress evaluation[54].Limiting factors for UAV phenotyping platforms include local airspace regulatory constraints,limited flight time and load capacity,sensitivity of data acquisition to weather conditions such as wind and sunlight,and a lack of universal data processing and modeling methods for diverse environments[5].
Emerging phenotype acquisition technologies,such as new types of physical,chemical and biological(physiological)sensors,graphics and image technology,artificial intelligence technology,and Internet of Things(IoT)technology,are providing massive amounts of phenotype data for crop research.Transforming terabytes of image,point cloud,spectroscopic,infrared,X-ray,and other multi-scale phenotypic data covering cell,tissue,organ,and plant populations in laboratories and greenhouses into meaningful biological information is a challenging task[55].Finkel[56]proposed that imaging with phenomics will shift breeding to hands-free work for plant scientists.Increasing the number of imaging-based,automated,non-invasive,and nondestructive high-throughput plant phenotyping platforms yields raw data that must be accurately and robustly normalized,reconstructed,and analyzed,requiring the development of advanced image understanding and quantification algorithms for 2D image or 3D mesh processing.Paproki[57]presented a novel 3D meshbased technique for temporal high-throughput analysis of Gossypium hirsutum vegetative growth.An open-source,flexible image analysis framework called Image Harvest(IH)was developed for processing images originating from high-throughput plant phenotyping platforms[58].Olsen et al.[59]proposed computer vision systems and approaches to annotate,detect,and count panicles(heads),a key phenotype,in aerial images of sorghum crops.Walter et al.[60]described the application of high-throughput digital phenotyping to color-based traits in field trials of a wheat breeding program and detailed the basic image analysis methods involved.
Plant image data provides abundant data for rapid construction of plant 3D models.Crop 3D structure is also a prerequisite for accurate plant phenotype measurement,which plays an important role in crop phenomics.High-throughput plant phenotypic analysis has become a research focus in recent years,making essential the rapid acquisition of plant three-dimensional shapes.Technique for plant 3D reconstruction include regular model L-systems,3D digitizers,LiDAR[39],ultrasonic sensors,and structured light imaging sensors.Compared with other 3D reconstruction methods,crop 3D reconstruction based on multi-view images has the advantages of low equipment cost,convenient data acquisition,and flexible use.Acquired point cloud data contains not only high-density 3D point clouds,but also information such as true color texture of the image.Using a 3D reconstruction method based on multiperspective images,the structure and phenotype information of crops can be described in detail from the point-cloud features and appearance characteristics recovered from the images.Any photos can be processed,and the reconstruction speed can be accelerated by improvement of the computer performance and the reconstruction algorithm.
Modern integrated sensors(visible light,near-infrared,farinfrared,fluorescence,multispectral,laser,hyperspectral,etc.)allow the acquisition of plant dynamic growth and development phenotypic datasets,which contain information about plant genetics and mutations.They have been widely used in the analysis of plant height,chlorophyll content,LAI,disease susceptibility,drought stress sensitivity,nitrogen content,and yield of crops.Hyperspectral(such as near-infrared)photos combined with multispectral analysis technology can nondestructively reveal nutrient components in rice,wheat or corn,such as amylose,dextran,and other carbohydrates.Computational image analysis algorithms can be developed to extract important phenotypic parameters from spectral image data showing rust infection,germination rate,and flowering date.By integrating multi-spectral and highdefinition imaging equipment,key agronomic traits of crops can be collected with high throughput.Chen et al.[61]acquired more than 400 traits of 18 barley genotypes using visible light and fluorescence and near-infrared spectroscopy.They selected six trait clusters of principal components using multicollinearity analysis and other methods with the aim of characterizing crop sensitivity to drought.Conventional regression methods and machine learn-ing are often inadequate for retrieving all the information contained in spectral or hyperspectral data.Field crop phenotypic information analysis models based on a single method have the shortcomings of poor versatility and poor inter-annual prediction stability.It is desirable to construct phenotypic information analysis models by combining of multi-source remote sensing information with deep learning.
In recent years,the development and construction of crop models have received extensive attention.With research into the mechanisms underlying crop physiological and ecological processes and the application of computer technology,crop models have gradually entered the stage of practical application.Crop modeling is the application of mathematical expressions of physiological and physical processes in agricultural systems for quantitatively and dynamically describing crop growth,development,and yield formation.It uses computer simulation of environmental responses to strengthen understanding of crop physiology,behavior,and response and to provide farmers with management and decision-making suggestions.Crop models are divided mainly into growth models,morphological structure models,and functional structure models.With the large-scale application of 3D digital technology in the agricultural field,functional structure models are gradually being used to optimize the planting configuration of interspecific cropping,measure the contribution of plant-type plasticity to the improvement of light interception in intercropping systems,and determine planting density based on light distribution in crop populations.The integration of crop functional structure models with remote sensing,geography information systems and GPS technologies,cloud computing,decision support systems,and the Internet of Things will promote the development of digital agriculture and provide technical support for modern agriculture.Brown et al.[62]reported that traits can be captured throughout development and across environments from multidimensional phenotypes by applying functional structural plant models(FSPMs)to predict plant growth and reproduction in target environments.Vidal et al.[63]proposed that coordination rules for maize could be used in FSPMs.Muller et al.[64]considered crop simulation models to be powerful tools for predicting the impact of climate change,innovative crop management practices on crop production and agricultural systems.
In recent years,various computer vision algorithms,graphics and image processing,and machine learning(ML)classification methods have been applied in phenotypic data analysis on a large scale.By incorporating prior knowledge and expertise,phenotypic features such as plant size,morphology,growth dynamics,and disease status can be automatically extracted.Artificial intelligence technology,such as ML,plays a pivotal role in data mining(DM)and processing,providing relevant information for decision making aimed at achieving breeding targets.Research in crop phenomics is aided by ML methods such as support vector machines(SVMs)random forests,and artificial neural networks(ANNs).Edlich-Muth et al.[65]performed phenomic experiments on 195 inbred and 382 hybrid maize cultivars and followed their progress from 16 days after sowing(DAS)to 48 DAS with 129 image-derived features to investigate the possibility of biomass prediction for fully grown plants from early developmental stage image-derived features.Kamruzzaman et al.[66]presented a new algorithmic approach(Hyppo-X)to visualize complex phenomics data and characterize the effect of environment on phenotypic traits.Hyppo-X was evaluated in two real-world plant(maize)datasets to test the ability of this approach.Rahaman et al.[67]proposed a statistical framework for the processing of phenomics data integrating DM and ML methods.
Deep learning(DL),a subset of ML,has emerged as a versatile tool for assimilating large amounts of heterogeneous data and providing reliable predictions of complex and uncertain phenomena.Convolutional neural networks(CNNs),deep CNNs(DCNNs),and automated image-analysis processes were designed and developed using open-source software libraries such as OpenCV,scikit-image,and TensorFlow,and model frameworks such as You Only Look Once(YOLO)[68-70]and capsule networks are popularly used in present crop studies.Ubbens et al.[68]introduced an opensource DL tool called Deep Plant Phenomics,which provides pretrained neural networks for several common plant phenotyping tasks,as well as an easy-to-use platform that plant scientists can use to train these models for their own phenotyping applications.
DL has undergone a surge in multiple research fields,producing state-of-the-art results in numerous tasks that were previously assumed to be difficult for computers to handle.It can help researchers transform big multi-omics data into biological knowledge.A variety of DL models and architectures have performed well on a variety of plant phenotyping tasks.The identification and counting of crop traits using DL is a hot research field.To improve the accuracy of rice detection and counting in the field,Zhou et al.[69]applied a CNNto perform automatic rice phenotype measurements.To detect regions containing flowering panicles in ground-level RGB images of paddy rice,Desai et al.[70]used a CNN to estimate heading date automatically.Yang et al.[71]used a feature pyramid network mask(FPN-Mask)to accurately segment rice leaves and panicles and to calculate the leaf-to-panicle ratio(LPR)of the rice canopy during the grain filling stage.DL has been applied to other crops to carry out identification,segmentation and counting of specific traits,including counting maize tassels in the wild[72],image-based plant phenotyping for root and shoot feature identification and localization in bread wheat[73],identification and counting of wheat spikes in digital images made under natural field conditions[74],multi-trait prediction in wheat[75],and detection of panicles in cereal crops[76].Several studies have focused on the classification of crop cultivars using seeds.For example,ten major cultivars of basmati rice were identified and differentiated based on seed images[77],haploid seeds were sorted using features based on color,texture,and morphology[78],and soybean cultivars were quickly identified from seeds by DL[79].In addition to crop seed differentiation,crop detection and classification is also an interesting research area for DL researchers.It has been used in many crop studies,including plant seedling detection and cotton counting in the field[80],quick and accurate crop imaging during crop growth monitoring[81],and detection of intact green tomatoes regardless of occlusion or fruit growth stage[82].
Crop yield prediction using DL is another popular research field.Wu et al.[83]applied Faster region-based CNN(Faster R-CNN)to quickly quantify rice kernels per panicle.Khaki et al.[84]used a deep neural network(DNN)for maize yield prediction.Montesinos et al.[75]reported a study in which multi-trait prediction,including the grain yield(GY)of 270 durum wheat lines,was evaluated in 43 environments(country/location/year combinations)across a broad range of water regimes in the Mediterranean Basin and other locations.Xu et al.[85]performed automatic segmentation of wheat ear images captured by handheld devices for rapid and accurate wheat ear counting.As in crop production,weed recognition and segmentation are important in crop breeding.Teimouri et al.[86]used a CNN for automatic estimation of weed species and growth stages in in situ images containing 18 weed species or families.Ma et al.[87]used a fully convolutional network(FCN)for rice seedling and weed image segmentation at the seedling stage in paddy fields.For the recognition of diseases and pests,DL has been used in crops including rice,wheat,tomato,and potato.In rice,Li et al.[88]applied Faster R-CNN and YOLO v3 toconstruct video detection systems for plant diseases and pests.They aimed to build a real-time crop disease and pest video detection system.In wheat,a CNN was used to grade wheat stripe rust[89]and to automatically classify and locate wheat mites[90].In Table 1,applications of DL in agriculture over the past few years are listed along with the models and architectures used in them.
Table 1Applications of deep learning (DL) in crop phenotyping.
In the future of agriculture,artificial intelligence technology,especially DL,has greatly advanced the recognition,classification,analysis,and visualization of big phenotypic data and will continue to play an important role in advancing research and product development in the agricultural industry.Later,with the further accumulation of high-throughput,multi-dimensional,and multi-scale phenotypic data,the construction of a suitable analysis platform for updating big data analysis software and hardware systems and the integration and interactive visualization of the data analysis process will be needed and feasible.
With the development of crop phenomics,phenotypic facilities are required to collect spatio-temporal phenotypic information of samples at high throughput and multiple scales,and the cultivation and collection of samples are time-consuming and laborious.To promote the development of phenomics,it is necessary to promote the sharing and use of data and information,an activity that will require the establishment of data standards.Only through standardization of collection processes,definitions and descriptions,and data analyses can phenomics big data be effectively shared.Standardization and management of phenotypic data can include data preprocessing,data integration,data storage,and other aspects.The ISA-Tab system(cropnet.pl/phenotypes)and MI-APPE system(Minimal Information about Plant Phenotyping Experiments)are international phenotype data standards.Both of these standards describe the metadata required by phenotypic experiments using ontologies that provide standard attribute lists for data management systems.
Many institutions have constructed data integration and storage system for crop phenomics data.A joint team from China and the United Kingdom led by Zhou Ji,using IoT technology,has developed CropSight[91],an open-source information management system for automated data acquisition by IoT sensors and phenotyping platforms.The PHIS(Phenotyping Hybrid Information System)[92]of the French National Institute of Agricultural Sciences(Institut Nationale de la Recherche Agronomique,INRA)integrates and manages phenotypic data from multiple experiments and platforms using an ontology-driven architecture.Phenotyping platforms and data-management systems are listed in Table 2.
One of the challenges in crop breeding is increasing both yield potential and yield stability[93].Crop phenotypic traits under stress are indicators of yield stability.Some morphological and physiological response traits in different growth stages are used for crop yield-potential estimation.The main application of phenotyping techniques is integrating a variety of phenotypic technologies to evaluate abiotic and biotic stress as well as measure traits associated with yield potential to promote crop genetic improvement and breeding(Table 3).
Drought,salinity,and nutrient deficiencies are abiotic stresses that reduce crop yields worldwide.The responses of plants to abi-otic stresses are complex,and a given trait can experience positive,negative or no effect depending on the stress scenario[94].Thus,phenotyping abiotic stress resistance is often challenging.Crop phenotyping techniques allow measuring plant responses to various types of abiotic stresses in all growth stages for use in abiotic resistance breeding.
Drought stress is associated with reduced water availability and cellular dehydration.Typically,plant water loss is determined by stomatal conductance and leaf area.Drought stress reduces stomatal conductance,limiting photosynthesis and growth,and lower heat dissipation,increasing canopy temperature[45],so that temperature can be used as an indicator of plant response to water stress[95].A ground phenotyping platform with thermometer sensor and RGB cameras,developed to phenotype traits associated with drought tolerance in the field,required 2 h and 27 min to collect data in a 0.87-hectare field at a speed of 0.75 m s-1[96].Thermal cameras combined with UAVs permit scanning larger plot sizes in less time,allowing the selection of water stress-resistant genotypes[97].Drought stress can also alter crop growth status,detectable by combining RGB imaging and near-infrared information to select water stress-resistant genotypes[98].For example,Yang et al.[99]developed the RAP system,and extracted imagebased traits associated with biomass,greenness,and morphology traits for drought-resistance linkage analyses in 507 rice accessions,identifying 433 loci associated with drought resistance.
Evaluation of salinity tolerance-associated traits is another target in crop breeding.Salinity stress too can reduce stomatal conductance,and the plant response can be detected in visible(VIS)to near-infrared(NIR)spectral reflectance images[100].VIS and NIR images of plant growth were captured with Scanalyzer3D[101]to calculate plant growth and leaf health nondestructively to characterize plant salinity tolerance mechanisms,including Na+exclusion,osmotic tolerance,and tissue tolerance.Similarly,Hairmansis et al.[102]determined the salinity tolerance of two rice cultivars using an image-based method.They used RGB and fluorescence images to sense tissue ion concentrations and differentiate between the ionic and osmotic stages of salinity stress and identify the genetic basis of salinity tolerance.With the same platform,the projected shoot area and color parameters were identified as the most informative traits for salt tolerance in lentils[103].Hyperspectral imaging combined with machine learning can predict crop traits such as sodium concentration,photosynthetic rate,and transpiration,a capability that is useful for studying plant salinity stress[104].
Crop nutrient deficiencies can lead to reduced chlorophyll content and growth rate and result in plant tissue necrosis and higher sensitivity to disease[105];symptoms can be monitored using RGB,multispectral and hyperspectral sensors.Nitrogen(N)is the most important nutrient,owing to its connection to biomass and yield[48].A combination of digital imagery and appropriate machine learning methods facilitates convenient and reliable estimation of crop N nutrition[106].A movable phenotyping platform integrated with multispectral and hyperspectral sensors could estimate N content under various N treatments[107].Because existing techniques for crop nutrient deficiencies are expensive,a consumer-grade RGB camera can be used to generate a range of RGB vegetation indices,which show good performance in the assessment of low-N conditions[50].A sensing system using visible and near-infrared sensors that could capture reflectance at 12 different wavelengths was developed at a cost of USD$200 to predict N levels in leaves[108].
Invasion by diseases and pests impairs crop production and food security.Deployment of genes for disease resistance and insect resistance is one of the most economical and effective ways to reduce pesticide residues and is a primary target of crop breeding.Crop biotic stress can cause changes in plant texture and color,canopy morphology,leaf structure,transpiration rate,and optical characteristics[109].Accordingly,phenotyping platforms and methods have been developed to detect biotic stress using optical sensors[110,111].To assist in crop disease resistance breeding,phenotyping techniques should be capable of(a)detecting crop health status brought about by pathogens,(b)identifying various diseases,and(c)quantifying disease severity[112].
Pathogens such as rust and mold can produce symptoms such as yellowing and necrosis of leaves that can be detected by visible imaging.RGB images of wheat spikes can be used to assess wheat Fusarium head blight severity[113].Compared with RGB images,multispectral and hyperspectral imaging systems are much more sensitive to changes in the leaf spectral reflectance of vegetation due to pathogen infection[114].Some pathogens can damage the xylem tissue of crops,resulting in reduced transpiration and stomatal closure and causing canopy temperature variation[115].Accordingly,thermal infrared imaging technology has been used[116]to detect changes in transpiration rate in the early stage of disease infection at the plant and canopy levels.Because chlorophyll fluorescence can provide quantitative information about photosynthetic function,it has been used to detect the accumulation of compounds associated with disease resistance in leaf tissues and changes in photosynthetic rate caused by pathogenesis[117].Although prediction of diseases at an early stage is desirable,disease symptoms tend to be mild in early stages of infection and thus difficult to detect even with high-spatial-resolution and highspectral-resolution images.A ground-based real-time remote sensing system that integrates hyperspectral and multispectral fluorescence was developed to detect wheat yellow rust before it can beclearly identified[118].Similarly,chlorophyll fluorescence spectra of early rice blast infection were acquired at the leaf level,and changed with the severity of infection[119].
Insect pests are primary vectors of many important plant diseases.With a high-resolution web camera and a remote management system,Fukatsu et al.[120]developed a remote pheromone trap system based on image processing and wireless sensor networks to monitor continuously the presence of rice bugs in the field.An automatic video-tracking platform developed[121]to quantify aphid feeding behavior for evaluating plant resistance levels could screen 100 samples in parallel,and such a platform could also be adapted for screening large populations for plant resistance to aphids.Leaf surface temperature,photosynthetic activity,gas exchange,carotenoid concentration,and other physiological characteristics of crops will also change under insect invasion and these responses are associated with specific spectral features.For this reason,a UAV phenotyping platform is also a good choice for insect resistance screening.
Accurate pre-harvest crop yield estimation is critical for crop field management and food security[122].Nondestructive prediction of crop yield with high accuracy would allow the identification of high-yielding genotypes[123].Grain crop yield is generally described as the product of the number of kernels per unit area and the individual kernel weight.Typically,direct or indirect approaches are used to evaluate crop yield.
The direct approach is to calculate the yield from the number of spikelets per unit area,number of kernels per spike,and kernel weight.The complexity of the field environment,involving illumination differentials and panicle shape deformations make accurate panicle segmentation challenging.To solve this problem,Xiong et al.[124]built a Panicle-SEG-CNN model based on rice plot images that gave robust and accurate results for rice panicle segmentation in the field.To measure grain traits,Duan et al.[125]developed a labor-free instrument to thresh rice panicles,evaluate rice yield traits.It could calculate number of filled spikelets,kernel length and width,and 1000-kernel weight based on an RGB camera and weight sensor,with a throughput of 1440 plants per day.The facility could also accommodate wheat and barley.Similarly,Song et al.[126]designed a high-throughput automatic measuring system for corn kernel traits that could perform ear parameter measurement,ear threshing,kernel parameter measurement,kernel packaging,and label printing[126].The direct approach can combine both spike and grain characters to calculate crop yield with high precision,but spike and grain characters can be estimated only at late stages of crop growth or after crop harvest,restricting the efficiency of yield estimation in large-scale breeding.
The indirect approach is to use ground-based,UAV,and satellite phenotyping platforms to estimate crop yield in different growth periods by collecting and analyzing yield traits before harvesting[127].Traits including crop density,canopy cover,green biomass,canopy water mass,leaf senescence,and chlorophyll content can be assessed from image and spectral information[128].Under the condition of a sufficient water supply,cultivars with lower canopy temperature can show increased yield,so that crop yield prediction can be based on canopy temperature[129].The relationship between crop photosynthesis and chlorophyll content means that chlorophyll content can also be used to predict crop yield[130].The accuracy of yield prediction models increases with advances in imaging tools and modeling parameters.Most yield estimation models are based on canopy reflectance.
Owing to the high efficiency of UAV for data collection,aerial platforms equipped with multiple sensors are the main method of crop yield estimation.Yang et al.[131]used DCNNs to estimate rice yield at the ripening stage from RGB and multispectral imaging data acquired by a UAV.To improve yield prediction accuracy,Maimaitijiang et al.[132]collected information such as canopy spectral,structure,thermal,and texture features extracted from RGB,multispectral,and thermal images with a low-cost UAV to predict soybean yield,showing that multimodal data integration improves yield prediction accuracy and is adaptable to spatial variation.
The rapid development of sensors,image-processing technology,data-analysis technology,and phenotyping systems for multiple scales such as microscopic,ground,and aerial phenotyping platforms,has provided effective solutions for high-throughput crop phenotyping in the field as well as in controlled environments.By carrying various sensors,modern phenotyping platforms can evaluate crop yield potential and yield stability traits for crop breeding.
Phenotyping platforms with multiple sensors have multiple applications in crop breeding(Table 4).Microscopic phenotyping is based mainly on X-CT or microscopy imaging technology to observe microstructure of crop organs or tissues,but is too expensive for most research groups.Portable instruments have shown great mobility,but they can collect data only at fixed points,limiting their application to high-throughput phenotyping.Stationary phenotyping platforms can carry a variety of sensors to simultaneously monitor pot crops as well as crops in specific field areas.However,their operation is restricted to a limited area and their construction is expensive.Movable phenotyping platforms are ideal ground-based phenotyping solutions,but acquiring sensor data stably during movement is challenging,and there is no universal movable phenotyping platform suitable for all crop growing environments,in particular in paddy fields.UAV-based phenotyping platforms are flexible and easy to operate,with low hardware cost and high efficiency in comparison with ground-based platforms.But restricted by their battery and loading capacities,UAV-based phenotyping platforms can carry only limited sensors for long-term observation.Exploiting the advantages of ground and aerial platforms by combining the two is likely to offer flexible crop phenotyping in the future.For example,a tractor-based proximal crop-sensing platform and a UAV-based platform were combined to target complex traits such as plant height in sorghum[133].
Despite the progress in high-throughput crop phenotyping technology and platforms,there is no denying that many challenges remain to be addressed for more accurate estimation of most crop phenotypic traits.
Stationary phenotyping platforms can be used for monitoring at high resolution,but the operation is restricted to a limited area and is usually expensive.UAV platforms offer advantages for highthroughput phenotyping but are restricted by their flight time,and can obtain crop canopy information with only limited resolution.The development of intelligent phenotyping robots has the advantages of both the detection accuracy of stationary phenotype platforms and the detection efficiency of UAV phenotype platforms.Phenotyping robots equipped with autonomous navigation and power supply systems permit flexible movement over different plots.Phenotyping robots can simultaneously obtain highresolution data from the crop canopy and sides as well as crop growth environment information.
Table 4Advantages and limitations of crop phenotyping platforms and imaging techniques.
Phenotypic data obtained by a single sensor are limited,and the data quality is usually affected by the field environment,making it difficult to meet the demands of modern crop breeding.The combination of multi-source data can explain new biological phenomena.For example,in the process of crop abiotic and biotic stress resistance screening,a variety of phenotypic data are generally combined for effective and accurate analysis.Simultaneous acquisition of multiple crop traits by integrating multiple sensors,and development of novel sensors such as laser-induced breakdown spectroscopy and electrochemical micro-sensing[134]for detection of plant signal molecule is a challenge for the future.
Accelerating the application of artificial intelligence and other modern information technology in agriculture,especially in crop breeding,is in urgent need.DL has been applied to a certain extent in agriculture for classification of crops and their organs,identification of diseases and insect pests,identification and counting of fruits,weed identification,yield prediction,and other purposes.However,applications in crop breeding and other fields such as crop irrigation,water stress detection,hail damage,and greenhouse monitoring are still rare.Given that DL usually requires a large amount of data to train models,insufficient sharing of data sets in this field leads to time-consuming and difficult data collection.Information isolation should be reduced in future and modern information technologies such as DL should play a key role in more agricultural fields.
Massive amounts of crop data have been obtained via different phenotyping platforms,and much progress has been made in the extraction of different phenotypic traits combined with DL methods.However,data collected by various sensors still contain abundant information that invites further exploration.It is still urgent to mine sensor data to extract integrated information and conduct association analyses with key traits of crops.Because data types and formats collected by various phenotypic sensors are different,combining and managing the data acquired by multiple sensors is another challenge for future phenotypic data analysis,and calls for collaboration between multidisciplinary laboratories.
CRediT authorship contribution statement
Peng Song,Jinlu Wang,Xinyu GuoandWanneng Yangjointly wrote the article and prepared the figures and tables.Chunjiang Zhaohelped to modify and improve the article.
Declaration of competing interest
The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.
Acknowledgments
This work was supported by the National Key Research and Development Program of China (2016YFD0100101-18,2020YFD1000904-1-3),the National Natural Science Foundation of China(31601216,31770397),and Fundamental Research Funds for the Central Universities(2662019QD053,2662020ZKPY017).We thank Jiawei Shi of Huazhong Agricultural University for assisting to prepare the Figure.