李云霞,馬浚誠(chéng),劉紅杰,張領(lǐng)先※
基于RGB圖像與深度學(xué)習(xí)的冬小麥田間長(zhǎng)勢(shì)參數(shù)估算系統(tǒng)
李云霞1,馬浚誠(chéng)2,劉紅杰3,張領(lǐng)先1※
(1. 中國(guó)農(nóng)業(yè)大學(xué)信息與電氣工程學(xué)院,北京 100083;2. 中國(guó)農(nóng)業(yè)科學(xué)院農(nóng)業(yè)環(huán)境與可持續(xù)發(fā)展研究所,北京 100081;3. 河南省商丘市農(nóng)林科學(xué)院小麥研究所,商丘 476000)
為準(zhǔn)確、快速獲取冬小麥田間長(zhǎng)勢(shì)信息,該研究設(shè)計(jì)并實(shí)現(xiàn)了一種基于深度學(xué)習(xí)的冬小麥田間長(zhǎng)勢(shì)參數(shù)估算系統(tǒng)。該系統(tǒng)主要包含長(zhǎng)勢(shì)參數(shù)估算模塊和麥穗計(jì)數(shù)模塊。長(zhǎng)勢(shì)參數(shù)估算模塊基于殘差網(wǎng)絡(luò)ResNet18構(gòu)建長(zhǎng)勢(shì)參數(shù)估算模型,實(shí)現(xiàn)了冬小麥苗期葉面積指數(shù)(Leaf Area Index,LAI)和地上生物量(Above Ground Biomass,AGB)的估算,并基于遷移學(xué)習(xí)進(jìn)行泛化能力測(cè)試;麥穗計(jì)數(shù)模塊基于Faster R-CNN并結(jié)合非極大值抑制(Non Maximum Suppression,NMS)構(gòu)建麥穗計(jì)數(shù)模型,實(shí)現(xiàn)了開花期麥穗準(zhǔn)確計(jì)數(shù)。結(jié)果表明,針對(duì)2017—2018和2018—2019兩個(gè)生長(zhǎng)季數(shù)據(jù),基于ResNet18的長(zhǎng)勢(shì)參數(shù)估算模型對(duì)LAI估算的決定系數(shù)分別為0.83和0.80,對(duì)AGB估算的決定系數(shù)均為0.84,優(yōu)于基于傳統(tǒng)卷積神經(jīng)網(wǎng)絡(luò)(Convolutional Neural Networks,CNN)、VGG16和GoogLeNet構(gòu)建的估算模型,并且泛化能力測(cè)試表明該模型對(duì)數(shù)據(jù)的季節(jié)性差異具有魯棒性?;贔aster R-CNN的麥穗計(jì)數(shù)模型,在利用NMS優(yōu)化后決定系數(shù)從0.66增至0.83,提升了25.8%,NRMSE從0.19降至0.05,下降了73.7%。相較于基于CNN構(gòu)建的分類計(jì)數(shù)模型,基于Faster R-CNN+NMS的麥穗計(jì)數(shù)模型表現(xiàn)更優(yōu),決定系數(shù)為0.83,提升了33.9%,單個(gè)麥穗識(shí)別時(shí)間為1.009 s,效率提升了20.7%。綜上所述,該系統(tǒng)能夠滿足冬小麥田間長(zhǎng)勢(shì)參數(shù)估算需求,可為冬小麥田間精細(xì)化管理提供支撐。
機(jī)器視覺;圖像處理;模型;冬小麥;深度學(xué)習(xí);葉面積指數(shù);地上生物量;麥穗計(jì)數(shù)
冬小麥?zhǔn)侵袊?guó)重要的糧食作物。根據(jù)國(guó)家統(tǒng)計(jì)局?jǐn)?shù)據(jù)顯示,冬小麥播種面積約占中國(guó)小麥播種總面積的93.5%,年產(chǎn)量約占小麥年產(chǎn)總量的95%,對(duì)中國(guó)糧食安全具有重要意義[1]。精細(xì)化的田間管理和優(yōu)良品種培育是提高冬小麥產(chǎn)量和品質(zhì)的必要手段,通過冬小麥長(zhǎng)勢(shì)的無(wú)損檢測(cè),可為冬小麥的栽培育種工作提供豐富的數(shù)據(jù)支撐。冬小麥苗期(拔節(jié)前)是奠定生長(zhǎng)趨勢(shì)和籽粒產(chǎn)量的重要時(shí)期[2]。這一階段冬小麥葉片生長(zhǎng)迅速,生物量快速累積,因此,葉面積指數(shù)(Leaf Area Index,LAI)和地上生物量(Above Ground Biomass,AGB)可以充分反映苗期冬小麥的長(zhǎng)勢(shì)[3-4]。冬小麥進(jìn)入開花期后,麥穗的發(fā)育已基本完成,穗數(shù)是表征這一階段冬小麥長(zhǎng)勢(shì)的關(guān)鍵參數(shù),同時(shí)也是評(píng)估冬小麥種質(zhì)和產(chǎn)量的關(guān)鍵因素[5]。此外,穗數(shù)對(duì)冬小麥的營(yíng)養(yǎng)診斷和病害識(shí)別也具有一定指導(dǎo)意義[6]。因此,通過監(jiān)測(cè)冬小麥關(guān)鍵生育期的長(zhǎng)勢(shì)參數(shù),能夠掌握冬小麥的生長(zhǎng)狀況,進(jìn)而優(yōu)化田間管理方式,培育高產(chǎn)優(yōu)質(zhì)品種,具有良好的實(shí)際意義。
冬小麥長(zhǎng)勢(shì)參數(shù)的傳統(tǒng)測(cè)量方法是直接測(cè)量,往往需要破壞性采樣,不但耗費(fèi)大量的人力、物力且效率較低,不能滿足高通量、自動(dòng)化的冬小麥長(zhǎng)勢(shì)監(jiān)測(cè)需求[3-4]。在作物長(zhǎng)勢(shì)無(wú)損檢測(cè)方面,國(guó)內(nèi)外許多學(xué)者利用遙感數(shù)據(jù)開展作物長(zhǎng)勢(shì)參數(shù)反演[3,7-9]以及產(chǎn)量預(yù)測(cè)研究[7,10-12],例如,Wu等[3]利用合成孔徑雷達(dá)(Synthetic Aperture Radar,SAR)遙感數(shù)據(jù)進(jìn)行冬小麥LAI反演,F(xiàn)ernandez-Gallego等[7]利用無(wú)人機(jī)圖像進(jìn)行小麥麥穗計(jì)數(shù),王麗愛等[8]利用衛(wèi)星HJ-CCD的影像實(shí)現(xiàn)大田尺度下小麥LAI反演,Xie等[11]利用衛(wèi)星圖像進(jìn)行冬小麥產(chǎn)量估算?;谶b感數(shù)據(jù)的長(zhǎng)勢(shì)參數(shù)反演方法優(yōu)勢(shì)在于能夠?qū)崿F(xiàn)大面積的監(jiān)測(cè)及決策支持。然而,遙感數(shù)據(jù)獲取易受天氣、地形的影響且需要專用設(shè)備,靈活性不足[4]。此外,遙感數(shù)據(jù)的后期處理相對(duì)復(fù)雜[13]。目前,計(jì)算機(jī)視覺技術(shù)迅猛發(fā)展,廣泛應(yīng)用于農(nóng)業(yè)生產(chǎn)的多個(gè)領(lǐng)域,病蟲害診斷[14]、作物長(zhǎng)勢(shì)監(jiān)測(cè)[15-17]、植物圖像分割[18]等成為了作物長(zhǎng)勢(shì)參數(shù)無(wú)損檢測(cè)的有效工具之一。為降低數(shù)據(jù)采集成本并提高數(shù)據(jù)獲取的靈活性,一些學(xué)者利用數(shù)碼相機(jī)采集高分辨率數(shù)字圖像,然后將數(shù)字圖像應(yīng)用于作物長(zhǎng)勢(shì)參數(shù)獲取[17,19-20]、病害識(shí)別[21]等領(lǐng)域。Fernandez-Gallego等[19]利用數(shù)字圖像構(gòu)建冬小麥麥穗計(jì)數(shù)模型,鄧?yán)^忠等[21]基于數(shù)字圖像實(shí)現(xiàn)小麥腥黑穗病的識(shí)別,但這些研究中通常涉及較為復(fù)雜的圖像預(yù)處理過程,如去噪、分割和特征提取等,易受環(huán)境噪聲和主觀意識(shí)的影響,準(zhǔn)確率有待進(jìn)一步提升[22],而深度學(xué)習(xí)技術(shù)能夠有效解決上述問題。卷積神經(jīng)網(wǎng)絡(luò)(Convolutional Neural Networks,CNN)是目前應(yīng)用最廣泛的深度學(xué)習(xí)模型之一,其以圖像為輸入,自主學(xué)習(xí)圖像特征,克服了圖像特征人工提取的局限性,在作物長(zhǎng)勢(shì)參數(shù)獲取[4,20]、品質(zhì)檢測(cè)[23]、麥穗計(jì)數(shù)[5,24]等多個(gè)農(nóng)業(yè)領(lǐng)域得到了廣泛應(yīng)用。馬浚誠(chéng)等[4,20]將CNN和可見光圖像結(jié)合構(gòu)建冬小麥LAI和AGB估算模型,張領(lǐng)先等[24]基于分類思想設(shè)計(jì)了基于CNN的冬小麥麥穗檢測(cè)計(jì)數(shù)系統(tǒng),在田間環(huán)境下實(shí)現(xiàn)對(duì)冬小麥麥穗的識(shí)別計(jì)數(shù)。上述研究雖然取得了一定的效果,但在估算的準(zhǔn)確率、效率和泛化能力等方面需要進(jìn)一步提升。殘差網(wǎng)絡(luò)(Residual Neural Network,ResNet)[25]是目前最先進(jìn)的卷積神經(jīng)網(wǎng)絡(luò)模型之一,ResNet18、ResNet50均是代表性網(wǎng)絡(luò)結(jié)構(gòu)。ResNet具有更深網(wǎng)絡(luò)結(jié)構(gòu),并且殘差模塊的引入與使用大幅度降低了模型的參數(shù)量[26-28],有效克服了目前卷積神經(jīng)網(wǎng)絡(luò)應(yīng)用中參數(shù)量大、計(jì)算成本高的問題。
綜上,本研究基于ResNet18和RGB圖像開展冬小麥關(guān)鍵生育期長(zhǎng)勢(shì)參數(shù)估算系統(tǒng)研究。該系統(tǒng)針對(duì)冬小麥苗期LAI和AGB的估算,改進(jìn)ResNet18的網(wǎng)絡(luò)結(jié)構(gòu),基于2017—2018生長(zhǎng)季和2018—2019生長(zhǎng)季連續(xù)2個(gè)生長(zhǎng)季的冬小麥田間圖像數(shù)據(jù)構(gòu)建估算模型;針對(duì)冬小麥開花期麥穗計(jì)數(shù),以ResNet50為特征提取網(wǎng)絡(luò),結(jié)合基于區(qū)域的更快速卷積神經(jīng)網(wǎng)絡(luò)(Faster Region-based Convolutional Neural Networks,F(xiàn)aster R-CNN)和非極大值抑制(Non Maximum Suppression,NMS)構(gòu)建開花期麥穗計(jì)數(shù)模型。
1.1.1 圖像數(shù)據(jù)采集
本研究試驗(yàn)在河南省商丘市農(nóng)林科學(xué)院小麥試驗(yàn)基地開展。試驗(yàn)設(shè)置了12個(gè)長(zhǎng)×寬為2.4 m×5 m的試驗(yàn)小區(qū)。在每個(gè)試驗(yàn)小區(qū)內(nèi)設(shè)置3個(gè)互不重疊、大小為1 m×1 m的圖像采樣區(qū),用白色邊框標(biāo)出。試驗(yàn)共采集了2017—2018和2018—2019連續(xù)2個(gè)生長(zhǎng)季的高分辨率冬小麥冠層圖像數(shù)據(jù)及其相應(yīng)的田間采樣數(shù)據(jù)。2017 —2018生長(zhǎng)季的冬小麥播種時(shí)間為2017年10月14日,試驗(yàn)人員在該生長(zhǎng)季中共進(jìn)行了17次數(shù)據(jù)采集;2018—2019生長(zhǎng)季的冬小麥于2018年10月15日播種,試驗(yàn)人員共進(jìn)行了20次數(shù)據(jù)采集。冬小麥冠層圖像采集于晴天或少云天氣,采集時(shí)間為上午9:00—11:00。圖像數(shù)據(jù)采集設(shè)備為數(shù)碼相機(jī)(EOS 600D,Canon,佳能中國(guó)),圖像采集時(shí)保持相機(jī)閃光燈關(guān)閉且不使用光學(xué)變焦,鏡頭垂直向下,與地面保持1.5 m的垂直距離。采集的冬小麥冠層圖像原始大小為5 184×3 456(像素),并保存為.jpg格式。
2017—2018生長(zhǎng)季共采集冬小麥苗期冠層圖像612張,2018—2019生長(zhǎng)季共采集冬小麥苗期冠層圖像720張,圖像示例如圖1a所示。2017—2018生長(zhǎng)季開花期麥穗圖像采集日期為2018年5月2日,共采集麥穗圖像36張,2018—2019生長(zhǎng)季開花期麥穗圖像采集日期為2019年5月5日,共采集麥穗圖像36張,圖像示例如圖1b所示。
1.1.2 田間測(cè)量數(shù)據(jù)采集
LAI和AGB田間采樣與圖像采集同時(shí)進(jìn)行,采用田間采樣和實(shí)驗(yàn)室測(cè)量的獲取方法。在試驗(yàn)小區(qū)的非圖像采集區(qū)(即白色框外)隨機(jī)采集5株冬小麥樣本,按照試驗(yàn)小區(qū)標(biāo)號(hào)裝入試驗(yàn)紙袋并帶回實(shí)驗(yàn)室進(jìn)行測(cè)量。冬小麥LAI的計(jì)算方法如式(1)所示。
測(cè)算過程中需從冬小麥樣本植株葉片上截取長(zhǎng)度為5 cm的樣本葉片,然后將樣本葉片緊密排列并測(cè)量葉寬,最終獲得樣本葉片面積。隨后,將樣本葉片和冬小麥樣本植株葉片剩余部分分裝在不同試驗(yàn)紙袋,在70 ℃[29-30]下烘干至質(zhì)量恒定不變,然后使用精度為0.001的天平稱獲取樣本葉片質(zhì)量和葉片總質(zhì)量。完成LAI的測(cè)量后,將冬小麥樣本剩余部分在70 ℃下烘干至質(zhì)量恒定,然后進(jìn)行稱量,通過式(2)計(jì)算獲得AGB的實(shí)際值。
式中W為樣本植株剩余部分干質(zhì)量,g。
麥穗數(shù)測(cè)量值以基于圖像的人工計(jì)數(shù)方法獲取,采用統(tǒng)一的麥穗計(jì)數(shù)標(biāo)準(zhǔn),由5位具有相關(guān)農(nóng)學(xué)背景工作人員分別計(jì)數(shù)后,取平均值作為該圖像對(duì)應(yīng)的麥穗數(shù)測(cè)量值。
1.1.3 數(shù)據(jù)擴(kuò)充與數(shù)據(jù)集構(gòu)建
分別構(gòu)建苗期冠層圖像數(shù)據(jù)集和開花期麥穗數(shù)據(jù)集。針對(duì)冬小麥苗期冠層圖像,將圖像大小統(tǒng)一調(diào)整為224×224(像素)后,按照2∶1的比例將2017—2018生長(zhǎng)季圖像數(shù)據(jù)劃分為訓(xùn)練驗(yàn)證集408張圖像和測(cè)試集204張圖像。為進(jìn)一步豐富試驗(yàn)數(shù)據(jù),降低過擬合現(xiàn)象的可能性,以旋轉(zhuǎn)、翻轉(zhuǎn)和明度調(diào)整的方式對(duì)訓(xùn)練驗(yàn)證集數(shù)據(jù)進(jìn)行擴(kuò)充[20]。訓(xùn)練驗(yàn)證擴(kuò)充數(shù)據(jù)集由3個(gè)部分組成:1)初始的訓(xùn)練驗(yàn)證圖像數(shù)據(jù);2)旋轉(zhuǎn)、翻轉(zhuǎn)后的圖像數(shù)據(jù),旋轉(zhuǎn)角度選取90°、180°和270°,翻轉(zhuǎn)包括水平和垂直兩個(gè)方向;3)明度調(diào)整的圖像數(shù)據(jù),以2)中的圖像數(shù)據(jù)為基礎(chǔ),將其顏色空間從RGB(Red,Green,Blue)顏色空間轉(zhuǎn)為HSV(Hue,Saturation,Value)顏色空間,在明度(Value,V)通道將其值上下調(diào)整10%和20%。最終,通過上述擴(kuò)充方法,訓(xùn)練驗(yàn)證集擴(kuò)充了26倍。
麥穗計(jì)數(shù)試驗(yàn)的數(shù)據(jù)集包含冠層圖像和對(duì)應(yīng)的標(biāo)簽數(shù)據(jù)兩部分。麥穗標(biāo)簽采用MATLAB自帶的Image Labeler進(jìn)行矩形標(biāo)注,獲得與圖像相對(duì)應(yīng)的標(biāo)記數(shù)據(jù)。將采集的冠層圖像統(tǒng)一調(diào)整為2 500×2 500(像素),按照2∶1的比例劃分為訓(xùn)練驗(yàn)證集48張圖像和測(cè)試集24張圖像。將訓(xùn)練驗(yàn)證集的圖像進(jìn)一步處理,從每張圖像中隨機(jī)提取2~3張相互不重疊、且大小為640×640(像素)的目標(biāo)檢測(cè)試驗(yàn)圖像,從而構(gòu)建包含100張圖像的訓(xùn)練驗(yàn)證圖像數(shù)據(jù)集,然后將訓(xùn)練驗(yàn)證圖像以及相對(duì)應(yīng)的標(biāo)簽數(shù)據(jù)按照6∶4劃分為訓(xùn)練集60張圖像和驗(yàn)證集40張圖像,用于訓(xùn)練麥穗計(jì)數(shù)模型。測(cè)試集的初始圖像不做處理。
1.2.1 葉面積指數(shù)(LAI)和地上生物量(AGB)估算模型構(gòu)建
苗期長(zhǎng)勢(shì)參數(shù)估算包括對(duì)LAI和AGB的估算。本研究基于ResNet18構(gòu)建冬小麥苗期長(zhǎng)勢(shì)參數(shù)估算模型。ResNet18是目前應(yīng)用最廣泛的殘差網(wǎng)絡(luò)模型之一,具有參數(shù)量少、準(zhǔn)確率高的優(yōu)點(diǎn),且具有良好的訓(xùn)練效率[31]。本研究利用遷移學(xué)習(xí)方法對(duì)ImageNet數(shù)據(jù)庫(kù)上預(yù)訓(xùn)練的ResNet18進(jìn)行改進(jìn),保留ResNet18在ImageNet數(shù)據(jù)庫(kù)上預(yù)訓(xùn)練的權(quán)重,將原網(wǎng)絡(luò)中與分類任務(wù)相關(guān)的Softmax層和分類層替換為全連接層和回歸層。其中,全連接層中神經(jīng)元數(shù)量設(shè)置為2,輸出結(jié)果為冬小麥LAI和AGB,權(quán)重學(xué)習(xí)率因子和偏差學(xué)習(xí)率因子設(shè)置為10,以增加遷移學(xué)習(xí)后網(wǎng)絡(luò)學(xué)習(xí)的速率。模型訓(xùn)練優(yōu)化器采用動(dòng)量隨機(jī)梯度下降法,初始學(xué)習(xí)率為0.000 01,批處理大小設(shè)置為32,最大訓(xùn)練輪數(shù)設(shè)為300[4],其中初始學(xué)習(xí)率為模型可以收斂的最大學(xué)習(xí)率。改進(jìn)后的冬小麥苗期長(zhǎng)勢(shì)參數(shù)估算模型網(wǎng)絡(luò)結(jié)構(gòu)如圖2所示。
1.2.2 麥穗計(jì)數(shù)模型構(gòu)建
Faster R-CNN是目前應(yīng)用廣泛且準(zhǔn)確率較高的目標(biāo)檢測(cè)算法之一[26],非極大值抑制(Non Maximum Suppression,NMS)是優(yōu)化目標(biāo)檢測(cè)的有效算法之一。本研究基于Faster R-CNN構(gòu)建冬小麥麥穗計(jì)數(shù)模型,并以NMS進(jìn)行麥穗計(jì)數(shù)結(jié)果優(yōu)化,實(shí)現(xiàn)對(duì)冬小麥麥穗的準(zhǔn)確計(jì)數(shù)。麥穗計(jì)數(shù)方法主要包含5個(gè)步驟,如圖3所示,1)特征提?。狠斎霝檎麖垐D片,輸出為提取出的特征圖(Feature maps);2)候選區(qū)域獲?。禾卣鲌D輸入候選區(qū)域網(wǎng)絡(luò)(Region Proposal Network,RPN)輸出為候選區(qū)域(Region proposals);3)候選區(qū)域特征圖獲?。汉蜻x區(qū)域和特征圖共同輸入感興趣區(qū)域(Regions of Interest,ROI)池化層,輸出候選區(qū)域特征圖;4)分類與回歸:通過分類和邊框回歸,輸出候選區(qū)域類別,和候選區(qū)域在圖像中的位置,進(jìn)而實(shí)現(xiàn)目標(biāo)麥穗的分類;5)通過非極大值抑制(NMS)優(yōu)化,實(shí)現(xiàn)麥穗準(zhǔn)確計(jì)數(shù)。模型訓(xùn)練時(shí)采用動(dòng)量隨機(jī)梯度下降法,初始學(xué)習(xí)率為0.001,是模型可以較好收斂的最大學(xué)習(xí)率,批處理大小設(shè)置為1,最大訓(xùn)練輪數(shù)設(shè)為10。麥穗目標(biāo)檢測(cè)時(shí),負(fù)重疊參數(shù)設(shè)置為[0, 0.3],正重疊參數(shù)設(shè)置為[0.6, 1][5]。
NMS算法中的置信度分?jǐn)?shù)()和交并比(Intersection over Union,IoU)是2個(gè)重要參數(shù)。置信度分?jǐn)?shù)()表示標(biāo)記框內(nèi)目標(biāo)是麥穗的概率,值越高表示該標(biāo)記框中目標(biāo)是麥穗的概率更高,IoU表示同一麥穗不同標(biāo)記框之間的重疊程度,IoU值越大,則兩個(gè)標(biāo)記框之間的重疊程度越大。本研究通過對(duì)和IoU取值的設(shè)置,優(yōu)化麥穗識(shí)別準(zhǔn)確率,進(jìn)而實(shí)現(xiàn)麥穗的準(zhǔn)確計(jì)數(shù)。
1.2.3 評(píng)價(jià)指標(biāo)
本研究以決定系數(shù)(coefficient of determination,2)、均方根誤差(Root Mean Square Error,RMSE)、歸一化均方根誤差(Normalized Root Mean Square Error,NRMSE)作為模型性能的評(píng)價(jià)指標(biāo)。2表示估算值與實(shí)測(cè)值之間的擬合程度。RMSE用于測(cè)量估算值與實(shí)測(cè)值之間的誤差。NRMSE是RMSE的歸一化處理,以RMSE的值除以實(shí)測(cè)值平均數(shù)的計(jì)算方法獲得,可以更直觀地體現(xiàn)模型性能的變化。2越高,RMSE和NRMSE越低表示估算值與實(shí)測(cè)值之間吻合度越高,估算結(jié)果越準(zhǔn)確。
基于圖像處理與深度學(xué)習(xí),本研究設(shè)計(jì)并實(shí)現(xiàn)冬小麥關(guān)鍵生育期長(zhǎng)勢(shì)參數(shù)估算系統(tǒng),該系統(tǒng)以田間采集的冬小麥冠層RGB圖像為輸入,以長(zhǎng)勢(shì)參數(shù)為輸出,實(shí)現(xiàn)對(duì)表征冬小麥長(zhǎng)勢(shì)的LAI、AGB和麥穗數(shù)的估算。
該系統(tǒng)采用MATLAB 2019b編程實(shí)現(xiàn)。硬件環(huán)境為Intel Xeon處理器,32 GB內(nèi)存,英偉達(dá)Quadro系列顯卡NVIDIA Quadro P4000。系統(tǒng)主要分為4個(gè)功能模塊:圖像采集模塊、苗期長(zhǎng)勢(shì)參數(shù)估算模塊、麥穗計(jì)數(shù)模塊和系統(tǒng)管理模塊。依據(jù)系統(tǒng)設(shè)計(jì)目標(biāo)和主要功能需求,冬小麥長(zhǎng)勢(shì)參數(shù)估算系統(tǒng)結(jié)構(gòu)與功能設(shè)計(jì)如圖4所示。其中,圖像采集模塊主要功能是將田間環(huán)境下采集的高分辨率冬小麥冠層RGB圖像導(dǎo)入系統(tǒng)并根據(jù)需求進(jìn)行裁剪,系統(tǒng)將剪裁后的圖像數(shù)據(jù)存入服務(wù)器,以便于進(jìn)行長(zhǎng)勢(shì)參數(shù)估算;苗期長(zhǎng)勢(shì)參數(shù)估算模塊主要功能是運(yùn)行苗期長(zhǎng)勢(shì)參數(shù)估算模型,對(duì)上傳的圖像數(shù)據(jù)進(jìn)行運(yùn)算,實(shí)現(xiàn)采樣區(qū)域LAI和AGB的估算,此外該模塊還能夠繪制長(zhǎng)勢(shì)參數(shù)變化趨勢(shì),以便分析冬小麥苗期長(zhǎng)勢(shì)情況;麥穗計(jì)數(shù)模塊主要功能是將采集的冬小麥冠層RGB圖像輸入麥穗計(jì)數(shù)模型中,對(duì)該采樣區(qū)域進(jìn)行麥穗計(jì)數(shù);系統(tǒng)管理模塊主要功能包括查詢長(zhǎng)勢(shì)參數(shù)估算結(jié)果與歷史記錄,以及對(duì)系統(tǒng)進(jìn)行日常管理與維護(hù)。
冬小麥苗期長(zhǎng)勢(shì)參數(shù)估算模塊及麥穗計(jì)數(shù)模塊系統(tǒng)界面如圖5所示。圖像導(dǎo)入功能將圖像采集模塊中預(yù)處理后的圖像導(dǎo)入;圖像處理功能可按照估算模型圖像輸入的要求,對(duì)導(dǎo)入的圖像進(jìn)行處理;的運(yùn)行與展示功能可進(jìn)行冬小麥長(zhǎng)勢(shì)參數(shù)的估算并顯示結(jié)果;結(jié)果保存功能將當(dāng)前日期下該批數(shù)據(jù)的處理結(jié)果進(jìn)行保存以便于后續(xù)查詢。用戶可在界面下方按需求查看冬小麥階段性的長(zhǎng)勢(shì)參數(shù)變化趨勢(shì)。
本研究基于ResNet18構(gòu)建的冬小麥苗期長(zhǎng)勢(shì)參數(shù)估算模型以冬小麥苗期冠層圖像數(shù)據(jù)為輸入,以長(zhǎng)勢(shì)參數(shù)LAI和AGB為輸出,2017—2018的長(zhǎng)勢(shì)參數(shù)估算結(jié)果如圖6所示。由圖6可知,模型估算的LAI和AGB值與田間測(cè)量數(shù)據(jù)具有較好的擬合度,2分別為0.83和0.84,NRMSE分別為0.32和0.31。
為進(jìn)一步驗(yàn)證模型的估算效果,本研究與其他估算模型進(jìn)行對(duì)比試驗(yàn),結(jié)果如表2所示。VGG16[32]以其較深的網(wǎng)絡(luò)和較小的卷積核,在保證特征提取感受野大小的同時(shí)減少了卷積層的參數(shù)量。GoogLeNet[33]采用Inception模塊,使網(wǎng)絡(luò)結(jié)構(gòu)更加模塊化,也增加了網(wǎng)絡(luò)寬度,通過在網(wǎng)絡(luò)不同深度處增加輔助sofrmax函數(shù)來保證網(wǎng)絡(luò)加深的同時(shí)有效避免梯度消失。文獻(xiàn)[4]中,基于卷積神經(jīng)網(wǎng)絡(luò)(CNN)構(gòu)建麥穗計(jì)數(shù)模型,通過分類的方法實(shí)現(xiàn)麥穗識(shí)別與計(jì)數(shù),提高了田間環(huán)境下麥穗識(shí)別計(jì)數(shù)的準(zhǔn)確性。由表2可知,本研究模型和基于VGG16的長(zhǎng)勢(shì)參數(shù)估算模型有較好的2,但VGG16的參數(shù)量遠(yuǎn)遠(yuǎn)大于ResNet18的參數(shù)量,在相同迭代次數(shù)的情況下,基于VGG16的估算模型的訓(xùn)練時(shí)間約是基于ResNet18的估算模型所需訓(xùn)練時(shí)間的5倍,因此基于ResNet18的估算模型訓(xùn)練效率更高?;贕oogLeNet的估算模型與基于ResNet18的估算模型有相近的訓(xùn)練時(shí)間,但是2值在4個(gè)模型中最低。文獻(xiàn)[4]中基于CNN的估算模型結(jié)構(gòu)較為簡(jiǎn)單,與其他3個(gè)模型相比較,網(wǎng)絡(luò)深度較淺,因此訓(xùn)練效率較高,但該模型的2明顯低于基于ResNet18的估算模型和基于VGG16的估算模型。與文獻(xiàn)[4]基于CNN的估算模型相比較,基于ResNet18的估算模型對(duì)AGB估算的2從0.78增至0.84,提升了7.7%,NRMSE從0.35減至0.31,下降了11.4%;對(duì)LAI估算的2分別從0.79增至0.83,提升了5.1%,NRMSE從0.34減至0.32,下降了5.9%。本研究基于ResNet18構(gòu)建的冬小麥苗期長(zhǎng)勢(shì)參數(shù)估算模型,在估算效率和準(zhǔn)確率方面均取得了較好的結(jié)果。
表2 基于ResNet18的模型與其他模型估算結(jié)果對(duì)比
為進(jìn)一步驗(yàn)證模型的泛化能力,本研究利用2018—2019生長(zhǎng)季數(shù)據(jù)進(jìn)行測(cè)試。將預(yù)處理后的2018—2019生長(zhǎng)季圖像數(shù)據(jù)輸入基于ResNet18構(gòu)建的冬小麥苗期長(zhǎng)勢(shì)參數(shù)估算模型進(jìn)行測(cè)試,對(duì)于AGB的估算結(jié)果如圖7a所示,其2為0.79,NRMSE為0.36,對(duì)于LAI的估算結(jié)果如圖7b所示,其2為0.77,NRMSE為0.50。估算表現(xiàn)與模型在2017—2018生長(zhǎng)季數(shù)據(jù)的估算結(jié)果比較有所下降,由此可見,2018—2019生長(zhǎng)季圖像數(shù)據(jù)中含有2017—2018生長(zhǎng)季圖像數(shù)據(jù)中未體現(xiàn)的特征。此外,與2017—2018生長(zhǎng)季相比較,2018—2019生長(zhǎng)季冬小麥種植過程中,在出苗階段土壤存在水分不均衡的現(xiàn)象,缺少水分的區(qū)域麥苗生長(zhǎng)速率較慢,進(jìn)而導(dǎo)致冬小麥出苗不均衡。因此,在樣本數(shù)據(jù)采集過程中,同批次隨機(jī)采集的樣本植株存在生長(zhǎng)狀況不一致的現(xiàn)象,相鄰批次采集樣本植株的生長(zhǎng)狀況延續(xù)性也較差,進(jìn)而影響了模型估算的準(zhǔn)確度。
遷移學(xué)習(xí)可以保留模型在2017—2018生長(zhǎng)季圖像數(shù)據(jù)集上訓(xùn)練時(shí)的參數(shù)權(quán)重,并對(duì)新的數(shù)據(jù)集進(jìn)行學(xué)習(xí),避免了相同圖像特征的重復(fù)學(xué)習(xí),提高了模型訓(xùn)練效率。因此,利用2018—2019生長(zhǎng)季圖像數(shù)據(jù)進(jìn)行遷移學(xué)習(xí),獲得遷移學(xué)習(xí)后的基于ResNet18的冬小麥苗期長(zhǎng)勢(shì)參數(shù)估算模型,然后進(jìn)行測(cè)試,結(jié)果如圖7c和圖7d所示。遷移學(xué)習(xí)后的模型對(duì)AGB估算結(jié)果的2從0.79增至0.84,提升了6.3%,NRMSE從0.36降至0.25,下降了30.6%;對(duì)LAI估算結(jié)果的2從0.77增至0.80,提升了3.9%,NRMSE從0.50降至0.34,下降了32%。遷移學(xué)習(xí)后的模型估算結(jié)果表明模型具有較好的泛化能力,可以適應(yīng)數(shù)據(jù)的季節(jié)性差異,具有魯棒性。
本研究基于驗(yàn)證集40張圖像進(jìn)行置信度分?jǐn)?shù)()和交并比(IoU)的優(yōu)選試驗(yàn)。的取值設(shè)置是0.90和0.95,IoU的取值范圍設(shè)置為[0.1, 0.5],取值梯度為0.1,共有10組參數(shù)組合,如表3所示。由表3可知,采用NMS優(yōu)化后,基于Faster R-CNN+NMS的計(jì)數(shù)模型的2明顯高于基于Faster R-CNN的計(jì)數(shù)模型。當(dāng)取值為0.95,IoU取值為0.2和0.3時(shí),基于Faster R-CNN+NMS的計(jì)數(shù)模型在驗(yàn)證集上有最佳的2(2=0.67),IoU=0.2時(shí)NRMSE是0.095,IoU=0.3時(shí)是0.100。因此,本研究最終采用的NMS參數(shù)是0.95,IoU是0.2。
表3 置信度分?jǐn)?shù)和交并比組合優(yōu)選試驗(yàn)結(jié)果
注:本研究模型在表中加粗表示。
Note: Model of this study is shown in bold in the table.
測(cè)試集包含24張圖像,共計(jì)9 521個(gè)麥穗,使用測(cè)試集圖像對(duì)基于Faster R-CNN的計(jì)數(shù)模型和基于Faster R-CNN+NMS的計(jì)數(shù)模型進(jìn)行測(cè)試?;贔aster R-CNN的計(jì)數(shù)模型對(duì)測(cè)試集的計(jì)數(shù)結(jié)果為8 092個(gè)麥穗,2為0.66,NRMSE為0.19,如圖8a所示?;贔aster R-CNN+NMS的計(jì)數(shù)模型對(duì)測(cè)試集的麥穗計(jì)數(shù)結(jié)果為9 207個(gè)麥穗,2從0.66增至0.83,提升了25.8%,NRMSE從0.19降至0.05,下降了73.7%,如圖8b所示。
基于Faster R-CNN的計(jì)數(shù)模型對(duì)麥穗識(shí)別過程中存在麥穗標(biāo)記框位置不準(zhǔn)確以及對(duì)重疊麥穗的識(shí)別檢測(cè)有遺漏的問題,進(jìn)而導(dǎo)致了麥穗計(jì)數(shù)誤差。如圖9所示,圖9a左圖中存在交疊的麥穗沒有被識(shí)別標(biāo)記,圖9b左圖中臨近麥穗沒有被識(shí)別標(biāo)記,圖9c左圖中標(biāo)記框標(biāo)記位置不準(zhǔn)確,圖9d左圖中兩個(gè)麥穗緊密相連被識(shí)別為一個(gè)麥穗;而基于Faster R-CNN+NMS的計(jì)數(shù)模型在麥穗識(shí)別過程中修正了這些問題。因此,通過NMS優(yōu)化,基于Faster R-CNN+NMS的計(jì)數(shù)模型對(duì)圖像中麥穗標(biāo)記的準(zhǔn)確度以及識(shí)別能力顯著提升。
為進(jìn)一步表明本研究麥穗計(jì)數(shù)模型的有效性,與文獻(xiàn)[24]提出的麥穗計(jì)數(shù)方法進(jìn)行了對(duì)比,結(jié)果如表4所示。文獻(xiàn)[24]基于CNN+NMS構(gòu)建了麥穗計(jì)數(shù)系統(tǒng),該系統(tǒng)利用CNN對(duì)麥穗、葉片和陰影進(jìn)行分類,使用滑動(dòng)窗口結(jié)合NMS進(jìn)行麥穗計(jì)數(shù),其計(jì)數(shù)結(jié)果2為0.62。與之相比較,本研究基于Faster R-CNN+NMS的計(jì)數(shù)模型的2提高了33.9%,達(dá)到0.83。
表4 基于Faster R-CNN+NMS的麥穗計(jì)數(shù)模型與其他麥穗計(jì)數(shù)模型估算結(jié)果比較
在計(jì)數(shù)效率方面,由于文獻(xiàn)[24]方法圖像處理流程比較繁復(fù),針對(duì)100張大小為640×640(像素)的測(cè)試圖像中3 138個(gè)麥穗(人工計(jì)數(shù))識(shí)別計(jì)數(shù),總耗時(shí)為3 991.24 s,其單個(gè)麥穗的平均識(shí)別計(jì)數(shù)耗時(shí)為1.272 s。相比而言,本研究基于Faster R-CNN的計(jì)數(shù)模型和基于Faster R-CNN+NMS的計(jì)數(shù)模型對(duì)測(cè)試集中9 521個(gè)麥穗(人工計(jì)數(shù))識(shí)別計(jì)數(shù)總耗時(shí)分別為9 690和9 611 s,平均每個(gè)麥穗識(shí)別計(jì)數(shù)耗時(shí)分別為1.018和1.009 s,后者效率比文獻(xiàn)[24]提升了20.7%。結(jié)果表明,本研究基于Faster R-CNN+NMS的計(jì)數(shù)模型在準(zhǔn)確率和效率上均有明顯提高,在一定程度上解決了由于麥穗交叉遮擋而導(dǎo)致的麥穗識(shí)別不清和遺漏問題,具有更好的實(shí)際應(yīng)用價(jià)值。
本研究針對(duì)冬小麥關(guān)鍵生育期的長(zhǎng)勢(shì)參數(shù),設(shè)計(jì)并實(shí)現(xiàn)了基于深度學(xué)習(xí)的冬小麥田間長(zhǎng)勢(shì)參數(shù)估算系統(tǒng),實(shí)現(xiàn)了對(duì)冬小麥長(zhǎng)勢(shì)變化的監(jiān)測(cè),拓展了深度技術(shù)在田間作物長(zhǎng)勢(shì)參數(shù)估算方面的應(yīng)用。
1)構(gòu)建了基于改進(jìn)ResNet18的冬小麥苗期長(zhǎng)勢(shì)參數(shù)估算模型,實(shí)現(xiàn)了葉面積指數(shù)(Leaf Area Index,LAI)和地上生物量(Above Ground Biomass,AGB)的準(zhǔn)確估算,在基于連續(xù)2個(gè)生長(zhǎng)季數(shù)據(jù)的測(cè)試結(jié)果中,對(duì)于AGB估算的決定系數(shù)均達(dá)到0.84,對(duì)于LAI估算的決定系數(shù)均超過0.80。同時(shí),驗(yàn)證了該估算模型針對(duì)數(shù)據(jù)的季節(jié)性差異具有較好的泛化能力。
2)針對(duì)冬小麥開花期麥穗計(jì)數(shù)任務(wù),基于Faster R-CNN構(gòu)建了麥穗計(jì)數(shù)模型,并以非極大值抑制(Non Maximum Suppression,NMS)進(jìn)行計(jì)數(shù)結(jié)果優(yōu)化,模型計(jì)數(shù)結(jié)果決定系數(shù)為0.83,歸一化均方根誤差為0.05,單個(gè)麥穗識(shí)別計(jì)數(shù)時(shí)間為1.009 s,與未以NMS優(yōu)化的計(jì)數(shù)模型和基于卷積神經(jīng)網(wǎng)絡(luò)(Convolutional Neural Network,CNN)的分類計(jì)數(shù)模型相比,麥穗計(jì)數(shù)的準(zhǔn)確率和效率均得到提升。
本研究結(jié)合數(shù)字圖像的特點(diǎn),在冬小麥苗期和開花期開展了關(guān)鍵長(zhǎng)勢(shì)參數(shù)的估算研究,取得了較高的準(zhǔn)確率。在下一步的研究中,需進(jìn)一步探索冬小麥拔節(jié)至抽穗階段長(zhǎng)勢(shì)監(jiān)測(cè)參數(shù)估算方法,從而實(shí)現(xiàn)冬小麥全生育期長(zhǎng)勢(shì)監(jiān)測(cè),豐富冬小麥田間長(zhǎng)勢(shì)參數(shù)估算系統(tǒng)的功能,為冬小麥的田間管理和育種栽培提供支撐。
[1] 何中虎,莊巧生,程順和,等. 中國(guó)小麥產(chǎn)業(yè)發(fā)展與科技進(jìn)步[J]. 農(nóng)學(xué)學(xué)報(bào),2018,8(1):99-106.
He Zhonghu, Zhuang Qiaosheng, Cheng Shunhe, et al. Wheat production and technology improvement in China[J]. Journal of Agriculture, 2018, 8(1): 99-106. (in Chinese with English abstract)
[2] 湯永祿,李朝蘇,吳春,等. 四川盆地單產(chǎn)9000 kg/hm2以上超高產(chǎn)小麥品種產(chǎn)量結(jié)構(gòu)與干物質(zhì)積累特點(diǎn)[J]. 作物學(xué)報(bào),2014,40(1):134-142.
Tang Yonglu, Li Chaosu, Wu Chun, et al. Yield component and dry matter accumulation in wheat varieties with 9000kg/ha yield potential in Sichuan Basin[J]. Acta Agronomica Sinica, 2014, 40(1): 134-142. (in Chinese with English abstract)
[3] Wu S R, Yang P, Ren J Q, et al. Winter wheat LAI inversion considering morphological characteristics at different growth stages coupled with microwave scattering model and canopy simulation model[J]. Remote Sensing of Environment, 2020, 240: 111681.
[4] 馬浚誠(chéng),劉紅杰,鄭飛翔,等. 基于可見光圖像和卷積神經(jīng)網(wǎng)絡(luò)的冬小麥苗期長(zhǎng)勢(shì)參數(shù)估算[J]. 農(nóng)業(yè)工程學(xué)報(bào),2019,35(5):183-189.
Ma Juncheng, Liu Hongjie, Zheng Feixiang, et al. Estimating growth related traits of winter wheat at seedling stages based on RGB images and convolutional neural network[J]. Transactions of the Chinese Society of Agricultural Engineering (Transactions of the CSAE), 2019, 35(5): 183-189. (in Chinese with English abstract)
[5] Madec S, Jin X L, Lu H, et al. Ear density estimation from high resolution RGB imagery using deep learning technique[J]. Agricultural and Forest Meteorology, 2019, 264: 225-234.
[6] Zhou C Q, Liang D, Yang X D, et al. Wheat ears counting in field conditions based on multi-feature optimization and TWSVM[J]. Frontiers in Plant Science, 2018, 9: 1024.
[7] Fernandez-Gallego J A, Lootens P, Borra-Serrano I, et al. Automatic wheat ear counting using machine learning based on RGB UAV imagery[J]. Plant Journal, 2020, 103(4): 1603-1613.
[8] 王麗愛,周旭東,朱新開,等. 基于 HJ-CCD 數(shù)據(jù)和隨機(jī)森林算法的小麥葉面積指數(shù)反演[J]. 農(nóng)業(yè)工程學(xué)報(bào),2016,32(3):149-154.
Wang Liai, Zhou Xudong, Zhu Xinkai, et al. Inverting wheat leaf area index based on HJ-CCD remote sensing data and random forest algorithm[J]. Transactions of the Chinese Society of Agricultural Engineering (Transactions of the CSAE), 2016, 32(3): 149-154. (in Chinese with English abstract)
[9] 孫詩(shī)睿,趙艷玲,王亞娟,等. 基于無(wú)人機(jī)多光譜遙感的冬小麥葉面積指數(shù)反演[J]. 中國(guó)農(nóng)業(yè)大學(xué)學(xué)報(bào),2019,24(11):51-58.
Sun Shirui, Zhao Yanling, Wang Yajuan, et al. Leaf area index inversion of winter wheat based on multispectral remote sensing of UVA[J]. Journal of China Agricultural University, 2019, 24(11): 51-58. (in Chinese with English abstract)
[10] Bognár P, Kern A, Pasztor S, et al. Yield estimation and forecasting for winter wheat in Hungary using time series of MODIS data[J]. International Journal of Remote Sensing, 2017, 38(11): 3394-3414.
[11] Xie Y, Wang P X, Bai X J, et al. Assimilation of the leaf area index and vegetation temperature condition index for winter wheat yield estimation using Landsat imagery and the CERES-Wheat model[J]. Agricultural and Forest Meteorology, 2017, 246: 194-206.
[12] 王蕾,王鵬新,李俐,等. 基于VTCI和分位數(shù)回歸模型的冬小麥單產(chǎn)估測(cè)方法[J]. 農(nóng)業(yè)機(jī)械學(xué)報(bào),2017,48(7):167-173,166.
Wang Lei, Wang Pengxin, Li li, et al. Winter wheat yield estimation method based on quantile regression model and remotely sensed vegetation temperature condition index[J]. Transactions of the Chinese Society for Agricultural Machinery, 2017, 48(7): 167-173, 166. (in Chinese with English abstract)
[13] Durbha S S, King R L, Younan N H, et al. Support vector machines regression for retrieval of leaf area index from multiangle imaging spectroradiometer[J]. Remote Sensing of Environment, 2007, 107(1/2): 348-361.
[14] 姚青,張超,王正,等. 分布式移動(dòng)農(nóng)業(yè)病蟲害圖像采集與診斷系統(tǒng)設(shè)計(jì)與試驗(yàn)[J]. 農(nóng)業(yè)工程學(xué)報(bào),2017,33(增刊1):184-191.
Yao Qing, Zhang Chao, Wang Zheng, et al. Design and experiment of agricultural diseases and pest image collection and diagnosis system with distributed and mobile device[J]. Transactions of the Chinese Society of Agricultural Engineering (Transactions of the CSAE), 2017, 33(Supp. 1): 184-191. (in Chinese with English abstract)
[15] 賈洪雷,王剛,郭明卓,等. 基于機(jī)器視覺的玉米植株數(shù)量獲取方法與試驗(yàn)[J]. 農(nóng)業(yè)工程學(xué)報(bào),2015,31(3):215-220.
Jia Honglei, Wang Gang, Guo Mingzhuo, et al. Methods and experiments of obtaining corn population based on machine vision[J]. Transactions of the Chinese Society of Agricultural Engineering (Transactions of the CSAE), 2015, 31(3): 215-220. (in Chinese with English abstract)
[16] Crimmins M A, Crimmins T M. Monitoring plant phenology using digital repeat photography[J]. Environmental Management, 2008, 41: 949-958.
[17] Casadesus J, Villegas D. Conventional digital cameras as a tool for assessing leaf area index and biomass for cereal breeding[J]. Journal of Integrative Plant Biology, 2014, 56(1): 7-14.
[18] Pérez-Rodríguez F, Gomez-Garcia E. Codelplant: Regression-based processing of RGB images for colour models in plant image segmentation[J]. Computers and Electronics in Agriculture, 2019, 163: 104880.
[19] Fernandez-Gallego J A, Kefauver S C, Gutierrez N A, et al. Wheat ear counting in-field conditions: High throughput and low-cost approach using RGB images[J]. Plant Methods, 2018, 14: 22.
[20] Ma J C, Li Y X, Chen Y Q, et al. Estimating above ground biomass of winter wheat at early growth stages using digital images and deep convolutional neural network[J]. European Journal of Agronomy, 2019, 103: 117-129.
[21] 鄧?yán)^忠,李敏,袁之報(bào),等. 基于圖像識(shí)別的小麥腥黑穗病害特征提取與分類[J]. 農(nóng)業(yè)工程學(xué)報(bào),2012,28(3):172-176.
Deng Jizhong, Li Min, Yuan Zhibao, et al. Feature extraction and classification ofdiseases based on image recognition[J]. Transactions of the Chinese Society of Agricultural Engineering (Transactions of the CSAE), 2012, 28(3): 172-176. (in Chinese with English abstract)
[22] Ghosal S, Blystone D, Singh A K, et al. An explainable deep machine vision framework for plant stress phenotyping[J]. Proceedings of the National Academy of Sciences of the United States of America, 2018, 115(18): 4613-4618.
[23] 祝詩(shī)平,卓佳鑫,黃華,等. 基于CNN的小麥籽粒完整性圖像檢測(cè)系統(tǒng)[J]. 農(nóng)業(yè)機(jī)械學(xué)報(bào),2020,51(5):36-42.
Zhu Shiping, Zhuo Jiaxin, Huang Hua, et al. Wheat grain integrity image detection system based on CNN[J]. Transactions of the Chinese Society for Agricultural Machinery, 2020, 51(5): 36-42. (in Chinese with English abstract)
[24] 張領(lǐng)先,陳運(yùn)強(qiáng),李云霞,等. 基于卷積神經(jīng)網(wǎng)絡(luò)的冬小麥麥穗檢測(cè)計(jì)數(shù)系統(tǒng)[J]. 農(nóng)業(yè)機(jī)械學(xué)報(bào),2019,50(3):144-150.
Zhang Lingxian, Chen Yunqiang, Li Yunxia, et al. Detection and counting system for winter wheat ears based on convolutional neural network[J]. Transactions of the Chinese Society for Agricultural Machinery, 2019, 50(3): 144-150. (in Chinese with English abstract)
[25] He K M, Zhang X Y, Ren S Q, et al. Deep residual learning for image recognition[C]// IEEE Conference on Computer Vision and Pattern Recognition, Las Vegas: IEEE, 2016: 770-778.
[26] 席芮,姜?jiǎng)P,張萬(wàn)枝,等. 基于改進(jìn)Faster R-CNN的馬鈴薯芽眼識(shí)別方法[J]. 農(nóng)業(yè)機(jī)械學(xué)報(bào),2020,51(4):216-223.
Xi Rui, Jiang Kai, Zhang Wanzhi, et al. Recognition method for potato buds based on improved Faster R-CNN[J]. Transactions of the Chinese Society for Agricultural Machinery, 2020, 51(4): 216-223. (in Chinese with English abstract)
[27] 姚青,谷嘉樂,呂軍,等. 改進(jìn)RetinaNet的水稻冠層害蟲為害狀自動(dòng)檢測(cè)模型[J]. 農(nóng)業(yè)工程學(xué)報(bào),2020,36(15):182-188.
Yao Qing, Gu Jiale, Lyu Jun, et al. Automatic detection model for pest damage symptoms on rice canopy based on improved RetinaNet[J]. Transactions of the Chinese Society of Agricultural Engineering (Transactions of the CSAE), 2020, 36(15): 182-188. (in Chinese with English abstract)
[28] Wang Y J, Lv J D, Xu L M, et al. A segmentation method for waxberry image under orchard environment[J]. Scientia Horticulturae, 2020, 266: 109309.
[29] Yue J B, Zhou C Q, Guo W, et al. Estimation of winter-wheat above-ground biomass using the wavelet analysis of unmanned aerial vehicle-based digital images and hyperspectral crop canopy images[J]. International Journal of Remote Sensing, 2020, 42(5): 1602-1622.
[30] Liu B, Liu L L, Asseng S, et al. Modelling the effects of post-heading heat stress on biomass partitioning, and grain number and weight of wheat[J]. Journal of Experimental Botany, 2020, 71(19): 6015-6031.
[31] Canziani A, Paszke A, Culurciello E. An analysis of deep neural network models for practical applications[C]// IEEE Computer Vision and Pattern Recognition, Las Vegas: IEEE, 2016.
[32] Simonyan K, Zisserman A. Very deep convolutional networks for large-scale image recognition[C]// International Conference on Learning Representations, San Diego: Yoshua Bengio & Yann LeCun, 2015.
[33] Szegedy C, Liu W, Jia Y Q, et al. Going deeper with convolutions[C]// IEEE Computer Society Computer Society Conference on Computer Vision and Pattern Recognition, Boston: IEEE, 2015: 1-9.
Field growth parameter estimation system of winter wheat using RGB digital images and deep learning
Li Yunxia1, Ma Juncheng2, Liu Hongjie3, Zhang Lingxian1※
(1.,,100083,;2.,,100081,; 3.,,476000,)
Leaf Area Index (LAI) and Above Ground Biomass (AGB) are the key traits to fully reflect the growth of winter wheat at early stages. After winter wheat enters the flowering stage, the development of wheat ears has been basically completed. The number of ears is a vital agronomic parameter to characterize the growth of winter wheat at this stage, and also a critical factor to evaluate the germplasm and yield of winter wheat. At present, the traditional methods for measuring LAI, AGB and ear number are destructive and time-consuming, and the accuracy of estimation methods based on RGB images and shallow machine learning technology needs to be further improved. In order to accurately and quickly obtain the winter wheat growth information, and further improve the accuracy of winter wheat growth parameter estimation, this study developed a winter wheat growth parameter estimation system based on RGB images and deep learning. The system mainly included growth parameter estimation module and wheat ear counting module. The data of winter wheat during the 2017—2018 and 2018—2019 growing seasons were collected consecutively. Combined with the characteristics of RGB images, the deep learning models were explored, which were applicable to obtain the growth parameters at the early stage of winter wheat and to count the number of wheat ears. Therefore, for the estimation module of growth parameter at early stages, the residual network ResNet18 was used as the basic network to establish the growth parameter estimation model, with the 2017—2018 growth season data. Based on this estimation model, the LAI and AGB at early stages of winter wheat were obtained. Moreover, the generalization ability of the ResNet18-based model was tested using transfer learning, with 2018—2019 growth season data. For wheat ear counting module, a wheat ear counting model was built, which was based on the Faster R-CNN and Non Maximum Suppression (NMS), and achieved accurate counting of wheat ear at flowering stage. Moreover, the Faster R-CNN+NMS wheat ear counting model was compared with the Faster R-CNN ear counting model without NMS and the classification counting model based on Convolutional Neural Networks (CNNs). The results showed that, for the estimation module of growth parameter at early stages, the determination coefficients of the ResNet18-based model for LAI estimation respectively were 0.83 and 0.80 on the dataset of the two growing seasons of 2017—2018 and 2018—2019. And the determination coefficients of the ResNet18-based model for AGB estimation both were 0.84. The model was superior to the model based on VGG16 and GoogLeNet and the published CNN-based estimation model. And the results of generalization ability test showed that the ResNet18-based model was robust to the seasonal differences of data. For wheat ear counting module, given the ear counting model based on Faster R-CNN, the determination coefficient increased by 25.8% from 0.66 to 0.83, after the NMS optimization. And the NRMSE decreased by 73.7% from 0.19 to 0.05. Compared with the classification counting model based on CNN, the wheat ear counting model based on Faster R-CNN+NMS had better performance, with a determination coefficient of 0.83, improved by 33.9%, and a single ear identified time of 1.009 s, improved by 20.7%. In conclusion, this system can meet the demand of field growth parameter estimation of winter wheat and provide support for fine field management of winter wheat.
machine vision; image processing; models; winter wheat; deep learning; leaf area index; above ground biomass; wheat ear counting
李云霞,馬浚誠(chéng),劉紅杰,等. 基于RGB圖像與深度學(xué)習(xí)的冬小麥田間長(zhǎng)勢(shì)參數(shù)估算系統(tǒng)[J]. 農(nóng)業(yè)工程學(xué)報(bào),2021,37(24):189-198.doi:10.11975/j.issn.1002-6819.2021.24.021 http://www.tcsae.org
Li Yunxia, Ma Juncheng, Liu Hongjie, et al. Field growth parameter estimation system of winter wheat using RGB digital images and deep learning[J]. Transactions of the Chinese Society of Agricultural Engineering (Transactions of the CSAE), 2021, 37(24): 189-198. (in Chinese with English abstract) doi:10.11975/j.issn.1002-6819.2021.24.021 http://www.tcsae.org
2021-01-21
2021-12-11
國(guó)家自然科學(xué)基金(31801264);中國(guó)科協(xié)青年人才托舉工程第四屆項(xiàng)目(2018QNRC001)
李云霞,博士生,研究方向?yàn)檗r(nóng)業(yè)信息技術(shù)。Email:736927152@qq.com
張領(lǐng)先,教授,博士生導(dǎo)師,研究方向?yàn)檗r(nóng)業(yè)信息技術(shù)。Email:zhanglx@cau.edu.cn
10.11975/j.issn.1002-6819.2021.24.021
S512.1+1;TP391.4
A
1002-6819(2021)-24-0189-10