亚洲免费av电影一区二区三区,日韩爱爱视频,51精品视频一区二区三区,91视频爱爱,日韩欧美在线播放视频,中文字幕少妇AV,亚洲电影中文字幕,久久久久亚洲av成人网址,久久综合视频网站,国产在线不卡免费播放

        ?

        基于三目視覺(jué)的自主導(dǎo)航拖拉機(jī)行駛軌跡預(yù)測(cè)方法及試驗(yàn)

        2018-10-11 03:12:52田光兆顧寶興IrshadAliMari王海青
        關(guān)鍵詞:拖拉機(jī)軌跡方向

        田光兆,顧寶興,Irshad Ali Mari,周 俊,王海青

        ?

        基于三目視覺(jué)的自主導(dǎo)航拖拉機(jī)行駛軌跡預(yù)測(cè)方法及試驗(yàn)

        田光兆1,顧寶興1※,Irshad Ali Mari2,周 俊1,王海青1

        (1. 南京農(nóng)業(yè)大學(xué)工學(xué)院,南京 210031;2. 巴基斯坦信德農(nóng)業(yè)大學(xué)凱爾布爾工程技術(shù)學(xué)院,凱爾布爾 66020)

        為了實(shí)現(xiàn)自主導(dǎo)航拖拉機(jī)離開(kāi)衛(wèi)星定位系統(tǒng)時(shí)能夠持續(xù)可靠工作,該文提出了基于三目視覺(jué)的拖拉機(jī)行駛軌跡預(yù)測(cè)方法。該方法將三目相機(jī)分解為長(zhǎng)短基線2套雙目視覺(jué)系統(tǒng)分時(shí)獨(dú)立工作。通過(guò)檢測(cè)相鄰時(shí)刻農(nóng)業(yè)環(huán)境中同一特征點(diǎn)的坐標(biāo)變化反推拖拉機(jī)在水平方向上的運(yùn)動(dòng)矢量,并通過(guò)灰色模型預(yù)測(cè)未來(lái)時(shí)刻的運(yùn)動(dòng)矢量變化,最終建立不同速度下的前進(jìn)方向誤差模型。試驗(yàn)結(jié)果表明:拖拉機(jī)行駛速度為0.2 m/s時(shí),46.5 s后前進(jìn)方向誤差超過(guò)0.1 m,對(duì)應(yīng)行駛距離為9.3 m。行駛速度上升到0.5 m/s時(shí),該時(shí)間和行駛距離分別降低到17.2 s和8.6 m。當(dāng)行駛速度上升到0.8 m/s時(shí),該時(shí)間和距離分別快速降低至8.5 s和6.8 m。行駛速度越高,前進(jìn)方向誤差增速越高。該方法可用于短時(shí)預(yù)測(cè)拖拉機(jī)的行駛軌跡,為自主導(dǎo)航控制提供依據(jù)。

        拖拉機(jī);自主導(dǎo)航;機(jī)器視覺(jué);軌跡預(yù)測(cè);灰色模型

        0 引 言

        為了降低人工成本、提高作業(yè)效率、改善作業(yè)質(zhì)量,具有自主導(dǎo)航功能的農(nóng)業(yè)機(jī)械越來(lái)越多地應(yīng)用到農(nóng)業(yè)生產(chǎn)中來(lái)。比如,自主導(dǎo)航拖拉機(jī)能夠作為牽引機(jī)械進(jìn)行田間播種、施肥、耕地[1-7]。自主導(dǎo)航聯(lián)合收割機(jī)能夠在無(wú)人干預(yù)的情況下收獲小麥、水稻和玉米[8-14]。自主導(dǎo)航插秧機(jī)能夠在水田里精準(zhǔn)插秧,大幅提高作業(yè)精度,其效率是人工的50倍[15-18]。

        視覺(jué)系統(tǒng)是自主導(dǎo)航農(nóng)業(yè)裝備的重要組成部分。視覺(jué)系統(tǒng)主要用來(lái)識(shí)別作物行、溝壟或障礙物,是農(nóng)機(jī)智能化作業(yè)的重要外界環(huán)境和自身姿態(tài)感知工具[19-21]。尤其是當(dāng)GPS或北斗定位系統(tǒng)受到干擾無(wú)法正常工作時(shí),視覺(jué)系統(tǒng)能夠進(jìn)行輔助相對(duì)定位,保證導(dǎo)航工作能夠繼續(xù)進(jìn)行[22-27]。同時(shí),通過(guò)視覺(jué)系統(tǒng)也能夠?qū)ξ磥?lái)進(jìn)行預(yù)測(cè),其預(yù)測(cè)結(jié)果為導(dǎo)航?jīng)Q策與控制提供數(shù)據(jù)基礎(chǔ)。由于農(nóng)業(yè)機(jī)械導(dǎo)航控制具有嚴(yán)重時(shí)滯性,為了提高常規(guī)PID控制效果,文獻(xiàn)[28]和[29]都提到通過(guò)預(yù)測(cè)數(shù)據(jù)能夠顯著改善PID控制效果,具有很強(qiáng)的工程實(shí)際意義。

        現(xiàn)有研究中,大多是在GPS或北斗可靠工作的前提下討論導(dǎo)航控制方法問(wèn)題。而本文探討的問(wèn)題是當(dāng)GPS或北斗失效時(shí),如何單獨(dú)利用視覺(jué)系統(tǒng)為自主導(dǎo)航拖拉機(jī)進(jìn)行行駛軌跡預(yù)測(cè),并提出一種基于灰色理論的軌跡預(yù)測(cè)方法。

        1 視覺(jué)系統(tǒng)硬件組成

        本研究中視覺(jué)系統(tǒng)由Point Gray公司BBX3三目相機(jī)、1394B采集卡和工控機(jī)組成。

        三目相機(jī)由右、中、左3個(gè)子相機(jī)構(gòu)成。其中右、中2個(gè)子相機(jī)構(gòu)成短基線雙目視覺(jué)系統(tǒng),右、左2個(gè)子相機(jī)構(gòu)成長(zhǎng)基線雙目視覺(jué)系統(tǒng)。三目視覺(jué)系統(tǒng)由長(zhǎng)短基線2套雙目視覺(jué)系統(tǒng)疊加而成。2套雙目視覺(jué)系統(tǒng)空間坐標(biāo)系原點(diǎn)和各軸正方向相同,原點(diǎn)在右相機(jī)光心,水平向右為軸正方向,垂直向下為軸正方向,水平向前為軸正方向。為了提高開(kāi)發(fā)效率,Point Gray公司已經(jīng)直接將雙目系統(tǒng)的另外一個(gè)相機(jī)的抓圖和系統(tǒng)的視覺(jué)測(cè)量功能固化到API[30]。使用者無(wú)需采用傳統(tǒng)的雙目抓圖、圖像特征點(diǎn)檢測(cè)與匹配、視差法測(cè)距等一系列過(guò)程,只需根據(jù)單幅右相機(jī)圖像即可獲取環(huán)境深度信息。

        圖1 三目相機(jī)結(jié)構(gòu)

        1394B采集卡用于高速接收相機(jī)回傳的數(shù)字圖像。工控機(jī)是圖像處理的核心部件,用于程序控制相機(jī)采集圖像,執(zhí)行圖像處理程序,輸出解算結(jié)果。

        2 拖拉機(jī)運(yùn)動(dòng)矢量檢測(cè)與預(yù)測(cè)方法

        2.1 拖拉機(jī)運(yùn)動(dòng)矢量檢測(cè)

        拖拉機(jī)運(yùn)動(dòng)矢量檢測(cè)的基本原理是:利用同一組靜止的特征點(diǎn)相鄰時(shí)刻在相機(jī)坐標(biāo)系中的坐標(biāo)變化,反推拖拉機(jī)的運(yùn)動(dòng)矢量,其具體檢測(cè)流程如圖2所示。

        圖2 拖拉機(jī)運(yùn)動(dòng)矢量檢測(cè)流程

        由于2套視覺(jué)系統(tǒng)空間坐標(biāo)原點(diǎn)重合,所以同一個(gè)實(shí)際物理點(diǎn)在2套視覺(jué)系統(tǒng)中的坐標(biāo)理論上也是完全吻合的。但是由于2套系統(tǒng)的基線長(zhǎng)度不一樣,就會(huì)導(dǎo)致測(cè)量結(jié)果略有偏差。為了得到精確的測(cè)量結(jié)果,長(zhǎng)短基線2套視覺(jué)系統(tǒng)執(zhí)行相同的運(yùn)動(dòng)檢測(cè)方法,然后求取均值。

        其步驟包括:

        1)在復(fù)雜背景農(nóng)業(yè)環(huán)境中,右相機(jī)采集圖像,圖像編號(hào)自增1,并將圖像存儲(chǔ)。

        2)對(duì)右相機(jī)采集到的圖像進(jìn)行SIFT特征點(diǎn)檢測(cè),計(jì)算環(huán)境中所有特征點(diǎn)圖像坐標(biāo)[31]。由于相機(jī)的圖像將不可避免地發(fā)生畸變,所以還需要對(duì)每個(gè)特征點(diǎn)的近似權(quán)值進(jìn)行估算。圖3為某一特征點(diǎn)的近似權(quán)值估算方法如式(1)所示。

        3)根據(jù)特征點(diǎn)的圖像坐標(biāo)和步驟1)采集到的深度圖像,利用相機(jī)提供的API函數(shù)進(jìn)行圖像坐標(biāo)到相機(jī)坐標(biāo)的轉(zhuǎn)換。

        4)判斷當(dāng)前處理的是否為第1幅圖像。若是,將每個(gè)有效特征點(diǎn)的圖像坐標(biāo)、相機(jī)坐標(biāo)以及權(quán)值存儲(chǔ)到數(shù)組中,然后重復(fù)步驟1)~4)。若不是第1幅圖像,則將以上數(shù)據(jù)存儲(chǔ)到數(shù)組中。

        5)與前1幅圖像進(jìn)行SIFT特征點(diǎn)匹配。將匹配成功的特征點(diǎn)對(duì)的圖像坐標(biāo)保存到數(shù)組中。

        圖3 特征點(diǎn)近似權(quán)值計(jì)算

        6)遍歷數(shù)組,分別從數(shù)組和中找出匹配成功的特征點(diǎn)對(duì)所對(duì)應(yīng)的相機(jī)坐標(biāo)和近似權(quán)值,保存到數(shù)組。

        式中表示有效特征點(diǎn)的總個(gè)數(shù)。

        2.2 拖拉機(jī)運(yùn)動(dòng)矢量預(yù)測(cè)

        農(nóng)用拖拉機(jī)大多數(shù)都是勻速低速作業(yè),根據(jù)其作業(yè)特點(diǎn),本文設(shè)計(jì)了灰色理論的軌跡預(yù)測(cè)方案。

        圖4 通過(guò)滑動(dòng)窗口獲取預(yù)測(cè)數(shù)據(jù)

        假設(shè)時(shí)刻滑窗內(nèi)個(gè)運(yùn)動(dòng)矢量組成樣本(0),其中(0)形式為式(4)。

        為了降低干擾數(shù)據(jù)對(duì)有效數(shù)據(jù)的影響,對(duì)(0)進(jìn)行一次累加,得到(0)的1-AGO序列為(1),如式(5)所示。

        其中

        則GM(1,1)模型的表達(dá)式為一階微分方程,如式(6)所示。

        其中

        對(duì)式(8)離散化后,可得出一次累加后+1時(shí)刻的預(yù)測(cè)模型,如式(9)所示。

        3 拖拉機(jī)行駛軌跡預(yù)測(cè)試驗(yàn)與結(jié)果分析

        3.1 試驗(yàn)設(shè)計(jì)

        以東方紅SG250型拖拉機(jī)為試驗(yàn)平臺(tái),將BBX3型三目相機(jī)以水平姿態(tài)安裝在拖拉機(jī)頭部的配重梁前端,距離地面0.6 m,如圖5所示。同時(shí)拖拉機(jī)頂部安裝精度為厘米級(jí)的RTK-GPS系統(tǒng)。在光線條件良好的晴天上午,在具有大量砂石的硬路面開(kāi)展試驗(yàn)。視覺(jué)檢測(cè)和RTK-GPS檢測(cè)同步,頻率都是10 Hz。拖拉機(jī)分別以0.2、0.5、0.8 m/s的低速直線行駛。通過(guò)工控機(jī)采集GPS數(shù)據(jù)和視覺(jué)預(yù)測(cè)數(shù)據(jù),繪制實(shí)測(cè)軌跡和預(yù)測(cè)軌跡,提取相同行駛距離內(nèi)的有效數(shù)據(jù),分析預(yù)測(cè)精度。所用工控機(jī)型號(hào)為研華ARK3500P,CPU型號(hào)為i7-3610,內(nèi)存4 GB。由于GPS采用了載波相位實(shí)時(shí)差分技術(shù),定位精度可達(dá)厘米級(jí),因此可將GPS定位數(shù)據(jù)作為參考標(biāo)準(zhǔn),以此驗(yàn)證三目視覺(jué)系統(tǒng)的運(yùn)動(dòng)檢測(cè)與預(yù)測(cè)精度。

        圖5 三目相機(jī)安裝位置

        試驗(yàn)過(guò)程中,GPS初始時(shí)刻的全局定位數(shù)據(jù)作為基準(zhǔn)。按照文中方法,通過(guò)視覺(jué)系統(tǒng)獲得的下一時(shí)刻的增量數(shù)據(jù)加上GPS基準(zhǔn)數(shù)據(jù),就形成了視覺(jué)系統(tǒng)的測(cè)量數(shù)據(jù)(也是絕對(duì)坐標(biāo))。由多個(gè)視覺(jué)系統(tǒng)測(cè)量數(shù)據(jù)可以形成視覺(jué)系統(tǒng)預(yù)測(cè)數(shù)據(jù)。在某時(shí)刻的視覺(jué)系統(tǒng)預(yù)測(cè)數(shù)據(jù)和該時(shí)刻的GPS的定位數(shù)據(jù)之間必然存在一定的誤差。本文得到這個(gè)誤差后再向前進(jìn)方向(方向)和側(cè)向(方向)進(jìn)行分解,得到2個(gè)方向上的誤差分量。

        3.2 結(jié)果與分析

        拖拉機(jī)分別在0.2、0.5、0.8 m/s的恒定速度下直線行駛,軌跡預(yù)測(cè)試驗(yàn)結(jié)果分別如圖6~圖7和表1所示。

        圖6a、6c、6e中,實(shí)線是根據(jù)GPS數(shù)據(jù)繪制的拖拉機(jī)行駛軌跡。虛線是根據(jù)前文所述三目視覺(jué)預(yù)測(cè)方法得到的預(yù)測(cè)軌跡。視覺(jué)預(yù)測(cè)軌跡基本與GPS實(shí)測(cè)軌跡一致。但是隨著行駛距離的增大,預(yù)測(cè)的累積誤差越來(lái)越明顯。

        圖6b、6d、6f表明,方向的誤差是導(dǎo)致預(yù)測(cè)軌跡和實(shí)測(cè)軌跡偏差越來(lái)越大的主要原因。方向誤差在震蕩中不斷增大。根據(jù)試驗(yàn)數(shù)據(jù),分別建立了不同速度下方向累積誤差的2次多項(xiàng)式模型。當(dāng)拖拉機(jī)行駛速度為分別0.2、0.5、0.8 m/s時(shí),該模型分別如式(11)~(13)所示,對(duì)應(yīng)2分別為0.93、0.97、0.98。式(11)~(13)中,對(duì)的一階導(dǎo)數(shù)反映出方向累計(jì)誤差的變化。由于二次項(xiàng)系數(shù)均大于0,故一階導(dǎo)數(shù)均為遞增函數(shù),即方向誤差的變化呈線性遞增。通過(guò)計(jì)算,線性遞增的斜率分別為0.000 2、0.002 6、0.005 0,表明拖拉機(jī)行駛速度越高,方向誤差增速越高。該模型可以用來(lái)估計(jì)當(dāng)前時(shí)刻誤差狀態(tài)。

        式中表示行駛時(shí)間,表示方向累積誤差。

        圖7反映出不同速度下方向的誤差變化很小。主要原因是試驗(yàn)過(guò)程中拖拉機(jī)直線行駛,在方向上位移很小,因此累積誤差也很小,在±5 cm以內(nèi)。

        圖7 不同恒定速度下x方向累積誤差

        表1 x和z方向累積誤差數(shù)據(jù)

        表1定量分析了不同速度下,、方向累積誤差變化。行駛速度越快,方向累積誤差上升越快。速度為0.2 m/s時(shí),需要46.5 s的時(shí)間方向累積誤差超過(guò)0.1 m,當(dāng)速度上升到0.8 m/s時(shí),這個(gè)時(shí)間縮短到8.5 s。方向誤差變化沒(méi)有明顯規(guī)律性。通過(guò)對(duì)表1的數(shù)據(jù)進(jìn)行非線性擬合,得到方向累積誤差變化速率(單位m/s)與行駛速度之間的關(guān)系

        據(jù)式(14)可以直接計(jì)算不同速度下的方向累積誤差變化速率。

        在國(guó)內(nèi)外近期類似研究中,文獻(xiàn)[32]中提到采用粒子濾波方式對(duì)改裝的農(nóng)業(yè)機(jī)器人進(jìn)行了60 m的直線跟蹤,橫向偏差為4±0.7 cm。文獻(xiàn)[33]中提到改裝后的茂源250拖拉機(jī)以0.58 m/s速度視覺(jué)導(dǎo)航,最大誤差18 cm,平均誤差4.8 cm。這些關(guān)于機(jī)器視覺(jué)在農(nóng)機(jī)導(dǎo)航上的最新研究成果與本文研究最大的區(qū)別在于研究?jī)?nèi)容的不同。上述研究均是以機(jī)器視覺(jué)和其他傳感器聯(lián)合,進(jìn)行直線跟蹤研究。而本文是研究預(yù)測(cè)軌跡與實(shí)際軌跡的偏差。由于本文是單獨(dú)通過(guò)視覺(jué)傳感器對(duì)運(yùn)動(dòng)軌跡進(jìn)行檢測(cè),并在此基礎(chǔ)上再次預(yù)測(cè),那么累積誤差將不可避免。

        誤差產(chǎn)生的原因主要包括:自然光線影響和圖像處理的時(shí)間延遲造成。為了減小累積誤差,得到更好的試驗(yàn)效果,建議使用更高性能工控機(jī)和高速快門相機(jī)。

        4 結(jié) 論

        1)通過(guò)長(zhǎng)短基線2套雙目視覺(jué)系統(tǒng)疊加構(gòu)建三目視覺(jué)系統(tǒng),并通過(guò)灰色預(yù)測(cè)算法,確實(shí)能夠預(yù)測(cè)拖拉機(jī)在平面上的運(yùn)動(dòng)軌跡。

        2)通過(guò)視覺(jué)系統(tǒng)得到的預(yù)測(cè)軌跡與真實(shí)軌跡之間存在累積誤差。該誤差主要由前進(jìn)方向的測(cè)量誤差引起。

        3)拖拉機(jī)行駛速度越高,前進(jìn)方向累積誤差增速越高。速度為0.2 m/s時(shí),前進(jìn)方向累積誤差超過(guò)0.1 m的時(shí)間和行駛距離分別為46.5 s和9.3 m。速度上升到0.5 m/s時(shí),該時(shí)間和行駛距離分別降低到17.2 s和8.6 m。當(dāng)速度上升到0.8 m/s時(shí),該時(shí)間和距離分別快速降低至8.5 s和6.8 m。

        [1] Adam J L, Piotr M, Seweryn L, et al. Precision of tractor operations with soil cultivation implements using manual and automatic steering modes[J]. Biosystems Engineering, 2016, 145(5): 22-28.

        [2] Gan-Mor S, Clark R L, Upchurch B L. Implement lateral position accuracy under RTK-GPS tractor guidance[J]. Computers and Electronics in Agriculture, 2007, 59(1/2): 31-38.

        [3] Timo O, Juha B. Guidance system for agricultural tractor with four wheel steering[J]. IFAC Proceedings Volumes, 2013, 46(4): 124-129.

        [4] Karimi D, Henry J, Mann D D. Effect of using GPS auto steer guidance systems on the eye-glance behavior and posture of tractor operators[J]. Journal of Agricultural Safety and Health, 2012, 18(4): 309-318.

        [5] 劉柯楠,吳普特,朱德蘭,等.太陽(yáng)能渠道式噴灌機(jī)自主導(dǎo)航研究[J].農(nóng)業(yè)機(jī)械學(xué)報(bào),2016,47(9):141-146. Liu Kenan, Wu Pute, Zhu Delan, et al. Autonomous navigation of solar energy canal feed sprinkler irrigation machine[J]. Transactions of the Chinese Society for Agricultural Machinery, 2016, 47(9): 141-146. (in Chinese with English abstract)

        [6] Cordesses L, Cariou C, Berducat M. Combine harvester control using real time kinematic GPS[J]. Precision Agriculture, 2000, 2(2): 147-161.

        [7] Jongmin C, Xiang Y, Liangliang Y, et al. Development of a laser scanner-based navigation system for a combine harvester[J]. Engineering in Agriculture, Environment and Food, 2014, 7(1): 7-13.

        [8] 張美娜,呂曉蘭,陶建平,等.農(nóng)用車輛自主導(dǎo)航控制系統(tǒng)設(shè)計(jì)與試驗(yàn)[J]. 農(nóng)業(yè)機(jī)械學(xué)報(bào),2016,47(7):42-47. Zhang Meina, Lü Xiaolan, Tao Jianping, et al. Design and experiment of automatic guidance control system in agricultural vehicle[J]. Transactions of the Chinese Society for Agricultural Machinery, 2016, 47(7): 42-47. (in Chinese with English abstract)

        [9] 姬長(zhǎng)英,周?。r(nóng)業(yè)機(jī)械導(dǎo)航技術(shù)發(fā)展分析[J].農(nóng)業(yè)機(jī)械學(xué)報(bào),2014,45(9):44-54. Ji Changying, Zhou jun. Current situation of navigation technologies for agricultural machinery[J]. Transactions of the Chinese Society for Agricultural Machinery, 2014, 45(9): 44-54. (in Chinese with English abstract)

        [10] 張漫,項(xiàng)明,魏爽,等.玉米中耕除草復(fù)合導(dǎo)航系統(tǒng)設(shè)計(jì)與試驗(yàn)[J].農(nóng)業(yè)機(jī)械學(xué)報(bào),2015,46(增刊1):8-14. Zhang Man, Xiang Ming, Wei Shuang, et al. Design and implementation of a corn weeding-cultivating integrated navigation system based on GNSS and MV[J]. Transactions of the Chinese Society for Agricultural Machinery, 2015, 46(Supp.1): 8-14. (in Chinese with English abstract)

        [11] 謝斌,李靜靜,魯倩倩,等.聯(lián)合收割機(jī)制動(dòng)系統(tǒng)虛擬樣機(jī)仿真及試驗(yàn)[J].農(nóng)業(yè)工程學(xué)報(bào),2014,30(4):18-24. Xie Bin, Li Jingjing, Lu Qianqian, et al. Simulation and experiment of virtual prototype braking system of combine harvester[J]. Transactions of the Chinese Society of Agricultural Engineering (Transactions of the CSAE), 2014, 30(4): 18-24. (in Chinese with English abstract)

        [12] 任述光,謝方平,王修善,等.4LZ-0.8型水稻聯(lián)合收割機(jī)清選裝置氣固兩相分離作業(yè)機(jī)理[J].農(nóng)業(yè)工程學(xué)報(bào),2015,31(12):16-22. Ren Shuguang, Xie Fangping, Wang Xiushan, et al. Gas-solid two-phase separation operation mechanism for 4LZ-0.8 rice combine harvester cleaning device[J]. Transactions of the Chinese Society of Agricultural Engineering (Transactions of the CSAE), 2015, 31(12): 16-22. (in Chinese with English abstract)

        [13] 焦有宙,田超超,賀超,等.不同工質(zhì)對(duì)大型聯(lián)合收割機(jī)余熱回收的熱力學(xué)性能[J].農(nóng)業(yè)工程學(xué)報(bào),2018,34(5):32-38. Jiao Youzhou, Tian Chaochao, He Chao, et al. Thermodynamic performance of waste heat collection for large combine harvester with different working fluids[J]. Transactions of the Chinese Society of Agricultural Engineering (Transactions of the CSAE), 2018, 34(5): 32-38. (in Chinese with English abstract)

        [14] 偉利國(guó),張小超,汪鳳珠,等.聯(lián)合收割機(jī)稻麥?zhǔn)斋@邊界激光在線識(shí)別系統(tǒng)設(shè)計(jì)與試驗(yàn)[J].農(nóng)業(yè)工程學(xué)報(bào),2017,33(增刊1):30-35. Wei Liguo, Zhang Xiaochao, Wang Fengzhu, et al. Design and experiment of harvest boundary online recognition system for rice and wheat combine harvester based on laser detection[J]. Transactions of the Chinese Society of Agricultural Engineering (Transactions of the CSAE), 2017, 33(Supp.1): 30-35. (in Chinese with English abstract)

        [15] Yoshisada N, Katsuhiko T, Kentaro N, et al. A global positioning system guided automated rice transplanter[J]. IFAC Proceedings Volumes, 2013, 46(18): 41-46.

        [16] Tamaki K, Nagasaka Y, Nishiwaki K, et al. A robot system for paddy field farming in Japan[J]. IFAC Proceedings Volumes, 2013, 46(18): 143-147.

        [17] 胡煉,羅錫文,張智剛,等.基于CAN總線的分布式插秧機(jī)導(dǎo)航控制系統(tǒng)設(shè)計(jì)[J].農(nóng)業(yè)工程學(xué)報(bào),2009,25(12):88-92. Hu Lian, Luo Xiwen, Zhang Zhigang, et al. Design of distributed navigation control system for rice transplanters based on controller area network[J]. Transactions of the Chinese Society of Agricultural Engineering (Transactions of the CSAE), 2009, 25(12): 88-92. (in Chinese with English abstract)

        [18] 胡靜濤,高雷,白曉平,等.農(nóng)業(yè)機(jī)械自動(dòng)導(dǎo)航技術(shù)研究進(jìn)展[J].農(nóng)業(yè)工程學(xué)報(bào),2015,31(10):1-10. Hu Jingtao, Gao Lei, Bai Xiaoping, et al. Review of research on automatic guidance of agricultural vehicles[J]. Transactions of the Chinese Society of Agricultural Engineering (Transactions of the CSAE), 2015, 31(10): 1-10. (in Chinese with English abstract)

        [19] 宋宇,劉永博,劉路,等.基于機(jī)器視覺(jué)的玉米根莖導(dǎo)航基準(zhǔn)線提取方法[J].農(nóng)業(yè)機(jī)械學(xué)報(bào),2017,48(2):38-44. Song Yu, Liu Yongbo, Liu Lu, et al. Extraction method of navigation baseline of corn roots based on machine vision[J]. Transactions of the Chinese Society for Agricultural Machinery, 2017, 48(2): 38-44. (in Chinese with English abstract)

        [20] Leemans V, Destain M F. Line cluster detection using a vartiant of the Hough transform for culture row localisation[J]. Image and Vision Computing, 2006, 24(5): 541-550.

        [21] Gee C, Bossu J, Jones G, et al. Crop weed discrimination in perspective agronomic image[J]. Computers and Electronics in Agriculture, 2007, 58(1): 1-9.

        [22] 姜國(guó)權(quán),柯杏,杜尚豐,等.基于機(jī)器視覺(jué)的農(nóng)田作物行檢測(cè)[J].光學(xué)學(xué)報(bào),2009,29(4):1015-1020. Jiang Guoquan, Ke Xing, Du Shangfeng, et al. Crop row detection based on machine vision[J]. Acta Optica Sinica, 2009, 29(4): 1015-1020. (in Chinese with English abstract)

        [23] Han Y H, Wang Y M,Kang F. Navigation line detection basedon support vector machine for automatic agriculture vehicle[C]// International Conference on Automatic Control and Artificial Intelligence (ACAI 2012), Xiamen, 2012: 1381-1385.

        [24] English A, Ross P,Ball D, et al. Vision based guidance for robot navigation in agriculture[C]// 2014 IEEE International Conference on Robotics & Automation (ICRA), Hong Kong, 2014: 1693-2698.

        [25] Cariou C, Lenain R, Thuilot B, et al. Motion planner and lateral-longitudinal controllers for autonomous maneuvers of a farm vehicle in headland[C]// 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems, USA, 2009: 5782-5787.

        [26] 林桂潮,鄒湘軍,張青,等.基于主動(dòng)輪廓模型的自動(dòng)導(dǎo)引車視覺(jué)導(dǎo)航[J].農(nóng)業(yè)機(jī)械學(xué)報(bào),2017,48(2):20-26. Lin Guichao, Zou Xiangjun, Zhang Qing, et al. Visual navigation for automatic guided vehicles based on active contour model[J]. Transactions of the Chinese Society for Agricultural Machinery, 2017, 48(2): 20-26. (in Chinese with English abstract)

        [27] 項(xiàng)明,魏爽,何潔,等.基于DSP和MCU的農(nóng)機(jī)具視覺(jué)導(dǎo)航終端設(shè)計(jì)[J].農(nóng)業(yè)機(jī)械學(xué)報(bào),2015,46(增刊1):21-26. Xiang Ming, Wei Shuang, He Jie, et al. Development of agricultural implement visual navigation terminal based on DSP and MCU[J]. Transactions of the Chinese Society for Agricultural Machinery, 2015, 46(Supp.1): 21-26. (in Chinese with English abstract)

        [28] 任俊如.改進(jìn)的預(yù)測(cè)PID控制器的研究與設(shè)計(jì)[D].武漢:武漢科技大學(xué),2011. Ren Junru. The Research and Design of Improved Predictive PID Controller[D]. Wuhan: Wuhan University of science and Technology, 2011. (in Chinese with English abstract)

        [29] 余天明,鄭磊,李頌.電控機(jī)械式自動(dòng)變速器離合器灰色預(yù)測(cè)PID控制技術(shù)[J].農(nóng)業(yè)機(jī)械學(xué)報(bào),2011,42(8):1-6. Yu Tianming, Zheng Lei, Li Song. Gray prediction PID control technology of automated mechanical transmission clutch[J]. Transactions of the Chinese Society for Agricultural Machinery, 2011, 42(8): 1-6. (in Chinese with English abstract)

        [30] Point Grey Research, Inc. Triclops software kit Version 3.1 user’s guide and command reference [EB/OL]. [2018-08-25]. https://www.ptgrey.com/support/downloads

        [31] 陳晗婧.SIFT特征匹配技術(shù)研究與應(yīng)用[D].南京:南京理工大學(xué),2017. Chen Hanjing. Research and Application of SIFT Feature Point Technology[D]. Nanjing: Nanjing University of Science and Technology, 2017. (in Chinese with English abstract)

        [32] Hiremath S, Evert F K V, Braak C T,et al. Image-based particle filtering for navigation in a semi-structured agriculturalenvironment[J]. Biosystems Engineering, 2014, 121(5): 85-95.

        [33] 沈文龍,薛金林,汪東明,等.農(nóng)業(yè)車輛視覺(jué)導(dǎo)航控制系統(tǒng)[J].中國(guó)農(nóng)機(jī)化學(xué)報(bào),2016,37(6):251-254. Shen Wenlong, Xue Jinlin, Wang Dongming, et al. Visual navigation control system of agricultural vehicle[J]. Journal of Chinese Agricultural Mechanization, 2016, 37(6): 251-254. (in Chinese with English abstract)

        Traveling trajectory prediction method and experiment of autonomous navigation tractor based on trinocular vision

        Tian Guangzhao1, Gu Baoxing1※, Irshad Ali Mari2, Zhou Jun1, Wang Haiqing1

        (1.210031,; 2.66020,)

        In order to make the autonomous navigation tractors work steadily and continuously without the satellite positioning system, a traveling trajectory prediction system and method based on trinocular vision were designed in this paper. The system was composed of a trinocular vision camera, an IEEE 1394 acquisition card and an embedded industrial personal computer (IPC). The right and left sub cameras constituted a binocular vision system with a long base line. The right and middle sub cameras constituted another binocular vision system with a narrow base line. To obtain more precise measurement results, the two binocular vision systems worked independently and in time-sharing. Then the motion vectors of tractor, which were in presentation of horizontal direction data, were calculated by the feature point coordinate changing in the working environment of the tractor. Finally, the error models which were in the direction of heading were established at different velocities, and the motion vectors of tractor were predicted by the models based on grey method. The contrast experiments were completed with a modified tractor of Dongfanghong SG250 at the speed of 0.2, 0.5 and 0.8m/s. During the experiments, the IPC was used to collect RTK-GPS data and predict movement tracks. The RTK-GPS used in the experiments was a kind of high-precision measuring device, and the measuring precision can reach 1-2 cm. Therefore, the location data of RTK-GPS were supposed as the standard which was used to compare with the data from trinocular vision system. The experimental results showed that the method mentioned above could accurately predict the trajectory of the tractor on the plane with an inevitable error which was mainly caused by the visual measurement error of the forward direction (direction). When the tractor travelled at the speed of 0.2 m/s, the time and the distance that the error in forward direction exceeded 0.1 m equaled 46.5 s and 9.3 m, respectively. When the speed increased to 0.5 m/s, the time and the distance decreased to 17.2 s and 8.6 m, respectively. When the driving speed increased to 0.8 m/s, the time and distance quickly decreased to 8.5 s and 6.8 m, respectively. It showed that the higher the tractor traveling speed, the faster the error in forward direction increased. After that, the relationship between errors in forward direction and traveling time was acquired and analyzed by the way of nonlinear data fitting. In addition, the experimental results showed that the trend of lateral error (direction) which was perpendicular to forward direction was not regular. When the speed was 0.2 m/s, the average error was 0.002 5 m with a standard deviation (STD) of 0.003 9. When the speed increased to 0.5 m/s and 0.8 m/s, the average error in lateral direction was 0.008 2 m with an STD of 0.012 4 and 0.003 6 m with an STD of 0.006 4. The result showed that the lateral error was very small and almost invariable. Therefore, the errors of trinocular vision were mainly caused by the errors of the forward direction. The root causes of the error were the natural light and time-delay during the image processing. According to the experimental data and results, the system and method proposed in this paper could be used to measure and predict the traveling trajectory of a tractor in the dry agricultural environment with the sudden loss of the satellite signal in a short period of time. The measured and predicted data could provide temporary help for the operations of autonomous tractors.

        tractor; automatic guidance; machine vision; trajectory prediction; gray model

        10.11975/j.issn.1002-6819.2018.19.005

        S219.1

        A

        1002-6819(2018)-19-0040-06

        2018-06-13

        2018-08-27

        中央高?;緲I(yè)務(wù)費(fèi)資助項(xiàng)目(KYGX201701);國(guó)家自然科學(xué)基金資助項(xiàng)目(31401291);江蘇省自然科學(xué)基金資助項(xiàng)目(BK20140729)

        田光兆,講師,博士,主要從事農(nóng)業(yè)機(jī)械導(dǎo)航與控制研究。 Email:tgz@njau.edu.cn

        顧寶興,講師,博士,主要從事智能化農(nóng)業(yè)裝備研究。 Email:gbx@njau.edu.cn

        田光兆,顧寶興,Irshad Ali Mari,周 俊,王海青. 基于三目視覺(jué)的自主導(dǎo)航拖拉機(jī)行駛軌跡預(yù)測(cè)方法及試驗(yàn)[J]. 農(nóng)業(yè)工程學(xué)報(bào),2018,34(19):40-45. doi:10.11975/j.issn.1002-6819.2018.19.005 http://www.tcsae.org

        Tian Guangzhao, Gu Baoxing, Irshad Ali Mari, Zhou Jun, Wang Haiqing. Traveling trajectory prediction method and experiment of autonomous navigation tractor based on trinocular vision [J]. Transactions of the Chinese Society of Agricultural Engineering (Transactions of the CSAE), 2018, 34(19): 40-45. (in Chinese with English abstract) doi:10.11975/j.issn.1002-6819.2018.19.005 http://www.tcsae.org

        猜你喜歡
        拖拉機(jī)軌跡方向
        2022年組稿方向
        飛上天的“拖拉機(jī)”
        2021年組稿方向
        2021年組稿方向
        軌跡
        軌跡
        軌跡
        進(jìn)化的軌跡(一)——進(jìn)化,無(wú)盡的適應(yīng)
        牛哄哄的拖拉機(jī)
        拖拉機(jī)闖禍了
        小布老虎(2016年12期)2016-12-01 05:47:00
        一二区视频免费在线观看| 自拍偷自拍亚洲精品情侣| 又大又粗弄得我出好多水| 亚洲区精选网址| 日本av一区二区在线| 久久久久亚洲av无码专区首| 国产精品成人国产乱| 无码片久久久天堂中文字幕| 中文字幕乱码琪琪一区| 精品国产日韩一区2区3区| 日产亚洲一区二区三区| 麻豆国产高清精品国在线| 日本一区二区视频免费观看| 开心激情视频亚洲老熟女| 欧美人牲交| 亚洲一二三区在线观看| 亚洲国产精品一区二区第一| 黄片视频大全在线免费播放| 无码国产精品一区二区免费式直播| 这里只有久久精品| 日韩av免费在线不卡一区| 亚洲av网站在线观看一页| 亚洲日本va中文字幕| 国产午夜福利精品| 人妖系列在线免费观看| 综合亚洲伊人午夜网| 美丽的熟妇中文字幕| 亚洲欧美性另类春色| 狠狠综合久久av一区二区三区| 色多多性虎精品无码av| 亚洲国产高清在线一区二区三区 | 日本高清www午色夜高清视频| 人妻丝袜av中文系列先锋影音| 亚洲综合网在线观看首页| 蜜桃伦理一区二区三区| 精品一级一片内射播放| 国产高跟黑色丝袜在线| 国产自产精品露脸刺激91在线| 国产一区二区免费在线观看视频| 亚洲精品1区2区在线观看| 天天天天躁天天爱天天碰|