Ziqi Song, Hongyu Bianand Adam Zielinski
1. Acoustic Science and Technology Laboratory, Harbin Engineering University, Harbin 150001, China
2. College of Underwater Acoustic Engineering, Harbin Engineering University, Harbin 150001, China
3. Department of Electrical and Computer Engineering, University of Victoria, Victoria V8W 2Y2, Canada
Underwater Terrain-Aided Navigation Based on Multibeam Bathymetric Sonar Images
Ziqi Song1,2,3, Hongyu Bian1,2*and Adam Zielinski3
1. Acoustic Science and Technology Laboratory, Harbin Engineering University, Harbin 150001, China
2. College of Underwater Acoustic Engineering, Harbin Engineering University, Harbin 150001, China
3. Department of Electrical and Computer Engineering, University of Victoria, Victoria V8W 2Y2, Canada
Underwater terrain-aided navigation is used to complement the traditional inertial navigation employed by autonomous underwater vehicles during lengthy missions. It can provide fixed estimations by matching real-time depth data with a digital terrain map. This study presents the concept of using image processing techniques in the underwater terrain matching process. A traditional gray-scale histogram of an image is enriched by incorporation with spatial information in pixels. Edge corner pixels are then defined and used to construct an edge corner histogram, which employs as a template to scan the digital terrain map and estimate the fixes of the vehicle by searching the correlation peak. Simulations are performed to investigate the robustness of the proposed method, particularly in relation to its sensitivity to background noise, the scale of real-time images, and the travel direction of the vehicle. At an image resolution of 1 m2/pixel, the accuracy of localization is more than 10 meters.
underwater acoustics; terrain-aided navigation; sonar images; histogram; autonomous underwater vehicle; multibeam bathymetric sonar
Accuracy and reliability are vital for underwater navigation in modern civil and military applications. Many underwater navigation methods are based on the inertial navigation system (INS). However, INS is able to continuously provide the position and velocity of an autonomous underwater vehicle (AUV), it suffers from unbounded growth in drift errors with elapsing operational time (Zhaoet al., 2014). Position errors in INS can be corrected using the global positioning system (GPS) or underwater acoustic positioning methods. However, these techniques are not applicable in many cases. For example, to implement corrections based on GPS, the underwater vehicle must emerge from the water to receive the radio-frequency signal. In addition, the GPS signal is at times weak in relation to factors such as ice on the antenna, interference from other communication equipment, and even physical barriers.
Since the energy of AUV is often limited, rising from the sea bottom takes time and energy, which is often unacceptable. Long baseline (LBL) and ultra-short baseline (USBL) are two types of acoustic positioning techniques. However, LBL requires pre-deployment of underwater transponders in the working sea area, and the precise position of these underwater transponders must be known in advance as a reference. Consequently, the huge maintenance cost of LBL limits its applications. In addition, as USBL uses high-frequency acoustic signals between the AUV and the surface transponders, the AUV is prevented from cruising freely, and the depths it can reach are restricted to a relatively short range (Jianget al., 2000).
To make an AUV capable of time-extensive submerged missions, an underwater navigation system based on geophysical characteristics was proposed to supplement the INS (Department of the Navy, USA, 2004). There are three types of geophysical navigation systems: earth-magnetism navigation (Haoet al., 2008; Zhouet al., 2008), gravity gradient navigation (Guoet al., 2003; Yuanet al., 2004; Xu, 2005), and terrain-aided navigation (TAN) (Wanget al., 2006; Nygren, 2005). When an AUV is equipped with multi-beam bathymetry sonar, TAN can then be applied as an error correction method for the INS.
TAN is popular in aviation fields, and many examples of successful applications are found in aircrafts and cruise missiles (Zhaoet al., 2014; Yunet al., 2014). However, underwater TAN (UTAN) is a relatively new approach (Chenet al., 2012; Paullet al., 2014; Zhanget al., 2014). The UTAN system in the “BP02” AUV was tested in a sea trial in 2002 by the Massachusetts Institute of Technology (MIT), Harvard University, and the Norwegian Defence Research Establishment (FFI) (Harvard University Library, 2002; Marthiniussenet al., 2004). “Lost02,” a UTAN system developed by Stennis, an American navy research center, was also tested at sea (Somajyoti, 2001).
Until now, matching algorithms used frequently in UTAN have often been adopted from surface applications, and can be roughly classified into two categories: terrain contourmatching algorithms based on correlation analysis such as the terrain contour matching (TERCOM) method (Golden, 1980; Eroglu and Yilmaz, 2014) and extended Kalman filtering (EKF)-based inertial terrain-aided navigation methods such as Sandia inertial terrain-aided navigation (SITAN) (Hollowell, 1990; Bairdet al., 1990). However, these techniques have been proven to be effective in the atmosphere, their capacity for accurate localization in the underwater environment is inferior because of the lower resolution of sonar measurements. Therefore, certain algorithms have been modified for underwater applications. The Bayesian method was shown to be sophisticated among popular methods employed to build a state-space model for estimating the position of a vehicle (Anonsen and Hagen, 2009). As examples of recursive Bayesian methods, point mass filters (PMF) and particle filters (PF) work well in underwater navigation. However, methods based on Kalman filters are not suitable in some cases because of the strong nonlinearity of bathymetric measurements (Jalvinget al., 2004; Anonsen and Hallingstad, 2006).
With the same basic principle of TAN used in the air, UTAN needs to satisfy two basic requirements. First, the AUV must be equipped with a digital terrain map (DTM) at a suitable resolution determined by the accuracy demand of the mission. Second, the area covered by the DRM needs a degree of depth variation so that a reliable estimation of the fixes can be obtained. Therefore, an multi-beam bathymetric sonar mounted on an AUV is an excellent choice for UTAN because it can cover a large area below the AUV path (swath) and provide 3D underwater terrain information in real time. For the approach described in this study, a multi-beam bathymetric sonar, RESON 7125 (RESON Inc., 2006), was used to execute the bathymetric survey.
This study investigates the possibility of using image analysis techniques as matching algorithm in UTAN. Inspired by the color contour images shown in many sonar displays, bathymetric measurements are presented in the gray level of image pixels, and features within images (such as the unique characteristics of the sea bottom) can be extracted in accordance with their unique characteristics. The gray-scale histogram (GSH) is a statistical method representing the distribution of pixel intensity on a gray scale. It is constructed by calculating the amount of pixels on a certain scale, which alters depending on the particular application. Since the sea floor exhibits various depth distribution patterns in acoustic images, utilization of image analysis techniques for UTAN represents an alternative mode of underwater navigation.
The outline of this paper is as follows. Section 2 explains how to improve discrimination inherent in the conventional gray-scale histogram by using additional information related to the spatial distribution of pixels and the novel edge corner histogram. Section 3 then delivers a detailed description of the performance of the edge corner histogram in simulations. Finally, conclusions and suggestions for future work are presented in Section 4.
The GSH of an image represents the distribution of pixels over the gray-level scale, and the GSH of anN-bit gray-level image (bit depth =N) consists of 2Nsub-ranges, where pixels of a certain sub-range have the same gray-level. If all pixels of an image are considered as a set, pixels belonging to one sub-range therefore correspond to one subset. In this paper, pixels from the same subset are considered together, and are known as the sub-region of an image. A sub-region can be either connected or disconnected, depending on the spatial distribution of the pixels it encompasses.
However, different gray-level images can have the same GSH. For example, Fig. 1(a) and Fig. 1(b) have the same amount of black and white pixels (each small square represents one pixel), so they share the same GSH. This inherent drawback makes GSH unreliable when used to discriminate different gray-level images. The risk of making a wrong classification increases when the bit depth of the image is reduced, as the image resolution and number of sub-ranges also decreases. As underwater images from many kinds of sonars, such as multi-beam bathymetric sonars or side-scan sonars, are of a low resolution, more information is thus needed (in addition to GSH) to apply a histogram in the matching process of UTAN.
Fig. 1 Examples of different gray-level images that are the same in GSH but have different edge complexities
GSH provides 1D statistics of an image. In contrast, the spatial distribution characteristics of pixels describe 2Dstatistical features. Therefore, a number of gray-level images that have similar 1D features will become adequately distinctive if spatial distribution characteristics are taken into account together with GSH (Levi and Weiss, 2004; Dalal and Triggs, 2005). In this respect, the novel edge corner histogram is introduced.
2.1 Edge complexity
Pixels in the same sub-region of a gray-level image are generally scattered, and thus form complex edges. Therefore, edge complexity can serve as a spatial characteristic, and can be used to evaluate the complexity of pixel distribution in a certain sub-region. Definitions of edge pixels and edge complexities are presented below as Definition 1 and Definition 2, respectively.
Definition 1. A pixel that has a different gray level from at least one of its 4-connected pixels is considered to be an edge pixel. Specially, boundary pixels of an image are edge pixels.
Definition 2. Pixels from thekthsub-range of a GSH correspond to a sub-region of the image, known asRk. All the pixels inRkform a set,λ(Rk). Edge pixels inRkare known asEk, and they form a set,λ(Ek). Element numbers inλ(Rk) andλ(Ek), are [N]λ(Rk) and [N]λ(Ek), respectively. The edge complexity,ηk, ofRkis defined as the ratio of [N]λ(Ek) to [N]λ(Rk):
Numerically, [N]λ(Rk) is equal to the area ofRk, and [N]λ(Ek) is equal to the perimeter ofRk. Based on this interpretation, when [N]λ(Rk)=0,ηk=0.
In Figs. 1(a) and 1(b), the edge pixels of the black sub-region are marked with a white cross. According to the above definitions, it can be seen that although these two sub-regions have the same amount of pixels, the perimeter of the one in Fig. 1(b) is larger than that in Fig. 1(a). This indicates that pixels in the latter figure have a more complicated spatial distribution than in the former. As a result, it is possible to discriminate them from each other using their edge complexities.
2.2 Edge corner pixels & edge corner complexity
Gray-level images with the same GSH can be discriminated using the edge complexity given by Eq. (1). However, some gray-level images may not only have the same GSH, but may have the same edge complexity. An example is shown in Fig. 2, where the number of edge pixels in the black region in Fig. 2(a) is as large as the one in Fig. 2(b), as each black pixel is also an edge pixel. As a result, these two images cannot be distinguished from each other using GSH and edge complexity.
In underwater terrain images suitable for UTAN, most sub-regions are in the shape of lines and isolated points, as illustrated in Fig. 2. There are two reasons for this phenomenon. First, UTAN has to be used in areas that have enough variance in depth to obtain sufficient input information. Second, the depth of areas that are suitable for UTAN vary frequently, and therefore there are a limited number of same-depth points belonging to a sub-region that cluster together to form a plane. Based on this observation, most pixels of a certain sub-region can be classified as edge pixels, and thus the discriminative capability of edge complexity in the matching process is degraded.
Fig. 2 Example of different gray-level images with the same GSH and edge complexities, but different edge corner complexities
To ameliorate this problem, this study proposes the idea of using edge corner pixels to discriminate such gray level-images. In this respect, edge corner pixels are extracted from the edge pixels of each sub-region of a gray-level image. The edge corner complexity is then defined to describe the spatial distribution of edge corner pixels in a certain sub-region. The definition of an edge corner pixel is given as Definition 3.
Definition 3. An edge pixel that is different from at least one of its vertical neighbor pixels, and from at least one of its horizontal neighbor pixels in a gray level image is an edge corner pixel of the sub-region it belongs to.
In a gray-level image with a resolutionN×M,g(i,j) represents the gray level of the pixelP(i,j),i=1,2,…,N;j=1,2,…,M. The set of edge corner pixels inRkisλ(Ck), and its elements number is [N]λ(Ck). Therefore,λ(Ck) must be a subset ofλ(Ek), as shown in Eq. (2):
Then,λ(Ck) can be described by Eq. (3).
Based on Definition 3, the definition of edge corner complexity is given by Definition 4.
Definition 4. The edge corner complexity,γk, of the sub-regionRkis the ratio of [N]λ(Ck) to [N]λ(Ek):
Specially, letγk= 0 when [N]λ(Ek)= 0.
Based on Definition 3, Fig. 3 shows edge corner pixels existing in a sub-region. In Fig. 2, the edge corner pixels of the black region are marked with a white cross. It can be seen that Fig. 2(a) has less edge corner pixels than Fig. 2(b); therefore the edge corner complexity of the former is smaller than the latter. As a result, they can be discriminated from the edge corner pixels using spatial information.
Fig.3 Edge corner pixels based on Definition 4
2.3 Edge corner histogram
To introduce the spatial information obtained from edge complexity and edge corner complexity to GSH, an edge corner histogram (ECH) is constructed. The ECH is an upgrade of GSH that presents 2D distributional features of pixels for each gray level. Based on Definitions 1, 2, 3, and 4, ECH is defined by Definition 5.
Definition 5. The ECH of a gray-level image represents the distribution in the image of the edge corner pixels over the gray-level scale. The value of thekthsub-range in ECH equals the product of[N]λ(Rk),ηkandγk.
IfHis the ECH of anN-bit gray-level image, andHkis the value of thekthsub-range ofH,k=1,2,…,2N, thenHkcan be calculated by Eq. (5):
This result indicates that to obtain the corresponding ECH of an image, it is only necessary to calculate the amount of edge corner pixels in each sub-range of a GSH. Therefore, the calculation steps of an ECH are as follows:
1) Calculate the GSH of the gray-level image.
2) Consider all pixels of each sub-range and extract the edge pixels.
3) Consider all edge pixels of each sub-range and extract the edge corner pixels.
4) Calculate the number of edge corner pixels in each sub-range.
5) Construct ECH using the results obtained in step 4.
ECH has the same length as GSH. As a result, the calculation complexity and time cost of the matching process in UTAN using ECH is the same as using GSH.
Simulations were conducted to investigate the performance of ECH in the matching process for UTAN. An underwater terrain image was constructed from actual depth data collected by a multi-beam bathymetry sonar (RESON 7125) using biharmonic spline interpolation (Sandwell, 1987). The image obtained was then used as the DTM. Depth data of a local region in the DTM were extracted and considered as 2D bathymetric measurements obtained by an AUV in real-time. The real-time image (RTI) was constructed from this set of data using the same interpolation method. The RTI was used as a window (template) to scan over the DTM in order to obtain the position (a fix) of the AUV by comparing the similarity of the images. The RTI scans the DTM from left to right and from top to bottom, using a particular searching step. At each step, the ECH of a local region of the DTM is calculated. The mean square difference (MSD) is then chosen as a measure of the similarity between this ECH and that of the RTI. Fixes of the position that have the largest similarities are then considered to be matching results, and these are then used to reset the INS.
It should be noted that the DTM and the RTI are 2D projections of their 3D underwater terrain, respectively, and that they have the same bit depth,N. The range of gray level, 1 to 2N, represents the depth range of the corresponding DTM. Gray level 1 corresponds to the minimum depth in the DTM when 2Ncorresponds to the maximum depth. The gray level of each pixel in RTI represents the relative depth of the graphical point in the same way as the DTM. This concept is suitable for computer processing, but it cannot give a clear visual impression of the corresponding underwater terrain. For this reason, original underwater terrain images are not shown in this study, and instead we present each simulation using the conventional way.
3.1 Simulation 1—Noise
Simulation 1 shows the effect of noise on ECH in the matching process. Gaussian noise was assumed and added to realistic depth data to simulate different noise backgrounds. The signal-to-noise ratio (SNR) varied from 10 dB to 2 dB. In addition, the RTI showed a simulated 5° rotation error compared to the DTM.
Fig. 4(a) shows a 3D model of the DTM. In Fig. 4(b), the area restricted by the dotted lines is the area that the RTI comes from. The AUV’s 2D position is assumed at the center point of the RTI. To simplify the calculation, in this paper the resolution of the DTM and RTI are both 1 m2/pixel. However, in this test the DTM uses 500×500 pixels, the RTI 100×100 pixels, and the searching step is 10 pixels.
Fig. 5 presents the matching results at different noise conditions. The local region confined by dotted lines represents the RTI, and the rectangular regions in solid lines are the top five estimated areas (ranked in similarity) after scanning the whole DTM. These estimated regions surround the AUV’s true position, which means that the underwater terrain can be localized by matching its ECH. The positions of the rectangular regions change only slightly with an increase in the noise power. This result indicates that the method has a good anti-noise capacity.
Table 1 shows the errors in each estimated region with different noise conditions. It can be seen that the errors inX-andY-axis show little variation when the SNR decreases, and this result agrees with that in Fig. 5. However, the similarity values of the rectangular regions continue to increase in line with the noise strength, which implies that the similarity decreases when the noise power escalates. However, it makes little change to the position of these estimated regions in terms of their similarity ranking, and therefore the matching results are considered to be stable.
Fig. 4 Representation of DTM and RTI in Simulation 1
Fig. 5 Matching results with different noise effect
Table 1 Error statistics of Fig. 5
3.2 Simulation 2—Area of RTI
In Simulation 2, the resolution of the DTM and the RTI are the same as in Simulation 1, and SNR is 2 dB. The difference here is that the area of the RTI varies from 50×50 pixels to 200×200 pixels. The 3D model of the DTM is the same as the one shown in Fig. 4(a), but the RTIs come from different local regions, as shown in Fig. 6.
In the matching results presented in Fig. 6, the rectangular regions in solid lines are located around the true position of AUV when their areas are increased. This shows that ECH is robust to scale variance. With an increase in the area of real-time images, the rectangular regions are gathered more closely. However, the matching errors inX-andY-axis were bigger than in Simulation 1, as shown in Table 2. This is because the RTIs in Simulation 2 emerge from a flatter region than in Simulation 1. As a result, there is less terrain information input to the matching method and the quality of matching results decreases.
Fig. 6 Matching results of different areas of real-time images
Table 2 Error statistic in Fig. 6
It should be noted that there needs to be a minimum area for the RTI in each specific DTM, because the matching method cannot provide reliable results when there is insufficient input terrain. To determine this minimum area, the uniqueness of the underwater region needs to be investigated in advance to assess its navigability. The more unique the underwater region, the smaller the real-time image required.
3.3 Simulation 3—Rotation
In this simulation, the robustness to rotation is investigated. The area where the real-time data is extracted from has different degrees of rotation comparing to the orientation of DTM. This is intended to simulate the possible direction error when an AUV has been cruising for a long time. SNR is 10 dB in this simulation. Resolution and area of the DTM and the RTI are the same as before. Fig. 7(a) shows the 3D model of the DTM and Fig. 7(b) presents the RTI in dotted lines without any rotation.
Fig. 8 presents the matching results using ECH with different rotation errors. The rectangular regions in solid lines are located around the true position of the AUV, and they scatter little when the rotational degree increases, thereby indicating that the ECH is robust to rotation error. In practice, it is assumed that the directional error of the AUV would be in the range of 0° to 20°. However, this anti-rotation performance would also be expected in relation to bigger rotation errors, as ECH is a statistical method that does not rely on orientation information.
Table 3 shows the matching errors of Fig. 8, where little change can be seen in the estimation errors, but the similarity values increase with the rotation degrees. However, the similarity variation does not lead to a big change in the similarity ranking, and therefore the top five estimated regions are robust.
Fig. 7 3D model of DTM in Simulation 3
Table 3 Error statistic in Fig. 8
Fig. 8 Matching results of different rotation degrees
This paper presents a novel underwater terrain matching method for UTAN. The method requires a digital terrain map and real-time depth measurements from a multi-beam sonar to estimate the fix of an AUV. GSH is improved by introducing spatial information from edge corner pixels. ECH is then proposed as an image feature and is applied as the core algorithm in the terrain matching process. Computer simulations are conducted to investigate its performance, and results show that ECH has a better discriminative ability than GSH. It is able to provide a robust estimation of the true position of the AUV with accuracy less than that of the searching step, which is 10 pixels (10 m) in this paper. The uniqueness of the underwater area used in simulations was suitable for UTAN to obtain a precise match. However, the discrimination potential of the DTM needs to be analyzed before being used, because if insufficient information is offered the matching result could deteriorate. If the planned trajectory of the AUV passes flat areas, a limitation of underwater terrain feature amount, such as a threshold for the terrain’s entropy, can be used to decrease the risk of mismatching.
In addition to the statistical features presented by ECH, other image characteristics, such as texture features, also exist in underwater terrain images taken from areas with relatively flat terrain. As a result, use of the image analysis methods in UTAN could broaden the scope of this navigation method. In route planning, it can be easier to determine in advance whether a certain area is suitable to obtain position fixes. In areas with slight terrain variations but abundant image features, image analysis based navigation methods provide assistance to the INS.
To further the research of image analysis utilization in UTAN, subsequent tests using actual AUV data are required. Measurements from sea trials will thus enrich the database and widen the scope of feature extraction in the future.
Acknowledgement
The authors would like to thank Dr. Pan Agathoklis and Dr. Paul Kraeutner at the Department of Electrical and Computer Engineering, University of Victoria, for their contribution to this paper.
The authors would also like to thank Feng Xu at the Institute of Acoustics, Chinese Academy of Science, for providing the terrain data used in this paper.
Anonsen KB, Hagen OK (2009). Terrain aided underwater navigation using pockmarks.IEEE OCEANS 2009, Biloxi, USA, 1-6.
Anonsen KB, Hallingstad O (2006). Terrain aided underwater navigation using point mass and particle filters.Proc. IEEE Position, Location and Navigation Symposium, San Diego, USA, 1027-1035. DOI: 10.1109/PLANS.2006.1650705
Baird CA, Snyder FB, Beierle M (1990). Terrain-aided altitude computations on the AFTI/F-16.Proc. IEEE Position Location and Navigation Symposium, Las Vegas, USA, 474-481. DOI: 10.1109/PLANS.1990.66217
Chen Xiaolong, Pang Yongjie, Li Ye, Chen Pengyun (2012). Underwater terrain matching positioning method based on MLE for AUV.Robots,34(5), 559-565. (in Chinese) DOI: 10.3724/SP.J.1218.2012.00559
Dalal N, Triggs B (2005). Histograms of oriented gradients for human detection.IEEE Computer Society Conference on Computer Vision and Pattern Recognition, San Diego, USA, 886-893. DOI: 10.1109/CVPR.2005.177
Department of the Navy, USA (2004). The navy unmanned undersea vehicle (UUV) master plan. Available from http://www.pdfdrive.net/the-navy-unmanned-undersea-vehicleuuv-master-plan-e85437.html [Accessed on May. 23, 2015].
Eroglu O, Yilmaz G (2014). A terrain referenced UAV localization algorithm using binary search method.Journal of Intelligent & Robotic Systems,73(4), 309-323. DOI: 10.1007/s10846-013-9922-7
Golden JP (1980). Terrain contour matching (TERCOM): A cruise missile guidance aid.Image Processing for Missile Guidance, San Diego, USA, 10-18. DOI: 10.1117/12.959127
Guo Youguang, Zhong Bin, Bian Shaofeng (2003). The determination of earth gravity field and the matched navigation in gravity field.Hydrographic Surveying and Charting,23(5), 61-64. (in Chinese) DOI: 1671-3044(2003)05-0061-04
Harvard University Library (2002). Citing electronic sources of information. University of Harvard. Available from http://People.seas.harvard.edu/~leslie/ASCOT02.doc [Accessed on May. 23, 2015].
Hao YL, Zhao YF, Hu JF (2008). Preliminary analysis on the application of geomagnetic field matching in underwater vehicle navigation.Progress in Geophysics,23(2), 594-598.
Hollowell J (1990). Heli/SITAN: A terrain referenced navigation algorithm for helicopters.Proc.IEEE Position Location and Navigation Symposium, Las Vegas, USA, 616-625. DOI: 10.1109/PLANS.1990.66236
Jalving B, Mandt M, Hagen OK, P?hner F (2004). Terrain referenced navigation of AUVs and submarines using multibeam echo sounders. Available from http://www.navlab.net/Publications/Terrain_Referenced_Navig ation_of_AUVs_and_Submarines_Using_Multibeam_Echo_So unders.pdf [Accessed on May. 23, 2015].
Jiang X, Feng X, Wang L (2000).Underwater robots. Liaoning Press of Science and Technology, Liaoyang, China, 293-297. (in Chinese)
Levi K, Weiss Y (2004). Learning object detection from a small number of examples: the importance of good features.IEEE Computer Society Conference on Computer Vision and Pattern Recognition, Washington, DC, USA, 53-60. DOI: 10.1109/CVPR.2004.1315144
Marthiniussen R, Vestgard K, Klepaker RA, Storkersen N (2004). HUGIN-AUV concept and operational experiences to date.OCEANS'04, Kobe, Japan, 846-850. DOI: 10.1109/OCEANS.2004.1405571
Nygren I (2005).Terrain navigation for underwater vehicles. Stockholm: KTH, Trita-S3-SB-0571.
Paull L, Saeedi S, Seto M (2014). AUV navigation and localization: A review.IEEE Journal of Oceanic Engineering,39(4), 131-149. DOI: 10.1109/JOE.2013.2278891
RESON Inc. (2006).Seabat 7125 operator’s manual. RESON Inc., Goleta, USA.
Sandwell DT (1987). Biharmonic spline interpolation of GEOS-3 and SEASAT altimeter data.Geophysical research letters,14(2), 139-142. DOI: 10.1029/GL014i002p00139
Somajyoti M (2001).Sensor fusion and feature based navigation for subsea robots. Ph.D. thesis, The University of Sydney, Sydney, Australia, 1-17.
Wang Kedong, Yan Lei, Deng Wei, Zhang Junhong (2006). Research on iterative closest contour point for underwater terrain-aided navigation.International Workshop on Structural,Syntactic and Statistical Pattern Recognition, Hong Kong, China, 252-260. DOI: 10.1007/11815921_27
Xu Daxin (2005). Using gravity anomaly matching techniques to implement submarine navigation.Chinese Journal of Geophysics,45(4), 812-816. (in Chinese) DOI: 0001-5733(2005)04-0812-05
Yuan Shuming, Sun Feng, Liu Guangjun, Chen Jing (2004). Application of gravity map matching technology in underwater navigation.Journal of Chinese Inertial Technology,12(2), 13-17. (in Chinese) DOI: 10.3969/j.issn.1005-6734.2004.02.004
Yun Sg, Lee W, Park CG (2014). Covariance calculation for batch processing terrain referenced navigation.IEEE Position Location and Navigation Symposium, Monterey, USA, 701-706. DOI: 10.1109/PLANS.2014.6851435
Zhang Kai, Li Yong, Zhao Jianhu, Rizos C (2014). A study of underwater terrain navigation based on the robust matching method.Journal of Navigation,67(4), 569-578. DOI: 10.1017/S0373463314000071
Zhao Long, Gao Nan, Huang Baoqi, Wang Qianyun (2014). A novel terrain aided navigation algorithm combined with the TERCOM algorithm and particle filter.IEEE Sensors Journal,15(2), 1124-1131. DOI: 10.1109/JSEN.2014.2360916
Zhou Jun, Ge Zhilei, Shi Guiguo, Liu Yuxia (2008). Key technique and development for geomagnetic navigation.Journal of Astronautics,29(5), 1467-1472. (in Chinese) DOI: 10.3873/j.issn.1000-1328.2008.05.001
10.1007/s11804-015-1334-6
1671-9433(2015)04-0425-09
Received date: 2015-07-10.
Accepted date: 2015-10-03.
Foundation item: Supported by the National Natural Nature Science Foundation of China (Grant No. 41376102), Fundamental Research Funds for the Central Universities (Gant No. HEUCF150514) and Chinese Scholarship Council (Grant No. 201406680029).
*Corresponding author Email: bianhongyu@hrbeu.edu.cn
? Harbin Engineering University and Springer-Verlag Berlin Heidelberg 2015
Journal of Marine Science and Application2015年4期