Skip to main content
Sensors (Basel, Switzerland) logoLink to Sensors (Basel, Switzerland)
. 2019 Jul 1;19(13):2917. doi: 10.3390/s19132917

SINS/Landmark Integrated Navigation Based on Landmark Attitude Determination

Shuqing Xu 1, Haiyin Zhou 1, Jiongqi Wang 1,*, Zhangming He 1,2, Dayi Wang 2
PMCID: PMC6651203  PMID: 31266229

Abstract

Based on the situation that the traditional SINS (strapdown inertial navigation system)/CNS (celestial navigation system) integrated navigation system fails to realize all-day and all-weather navigation, this paper proposes a SINS/Landmark integrated navigation method based on landmark attitude determination to solve this problem. This integrated navigation system takes SINS as the basic scheme and uses landmark navigation to correct the error of SINS. The way of the attitude determination is to use the landmark information photographed by the landmark camera to complete feature matching. The principle of the landmark navigation and the process of attitude determination are discussed, and the feasibility of landmark attitude determination is analyzed, including the orthogonality of the attitude transform matrix, as well as the influences of the factors such as quantity and geometric position of landmarks. On this basis, the paper constructs the equations of the SINS/Landmark integrated navigation system, testifies the effectiveness of landmark attitude determination on the integrated navigation by Kalman filter, and improves the navigation precision of the system.

Keywords: SINS/Landmark integrated navigation, landmark attitude determination, all-day and all-weather navigation, image matching, navigation Kalman filter

1. Introduction

In view of the advantages of strong autonomy, anti-interference and good concealment, SINS and CNS are widely used [1,2]. Inertial devices can provide the navigation data of attitude, velocity and position. Star sensors measure the information of astronomical object. This information can be combined and used to obtain more precise navigation results [3,4,5,6]. However, in real application, CNS is unable to work continuously throughout the day. In addition, it can be affected by bad weather, thus SINS/CNS integrated navigation cannot realize a continual navigation for a long time [7].

Recently, as a new navigation technology, landmark navigation has received wide attention globally. As can be seen, landmark navigation is technically different from CNS. It is based on the visualization of the landmarks, which can solve the drawbacks of star sensors and navigate the aircraft by using the geographical features of the planet’s surface [8,9]. To compensate the application limitation, this paper combines SINS with landmark navigation to construct a pattern with high precision to meet navigation requirements.

Currently, some studies have been conducted on the theory and application of landmark navigation. For the single landmark navigation, Costello and Castro [10] extracted landmark characteristics from 2D sensor images, and combined them with the landmark database to estimate the latitude, longitude and attitude of the vehicles on the ground or targets at space. Cesetti et al. [11] proposes a vision-based method of applying UAV (unmanned aerial vehicle) for landing. The basic principle of the guiding method is to define the target area by satellite images or antenna with high resolution . The key of the navigation strategy is to locate proper natural landmarks by feature matching algorithm.

For the applications of landmark navigation in practice, He et al. [12] proposed a method to obtain the information of the space target emitter by landmarks. The purpose is to obtain the azimuth angle of the target, and then calculate the attitude by planarization algorithm and linearization technique. In Khuller et al. [13], landmark navigation is applied in the field of robot industry. Robots can realize the position determination through the landmarks based on the direction information provided by visual inspection. It proposes the minimum set of landmarks for the robot position determination, and the quantity of this minimum set of landmarks is called “measurement dimension”.

However, landmark navigation is also limited by the weather and environment conditions, thus it should be integrated with other navigation methods to realize a more precious navigation. In the integration algorithm between the landmark navigation and other autonomous navigation methods, Kim and Hwang [14] raised the integration of SINS and landmark navigation. Its navigation mode is roughly the same as the one in this paper. It is designed to solve the problem of visual navigation under poor sight. When the environment makes it hard to observe landmarks, the landmark visual navigation is forced to interrupt; the inertial devices continue working and providing position, velocity and attitude information. This method can make up the shortfalls of single landmark navigation and realize a continuous navigation.

When the landmark navigation is combined with other non-autonomous navigations, Babel [15] proposed a navigation algorithm to determine the shortest route of UAV(unmanned aerial vehicle). The landmark navigation is used to compensate the navigation interruption of the GPS. Meanwhile, the method regularly updates the landmark base to realize long-term landmark navigation.

The above schemes all use landmarks for position determination, and the precision of their navigation results can be improved greatly. Under the practical research background, both the aircraft and the astronomical object are relatively rotating to the inertial system, thus it is hard to identify the real attitude of the aircraft to the fixed coordinate system of astronomical object. To achieve the best navigation result, the landmark information should be fully used to accomplish the attitude determination. Therefore, the key of the landmark navigation is to determine the attitude based on landmark characteristics.

In view of the autonomous navigation around the Earth, this paper proposes a SINS/Landmark integrated navigation method with high precision based on landmark attitude determination to meet the demands of the all-day and all-weather autonomous navigation. During the landmark navigation, landmark camera photographs the landmarks at its sampling frequency to capture the landmark characteristics. Then, the characteristics are recognized and matched with the landmark database to obtain the position and attitude information of the aircraft. Position and attitude information by the landmark navigation is combined with the measurement by inertial devices to get more precise navigation results. This paper also introduces the landmark navigation’s principle and attitude determination process, constructs the equations for SINS/Landmark integrated navigation system, compares the navigation results with those have non-attitude determination in the simulations, and testifies the effectiveness of landmark attitude determination on integrated navigation.

The main contributions of this paper include the following. (1) Combine landmark navigation with SINS: This paper proposes SINS/Landmark integrated navigation mode, which can continue navigating and realizing an all-day and all-weather navigation when the SINS/CNS mode is not working. (2) This method uses the landmark features obtained by the landmark camera to accomplish attitude determination, and combines the attitude information with the measurement of SINS to realize an integrated navigation with high precision. (3) This method applies Kalman filter to the linear navigation system, and illustrates the influence of the attitude determination on the navigation accuracy by comparison in the simulations.

The remainder of the paper is structured as follows. Section 2 introduces the principle of landmark navigation, including the acquisition and matching of landmark information. Section 3 introduces the process of determining the attitude by using the landmarks. In this process, the computability of the rotation matrix is verified, and the feasibility of the landmark attitude determination is theoretically explained. Section 4 introduces the SINS/Landmark integrated navigation and gives the system equations. In Section 5, the effectiveness of the SINS/Landmark integrated navigation with landmark attitude determination is verified by experimental simulation. Section 6 is the conclusion.

2. Principles of Landmark Navigation

2.1. Acquisition of Landmark Information

The landmark information can be obtained from the landmark camera, and the operation process is mainly divided into two parts: the shooting process of the landmark camera and the matching process of the landmark feature database.

It is known that the imaging principle of the camera is “small hole imaging”, and the landmark position vector obtained by the landmark information is actually the mapping point of the camera imaging focus, which is similar to the star sensor. However, in practical applications, the feature points of the mapping area may not be obvious and easy to extract. Therefore, the camera chooses the landmarks with clear features as the selection set, which is used to replace the virtual landmark of the mapping point. As for the equations of the position and attitude determination, there is no difference between the two selections.

As shown in Figure 1, when the aircraft (navigation target) orbits around the Earth, the landmark camera first captures the landmark images, and the images are transmitted to the information processing module. Then, the features are compared and analyzed, including the position information of the landmarks on the Earth, and then follows up. To keep the navigation going, the duration of the matching process should be less than the shooting time of the camera.

Figure 1.

Figure 1

Acquisition process of landmarks.

As shown in Figure 1, there are the processes of shooting, landmark capture and landmark information acquisition, respectively.

(a) Firstly, shoot the landmarks by the landmark camera. It should be noted that only one photo can be taken at a time depending on the shooting frequency of the camera.

(b) Secondly, the landmark camera obtains the images of the landmarks. At this time, multiple landmarks and landmark feature points are obtained according to the photos, and these feature points participate in the next step for matching.

(c) Finally, the landmark feature points in the photo are extracted and then matched with the feature points database. When the matching is completed, the aircraft’s space model is established for navigation. The space model is shown in the third picture of Figure 1. Since the vectors in the figure are all referenced to the Earth-centered-Earth-fixed system, the superscript of the vectors e, ρ1e and ρ2e are the position coordinates of the landmarks; ρ0e is the vector from Landmark 1 to Landmark 2; re is the position coordinates of the aircraft; and r1e and r2e are, respectively, the vectors from Landmark 1 and Landmark 2 to the aircraft, i.e., the relative position model of the aircraft and landmarks at space.

When the aircraft is not parallel to the landmark flat, the captured image of the landmark is prone to rotation and deformation as shown IN Figure 2:

Figure 2.

Figure 2

The principle of image compression during the landmark matching.

When the landmark camera is not parallel to the landmark flat, the segment ab of the landmark is imaged as g on the imaging area of the camera. According to the principle that the proportion of the similar triangle is constant, the imaging of the segment ab in the virtual flat should be h, and then the image compresses in the O direction, the compression proportion κ is as Equation (1):

κ=gh (1)

When the landmark camera is fixed on the aircraft, the transformation matrix between the landmark camera coordinate system and the aircraft coordinate system is a constant matrix. When the landmark camera changes direction with the flight of the aircraft, the transformation matrix will also be changed. In practice, the transformation matrix needs to be selected and calculated according to the situation.

2.2. Matching Process of the Landmark Images

The image rotation and deformation should be considered when matching the landmarks, which will take more time. In practice, this process can be operated by using sparse representation method or feature point detection method.

The sparse representation method is to express few original signals to the total by using a linear combination of these original signals [16,17,18], but it takes a long time, which cannot meet the real-time requirement. The feature point detection method is widely used in the visual-based celestial landing navigation. SIFT (Scale-Invariant Feature Transform) is one of the popular methods with preferable application effect [19,20], which realizes the identification and matching of landmark images by key point detection, description, matching and elimination of mismatch points. This algorithm has good discriminability, strong scalability and better robustness in the case of target rotation and deformation, and it can extract multiple features of the target.

For the matching process of landmarks, the operational process of the SIFT is shown in Figure 3:

Figure 3.

Figure 3

Flow of SIFT.

The scale space theory is used to search the feature points [21], and the Gaussian pyramid [22] and the Gaussian difference pyramid need to be established to compare with the pixels. The gradient of neighborhood feature points needs to be calculated and the main direction leads to the peak. The KD tree [23] is used for the matching process and the threshold should be set by the circumstances.

3. Position and Attitude Determination of the Landmark Navigation

3.1. Attitude Determination of the Landmark Navigation

3.1.1. Calculation for Attitude Angle

The coordinate systems used in this paper are as follows: i is the inertial coordinate system; e is the Earth-centered-Earth-fixed system; b is the aircraft body coordinate system; and li is the launching inertial system.

Assume that the position of the aircraft is re=xe,ye,ze in the Earth-centered-Earth-fixed system, the position of the observed landmark is ρie=aie,bie,ciei=1,,τ, representing that τ landmark can be observed. Assume that the distance between the aircraft and each landmark point is ri, and αi,βi,ηi is the angle between the axes of the aircraft body coordinate system and the vector from the aircraft to the landmark. Ceb is the transformation matrix from the Earth-centered-Earth-fixed system to the aircraft body coordinate system, then:

CebxeaieriyebierizecieriT=cosαicosβicosηi (2)

Let

xeaieriyebierizecieriT=Xi,cosαicosβicosηi=Zi (3)

Because of the measurement error, according to Equation (2), it can be written as Equation (4):

Z=CebX+ε (4)

where X=X1,XτT, Z=Z1,ZτT, ε=ε1,ετTN0,σ2. If τ3, the transformation matrix can be calculated by the least squares (LS) method as Equation (5):

Ceb=ZXTXXT1 (5)

The Launch point’s coordinate in the geodetic coordinate system is B,L,H, and the angle of incidence is A, then the transformation matrix between the launch inertial system and the geocentric inertial system can be calculated as Equation (6):

Clii=sinLcosL0cosLsinL00011000cosBsinB0sinBcosBsinA0cosA010cosA0sinA (6)

The transformation matrix between the geocentric inertial system and the Earth-centered-Earth-fixed system is

Cie=cosωTsinωT0sinωTcosωT0001 (7)

where ω is the angular velocity rate of the Earth’s rotation, and T is the rotation time of the Earth. At this point, the transformation matrix between the launch inertial system and the aircraft body coordinate system can be obtained as Equation (8):

Clib=CebCieClii (8)

Then, the pitch angle, yaw angle and roll angle of the aircraft can be obtained, separately.

φ=arctanCbli2,1Cbli2,1Cbli1,1Cbli1,1ϕ=arcsinCbli3,1γ=arctanCbli3,2Cbli3,2Cbli3,3Cbli3,3 (9)

Cbli is the transposition of Clib, representing the transformation matrix from the aircraft body coordinate system to the launch inertial system.

3.1.2. Computability of the Transformation Matrix

It is necessary to prove the computability of the transformation matrix, but its orthogonal should be proved first. In theory, the transformation matrix is orthogonal obviously, representing the rotation relationship between the two coordinate systems. However, in this paper, the transformation matrix Ceb is obtained by the distance scalar ri of the aircraft and each landmark point, as well as the angle αi,βi,ηi between the coordinate axes of the aircraft body coordinate system, and the vector from the aircraft to the landmark, thus it is significant to prove its orthogonality.

The computability of the transformation matrix between the launch inertial system and the aircraft body coordinate system is equivalent to that of the transformation matrix Ceb from the Earth-centered-Earth-fixed system to the aircraft body coordinate system. That is, Equation (5) can be calculated.

According to Equation (5), the matrix X must be full row rank, that is, the vector from the aircraft to the landmark point can be expanded into the entire space.

Proof for the orthogonality of matrix Ceb:

(1) Invariance of the mold length: Let ri be the mold length of xeaieyebiezecieT, and it can be known that the mold length of the column vectors in matrix X are all 1, and αi,βi,ηi is the angle between the three axes of the aircraft body coordinate system and the vector from the aircraft to the landmark. The column vector mold length of Z is also 1.

(2) Invariability of the angles: For any two landmarks (set to be the ith and the jth), the vectors from aircraft to the two landmarks can be set as rie and rje of the Earth-centered-Earth-fixed system, respectively. rib and rjb are, respectively, the two landmarks of the aircraft body coordinate system, satisfying

rib=Cebrierjb=Cebrje (10)

It is known that the same angle does not change in size at different coordinates, and consider that

rierje=rierjecosrie,rje=cosrie,rjeribrjb=ribrjbcosrib,rjb=cosrib,rjb (11)

Then,

rierje=ribrjb (12)

For convenience, taking three sets of linear independent column vectors in two coordinate systems, Equation (4) can be rewritten as Equation (13):

CebX˜=Z˜ (13)

where X˜=rk1e,rk2e,rk3eT, Z˜=rk1b,rk2b,rk3bT, k1,k2,k3 are three randomly selected landmarks. According to Equation (12),

X˜TX˜=rkie·rkjei=1,2,3j=1,2,3=rkib·rkjbi=1,2,3j=1,2,3=Z˜TZ˜ (14)

Therefore,

Z˜X˜1=Z˜1TX˜T (15)

Calculating the inverse matrix of Ceb, Equation (16) can be obtained after transposition:

Ceb1T=Z˜1TX˜T (16)

Considering Equation (13),

Ceb=Z˜1TX˜T (17)

According to Equations (16) and (17),

CebT=Ceb1 (18)

Therefore, the matrix Ceb is orthogonal. Moreover, it is computable.

3.1.3. The Relationship between the Number, Relevance and Accuracy of Landmarks

The selection of landmarks will affect the accuracy of navigation. The influential factors mainly contain the number of landmarks and the relevance of landmark locations (whether they are collinear or not), etc. The two aspects are discussed and the analysis results are given as follows.

From Equation (4), the estimation error of the transformation matrix Ceb is as Equation (19):

CovCeb=XXT1σ2 (19)

The relationship between the number of landmarks and the accuracy:

According to Gorbenko and Popov [24], for the set of landmarks, there is usually a minimum set of them. Meanwhile, the completeness can be satisfied. In the process of the position determination, the purpose is to obtain the coordinate of the aircraft. In general, a three-dimensional coordinate has three unknowns elements, and this means the number of observable landmarks should be three or more. The specific proof and analysis can refer to Section 3.2.

However, for the process of attitude determination, since the solution of the transformation matrix is calculated by the LS method using multiple samples, the relationship between the selection of the landmarks and the accuracy of the attitude needs to be determined according to Equation (19).

The following part is the analysis of the number of selected landmarks in the attitude determination.

Assuming that a new landmark is added on the basis of several original landmarks, Equation (19) can be rewritten as

CovCeb=XXτ+1XTXτ+1T1σ2=XXT+Xτ+1Xτ+1T1σ2 (20)

Since the eigenvalues of the matrix XXT+Xτ+1Xτ+1T1 are smaller than the eigenvalues of the matrix XXT1, within the calculation amount, a larger number of landmarks means higher accuracy. Since the landmark camera can only get one picture at a time according to its shooting frequency, and the number of landmarks in the picture is limited, it can be concluded that the more landmarks there are, the higher is the accuracy of the attitude that can be achieved.

The relationship between the landmark correlation and the accuracy:

If there is a strong correlation between the landmarks, even the landmarks are selected as points near a line, the minimum eigenvalue of the matrix XXT tends to zero. Make an eigenvalue decomposition (ED) of the matrix:

XXT1=PΛ1P1 (21)

Then, the mean square error (MSE) of the transformation matrix Ceb is as Equation (22):

MSECeb=σ2trXXT1=σ2trPΛ1P1=σ2trΛ1PP1=σ2trΛ1=σ2i=13λi1 (22)

Since the landmarks are considered in three-dimensional space, the eigenvalue matrix is Λ=λ1λ2λ3, and P is the eigenvector matrix.

Since λ3 approaches 0, when the matrix is inverted, MSECeb tends to infinity. At this time, the calculation error is particularly large, which affects the calculation accuracy.

The simulation data were used to give an explanation.

Assuming that the measurement angle error obeys the normal distribution with a zero mean value and a standard deviation of 30”, the errors can be calculated under the different number of landmarks and correlation. In each case, Monte Carlo simulation was performed for 100 times to obtain the mean error. The accuracy results are shown in Table 1:

Table 1.

The effect of the number and correlation of landmarks on the accuracy of the transformation matrix.

3 4 5 6 7
Non-correlation Pitch angle error 3.4146 2.0300 2.3630 1.6049 1.8437
Yaw angle error 4.5328 1.4792 1.9401 1.8308 0.6509
Roll angle error 1.5083 2.2644 0.8618 0.7915 0.2807″
Strong correlation Pitch angle error 28.4667″ 24.9020 13.8017 11.8040 3.3160
Yaw angle error 7.9635 2.8863 4.4863 4.6482 4.0131
Roll angle error 3.1885 5.7244 3.5935 2.2859 2.3123

The following can be seen in Table 1:

(a) When there is a strong correlation between the landmarks, the angle error becomes larger, especially the pitch angle error.

(b) Without considering the correlation of the landmarks, more landmarks make the angle error smaller. That is, the more accurate is the transformation matrix, the higher is the accuracy of the attitude determination. This shows that, in the actual navigation process, more landmarks lead to higher accuracy.

(c) In the case that a gradual increase emerges in the number of landmarks, whether there is a correlation relationship between landmarks has less and less influence on the accuracy of the transformation matrix. This shows that, when the number increases, the collinear problem inevitably appears. How to balance the number and the location of the landmarks is the key to improving the navigation accuracy.

3.2. Position Determination of Landmark Navigation

It is shown in Section 3.1 that the position of the ith navigation landmark in the Earth-centered-Earth-fixed system is ρie=aie,bie,cie,i=1,,τ, the vector of the aircraft under the Earth-centered-Earth-fixed system is re=xe,ye,ze, and the relative position of the aircraft to the landmark is rie. Let the transformation matrix of the aircraft body coordinate system to the Earth-centered-Earth-fixed system be Cbe=CebT, and Ceb can be obtained from Section 3.1.1. Then,

rie=xeaieyebiezecieT (23)

Suppose that ρib is the position vector of the ith landmark in the aircraft body coordinate system, which can be obtained by the landmark camera. Then, it can be obtained as Equation (24):

rie=Cbeρibxeaieyebiezecie=Cbeaiebiecie (24)

Suppose that xe,ye,ze are three unknown vectors, and the coordinate of the aircraft in the Earth-centered-Earth-fixed system can be obtained when there are three or more landmarks can be observed, that is τ3.

4. SINS/Landmark Integrated Navigation Model

The operation process of SINS/Landmark integrated navigation is based on SINS, and the landmark information is used to correct the error of the SINS. The measurement equation is established according to the position and attitude information obtained by SINS and landmark navigation. The specific navigation process is shown in Figure 4.

Figure 4.

Figure 4

Process of the SINS/Landmark integrated navigation.

4.1. State Equation of Integrated Navigation

Combined with the SINS mathematical platform angle error, velocity error, position error, gyroscope and the accelerometer error models, the state equation of the integrated navigation system can be obtained as Equation (25):

X˙t=FtXt+GtWt (25)

Take the state parameter of the system as 15 dimensions, and record it as:

Xt=ϕx,ϕy,ϕz,δvx,δvy,δvz,δx,δy,δz,εx,εy,εz,x,y,zT (26)

where ϕxϕyϕzT denotes the three mathematical platform angles error on three axes; δvxδvyδvzT denotes the velocity error; δxδyδzT denotes the position error; εxεyεz denotes the three random constant drift of gyroscope; xyzT denotes the three random constant bias of accelerometer; Ft denotes the process input matrix; and Gt denotes the process noise matrix.

Ft=03×303×303×3Cbli03×3Fb03×3Fa03×3Cbli03×3I3×303×303×303×303×303×303×303×303×303×303×303×303×303×315×15 (27)
Gt=Cbli03×303×3Cbli03×303×303×303×303×303×315×6 (28)

where Cbli is the transformation matrix from the aircraft body coordinate system to the launch inertial system, and

Cbli=cosθcosφcosθsinφsinγsinθcosγcosθsinφcosγ+sinθsinγsinθcosφsinθsinφsinγ+cosθcosγsinθsinφcosγcosθsinγsinφsinγcosφcosγcosφ (29)

where φ,θ,γ are the yaw angle, pitch angle and roll angle of the aircraft measured by the gyroscope, respectively.

Fa=f14f15f16f24f25f26f34f35f36. The parameters f14f36 are the derivative of gravitational acceleration on position coordinates, and they vary with the position of the aircraft.

Fb=0azayaz0axayax0. ax,ay,az are the components of the apparent acceleration along the three axes according to the accelerometer. The process noise of the navigation system is as Equation (30):

Wt=wεxwεywεzwxwywzT (30)

where wεxwεywεzT is the random noise of gyro; and wxwywzT is the random noise of accelerometer. The noise covariance matrix of Wt is

Qt=diagσεx2σεy2σεz2σx2σy2σz2 (31)

4.2. Measurement Equation of Integrated Navigation

According to the landmark navigation process, the output is the attitude and position information of the aircraft, which subtracts the attitude and position information measured from the SINS to obtain the measurement equation. Therefore, the observation of the measurement equation is the transformed platform angle error (the navigation frame misalignment angle) ϕx,ϕy,ϕz and position error δx,δy,δz.

Solution process of the mathematical platform misalignment angle ϕx,ϕy,ϕz:

In the SINS/Landmark integrated navigation process, SINS obtains the pitch angle θ0, yaw angle φ0 and roll angle γ0 of the aircraft through the strap down solution, and the landmark camera obtains the pitch angle θ, yaw angle φ and roll angle γ, subtracting the two sets of angles to obtain the three-axis attitude error as Equation (32):

Δa=aθaφaγ=θθ0φφ0γγ0 (32)

Since the composition of the attitude error equation of SINS is the platform angle error (the navigation frame misalignment angle), it is necessary to convert the attitude error angle of Equation (32) into the platform angle error to establish the measurement equation. The conversion relationship is as Equation (33):

Δa=M1Δa (33)

where M1 is the attitude angle error transformation matrix, and Δa=ϕxϕyϕzT is the platform angle error.

Solution process of the position error δx,δy,δz:

The position coordinate re=xeyezeT of the aircraft under the Earth-centered-Earth-fixed system can be obtained by the position determination process of landmark navigation as in Section 3.2. The coordinate of the aircraft under the inertial system of the Earth’s core, i.e., ri=xiyiziT, can be calculated by the inertial device. Then,

r˜i=Ceire (34)

The position error under the inertial system is as Equation (35):

δri=δxiδyiδzi=rir˜i (35)

Converting it to the mathematical platform coordinate system, the conversion relationship is as Equation (36):

δr=M2δri (36)

where M2 is the position error conversion matrix, and δr=δxδyδzT is the position error angle.

Therefore, the measurement equation is as Equation (37):

Zt=ϕxϕyϕzδxδyδz=HXt+Vt (37)

where H=I3×3,03×3,I3×3,03×3T, V=δΔx,δΔy,δΔz,δξx,δξy,δξzT, δΔx,δΔy,δΔzT denotes the difference between the attitude measurement noise of the landmark camera and the constant drift error of the gyroscope. δξx,δξy,δξzT is the difference between position measurement noise of the landmark camera and the constant error of the accelerometer.

4.3. Integrated Navigation Filtering Algorithm

Consider the integrated navigation system according to Equations (25) and (37), discretize it and use Kalman filter to estimate it. The system equation can be transformed as Equation (38):

Xk+1=ΦkXk+ΓWkZk=HXk+Vk (38)

Assuming that T is the sampling interval, then

Φk=I+Fk1T+12!Fk12T2Γk1=TI+12!Fk1T+13!Fk12T2Gk (39)

The Kalman filter algorithm is expressed as follows:

The one-step prediction equation:

X^k+1|k=ΦkXk (40)

The state estimation:

X^k+1=X^k+1k+KkZk+1HX^k+1|k (41)

The gain of the filter:

Kk=Pk+1|kHTHPk+1|kHT+Rk1 (42)

The MSE of the one-step prediction:

Pk+1|k=ΦkPkΦkT+ΓkQk+1ΓkT (43)

The MSE of the estimation:

Pk+1=IKk+1HPk+1|k (44)

The above are the basic equations of the discrete Kalman filter. When the initial value X^0 and P0 are given, combined with the measurement Zk of the k moment, the state estimation X^k at k moment can be obtained through iteration.

5. Simulation and Analysis

The Shandong Peninsula, the Liaodong Peninsula and the Yangtze River Delta were selected as the navigation landmarks. The landmark cameras were used to photograph the landmarks in the flight segment of the aircraft, and then the landmarks were compared to obtain the position and attitude of the aircraft. The matching process is shown in Figure 5, Figure 6 and Figure 7.

Figure 5.

Figure 5

Schematic diagram of the matching process about the Shandong Peninsula.

Figure 6.

Figure 6

Schematic diagram of the matching process about the Liaodong Peninsula.

Figure 7.

Figure 7

Schematic diagram of the matching process about the Yangtze River Delta.

The trajectory of the spacecraft was drawn based on the captured information, as shown in Figure 8, Figure 9 and Figure 10.

Figure 8.

Figure 8

3D trajectory of the aircraft on the Earth.

Figure 9.

Figure 9

Partial 3D trajectory of the aircraft in the flight area.

Figure 10.

Figure 10

Trajectory of the aircraft under latitude and longitude reference.

The spacecraft was analyzed in the simulation, and the error parameters of SINS and landmark navigation were set as follows.

① Initial position errors of the three directions wer all 0 m. ② The initial speed errors were 0 m/s. ③ The initial attitude angle error was 0. ④ The gyro random constant drift was 0.1/h. ⑤ The gyroscope random noise drift satisfied a normal distribution with a mean of 0 and a standard deviation of 0.5. ⑥ The accelerometer random constant bias was 104g, where g is the Earth gravity constant. ⑦ The accelerometer random noise offset satisfied a normal distribution with a mean of 0 and a standard deviation of 5×105g. ⑧ The standard deviation of the landmark navigation output position error was 50 m. ⑨ The standard deviation of the landmark navigation output attitude error was 3.

The sampling interval of SINS was 0.01 s, the landmark navigation was 0.1 s, the combined interval was 0.1 s, the simulation time was 1110 s, and the Monte Carlo method was used for simulation. Due to the requirements of the launch inertial system, initial latitude and longitude were 116.34 and 39.98, respectively, and the height was 0 m. The platform angle error, gyro drift, position error and speed error were corrected by the feedback during the navigation process.

To verify the importance of the landmark attitude determination in the integrated navigation, two methods (with attitude determine and without attitude determine) were simulated, and the trajectories of the aircraft was obtained, as shown in Figure 11.

Figure 11.

Figure 11

The Influence of the attitude determination on the accuracy of aircraft orbit.

It can be seen in Figure 11 that, during the whole navigation process, the orbit determination of the aircraft with attitude determination is almost the same as the real orbit, while the trajectory without attitude determination is far from the real orbit, which indicates that the attitude determination is critical. The comparison of the position as well as the velocity determination in three directions are given, respectively.

In Figure 12 , we can find that, in the X direction, when there is no attitude determination, the error of the position gradually becomes larger before 900 s, and there is a convergence trend after 900 s: the velocity error becomes larger with time. With the attitude determination, the position and velocity errors can be kept near 0. In Figure 13, for Y direction, without the attitude determination, the position and velocity errors diverge over time. With the attitude determination, the position error is very small, almost at 0 level and the speed figure has a slight error, but it also has a better improvement. In Figure 14, for Z direction, without the attitude determination, the position error diverges with time, the velocity error is divergent before 160 s, and then gradually converges after 160 s. When with the attitude determination, the changes of the position and velocity errors are similar to the Y direction.

Figure 12.

Figure 12

Precision comparison of the X direction.

Figure 13.

Figure 13

Precision comparison of the Y direction.

Figure 14.

Figure 14

Precision comparison of the Z direction.

Through the comparison of the three directions, it can be seen that the position errors gradually increase with time when the landmark only provides the positioning information, and large offsets appear in three directions, leading the gradually decreasing of the navigation accuracy. When the landmark completes the attitude determination, the navigation result can remain stable for a long time. The velocity determination is similar to the position determination.

Figure 12, Figure 13 and Figure 14 express the qualitative analysis of the accuracy on whether determine the attitude in the three directions, while Table 2 and Table 3 express the quantitative accuracy analysis of it.

Table 2.

Position accuracy data statistics (unit: m).

X Y Z
With attitude determination The average error 25.6573 60.1919 175.0093
The maximum error 40.5812 116.1984 240.7109
Without attitude determination The average error 1004.2 2606.1 7664.0
The maximum error 1601.2 6251.0 13615

Table 3.

Velocity accuracy data statistics (unit: m/s).

VX VY VZ
With attitude determination The average error 0.5100 1.1333 3.4702
The maximum error 1.2567 2.0557 6.2684
Without attitude determination The average error 1.6633 5.5829 12.1824
The maximum error 3.1103 7.2776 19.0498

It can be seen in Table 2 and Table 3 that it is quite different whether there is the attitude determination for navigation, and the attitude determination has a large impact on the navigation accuracy. When the attitude determination process is completed, the position and the velocity of the integrated navigation maintain at a high precision, and the navigation result approximates the true.

6. Conclusions

In this paper, faced with the limitation of the traditional SINS/CNS integrated navigation that the system is difficult to navigate all day and all weather, a SINS/Landmark integrated navigation method based on landmark attitude determination is proposed. Through the theoretical analysis of the landmark navigation principle and the simulation of the SINS/Landmark integrated navigation system, the landmark information is necessary to determine the attitude. When the landmark navigation only extracts the position information, the accuracy of the integrated navigation will maintain a large range of fluctuations. After the attitude determination, the accuracy of the integrated navigation can reach a high level and does not decrease with time. This shows that the SINS/Landmark integrated navigation method based on the landmark attitude determination is indeed effective, which can improve the accuracy of the navigation to some extent.

Abbreviations

SINS strapdown inertial navigation system
CNS celestial navigation system
UAV unmanned aerial vehicle
GPS global position system
SIFT Scale-Invariant feature transform
KD k-dimensional
ED eigenvalue decomposition
MSE mean square error

Author Contributions

S.X. proposed the main idea and finished the draft manuscript; J.W. conceived of the experiments and drew the figures and tables; Z.H. conducted the simulations; and H.Z. and D.W. analyzed the data.

Funding

This research was supported by the National Natural Science Foundation of China (Grant No. 61773021), the Natural Science Foundation for Distinguished Young Scholars of Hunan Province (Grant No. 2019JJ20018), the Natural Science Foundation of Hunan Province 2019JJ50745), and Civil Aerospace Advance Research Project (Grant No. D020213).

Conflicts of Interest

The authors declare no conflict of interest.

References

  • 1.Ning X., Zhang J., Gui M., Fang J. A Fast Calibration Method of the Star Sensor Installation Error Based on Observability Analysis for the Tightly Coupled SINS/CNS-Integrated Navigation System. IEEE Sens. J. 2018;18:6794–6803. doi: 10.1109/JSEN.2018.2850779. [DOI] [Google Scholar]
  • 2.Ning X., Liu L. A two-mode INS/CNS navigation method for lunar rovers. IEEE Trans. Instrum. Meas. 2014;63:2170–2179. doi: 10.1109/TIM.2014.2307972. [DOI] [Google Scholar]
  • 3.Han J., Changhong W., Li B. Novel all-time initial alignment method for INS/CNS integrated navigation system; Proceedings of the IEEE 13th International Conference on Signal Processing (ICSP); Chengdu, China. 6–10 November 2016; pp. 1766–1770. [Google Scholar]
  • 4.Lijun S., Wanliang Z., Yuxiang C., Xiaozhen C. Based on Grid Reference Frame for SINS/CNS Integrated Navigation System in the Polar Regions. Complexity. 2019;2019:1–8. doi: 10.1155/2019/2164053. [DOI] [Google Scholar]
  • 5.Pimenta F. Astronomy and navigation. In: Ruggles C.L., editor. Handbook of Archaeoastronomy and Ethnoastronomy. Springer; New York, NY, USA: 2015. pp. 43–65. [Google Scholar]
  • 6.Katz-Bassett E., Sherry J., Huang T.Y., Kazandjieva M., Partridge C., Dogar F. Helping conference attendees better understand research presentations. Commun. ACM. 2016;59:32–34. doi: 10.1145/2886093. [DOI] [Google Scholar]
  • 7.Wang Q., Li Y., Ming D., Wei G., Zhao Q. Performance enhancement of INS/CNS integration navigation system based on particle swarm optimization back propagation neural network. Ocean Eng. 2015;108:33–45. doi: 10.1016/j.oceaneng.2015.07.062. [DOI] [Google Scholar]
  • 8.Delaune J., Besnerais G.L., Voirin T., Farges J.L., Bourdarias C. Visual–inertial navigation for pinpoint planetary landing using scale-based landmark matching. Robot. Autom. Syst. 2016;78:63–82. doi: 10.1016/j.robot.2016.01.007. [DOI] [Google Scholar]
  • 9.Gupta S., Fouhey D., Levine S., Malik J. Unifying map and landmark based representations for visual navigation. arXiv. 20171712.08125 [Google Scholar]
  • 10.Costello M.J., Castro R. Precision Landmark-Aided Navigation. 7,191,056. US Patent. 2007 Mar 13;
  • 11.Cesetti A., Frontoni E., Mancini A., Zingaretti P., Longhi S. A Vision-Based Guidance System for UAV Navigation and Safe Landing using Natural Landmarks. J. Intell. Robot. Syst. 2010;57:233. doi: 10.1007/s10846-009-9373-3. [DOI] [Google Scholar]
  • 12.He Y., Li S., Guo Q. Landmark based position and orientation method with tilt compensation for missile launcher; Proceedings of the 35th Chinese Control Conference (CCC); Chengdu, China. 27–29 July 2016; pp. 5585–5589. [Google Scholar]
  • 13.Khuller S., Raghavachari B., Rosenfeld A. Landmarks in graphs. Discrete Appl. Math. 1996;70:217–229. doi: 10.1016/0166-218X(95)00106-2. [DOI] [Google Scholar]
  • 14.Kim Y., Hwang D.H. Vision/INS integrated navigation system for poor vision navigation environments. Sensors. 2016;16:1672. doi: 10.3390/s16101672. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 15.Babel L. Flight path planning for unmanned aerial vehicles with landmark-based visual navigation. Robot. Autom. Syst. 2014;62:142–150. doi: 10.1016/j.robot.2013.11.004. [DOI] [Google Scholar]
  • 16.Malioutov D., Cetin M., Willsky A.S. A sparse signal reconstruction perspective for source localization with sensor arrays. IEEE Trans. Signal Process. 2005;53:3010–3022. doi: 10.1109/TSP.2005.850882. [DOI] [Google Scholar]
  • 17.Cai T.T., Wang L. Orthogonal Matching Pursuit for Sparse Signal Recovery With Noise. IEEE Trans. Inf. Theory. 2011;57:4680–4688. doi: 10.1109/TIT.2011.2146090. [DOI] [Google Scholar]
  • 18.Carmi A., Gurfil P., Kanevsky D. Methods for Sparse Signal Recovery Using Kalman Filtering With Embedded Pseudo-Measurement Norms and Quasi-Norms. IEEE Trans. Signal Process. 2010;58:2405–2409. doi: 10.1109/TSP.2009.2038959. [DOI] [Google Scholar]
  • 19.Cheung W., Hamarneh G. n-SIFT: n-Dimensional Scale Invariant Feature Transform. IEEE Trans. Image Process. 2009;18:2012–2021. doi: 10.1109/TIP.2009.2024578. [DOI] [PubMed] [Google Scholar]
  • 20.Cruz-Mota J. Scale Invariant Feature Transform on the Sphere: Theory and Applications. Int. J. Comput. Vision. 2012;98:217–241. doi: 10.1007/s11263-011-0505-4. [DOI] [Google Scholar]
  • 21.Smirnov F., Zamolodchikov A. On space of integrable quantum field theories. Nucl. Phys. B. 2017;915:363–383. doi: 10.1016/j.nuclphysb.2016.12.014. [DOI] [Google Scholar]
  • 22.Lan Z., Lin M., Li X., Hauptmann A.G., Raj B. Beyond gaussian pyramid: Multi-skip feature stacking for action recognition; Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition; Boston, MA, USA. 8–10 June 2015; pp. 204–212. [Google Scholar]
  • 23.Li M., Wang L., Hao Y. Image matching based on SIFT features and kd-tree; Proceedings of the 2nd International Conference on Computer Engineering and Technology; Chengdu, China. 16–18 April 2010; pp. 218–222. [Google Scholar]
  • 24.Gorbenko A., Popov V. The problem of selection of a minimal set of visual landmarks. Appl. Math. Sci. 2012;6:4729–4732. [Google Scholar]

Articles from Sensors (Basel, Switzerland) are provided here courtesy of Multidisciplinary Digital Publishing Institute (MDPI)

RESOURCES