Preprints of the 18th IFAC World Congress Milano (Italy) August 28 - September 2, 2011

Adaptive Vision-based Missile Guidance in the Presence of Evasive Target Maneuvers S. S. Mehta ∗ W. MacKunis ∗∗ J. W. Curtis ∗∗∗ ∗

University of Florida, Research and Engineering Education Facility, Shalimar, FL-32579 (e-mail: [email protected]). ∗∗ Physical Sciences Department, Embry-Riddle Aeronautical University, Daytona Beach, FL-32114 (e-mail: [email protected]) ∗∗∗ Munitions Directorate, Air Force Research Laboratory, Eglin AFB, FL-32542 (e-mail: [email protected]) Abstract: A nonlinear adaptive visual servo guidance law is presented for a bank-to-turn (BTT) missile airframe that achieves near zero miss distance interception of a target undergoing unknown evasive maneuvers. The controller is developed assuming unknown missile linear velocity and target depth measurements; hence can be regarded as a pure vision-based guidance law. By approximating the unknown, scaled relative velocity via power series expansion, a continuous adaptive parameter update law is developed to compensate for the unknown missiletarget relative velocity and the depth measurements. In addition, robust elements are included in the guidance law to compensate for external disturbances and parameter identification errors. A rigorous Lyapunov-based stability analysis is utilized to prove uniformly ultimately bounded (UUB) stability of the system states, and high-fidelity numerical simulation results are provided to verify the performance of the proposed missile guidance law. Keywords: Adaptive Control, Vision-based Control, Robust Control, Missile Systems. 1. INTRODUCTION With the increase in computing power and communication network capabilities, vision-based guidance and control systems have emerged as a key technological component in the autonomy of future defense systems due to the information-rich nature of vision-based feedback. A number of guidance, control, and estimation results can be found for unmanned vehicle systems using active/passive vision (Langelaan [2007], Seetharaman et al. [2006], Saripalli et al. [2003]). Vision-based guidance of missile systems is an interesting problem from the standpoint of unmanned (autonomous/remotely operated) defense systems, but it has received rather less attention due to several challenges inherent in the corresponding guidance law design. For example, the target may perform evasive maneuvers, viz. a defensive motion of the target with an unknown acceleration, resulting in unknown time-varying relative velocity between the missile/camera and the target. In addition, wind gusts and air density variations may perturb the speed and path of the missile and/or the target (Siouris [2004]), which might cause the target to leave the field-of-view (FOV) of the missile’s on-board camera. Typically, vision-based systems (e.g., Bay et al. [2008], Shi and Tomasi [1994]) employ a feature point tracking method to detect and keep track of the target in an image plane during successive frames. Variations in the illumination, camera motion, temporary occlusion, and errors in feature point tracking are serious concerns that can cause

feature point drift or loss. Other significant challenges that must be addressed in robust control design for vision-based systems include uncertain time-delays in image acquisition and processing, uncertainties associated with the intrinsic camera calibration parameters, and unknown time-varying feature depth (Sznaier and Camps [1998]). The focus of the presented work is to develop an adaptive visual servo controller to compensate for the unknown time-varying missile-target relative velocity and the unknown feature depth measurements, while achieving robustness with respect to the unknown external disturbances. In Malyavej et al. [2006], the authors frame an interesting problem of CNG law implementation using robust extended Kalman filter (REKF)-based sensor fusion of networked sensors in the presence of communication constraints. In proportional navigation guidance (PNG), measurement of the line-of-sight (LOS) rate is required to provide acceleration control inputs to a missile autopilot. Various Kalman filter-based LOS rate estimation solutions using an imaging seeker can be found in Lin et al. [2005], Ananthasayanam et al. [2005] and the references therein. In Zaeim et al. [2010], a stabilizing control law is proposed for a gimbaled imaging seeker, and an integrated seekermissile system is presented to intercept a maneuvering target. While the aforementioned guidance law techniques have been successful in their respective guidance tasks, most of the research in vision-based guidance to intercept a moving target has been ‘reactive’ in nature, i.e., the control approach does not attempt to compensate for unknown

⋆ This research is supported in part by the US Air Force, Eglin AFB, grant number 00087801-201-19120100.

Copyright by the International Federation of Automatic Control (IFAC)

5471

Preprints of the 18th IFAC World Congress Milano (Italy) August 28 - September 2, 2011

target maneuvers by including estimates in the control structure. In another course of research, nonlinear control theory has been widely studied to obtain precision missile guidance to intercept a moving target. A Lyapunov-based guidance law is developed in Yanushevsky and Boord [2005] for a nonlinear missile dynamic model in the presence of vanishing disturbances due to a maneuvering target. In Pei-bei et al. [2009], a Lyapunov-based nonlinear guidance law is presented to intercept a target at a desired impact angle under the assumption that the target is stationary and the missile is moving with constant velocity. A smooth secondorder sliding mode (SSOSM)-based guidance law is proposed in Shtessel et al. [2007] for a missile-interceptor guidance problem, which compensates for sufficiently smooth uncertain disturbances. By using a nonlinear disturbance observer, the technique in Shtessel et al. [2007] achieves finite-time convergence in the presence of disturbances when measurement noise is absent. In this paper, a nonlinear vision-based adaptive pursuit guidance law is presented for a bank-to-turn (BTT) missile system with an objective to intercept a target moving with unknown time-varying velocity and in the presence of unknown target depth measurements. The linear velocity of a missile is produced by means of a propulsion system, where the thrust force is uncontrolled and directed along the missile longitudinal axis (Yanushevsky [2008]); it is assumed that the missile linear velocity is unknown and unmeasurable. For the BTT type of missile systems, the heading change is obtained by using aerodynamic control surface deflections for pitch, yaw, and roll angles. Therefore, the control objective is to design angular rate control inputs in roll, pitch, and yaw to intercept a target. A monocular vision system, i.e., an optical imaging seeker, provides image feedback of the target, and the unknown relative velocity between the missile and the target is encompassed into the missile-target kinematics by formulating an image Jacobian or an interaction matrix. The resulting image-based visual servo (IBVS) control problem has several advantages, including reduced computation time, robustness to inaccuracies in the camera calibration (Zhang and Ostrowski [1998]), and the ability to maintain the target in the FOV via feature-based motion strategies (Corke and Hutchinson [2001]). To address the issue of robustness of the vision-system with respect to extraneous agents (e.g., wind gusts), a continuous, robust feedback element is included in the guidance law, which compensates for an unknown nonlinear disturbance. To this end, a robust and adaptive nonlinear missile guidance law is presented that guarantees uniformly ultimately bounded (UUB) stability of the system states, where the size of the residual set can be made arbitrarily small. A Lyapunovbased stability analysis is provided to prove the theoretical result, and high-fidelity numerical simulation results are presented to demonstrate the performance of the proposed guidance law. 2. IMAGE KINEMATICS Consider an orthogonal coordinate frame Fm (x, y, z) whose origin is attached to the center of gravity of the missile airframe, a body-carried coordinate frame

Fr (xr , yr , zr ) with the origin at the missile center of gravity and orientation fixed to a north-east-down (NED) navigation frame, and an Earth-fixed reference system (ESF) Fe (xe , ye , ze ) on the surface of Earth. Compared to the range of a vision-guided missile, it is assumed that Earth’s curvature is negligible; therefore, the orientation of coordinate frame Fr is considered to coincide with that of frame Fe . The linear and angular velocities of the missile, expressed in Fm with respect to Fe , are represented as T

vm (t) = [ vx (t) vy (t) vz (t) ] , T

ωm (t) = [ ωx (t) ωy (t) ωz (t) ] , (1) respectively, where vx (t), vy (t), vz (t) ∈ R are the linear velocities along the x-, y-, and z-axes, and ωx (t), ωy (t), ωz (t) ∈ R denote the angular rates about the x-, y-, and z-axes, respectively. A monocular camera is rigidly attached to the center of gravity of the missile airframe 1 . A time-varying orthogonal coordinate frame Fc (xc , yc , zc ) is attached to the camera. Due to the relative orientation between coordinate frames Fm and Fc , the linear and angular velocities of the camera in frame Fc can be represented, respectively, as T

vc (t) = [ vy (t) −vz (t) vx (t) ] , T

wc (t) = [ ωy (t) −ωz (t) ωx (t) ] . (2) A time-varying orthogonal coordinate frame Ft (xt , yt , zt ) is rigidly attached to a target which is assumed to perform unknown evasive maneuvers. Without loss of generality, it is assumed that the origin of coordinate frame Ft , denoted by T , is visible in the camera image plane π; the point T will be referred to as the target throughout the remainder of the paper. It is also assumed that the camera maintains the target in the FOV at all times. The unknown time-varying velocity of the target T expressed in the missile coordinate frame Fm is denoted by vt (t) = T [ vtx (t) vty (t) vtz (t) ] , where vtx (t), vty (t), vtz (t) ∈ R are the linear velocities 2 along the x-, y-, and z-axes, respectively. The time-varying Euclidean coordinates of the target T expressed in the camera coordinate frame Fc (t) are denoted by T

m(t) ¯ , [ x(t) y(t) z(t) ] . (3) Remark 1. In (3), the target is assumed to be in front of the camera, i.e., z(t) > ǫ, where ǫ ∈ R is a positive scalar. However, the impact happens for some set z ∈ [zmin , zmax ], where zmax ≥ zmin > 0 Garnell and East [1977], Zarchan [1998]. This is due to the fact that the exact intercept value (i.e., the “zero intercept”) depends on the size of the ballistic target and the relative position of the camera at impact. The rate of change of the Euclidean position m(t) ¯ expressed in Fc due to the relative motion between the camera and the target can be related to the 6-DOF velocity of frame Fc and the linear velocity of the target T as m ¯˙ = −vc + vt′ − wc × m ¯ (4) 1 This assumption is made without loss of generality, since any deviations can be accounted for by an appropriate coordinate frame transformation. 2 The target coordinate frame F may undergo 6-DOF motion with t respect to frame Fe ; however, the motion of the point target T would be seen as purely translational motion along the x-, y-, and/or z-axes of frame Fm .

5472

Preprints of the 18th IFAC World Congress Milano (Italy) August 28 - September 2, 2011

T

where vt′ (t) = [ vty (t) −vtz (t) vtx (t) ] represents the target velocity as expressed in the camera coordinate frame Fc , and vc (t) and ωc (t) are defined in (2). The image coT ordinates of the target, denoted by p(t) , [ px (t) py (t) ] , and the corresponding Euclidean coordinates m(t) ¯ are related by perspective projection geometry as f by(t) f ax(t) + px0 py (t) = + py0 . (5) px (t) = z(t) z(t) In (5), f , a, b ∈ R represent the constant focal length and camera scaling factors along the x- and y-axes, respectively; px0 , py0 ∈ R denote the constant coordinates of the principal point (i.e., the intersection of an optical axis with the image plane) of the camera. Taking the time derivative of (5) and using (1), (3), and (4) in the resulting expression, the velocity of the camera and the target can be related to the velocity p(t) ˙ ∈ R2 of the feature point in the image plane π as     vm (t) − vt′ (t) p˙ x ′ = Jv p(t) ˙ = + Jω ωm (t) + d(t) (6) p˙ y z where Jv′ (px , py ) ∈ R2×3 denotes the following measurable image Jacobian that relates the linear velocities of the camera and target to the target image velocity:   px −f a 0 ′ . (7) Jv = py 0 f b 2×3

In (6), Jω (px , py ) ∈ R is the measurable image Jacobian that relates the camera angular velocities to the target image velocity as   py a px py px 2 a(f + 2 )   b fb fa  . (8) Jω =  2   px b px py py b(f + 2 ) − a fb fa

Assumption 1: The unknown nonlinear disturbance d(t) ∈ R2 in (6) satisfies kd (t) k ≤ γ0 (9) where γ0 ∈ R is a known bounding constant and k k denotes the L2 vector norm. 3. CONTROL OBJECTIVE

The control objective is to enable the missile airframe Fm to intercept the maneuvering target T , or in a physical sense, to drive the relative distance between the missile and target to zero. This can be achieved by regulating the time-varying target pixel coordinates p(t) to the desired image coordinates, i.e., the control problem can be set in a pursuit guidance framework. Hence, the control objective can be stated mathematically as T

(10) p(t) → pd where, pd = [ px0 py0 ] . In (10), without loss of generality, the optical axis of the camera is assumed to be along the longitudinal axis of the missile. 4. CONTROL DEVELOPMENT In this section, a robust and adaptive IBVS guidance law will be developed to achieve precision target interception for the nonlinear image kinematics given in (6). To this end, a polynomial approximation of the unknown scaled

relative velocity between the missile and the target will be utilized, and robust and adaptive elements will be injected into the guidance law to compensate for the resulting uncertainty. A function f (t) can be expressed in terms of a Taylor (or power) series, provided the function is continuous and suitably differentiable; and a polynomial approximation of f (t) can be generated from truncation of its power series expansion as provided by the following lemma (Carnahan et al. [1969]): Lemma 1. If a continuous function f (t) possesses a continuous (n + 1)th derivative everywhere on the interval [t0 , t], then it can be represented by a finite power series as (t − t0 ) (1) (t − t0 )i−1 (i−1) f (t) = f (t0 ) + f (t0 ) + · · · + f (t0 ) 1! (i − 1)! (t − t0 )n (n) + ··· + f (t0 ) + R(t) (11) (n)! where f (i) (•) represents the ith time-derivative of the function f (t) evaluated at the argument, and R(t) denotes the remainder of Taylor formula given by (t − t0 )n+1 (n+1) R(t) = f (ζ), t0 < ζ < t. (12) (n + 1)! Based on the control objective defined in Section 3, a T regulation error e(t) = [ e1 e2 ] ∈ R2 is defined as the difference between the coordinates of the principal point and the target image coordinates as e(t) , p(t) − pd . (13) After taking the time-derivative of e(t) and substituting the image kinematic relationship given by (6), the following expression can be obtained: (14) e˙ = p˙ = Jv′ vr′ + Jω ωm + d ′ where the unknown scaled relative velocity vr (t) ∈ R3 is assumed to be a continuously differentiable function of class cp where the (p + 1) derivatives exist 3 such that on any interval [t0 , t] for t ∈ [t0 , t0 + T ) it can be represented using (11), where T ∈ R is a constant interval length. Using Lemma 1, each element of vr′ (t) within a finite interval T can be represented locally at t0 as a polynomial in time with constant unknown coefficients such that (Pagilla and Zhu [2004]) # " L(t, t0 ) 0 0 ′ 0 L(t, t0 ) 0 θ(t0 ) + δvr′ (t, t0 ) vr (t, t0 ) = 0 0 L(t, t0 ) = ∆(t, t0 )θ(t0 ) + δvr′ (t, t0 ), t ∈ [t0 , t0 + T ) (15) p p+1 is a row where L(t, t0 ) = [ 1 (t − t0 ) . . . (t − t0 ) ] ∈ R vector of known functions of time. Also in (15), θ(t0 ) ,  T T θx (t0 ) θyT (t0 ) θzT (t0 ) ∈ R3(p+1) denotes a vector of unknown coefficients, where θx (t0 ), θy (t0 ), θz (t0 ) ∈ Rp+1 are obtained by evaluating the components of vr′ (t) ∈ R3 and their first p derivatives locally at t0 , and δvr′ (t, t0 ) ∈ R3 is a bounded function approximation error (i.e., the remainder from the Taylor series approximation). Note that θ(t0 ) is only constant within each interval [ti , ti+1 ) 3

In the physical sense, this assumption requires that the unknown depth z(t) is non-zero (see Remark 1) and vm (t), vt′ (t), and z(t) possess (p + 1) bounded derivatives.

5473

Preprints of the 18th IFAC World Congress Milano (Italy) August 28 - September 2, 2011

and can differ from one interval to another for a timevarying parameter. For any non-negative constant β ∈ R, the function approximation error δvr′ (t, t0 ) satisfies kδvr′ (t, t0 )k ≤ β ∀t ≥ 0. (16) It can be shown that δvr′ (t, t0 ) can be reduced by selecting a higher order polynomial (i.e., increasing p) and/or by reducing the interval length T . After substituting (15) in (14), the open-loop error system can be obtained as e˙ = Jv (∆(t, t0 )θ(t0 ) + δvr′ (t, t0 )) + Jω ωm (t) + d. (17) The unknown relative velocity vr′ (t, t0 ) measured in Fm can be estimated as ˆ 0) vˆr′ (t, t0 ) = ∆(t, t0 )θ(t (18) 3(p+1) ˆ where the time-varying estimate θ(t) ∈ R of the unknown constant coefficient vector θ(t0 ) is designed using the following direct adaptive parameter update law:  ˙ θˆ = proj ΓY T e . (19)

In (19), the function proj (·) denotes a normal projection algorithm, which ensures that elements θˆi (t) ∀ i = ˆ are bounded as (for further details 1, . . . , 3(p + 1) of θ(t) see Dixon [2007], Zergeroglu et al. [2000]) θ i ≤ θˆi (t) ≤ θ¯i (20) where θi , θ¯i ∈ R denote known, constant lower and upper bounds of θˆi (t), respectively. In (19), Y (px , py , t, t0 ) = J ′ ∆ v

is a known regression matrix and Γ ∈ R3(p+1)×3(p+1) is a diagonal, positive definite gain matrix. The adaptive update law given in (19) is designed for each time-interval of windows size T , i.e., the coefficients θ(t0 ) are constant during a given interval but may vary from one interval to the other. To facilitate the following guidance law design and stability analysis, the scaled relative velocity estimation error v˜r′ (t, t0 ) ∈ R3 is defined as (21) v˜r′ (t, t0 ) , vr′ (t) − vˆr′ (t, t0 ). After substituting (15) and (18) into (21), the scaled relative velocity estimation error can be expressed as ˜ + δv′ (t, t0 ) v˜r′ (t, t0 ) = ∆(t, t0 )θ(t) (22) r 3(p+1) ˜ where the estimation error θ(t) ∈ R is defined as ˜ , θ(t0 ) − θ(t). ˆ θ(t) (23) Remark 2. It can be shown that continuity of the estimate θˆ (t, t0 ) can be guaranteed during transitions between the th ith and (i + 1) intervals by resetting the value of the th estimate at the beginning of the (i + 1) interval for i = 1, 2, . . . as shown in Pagilla and Zhu [2004]. Based on the open-loop error system in (17) and the subsequent stability analysis, the missile angular velocity control input ωm (t) is designed as   (24) ωm (t) = −Jω+ ke + Y θˆ + ud . In (24), Jω+ ∈ R3×2 denotes the pseudo-inverse of the image Jacobian Jω (px , py ), k ∈ R is a positive control gain, and ud (t) ∈ R3 is a robust feedback element designed to compensate for the function approximation error δvr′ (t, t0 ) and the bounded disturbance d(t) as (Corless and Leitmann [1981])  e ′  , kek ≥ ǫ0 (βkJv k + γ0 ) kek (25) ud (t) = 1   (βkJv′ k + γ0 )e, kek < ǫ0 ǫ0

where ǫ0 ∈ R is a positive scalar, γ0 , β ∈ R are defined in (9) and (16), respectively. The advantages of the robust control term given in (25) are that it is a continuous feedback control, and that it can be tuned for precise target interception by selecting ǫ0 arbitrarily small. After substituting the control input in (24) into (17), the closedloop error system can be obtained as e˙ = −ke − ud + Y θ˜ + J ′ δv′ (t, t0 ) + d. (26) v

r

5. STABILITY ANALYSIS Theorem 1. The adaptive IBVS guidance law of (19), (24), and (25) ensures uniformly ultimately bounded target image tracking in the sense that ke (t)k ≤ ζ0 exp {−ζ1 t} + ζ2 (27) where ζ0 , ζ1 , ζ2 ∈ R denote positive bounding constants. Proof 1. Let V (t) be defined as the following nonnegative function during each interval t ∈ [ti , ti+1 ): 1 1 ˜ V = eT e + θ˜T Γ−1 θ. (28) 2 2 Based on (20) and (23), the Lyapunov function in (28) can be upper and lower bounded as 2

2

λ1 kek + c1 ≤ V (t) ≤ λ2 kek + c2 (29) where λ1 , λ2 , c1 , c2 ∈ R are known positive bounding constants. After using (19) and (26), the time-derivative of V (t) can be expressed as V˙ = −eT ke − eT ud + eT (J ′ δv′ (t, t0 ) + d). (30) v

r

Thus, from (25), if kek ≥ ǫ0 , V˙ (t) can be upper bounded as V˙ ≤ −k kek2 − (βkJv′ k + γ0 ) kek  + kδvr′ (t, t0 )k kJv′ k + kdk kek . (31) After canceling common terms and using the bounds on d(t) and δvr′ (t, t0 ) defined in (9) and (16), respectively, the upper bound in (31) can be rewritten as 2 V˙ ≤ −k kek . (32) If kek ≤ ǫ0 , the expression in (30) can be upper bounded as 1 2 2 V˙ ≤ −k kek − (βkJv′ k + γ0 ) kek ǫ0  + kδvr′ (t, t0 )k kJv′ k + kdk kek . (33) Using the upper bound in (9) and (16), and after completing the squares in (33), the upper bound on V˙ (t) can be expressed as 1 V˙ ≤ −k kek2 + (β kJv′ k + γ0 ) ǫ0 . (34) 4 Consequently, for all (e, t) ∈ R2 × R, (29) can be used to express inequality (34) as k V˙ ≤ − V (t) + ζ1 (35) λ2 1 ′ 2 where ζ1 = kc λ2 + 4 (β kJv k + γ0 ) ǫ0 is a constant. The linear differential inequality in (35) can be solved as      k λ2 k 1 − exp − t V (t) ≤ exp − t V (0) + ζ1 . λ2 k λ2 (36) The expressions in (28), (29), and (35) can be used to conclude that e(t) ∈ L∞ . Given that e (t) ∈ L∞ , (20), (24),

5474

Preprints of the 18th IFAC World Congress Milano (Italy) August 28 - September 2, 2011

6. SIMULATION RESULTS A numerical simulation was performed to demonstrate the performance of the proposed adaptive visual kinematic guidance law. The initial position tT ∈ R3 and orientation Rt ∈ R3×3 of the target coordinate frame Ft with respect to the origin of the earth-fixed reference frame Fe are considered to be T tT = [ 1200 2400 −3500 ] m, Rt = I3×3 (38) where I3×3 represents an identity matrix. The timevarying linear and angular velocities of Ft with respect to Fe are assumed to be given by T

vte = [ 20 50 30 ] m/s

Fig. 1 shows the position of the target in the camera image plane. The effect of disturbance is visible in the form of scatter around the equilibrium point, where it can be seen that the disturbance rejection property of the proposed controller resulted in improved target tracking compared to the uncompensated IBVS guidance law (i.e., without the robust and adaptive feedback elements). In addition, Fig. 1 shows the target image position pf inal at the time of impact using a) the proposed guidance law T pf inal = [ 0.3055 −0.3042 ] , and b) without compensatT ing for the target motion pf inal = [ −2.0202 2.2321 ] . The results demonstrate that ‘reactive’ vision-based missile guidance laws can incur large miss distances or might completely miss a highly maneuvering target. A plot of the commanded angular velocity control input is shown in Fig. 2. The Euclidean trajectories of the missile and target are shown in Fig. 3. At the time of impact, the Euclidean position of target with respect to the camera coordinate frame T was observed to be m ¯ = [ −0.0008 0.0010 0.4940 ] m, where it was assumed that the impact occurs when z(t) ≤ 0.5m. 70

50 40

(39)

The mathematical models for the target maneuvers, missile linear velocity, and disturbances given in (38)-(43) are used to generate the plant model only; they are not used in the guidance law. The robust and adaptive feedback elements in the guidance law compensate for these unmodeled effects.

20 10

2

0

0

−10

−2

−30 −200

−4 −150

0

4 −100 −50 p (t) [pixels]

0

50

x

Fig. 1. Image coordinates p(t) of the maneuvering target as seen by the on-board camera where the image position of target T at the interception is indicated by 1) ‘•’ - for the proposed controller and 2) ‘△’ - for the uncompensated controller.

ωx [rad/s]

0.1 0 −0.1 −0.2 0

0.2

0.4

0.6 time [s]

0.8

1

1.2

0.2

0.4

0.6 time [s]

0.8

1

1.2

0.2

0.4

0.6 time [s]

0.8

1

1.2

ωy [rad/s]

20 0 −20 −40 0 200 ωz [rad/s]

T

tM = [ 500 1000 −4500 ] m (41) # " 0.7660 −0.4924 0.4132 (42) RM = 0.6428 0.5868 −0.4924 . 0 0.6428 0.7660 The simulation results are based on the assumption that the image measurements corresponding to the target pixel coordinates p(t) are affected by additive white Gaussian noise of standard deviation 0.1 pixels. A nonlinear bounded disturbance dω (t), e.g., due to a wind gust, is assumed to perturb the angular rates of a missile between the time interval 0.4s ≤ t < 1s, which causes disturbance d(t) in terms of feature velocity in an image plane. The disturbance function is assumed to be of the following form: # " 3 sin(10t) + 1.5 (43) dω (t) = 5 cos(20t) + 1.2 . 2 sin(40t) + 0.8

30

−20

T

ωte = [ 0.1 −0.8 cos(t) 0.5 ] rad/s (40) and the unknown linear velocity of the missile produced T by a propulsion system is assumed to be vme = [ 80 0 0 ] m/s. The initial position tM and orientation RM of the missile body frame Fm with respect to Fe were selected as

Proposed Controller Uncompensated IBVS

60

py(t) [pixels]

and (25) can be used along with Assumption 1 to prove that ud (t) , ωm (t) ∈ L∞ . Since δvr′ (t, t0 ), d (t) ∈ L∞ , (20), (23), and (26) can be used to conclude that e˙ (t) ∈ L∞ . The inequalities in (29) and (36) can be used to conclude that   2   λ2 ke (0)k + c2 k 2 kek ≤ exp − t λ1 λ2   c2 − c1 ζ1 λ2 < γ1 (e (0)) (37) + + λ1 k λ1 where γ1 ∈ R is a positive constant. The result in (27) can now be directly obtained from (37). Thus, the robust term designed in (25) guarantees uniform ultimate boundedness of the target image tracking error, where the error bound can be reduced by reducing the size of ǫ0 (Corless and Leitmann [1981]). Hence, the target image coordinates are regulated within a small region centered at the principal T coordinates pd = [ px0 py0 ] .

100 0 −100 0

Fig. 2. Plot showing the commanded angular velocity control input ωm (t) to the missile airframe for the proposed controller.

5475

Preprints of the 18th IFAC World Congress Milano (Italy) August 28 - September 2, 2011

−2500

z−axis [m]

−3000 −3500 −4000 −4500 3000 2500 2000 1500 y−axis [m]

1000 400

600

800

1000

1200

1400

x−axis [m]

Fig. 3. Euclidean trajectory of the missile airframe Fm along with the time-varying target trajectory Ft . 7. CONCLUSION A missile guidance problem is formulated in a visionbased adaptive control framework in the presence of unknown missile-target relative velocity and unknown depth measurements. Missile-target kinematics are expressed in terms of an image Jacobian, and the scaled relative velocity is estimated using a direct adaptive parameter update law. A continuous robust feedback element is included in the guidance law structure to compensate for the parameter estimation errors and the bounded external disturbances. A Lyapunov-based stability analysis proves that the proposed control law guarantees uniform ultimate boundedness (UUB) of the error system around the equilibrium point. Numerical simulation results demonstrate that using the relative velocity estimates and robust elements in the control structure can significantly improve the error tracking performance. REFERENCES M. R. Ananthasayanam, A. K. Sarkar, A. Bhattacharya, P. Tiwari, and P. Vorha. Nonlinear observer state estimation from seeker measurements and seeker-radar measurements fusion. In Guidance, Navigation, and Control Conference, Proceedings of the AIAA, pages AIAA 2005–6066, Aug. 2005. Herbert Bay, Andreas Ess, Tinne Tuytelaars, and Luc Van Gool. Speeded-up robust features (surf). Computer Vision and Image Understanding, 110(3):346–359, 2008. ISSN 1077-3142. Similarity Matching in Computer Vision and Multimedia. B. Carnahan, H.A. Luther, and J.O. Wilkes. Applied Numerical Methods. John Wiley & Sons Inc., 1969. P.I. Corke and S.A. Hutchinson. A new partitioned approach to image-based visual servo control. Robotics and Automation, IEEE Transactions on, 17(4):507 – 515, Aug. 2001. ISSN 1042-296X. M. Corless and G. Leitmann. Continuous state feedback guaranteeing uniform ultimate boundedness for uncertain dynamic systems. Automatic Control, IEEE Transactions on, 26(5):1139–1144, Oct. 1981. ISSN 00189286. W. E. Dixon. Adaptive regulation of amplitude limited robot manipulators with uncertain kinematics and dynamics. Automatic Control, IEEE Transactions on, 52 (3):488–493, Mar. 2007. ISSN 0018-9286. P. Garnell and D. J. East. Guided Weapon Control Systems. Oxford: Pergamon Press, 1977.

J. Langelaan. State estimation for autonomous flight in cluttered environments. Journal of Guidance, Control, and Dynamics, 30(5):1414–1426, 2007. Zhe Lin, Yu Yao, and Ke-Mao Ma. The design of los reconstruction filter for strap-down imaging seeker. In Machine Learning and Cybernetics, Proceedings of 2005 International Conference on, volume 4, pages 2272– 2277, Aug. 2005. Veerachai Malyavej, Ian R. Manchester, and Andrey V. Savkin. Precision missile guidance using radar/multiplevideo sensor fusion via communication channels with bit-rate constraints. Automatica, 42(5):763–769, 2006. ISSN 0005-1098. Prabhakar R. Pagilla and Yongliang Zhu. Adaptive control of mechanical systems with time-varying parameters and disturbances. Journal of Dynamic Systems, Measurement, and Control, 126(3):520–530, 2004. Ma Pei-bei, Zhang You-an, Ji Jun, and Zhang Xiao-jie. Three-dimensional guidance law with terminal impact angle constraint. In Mechatronics and Automation, International Conference on, pages 4162–4166, Aug. 2009. S. Saripalli, J.F. Montgomery, and G.S. Sukhatme. Visually guided landing of an unmanned aerial vehicle. Robotics and Automation, IEEE Transactions on, 19(3): 371–380, Jun. 2003. ISSN 1042-296X. G. Seetharaman, A. Lakhotia, and E.P. Blasch. Unmanned vehicles come of age: The darpa grand challenge. Computer, 39(12):26–29, Dec. 2006. ISSN 0018-9162. Jianbo Shi and C. Tomasi. Good features to track. In Computer Vision and Pattern Recognition, Proceedings of the IEEE Conference on, pages 593–600, Jun. 1994. Yuri B. Shtessel, Ilya A. Shkolnikov, and Arie Levant. Smooth second-order sliding modes: Missile guidance application. Automatica, 43(8):1470 – 1476, 2007. ISSN 0005-1098. G.M. Siouris. Missile guidance and control systems. Springer, 2004. ISBN 9780387007267. M. Sznaier and O.I. Camps. Control issues in active vision: open problems and some answers. In Decision and Control, 1998. Proceedings of the 37th IEEE Conference on, volume 3, pages 3238–3244, 1998. R. T. Yanushevsky and W. J. Boord. A new approach to guidance law design. Journal of Guidance, Control, and Dynamics, 28(1):162–166, 2005. Rafael Yanushevsky. Modern Missile Guidance. CRC Press, 2008. R. Zaeim, M.A. Nekoui, and A. Zaeim. Integration of imaging seeker control in a visually guided missile. In Control and Automation, Proceedings of the 8th IEEE International Conference on, pages 46–51, Jun. 2010. P. Zarchan. Tactical and strategic missile guidance. Progress in astronautics and aeronautics, volume 176. New York: AIAA, 1998. E. Zergeroglu, W. Dixon, A. Behal, and D. Dawson. Adaptive set–point control of robotic manipulators with amplitude–limited control inputs. Robotica, 18(2):171– 181, 2000. ISSN 0263-5747. Hong Zhang and James P. Ostrowski. Visual servoing with dynamics: Control of an unmanned blimp. In Robotics and Automation, Proceedings of the IEEE International Conference on, pages 618–623, 1998.

5476

Adaptive Vision-Based Missile Guidance in the ...

Sep 2, 2011 - bust extended Kalman filter (REKF)-based sensor fusion of networked sensors in the presence of .... v (vm(t) − v′ t(t) z. ) + Jωωm(t) + d(t) (6) where J′ v(px,py) ∈ R2×3 denotes the following measurable image Jacobian that relates the linear velocities of the camera and target to the target image velocity:.

2MB Sizes 2 Downloads 102 Views

Recommend Documents

Adaptive Missile Guidance Using GPS PPT and Seminar Report.pdf ...
There was a problem loading more pages. Whoops! There was a problem previewing this document. Retrying... Download. Connect more apps... Try one of the apps below to open or edit this item. Adaptive Missile Guidance Using GPS PPT and Seminar Report.p

MISSILE COMMAND
requests for attention or service on a demand basis. When the interrupt ... of the programmable device produces a CALL to a .... which is analogous to a CCD in a video camera. .... Systems”,International Conference on Signal Processing.

Pricing Guidance in Ad Sale Negotiations: The PrintAds Example
Jun 28, 2009 - are sold by negotiation between sales and marketing teams working for publishers and ... An advertiser wishes to run an Ad Campaign with The Morn- ing Call, a ..... of the role of the market runner as an automated agent that.

Pricing Guidance in Ad Sale Negotiations: The PrintAds Example
Jun 28, 2009 - online marketplaces where parties negotiate over the Web. The motivating example ... troduces a game-theoretic perspective to the design of such guidance. .... sider the history of all the offers to that publisher thus far and use ...

Pricing Guidance in Ad Sale Negotiations: The PrintAds ...
based on data analysis. The problem is nuanced because the market runner can not fully reveal the price data for any of the publishers. We introduce two ...

On the Guidance Equation in Bohmian Mechanics - Bradford Skow ...
On the Guidance Equation in Bohmian Mechanics - Bradford Skow (2010).pdf. On the Guidance Equation in Bohmian Mechanics - Bradford Skow (2010).pdf.

Atari 2600 missile command manual
Page 3 of 22. Atari 2600 missile command manual. Atari 2600 missile command manual. Open. Extract. Open with. Sign In. Main menu. Displaying Atari 2600 ...

Adaptive Incremental Learning in Neural Networks
structure of the system (the building blocks: hardware and/or software components). ... working and maintenance cycle starting from online self-monitoring to ... neural network scientists as well as mathematicians, physicists, engineers, ...

On the Initialization of Adaptive Learning in ...
to the use of adaptive learning in macroeconomic models, and establish the initialization problem. A review of the initialization methods previously adopted in the literature is presented in section §3. We then proceed to present our simulation anal

The Value of Inflammatory Signals in Adaptive Immune ...
cells before pathogens replicate to sufficient numbers to cause disease or death. .... ODE model, and compare predictions to empirical data. We then use an ...

Adaptive variation in judgment and philosophical intuitionq
Feb 12, 2009 - article (e.g. in Word or Tex form) to their personal website or .... external (e.g., social and physical) environments regardless of logical ...

Transfer Speed Estimation for Adaptive Scheduling in the Data Grid
Saint Petersburg State University [email protected],[email protected]. Abstract. Four methods to estimate available channel bandwidth in Data Grid are ...

Simulating the Adaptive Behaviour of Storytellers in Computer Video ...
tive entertainment, and in computer video games in particular. ... ment (which amount to directing a virtual avatar to take various actions or speak certain lines of ...

Simulating the Adaptive Behaviour of Storytellers in Computer Video ...
tive entertainment, and in computer video games in particular. Computer ... This paper contains some text from a current demonstration submission to AIIDE'08.