Homography-Based Coordinate Relationships for Unmanned Air Vehicle Regulation S. S. Mehta†‡ , K. Kaiser† , N. Gans† , W. E. Dixon† †

Department of Mechanical and Aerospace Engineering, University of Florida, Gainesville, FL 32611-6250



Department of Agricultural and Biological Engineering, University of Florida, Gainesville, FL 32611-0570 The Euclidean position and orientation (i.e., pose) of an unmanned air vehicle (UAV) is typically required for autonomous navigation and control. In this paper, a collaborative visual servo controller is developed with the objective to regulate an UAV to a desired pose utilizing the feedback from a moving airborne monocular camera system. In contrast to typical camera configurations used for visual servo control problems, the control objective in this paper is developed using a moving on-board camera viewing a moving target. Multiview photogrammetric methods are used to develop relationships between different camera frames and UAV coordinate systems.

I.

Introduction

The use of unmanned air vehicles (UAV’s) has received a growing attention in the last decade. The UAV’s must have a high level of autonomy and preferably work in groups. In this context, an intensive research effort has been conducted in recent years on the development of cooperative control algorithms. Scenarios of particular interest are the wide area search and destroy (WASD)23 and combat intelligence surveillance and reconnaissance (ISR) missions, which have similar characteristics except that the combat ISR scenario typically has a longer duration. In such scenarios powered vehicles are released in the target area and are independently capable of searching, classifying, and attacking targets, along with subsequent battle damage verification. Exchange of information within the group can improve the group’s capability to meet performance requirements related to fast and reliable execution of such tasks. The Euclidean position and orientation (i.e., pose) of an unmanned air vehicle (UAV) is typically required for autonomous navigation and control. Often the pose of an UAV is determined by a global positioning system (GPS) or an inertial measurement unit (IMU). However, GPS may not be available in many environments, and IMUs can drift and accumulate errors over time in a similar manner as dead reckoning. Given recent advances in image extraction/interpretation technology, an interesting approach to overcome the pose measurement problem is to utilize a vision system. Specifically, rather than obtain an inertial measurement of the UAV, vision systems can be used to recast the navigation and control problem in terms of the image space where the goal pose is compared to the relative pose via multiple images. Some examples of image-based visual servo control of mobile vehicles include:,12 ,15 ,17 21 -,18 ,19 and.20 Previous pure image-based visual servo control results have a known problem with potential singularities in the image-Jacobian, and since the feedback is only in the image-space, these methods may require impossible Euclidean motions. Motivated by the desire to eliminate these issues, some efforts have been developed that combine reconstructed Euclidean information and image-space information in the control design. The Euclidean information can be reconstructed by decoupling the interaction between translation and rotation components of a homography matrix. This homography-based method yields an invertible triangular imageJacobian with realizable Euclidean motion. Homography-based visual servo control results that have been developed for UGV include:,13 ,14 and.22 In,22 a visual servo controller was developed to asymptotically regulate the pose of an UGV to a constant pose defined by a goal image, where the camera was mounted on-board an UGV (i.e., the camera-in-hand problem). The camera on-board result in22 was extended in13 to address the more general tracking problem. In,14 a stationary overhead camera (i.e., the camera-to-hand or fixed camera configuration) was used to regulate an UGV to a desired pose.

1 of 8 American Institute of Aeronautics and Astronautics

In this paper, a moving airborne monocular camera (e.g., a camera attached to a mothership aircraft flying at high altitude, a camera mounted on a satellite) is used to provide pose measurements of an UAV drone with limited sensor capabilities relative to a goal configuration. The relative velocity between the moving UAV and the moving camera presents a significant challenge. The contribution of this paper is the development of multi-view geometry concepts (i.e., photogrammetry) to relate coordinate frames attached to the moving camera, moving UAV, and the desired UAV pose specified by an a priori image. Geometric constructs developed for traditional camera-in-hand problems are fused with fixed-camera geometry to develop a set of Euclidean homographies. One of the resulting Euclidean homographies is not measurable through a set of spatiotemporal images (i.e., a corresponding projective homography can not be developed as in previous results). Hence, new geometric relationships are formulated to solve for the homography so that a measurable error system for the UAV can be developed. The resulting open-loop error system is expressed in a form that is amenable to a variety of UAV controllers. An outcome of this paper is a new geometric framework to relate the pose of a moving object to a stationary object via a moving camera. Applications that can build on this framework include: cooperative combat ISR, electronic attack, urban warfare, wide area search/attack, and persistent area denial.

II.

Geometric Model

Consider a single camera that is navigating above the airspace of an unmanned air vehicle (UAV) as depicted in Fig. 1. The moving coordinate frame I is attached to the airborne camera and the moving coordinate frame F is attached to the UAV. The UAV is represented in the camera image by foura feature points that are coplanar and not colinear. The Euclidean distance (i.e., s1i ∈ R3 ∀i = 1, 2, 3, 4) from the origin of F to one of the feature points is assumed to be known. The plane defined by the xy-axis of F and the UAV feature points is denoted as π. While viewing the feature points of the UAV, the camera is assumed to also view four additional coplanar and noncollinear feature points of a stationary reference object. The four additional feature points define the plane π∗ in Fig. 1. The stationary coordinate frame F ∗ is attached to the object where distance (i.e., s2i ∈ R3 ∀i = 1, 2, 3, 4) from the origin of the coordinate frame to one of the feature points is assumed to be known. The feature points that define π∗ are also assumed to be visible when the camera is a priori located coincident with the position and orientation (i.e., pose) of the stationary coordinate frame IR . When the camera is coincident with IR , the desired pose of the UAV is assumed to be in the camera’s field-of-view. When the UAV is located at the desired pose, the coordinate frame F is coincident with the coordinate frame Fd . To relate the coordinate systems, let R (t), R∗ (t), Rr (t), Rrd , Rr∗ ∈ SO(3) denote the rotation from F to I, F ∗ to I, I to IR , Fd to IR , and F ∗ to IR , respectively, xf (t), x∗f (t) ∈ R3 denote the respective time-varying translation from F to I and from F ∗ to I with coordinates expressed in I, and xf r (t), xf r (t), xf rd , x∗f r ∈ R3 denote the respective translation from I to IR , F to IR , Fd to IR , and from F ∗ to IR expressed in the coordinates of IR . From the geometry between the coordinate frames depicted in Fig. 1, the following relationships can be developed m ¯ i = xf + Rs1i m ¯ ∗i = x∗f + R∗ s2i

m ¯ rdi = xf rd + Rrd s1i m ¯ ∗ri = x∗f r + Rr∗ s2i



m ¯ i (t) = xf r + R∗r R∗T Rs1i

(1) (2) (3)

where m ¯ i (t), m ¯ ∗i (t) ∈ R3 denote the Euclidean coordinates of the feature points of the UAV and the feature points on the plane π∗ expressed in I as m ¯ i (t)  m ¯ ∗i (t)  

 

xi (t) yi (t) zi (t) x∗i (t) yi∗ (t) zi∗ (t)

T

T

(4) ,

(5)

m ¯ i (t), m ¯ rdi ∈ R3 denote the actual time varying and constant desired Euclidean coordinates, respectively, a It should be noted that if four coplanar target points are not available then the subsequent development can exploit the classic eight-points algorithm 7 with no four of the eight target points being coplanar.

2 of 8 American Institute of Aeronautics and Astronautics

Figure 1. Cam era Co ordinate Frame Relationships.

of the feature points attached to the UAV expressed in IR as  T     m ¯ i (t)  xi (t) yi (t) zi (t)  T m ¯ rdi  , xrdi yrdi zrdi

(6) (7)

and m ¯ ∗ri ∈ R3 denotes the constant Euclidean coordinates of the feature points on the plane π∗ expressed in IR as  T ∗ ∗ m ¯ ∗ri  x∗ri yri . (8) zri 

After some algebraic manipulation, the expressions for m ¯ i (t), m ¯ rdi , m ¯ ∗ri (t), and m ¯ i (t) in (1)-(3) can be rewritten as ¯m ¯ rd m m ¯i = x ¯f + R ¯ ∗i m ¯ rdi = x ¯f rd + R ¯ ∗ri (9) m ¯ ∗ri = xf r + Rr m ¯ ∗i



m ¯ i (t) = xfr + Rr m ¯i

(10) 3 ¯ ¯ where R (t), Rrd , Rr ∈ SO (3) and x ¯f (t), x ¯f rd , xf r (t) ∈ R are new rotational and translational variables, respectively, defined as ¯ = RR∗T ¯ rd = Rrd R∗T R R Rr = Rr∗ R∗T (11) r  ∗  ¯ xf + R∗ (s2i − s1i ) x ¯f = xf − R (12)  ∗  ∗ ¯ x¯f rd = xf rd − Rrd xfr + Rr (s2i − s1i ) (13) ∗ ∗  xf r = xf r − Rr xf = xf r − Rr xf . (14) By using the projective relationships (see Fig. 1.) d(t) = n∗T m ¯i

d∗ (t) = n∗T m ¯ ∗i

d∗r = n∗T m ¯ ∗ri

the relationships in (9) and (10) can be expressed as   ¯ + x¯f n∗T m m ¯i = R ¯ ∗i ∗ d   ¯f rd ∗T ¯ rd + x m ¯ rdi = R n m ¯ ∗ri d∗r   xfr n∗T m ¯ ∗ri = Rr + m ¯ ∗i d∗    xfr n∗T Rr + m ¯i = m ¯ i. d 3 of 8 American Institute of Aeronautics and Astronautics

(15)

(16) (17) (18) (19)

In (15)-(19), d(t), d∗ (t), d∗r > ε for some positive constant ε ∈ R, and n∗ ∈ R3 denotes the constant unit normal to the planes π and π ∗ . Remark 1 As in,1 the subsequent development requires that the constant rotation matrix R∗r be known. The constant rotation matrix Rr∗ can be obtained a priori using various methods (e.g., a second camera, Euclidean measurements, inertial measurements by mothership).

III.

Euclidean Reconstruction

The relationships given by (16) and (19) provide a means to quantify a translation and rotation error between the different coordinate systems. Since the pose of F, Fd , and F ∗ cannot be directly measured, a Euclidean reconstruction is developed in this section to obtain the position and rotational error information by comparing multiple images acquired from the hovering monocular vision system. Specifically, comparisons are made between the current UAV image and the reference image in terms of I and between the a priori known UAV image and the reference image in terms of IR . To facilitate the subsequent development, the normalized Euclidean coordinates of the feature points for the current UAV image and the reference image can be expressed in terms of I as mi (t) and m∗i (t) ∈ R3 , respectively, as follows: mi 

m ¯i zi

m∗i 

m ¯ ∗i . zi∗

(20)

Similarly, the normalized Euclidean coordinates of the feature points for the current, goal, and reference  image can be expressed in terms of IR as mi (t), mrdi , m∗ri ∈ R3 , respectively, as follows: 



mi (t) =

m ¯ i (t)  zi (t)

mrdi 

m ¯ rdi zrdi

m ¯ ∗ri ∗ . zri

m∗ri 

(21)

From the expressions given in (16) and (20), the rotation and translation between the coordinate systems F and F ∗ can now be related in terms of the normalized Euclidean coordinates as follows: mi =

 zi∗  ¯ R + xh n∗T m∗i . zi

αi H

(22)

In a similar manner, (17)-(21) can be used to relate the rotation and translation between m∗ri and mrdi as mrdi =

and between m∗i (t) and m∗ri as m∗ri =

∗ zri zrdi αrdi

zi∗ ∗ zri αri

  ¯ rd + xhrd n∗T m∗ri R



(23)

  Rr + xhr n∗T m∗i .



(24)

Hrd

Hr



The development provided in the appendix can be used to relate mi (t) to mi (t) as   zi n∗T m∗i ∗T  mi = Rr + xhr αi ∗T n mi zi n mi

Hr

(25)

In (22)-(25), αi (t) , αrdi , αri (t) ∈ R denote depth ratios, H (t), Hrd , Hr (t), Hr (t) ∈ R3×3 denote Euclidean homographies,2 and xh (t), xhrd , xhr (t) ∈ R3 denote scaled translation vectors that are defined as follows xh =

x ¯f d∗

xhrd =

x ¯f rd d∗r

xhr =

xfr d∗

4 of 8 American Institute of Aeronautics and Astronautics

(26)



Also from (25), the depth ratio

zi zi

can be obtained as follows 

zi = aHr mi zi   where a = 0 0 1 denotes a row vector. Each Euclidean feature point will have a projected pixel coordinate expressed in terms of I as  T  T p∗i  u∗i vi∗ 1 pi  ui vi 1

(27)

(28)

where pi (t) and p∗i (t) ∈ R3 represents the image-space coordinates of the time-varying feature points of the UAV and reference object, respectively, and ui (t), vi (t) , u∗i (t), vi∗ (t) ∈ R. Similarly, the projected pixel coordinate of the Euclidean features in the reference image can be expressed in terms of IR as  T  T ∗ p∗ri  u∗ri vri (29) prdi  urdi vrdi 1 1

where prdi and p∗ri ∈ R3 represents the constant image-space coordinates of the goal UAV and the reference ∗ object, respectively, and urdi , vrdi , u∗ri , vri ∈ R. To calculate the Euclidean homographies given in (22)-(25) from pixel information, the projected pixel coordinates are related to mi (t), m∗i (t), mrdi and m∗ri by the pin-hole camera model as pi prdi

= Ami = Amrdi

p∗i = Am∗i p∗ri = Am∗ri

(30) (31)

where A ∈ R3×3 is a known, constant, and invertible intrinsic camera calibration matrix. By using (22)-(25), (30), and (31), the following relationships can be developed:     pi = αi AHA−1 p∗i prdi = αrdi AHrd A−1 p∗ri

(32) G Grd   p∗ri = αri AHr A−1 p∗i Gr

(33)

where G (t) = [gij (t)], Grd = [grdij ], Gr = [grij ] ∀i, j = 1, 2, 3 ∈ R3×3 denote projective homographies. Sets of linear equations can be developed from (32) and (33) to determine the projective homographies up to a scalar multiple. Various techniques can be used (e.g., see3, 4 ) to decompose the Euclidean homographies, ¯ (t), R ¯ rd , Rr (t). Given that the constant rotation matrix to obtain αi (t) , αrdi , αri (t), xh (t), xhrd , xhr (t), R ∗ ¯ Rr is assumed to be known, the expressions for Rrd and Rr (t) in (11) can be used to determine Rrd and ¯ R∗ (t). Once R∗ (t) is determined, the expression for R(t) in (11) can be used to determine R(t).

IV.

Control Objective

The objective considered in this paper is to develop a visual servo controller that ensures that the pose of an UAV is regulated to a desired pose. A challenging aspect of this problem is that the UAV pose information is supplied by a moving airborne monocular camera system. That is, unlike traditional camerain-hand configurations or fixed camera configurations, the problem considered in this paper involves a moving airborne camera observing a moving air vehicle. Mathematically, the objective can be expressed as the desire  to regulate m ¯ i (t) to m ¯ di (or stated otherwise for xf r (t) → xf rd and R (t) = R∗r R∗T (t)R(t) → Rrd ). The objective is to develop a visual servo controller that ensures that the position/orientation of F is regulated to Fd (i.e., m ¯ i (t) measured in I regulates to m ¯ di measured in IR ). To ensure that m ¯ i (t) is regulated to m ¯ di from the Euclidean reconstruction given in (22), (23), (24), and (25) the control objective can be   stated as followsb : R (t) → Rrd , m1 (t) → mrd1 , and z1 (t) → zrd1 . To eliminate the singularity resulting b Any

point Oi can be utilized in the subsequent development; however, to reduce the notational complexity, we have elected to select the image point O1 , and hence, the subscript 1 is utilized in lieu of i in the subsequent development.

5 of 8 American Institute of Aeronautics and Astronautics

from the Euler angle-axis representation of the rotation matrix, the control objective is formulated in terms of the unit quaternion representation. The rotation matrix R (t) and Rrd can be expressed in terms of the unit quaternion as follows24 R (t) = (q02 − qvT qv )I3 + 2qv qvT + 2q0 qv× 2 T T Rrd (t) = (q0rd − qvrd qvrd )I3 + 2qvrd qvrd × +2q0rd qvrd where the unit quaternions q  and qrd are defined as follows      

ϕ (t)  cos q (t)  0  2    q (t) = = qv (t) k (t) sin ϕ 2(t)   

  cos ϕrd2(t) q0rd    = qrd = qvrd krd (t) sin ϕrd2(t)

(34) (35)

(36)

(37)

where (k (t), ϕ (t)) and (krd , ϕrd ) represent the angle-axis parameters for the rotation R (t) and Rrd , respectively, where ϕ (t) ∈ R and ϕrd ∈ R represent a rotation angle about a suitable unit vector k (t) ∈ R3 and krd ∈ R3 , respectively. Based on (36) and (37) the rotation regulation objective can be restated as the desire to regulate qv (t) in the sense that eω  qv (t) − qvrd  → 0

as

t → ∞.

(38)

To quantify the translation error between F expressed in I and Fd expressed in IR , a new hybrid position tracking error, denoted by ev (t) ∈ R3 , is defined as follows 

ev  me − merd 









(39)

T

where me (t) = me1 (t) me2 (t) me3 (t) ∈ R3 denotes the extended normalized Euclidean coordinates of the target points expressed in IR given by   T   x1 y1  me  . (40) − ln(α1 )   z1 z1    1 In (40), α1 (t) = zz1 ααr1 , where α1 (t) ∈ R and αr1 (t) ∈ R are the depth ratios expressed in (22) and 1  T ∈ R3 denotes the extended normalized (24), respectively. Also in (39), merd = merd1 merd2 merd3 Euclidean coordinates of the corresponding desired image point as follows  xrd1 yrd1 T − ln(αrd1 ) merd  (41) zrd1 zrd1

where merd1 and merd2 denotes the normalized Euclidean coordinates of target points given in (7) and αrd1 ∈ R is the depth ration expressed in (23). Based on the error system formulations in (39) and (38), the control objective can be stated as the desire to regulate the error signals ev (t) and eω (t) to zero. If the error signals ev (t) and eω (t) are regulated to zero then the object can be proven to be regulated to the desired position/orientation.

V.

Future Work and Conclusion

In this paper, geometric relationships have been developed between an UAV drone with limited sensor capabilities and a mothership navigating above the airspace of an UAV. To achieve the result, multiple views of a reference object were used to develop Euclidean homographies. By decomposing the Euclidean homographies into separate translation and rotation components, reconstructed Euclidean information was obtained for the control development. A quaternion based control objective is formulated for regulating the current pose of UAV to the desired pose. The contribution of this paper is that a new geometric framework is developed to relate the pose of an UAV through images acquired by a mothership having sensor capabilities. Our goal is that researchers can now build on this geometric construction to develop flight controllers for the limited sensor drones. 6 of 8 American Institute of Aeronautics and Astronautics

References 1 J. Chen, D. M. Dawson, W. E. Dixon, and A. Behal, “Adaptive Homography-Based Visual Servo Tracking for Fixed and Camera-in-Hand Configurations,” IEEE Transactions on Control Systems Technology, accepted, to appear. 2 O. Faugeras, Three-Dimensional Computer Vision, The MIT Press, Cambridge Massachusetts, 2001. 3 O. Faugeras and F. Lustman, “Motion and Structure From Motion in a Piecewise Planar Environment”, International Journal of Pattern Recognition and Artificial Intelligence, Vol. 2, No. 3, pp. 485-508, 1988. 4 Z. Zhang and A. R. Hanson, “Scaled Euclidean 3D Reconstruction Based on Externally Uncalibrated Cameras,” IEEE Symp. on Computer Vision, pp. 37-42, 1995. 5 E. Malis, “Contributions à la modélisation et à la commande en asservissement visuel”, Ph.D. Dissertation, University of Rennes I, IRISA, France, Nov. 1998. 6 E. Malis and F. Chaumette, “Theoretical Improvements in the Stability Analysis of a New Class of Model-Free Visual Servoing Methods,” IEEE Transactions on Robotics and Automation, Vol. 18, No. 2, pp. 176-186, April 2002. 7 E. Malis and F. Chaumette, “2 1/2 D Visual Servoing with Respect to Unknown Objects Through a New Estimation Scheme of Camera Displacement,” International Journal of Computer Vision, Vol. 37, No. 1, pp. 79-97, June 2000. 8 E. Malis, F. Chaumette, and S. Bodet, “2 1/2 D Visual Servoing,” IEEE Transactions on Robotics and Automation, Vol. 15, No. 2, pp. 238-250, April 1999. 9 J. J. E. Slotine and W. Li, Applied Nonlinear Control, Prentice Hall, Inc: Englewood Cliff, NJ, 1991. 10 M. W. Spong and M. Vidyasagar, Robot Dynamics and Control, John Wiley and Sons, Inc: New York, NY, 1989. 11 M. de Queiroz, D. Dawson, S. Nagarkatti, and F. Zhang, Lyapunov-based Control of Mechanical Systems, Birkhauser, New York, 2000. 12 D. Burschka and G. Hager, “Vision-Based Control of Mobile Robots,” Proc. of the IEEE International Conference on Robotics and Automation, pp. 1707-1713, 2001. 13 J. Chen, W. E. Dixon, D. M. Dawson, and M. McIntire, “Homography-based Visual Servo Tracking Control of a Wheeled Mobile Robot”, Proc. of the IEEE International Conference on Intelligent Robots and Systems, Las Vegas, Nevada, pp. 18141819, October 2003. 14 J. Chen, W. E. Dixon, D. M. Dawson, and V. Chitrakaran, “Visual Servo Tracking Control of a Wheeled Mobile Robot with a Monocular Fixed Camera,” Proceedings of the IEEE Conference on Control Applications, Taipei, Taiwan, pp. 1061-1066, 2004. 15 A. K. Das, et al., “Real-Time Vision-Based Control of a Nonholonomic Mobile Robot,” Proc. of the IEEE International Conference on Robotics and Automation, pp. 1714-1719, 2001. 16 C. A. Desoer and M. Vidyasagar, Feedback Systems: Input-Output Properties, New York: Academic Press, 1975. 17 W. E. Dixon, D. M. Dawson, E. Zergeroglu, and A. Behal, “Adaptive Tracking Control of a Wheeled Mobile Robot via an Uncalibrated Camera System,” IEEE Transactions on Systems, Man, and Cybernetics -Part B: Cybernetics, Vol. 31, No. 3, pp. 341-352, 2001. 18 Y. Ma, J. Kosecka, and S. Sastry, “Vision Guided Navigation for Nonholonomic Mobile Robot”, IEEE Trans. on Robotics and Automation, Vol. 15, No. 3, pp. 521-536, June 1999. 19 K.-T. Song and J.-H. Huang, “Fast Optical Flow Estimation and Its Application to Real-time Obstacle Avoidance,” Proc. of the IEEE International Conference on Robotics and Automation, pp. 2891-2896, 2001. 20 H. Y. Wang, S. Itani, T. Fukao, and N. Adachi, “Image-Based Visual Adaptive Tracking Control of Nonholonomic Mobile Robots”, Proc. of the IEEE/RJS International Conference on Intelligent Robots and Systems, pp. 1-6, 2001. 21 G. D. Hager, D. J. Kriegman, A. S. Georghiades, and O. Ben-Shahar, “Toward Domain-Independent Navigation: Dynamic Vision and Control,” Proc. of the IEEE Conference on Decision and Control, pp. 3257-3262, 1998. 22 Y. Fang, D. M. Dawson, W. E. Dixon, and M. S. de Queiroz, “2.5D Visual Servoing of Wheeled Mobile Robots,” Proc. of IEEE Conference on Decision and Control, Las Vegas, NV, pp. 2866-2871, Dec. 2002. 23 Schumacher, C. J., Chandler, P. R., and Rasmussen, S. J., “Task Allocation for Wide Area Search Munitions,” Proceedings of the American Control Conference, Anchorage, Alaska, 2002. 24 G. Hu, W. E. Dixon, S. Gupta, and N. Fitz-coy, “A Quaternion Formulation for Homography-based Visual Servo Control,” IEEE International Conference on Robotics and Automation, Orlando, Florida, 2006, accepted, to appear.

VI.

Appendix 

In order to find the relationship between the normalized Euclidean coordinates mi (t) and mi (t), (19) is expressed as    xf r n∗T d∗ m ¯ i = Rr + m ¯i . (42) d∗ d By substituting xf r (t) from (26) into (42) the following expression can be obtained    d∗ ∗T m ¯ i = Rr + xhr n m ¯i . (43) d Utilizing (15), (20), and (22) and rearranging the terms yields    n∗T m∗ m ¯ i = Rr + xhr αi ∗T i n∗T m ¯i . n mi 7 of 8 American Institute of Aeronautics and Astronautics

(44)

The above expression can be written in terms of the normalized Euclidean coordinates by using (20)-(21) as follows    n∗T m∗i ∗T zi mi . mi =  Rr + xhr αi ∗T n (45) n mi zi

8 of 8 American Institute of Aeronautics and Astronautics

Homography-Based Coordinate Relationships for ...

utilizing the feedback from a moving airborne monocular camera system. ... surveillance and reconnaissance (ISR) missions, which have similar characteristics ...

223KB Sizes 1 Downloads 206 Views

Recommend Documents

Coordinate Descent for mixed-norm NMF
Nonnegative matrix factorization (NMF) is widely used in a variety of machine learning tasks involving speech, documents and images. Being able to specify the structure of the matrix factors is crucial in incorporating prior information. The factors

rectangular coordinate system 02.pdf
There was a problem previewing this document. Retrying... Download. Connect more apps... Try one of the apps below to open or edit this item. rectangular ...

A Dual Coordinate Descent Algorithm for SVMs ... - Research at Google
International Journal of Foundations of Computer Science c World ..... Otherwise Qii = 0 and the objective function is a second-degree polynomial in β. Let β0 ...

coordinate plane net
Area = Length x Height. 2. L. Circles r Area = π r 2. VOLUME. H. W. B. Volume = Base x Width x Height. SURFACE AREA. Find the Area of Each side and add.

Which coordinate system for modelling path integration?
Apr 27, 2012 - derstanding animal PI, since it determines the neural architecture, the nature of .... diagrams. ...... Haze, clouds and limited sky visibility: po-.

Discovering Subsumption Relationships for Web-Based Ontologies
from an existing resource of IsA links. ... tions [16], or distributional similarity [7, 22] is not sufficient .... is a technology that produces light sources with high lumi-.

Interpersonal-Relationships-Professional-Communication-Skills-For ...
... to use the electronic health record for clear communication with current information. on classification systems, standards of documentation, and telehealth ... and this ebook pdf present at Saturday 16th of March 2013 03:32:08 PM, Get a lot of ..

Asynchronous Parallel Coordinate Minimization ... - Research at Google
passing inference is performed by multiple processing units simultaneously without coordination, all reading and writing to shared ... updates. Our approach gives rise to a message-passing procedure, where messages are computed and updated in shared

Coordinate-free Distributed Algorithm for Boundary ...
State Key Laboratory of Industrial Control Technology, Zhejiang University, China. §. INRIA Lille ... Wireless sensor networks (WSNs) have been widely adopted.

Unsupervised Discovery of Coordinate Terms for ...
entities belonging to a certain class [5, 8, 10]. Given sev- ..... ten coordinate terms with the largest negative scores. Posi- ..... x laptops, x computers. 2- canon ...

Coordinate-free sensorimotor processing: computing ...
domain of sensorimotor processing and we argue that in such context a thing ... for lack of space a quantitative comparison with alternative models on specific ...

Asynchronous Parallel Coordinate Minimization ... - Research at Google
Arock: An algorithmic framework for asynchronous parallel coordinate updates. SIAM Journal on Scientific Computing, 38(5):A2851–A2879, 2016. N. Piatkowski and K. Morik. Parallel inference on structured data with crfs on gpus. In International Works

Coordinate Systems - msdis - University of Missouri
Computer Precision ~ Continued ... Actual numeric ranges vary between computer implementations .... Since latitude and longitude are measured in degrees,.

An examination of how households share and coordinate the ...
Feb 15, 2012 - travel to and from the locations where the tasks need to be performed, as ..... going to take 45 minutes [on bus]...and if we go there and it's closed .... H3-B He got sick Thursday night, and he had an assignment due on Friday.

BUNGEE DROP - Linear Relationships
Student Names: ,. ,. OBJECTIVE: Create a bungee line for an object to allow it the most thrilling, yet SAFE, fall from a height of 3 or more meters. Each group gets ...

BUNGEE DROP - Linear Relationships
This number is your rubber band length of 1. Do three ... Distance of Fall. (cm). 0 ... Determine the Long Height you will drop your object from and record below.

encouraging strong family relationships - Center for the Study of Social ...
the connection of families to social networks, and the adequacy and quality of .... that one-third of all unmarried parents face no serious barriers to marriage,.