A Theoretical Model for Vision-Based Localization of a Wheeled Mobile Robot in Greenhouse Applications: A Daisy-Chaining Approach S. S. Mehta ∗ T. Burks ∗∗ ∗

Department of Mechanical and Aerospace Engineering, University of Florida, Gainesville, FL - 32611 USA (e-mail: [email protected]) ∗∗ Department of Agricultural and Biological Engineering, University of Florida, Gainesville, FL - 32611 USA (e-mail: [email protected]) Abstract: The Euclidean position and orientation (i.e. pose) of a wheeled mobile robot (WMR) is typically required for autonomous selection and control of agricultural operations in a greenhouse. A vision-based localization scheme is formulated as a desire to identify the position and orientation of a WMR navigating in a greenhouse by utilizing the image feedback of the above-the-aisle targets from an on-board camera. An innovative daisy chaining strategy is used to find the local coordinates of the above-the-aisle targets in order to localize the WMR. In contrast to typical camera configurations used for visual servo control problems, the controller in this paper is developed using a moving on-board camera viewing reseeding targets. Multi-view photogrammetric methods are used to develop relationships between different camera frames and WMR coordinate systems. Experimental results demonstrate the feasibility of the developed geometric model. Keywords: Machine Vision, Mobile robots, Automation, Cameras, Euclidean Reconstruction, Homography. 1. INTRODUCTION 1 A greenhouse is defined as a house of glass, polycarbonate or fiberglass construction used for propagation, growth and care of plants. The function of a greenhouse is to create the optimal growing conditions for the full life of the plants (2). The main application of robots in the commercial sector has been concerned with the substitution of manual human labour by robots or mechanised systems to make the work more time efficient, accurate, uniform and less costly while reducing/eliminating the occupational health hazard towards the human labor.

With recent advances camera technology, computer vision, and controls theory application of wheeled mobile robots (WMR) has shown growing interest towards automation of greenhouse agricultural operations such as pesticide/fungicide spraying, health monitoring, de-leafing, etc. In (30), an autonomous mobile robot for use in pest control and disease prevention application in commertial greenhouse has been described where human health hazards are involved in spraying potentially toxic chemicals in the confined space. Van Henten et. al. in (35) describes the autonomous de-leafing process of cucumber plant using a mobile robot. A virtual prototype of a service robot for health monitoring and localized chemical, drugs and fertilisers 1 This research is supported by research grant No. US-3715-05 from BARD, the United States - Israel Binational Agricultural Research and Development Fund at the University of Florida.

dispensing to plants in greenhouses has been realized in (1). Acaccia et. al. in (1), described the functionality of mobile robot as a service robot for health monitoring and localized treatment of plants but the localization scheme of the robot has not been presented. Dario et. al. in (7) described a vision-based navigation of a wheeled mobile robot for operations such as picking of ripe tomotoes and spraying of flowers. Most of the past research focuses on a particular, spatially non-varying task to be performed by the autonomous system in a greenhouse. However, for a large greenhouse spatial control over the agricultural operations might be desired (e.g. selective sparying of pesticides along different aisles of a greenhouse) requiring the knowledge of the Euclidean position and orientation (i.e. pose) of a WMR. Often the pose of a WMR is determined by a global positioning system (GPS) or an inertial measurement unit (IMU). However, GPS may not be available in many environments and accuracy of GPS positioning may not be sufficient, and IMUs can drift and accumulate errors over time in a similar manner as dead reckoning. Given recent advances in the image extraction/interpretation technology, an interesting approach to overcome the pose measurement problem is to utilize a vision system. In this paper, an innovative daisy chaining approach (27) is utilized for localization of WMR in a greenhouse while utilizing image feedback from a WMR on-board monocular camera. The moving on-board cameras attached to

Fig. 1. Camera coordinate frame relationships: The moving on-board monocular camera system (coordinate frame IRF at reference location, coordinate frame IF , coordinate frame IR ) navigating below the calibrated reference target (coordinate frame F ∗ ) identifies the pose of a WMR. The position and orientation of above-the-aisle target (coordinate frame F) is determined using image feedback from IF and IR . a WMR are used to provide pose measurements of the WMR relative to the greenhouse inertial frame of reference while utilizing image feedback from the above-the-aisle targets. The contribution of this paper is the development of a localization scheme for a WMR using two monocular on-board cameras resulting in a higher field-of-view. Geometric constructs developed for traditional camera-inhand problems are fused with fixed-camera geometry to develop a set of Euclidean homographies, which can be decomposed using standard decomposition algorithms to determine the pose of a WMR. 2. GEOMETRIC MODEL Consider a two monocular camera system on-board a WMR that is navigating along a planar motion while capturing the image data of the overhead (i.e. above-theaisle) target objects as depicted in Fig. 1 and Fig. 2. The moving coordinate frames IF and IR are attached to the forward and reverse looking on-board cameras, respectively, and the coordinate frame F is attached to above-the-aisle targets. The target is represented in the camera image by four 2 feature points that are coplanar and not colinear. The Euclidean distance (i.e., si ∈ R3 ∀i = 1, 2, 3, 4) from the origin of F to one of the feature points is assumed to be known. The plane defined by above-the-aisle targets (i.e., the plane defined by the xyaxis of F ) and the target feature points is denoted as π. The plane of motion of a WMR is defined by the xy-axis of IW MR , which is coincident with the plane defined by the xy-axis of IG as depicted in Fig. 2. 2

Image analysis methods can be used to determine planar objects (e.g. through color, texture differences). These traditional computer vision methods can be used to help determine and isolate the four coplanar feature points. If four coplanar target points are not available then the subsequent development can exploit the classic eight-points algorithm (25) with no four of the eight target points being coplanar.

Fig. 2. Localization of a WMR (coordinate frame IW MR ) with respect to greenhouse inertial frame of reference (coordinate frame IG ) while viewing the calibrated target (coordinate frame F ∗ ). The stationary coordinate frame F ∗ is attached to the calibrated beginning-of-the-aisle target object, as depicted in Fig. 1, where distance (i.e., si ∈ R3 ∀i = 1, 2, 3, 4) from the origin of the coordinate frame to one of the feature points is assumed to be known. The four feature points define the plane π∗ in Fig. 1. The stationary reference coordinate frames IRF and IRR are defined by forward and reverse looking cameras, respectively, corresponding to the instance when the calibrated target F ∗ comes in the field-of-view of forward looking camera (see Fig. 1). The calibrated target F ∗ corresponds to a target at the beginning of the aisle while above-the-aisle targets F are located overhead along the aisle 3 . The position and orientation of the calibrated target F ∗ is known with respect to greenhouse inertial frame of reference IG . ∗ To relate the coordinate systems, let R∗ , RRF , R∗F (t), ∗ ∗ (t), RF (t), RRF , Rc , RG , RW MRG (t) ∈ RF RF (t), RR SO(3) denote the rotation from F ∗ to F, F ∗ to IRF , F ∗ to IF , IRF to IF , F ∗ to IR , F to IF , IF to IR , IW MR to IRF , IG to F ∗ , and IG to IW MR , respectively, x∗f , x∗fRF , x∗f R (t) ∈ R3 denote the respective translation from F ∗ to F, F ∗ to IRF and F ∗ to IR with coordinates expressed in F ∗ , xfF RF (t) ∈ R3 denotes the time-varying translation from IRF to IF expressed in IRF , x∗f F (t) ∈ R3 denotes the translation between F ∗ to IF expressed in IF , xf F (t) ∈ R3 denote the time-varying translation from F to IF expressed in the coordinates of F, xf RF ∈ R3 denotes the known constant translation between IF to T IR expressed in IF , xf c = [ xc yc zc ] ∈ R3 denotes the known constant translation between IW MR to IRF expressed in IW MR , and x∗f G , xf W MRG (t) ∈ R3 denotes the translation between IG to F ∗ , and IG to IW MR expressed in IG . For geometric simplicity in the subsequent 3

The placement of above-the-aisle targets should be done according to the field-of-view of the monocular camera system, such that at any given instant at least one of the targets is in the field-of-view of the camera system.

analysis, the orientation of calibrated target F ∗ is assumed ∗ to be the same as orientation of IG , such that RG = 3×3 ∗ I and position of F .with respect to IG is given by ∗ ∗ T zG x∗fG = [ x∗G yG ] . Also, the xy-axis of camera frame IRF is considered to be parallel to the plane of motion of WMR. From the geometry between the coordinate frames depicted in Fig. 1 and Fig. 2, the following relationships can be developed ∗ si m ¯ ∗RF i = x∗f RF + RRF

m ¯ ∗Ri

= x∗f R

+

∗ RR si

m ¯ ∗F i = x∗f F + R∗F si m ¯ F i = xf F + RF si



m ¯ ∗Ri = x∗f R − xf RF + R∗R RRF si m ¯ ∗RF i ,

m ¯ ∗F i (t),

(1) (2) (3)

3

m ¯ ∗Ri (t)

where ∈ R denote the Euclidean coordinates of the feature points on the plane π ∗ expressed in IRF , IF , and IR , respectively, as

yF∗ i (t) ∗ yRi (t)

(4) (5)

3. LOCALIZATION OF WMR

(6)



m ¯ ∗Ri (t) ∈ R3 denote the time-varying virtual Euclidean coordinates of the feature points on the plane π ∗ expressed in IF as   T  ∗ ∗ m ¯ ∗Ri (t)  x∗Ri (t) yRi (8) (t) zRi (t) . 

The Euclidean coordinates m ¯ ∗Ri (t) are virtual coordinates since the feature points on the plane π∗ are now in the field-of-view of reverse looking camera IR and not IF . To facilitate the subsequent development, the normalized Euclidean coordinates of the feature points on the plane π∗ can be expressed in terms of IRF , IF , and IR as m∗RF i , m∗F i , m∗Ri ∈ R3 , as follows:

m∗Ri 

m ¯ ∗Ri ∗ zRi

m∗F i 

m ¯ ∗F i zF∗ i

(9)





m∗Ri 

m ¯ ∗Ri 

∗ zRi

(10)

.

Similarly, the normalized Euclidean coordinates of the feature points on the plane π can be expressed in terms of IF as follows m ¯ Fi mF i  . (11) zF i Each Euclidean feature point on the plane π ∗ will have a projected pixel coordinate expressed in terms of IRF as ∗ p∗RF i  [ u∗RF i vRF i 1]

p∗RF i

T

(12)

3

where ∈ R represents the image-space coordinates of the feature points of calibrated target F ∗ , and u∗RF i , ∗ vRF i ∈ R. Similarly, the projected pixel coordinate of the Euclidean features on the plane π ∗ and π can be expressed in terms of IF as p∗F i



[ u∗F i

vF∗ i

T

1]

(15)

= AR m∗Ri 3×3

T zF∗ i (t) ] T ∗ zRi (t) ]

m ¯ F i (t) ∈ R denote the time-varying Euclidean coordinates of the feature points on the plane π expressed in IF as  T m ¯ F i (t)  xF i (t) yF i (t) zF i (t) (7)

m ¯ ∗RF i ∗ zRF i

p∗Ri

p∗F i = AF m∗F i

T

3

m∗RF i 

p∗RF i = AF m∗RF i

pF i = AF mF i (16) where AF , AR ∈ R is a known, constant, and invertible intrinsic camera calibration matrix corresponding to the forward looking camera and reverse looking camera, respectively.

∗ ∗ m ¯ ∗RF i  [ x∗RF i yRF i zRF i ]

m ¯ ∗F i (t)  [ x∗F i (t) m ¯ ∗Ri (t)  [ x∗Ri (t)

where p∗F i , pF i ∈ R3 represents the image-space coordinates of the feature points of calibrated target F ∗ and F, ∗ ∗ ∗ respectively, and u∗RF i , vRF i , uF i , vF i , uF i , vF i ∈ R. Also the projected pixel coordinate of the Euclidean features on the plane π∗ can be expressed in terms of IR as T ∗ 1] (14) p∗Ri  [ u∗Ri vRi ∗ 3 where pRi ∈ R represents the image-space coordinates of the feature points of calibrated target F ∗ , and u∗Ri , ∗ vRi ∈ R. The projected pixel coordinates are related to m∗RF i , m∗F i , m∗Ri , and mF i by the pin-hole camera model as

T

pF i  [ uF i vF i 1 ]

(13)

The localization problem is solved considering the four case corresponding to the possible spatial arrangements of a WMR along the direction of motion in a greenhouse. 3.1 Case I: Calibrated target F ∗ is in the field-of-view of the forward looking camera frame IRF This reference location of WMR corresponds to the instant when the beginning-of-the-aisle calibrated target F ∗ comes in the field-of-view of forward looking on-board camera IRF (i.e. a snapshot of F ∗ ), the target image captured is regarded as a reference image. The known distance of the coordinate system F ∗ from the xy-axis of IG (i.e. the depth of target F ∗ from the WMR plane of motion) is ∗ given by zG .Also, the distance of IRF from the WMR plane of motion IW MR is given by zc . The unknown depth of the calibrated target frame F ∗ from the forward looking camera frame IRF can be obtained as follows (see Fig. 1 and Fig. 2) ∗ ∗ ∗ zRF (17) i = dRF = zG − zc ∗ where dRF denotes the distance of F ∗ from IRF along the unit normal n∗RF = [ 0 0 1 ]T . By using the relationships given in (9), (12), (15), and (17) the Euclidean coordinates m ¯ ∗RF i of the feature points on the plane π∗ expressed in IRF can be determined. Remark 1. As in (6), the subsequent development requires ∗ that the constant rotation matrix RRF be known. The ∗ constant rotation matrix RRF can be obtained a priori using various methods (e.g., a second camera, Euclidean measurements) or can be selected as an identity matrix. Using the first expression (1) and Remark 1, the constant translation x∗f RF between F ∗ to IRF with coordinates expressed in F ∗ can be computed. From the coordinate geometry as depicted in Fig. (2), the position and orientation of a WMR coordinate frame IW MR with respect to greenhouse inertial frame of reference IG can be determined as follows xf W MRG = x∗fG − x∗f RF − xfc ∗ RW MRG = RRF R∗G .

(18) (19)

3.2 Case II: Calibrated target F ∗ is in the field-of-view of the forward looking camera frame IF

3.3 Case III: Calibrated target F ∗ is in the field-of-view of the reverse looking camera frame IR

This arrangement corresponds to a time-varying trajectory of a WMR when the calibrated target F ∗ is in the fieldof-view of the forward looking camera frame IF . The relationship between the Euclidean coordinates m ¯ ∗RF i and ∗ m ¯ F i (t) can be obtained from (1) as follows m ¯ ∗F i = xf F RF + RF RF m ¯ ∗RF i (20) 3×3 3 where RF RF (t) ∈ R and xfF RF (t) ∈ R , defined in Section 2, are given by

This arrangement corresponds to the time-varying trajectory of a WMR such that the calibrated target F ∗ is in the field-of-view of the reverse looking camera frame IR . The relationship between the Euclidean coordinates m ¯ ∗Ri ∗ and m ¯ RF i (t) can be obtained from (1) and (2) as follows m ¯ ∗Ri = xf RRF + RRRF m ¯ ∗RF i (30)

T

∗ RF RF = RF∗ RRF

xf F RF = x∗f F



where RRRF (t) ∈ R3×3 and xf RRF (t) ∈ R3 , denote the rotation and translation, respectively, between IRF and IR given by

(21)

RF RF x∗f RF .

(22)

By using the projective relationship (see Fig. 1 and Fig. 2) T d∗RF = n∗RF m ¯ ∗RF i (23) the relationship in (20) can be expressed as   xfF RF ∗T m ¯ ∗F i = RF RF + ∗ nRF m (24) ¯ ∗RF i dRF From the expressions given in (9) and (24), the rotation and translation between the coordinate systems IRF and IF can now be related in terms of the normalized Euclidean coordinates as follows:

T z∗ i m∗F i = RF RF RF + xhF RF n∗RF m∗RF i . ∗ zF i (25)       αF RF i HF RF

T

∗ ∗ RRRF = RR RRF

xfRRF = x∗f R



RRRF x∗f RF .

(31) (32)

By using the projective relationship stated in (23), the relationship in (30) can be expressed as   xf RRF ∗T nRF m ¯ ∗RF i (33) m ¯ ∗Ri = RRRF + ∗ dRF

From the expressions given in (9), (10), and (33), the rotation and translation between the coordinate systems IRF and IR can now be related in terms of the normalized Euclidean coordinates as follows:

T z∗ i m∗Ri = RF RRRF + xhRRF n∗RF m∗RF i . ∗ zRi (34)       αRRF i HRRF

In (25), αRRF i (t) ∈ R denotes the depth ratio, HRRF (t) ∈ In (25), αF RF i (t) ∈ R denotes the depth ratio, HF RF (t) ∈ R3×3 denotes the Euclidean homography (18), and xhRRF (t) ∈ R3×3 denotes the Euclidean homography (18), and xhF RF (t) ∈R3 denotes the scaled translation vector that is defined as 3 R denotes the scaled translation vector that is defined as follows follows xf RRF xf F RF . (35) xhRRF = ∗ xhF RF = ∗ . (26) dRF d RF

By using (15) and (25), the following relationship can be developed: ∗ p∗F i = αF RF i AF HF RF A−1 pRF i F    (27) GF RF

By using (15), (16), and (34), the following relationship can be developed: ∗ p∗Ri = αF RF i AR HRRF A−1 pRF i F    (36) GRRF

where GF RF (t) = [gF RF ij (t)] ∀i, j = 1, 2, 3 ∈ R3×3 denotes the projective homography. Sets of linear equations can be developed from (27) to determine the projective homography up to a scalar multiple. Various techniques can be used (e.g., see (19; 37)) to decompose the Euclidean homographies, to obtain αF RF i (t) , xhF RF (t) , RF RF (t). ∗ Given that the constant rotation matrix RRF is assumed to be known, the expression for RF RF (t) in (21) can be used to determine RF∗ (t). Further xhF RF (t) and the constant translation vector x∗f RF , computed in 3.1, can be used to compute the translation x∗f F (t) between F ∗ to IF expressed in IF . From the coordinate geometry as depicted in Fig. (2), the position and orientation of a WMR coordinate frame IW MR with respect to greenhouse inertial frame of reference IG can be determined as follows

where GRRF (t) = [gRRF ij (t)] ∀i, j = 1, 2, 3 ∈ R3×3 denotes the projective homography. Sets of linear equations can be developed from (36) to determine the projective homography up to a scalar multiple. Various techniques can be used (e.g., see (19; 37)) to decompose the Euclidean homographies, to obtain αRRF i (t) , xhRRF (t) , RRRF (t). Given that the constant rotation matrix R∗RF is assumed to be known, the expression for RRRF (t) in (31) can ∗ be used to determine RR (t). Further xhRRF (t) and the constant translation vector x∗f RF , computed in 3.1, can be used to compute the translation x∗f R (t) between F ∗ to IR expressed in IR . From the coordinate geometry as depicted in Fig. (2), the position and orientation of a WMR coordinate frame IW MR with respect to greenhouse inertial frame of reference IG can be determined as follows

xf W MRG = x∗f G − x∗f F − xf c

xf W MRG = x∗f G − x∗f R − xf c

∗ RW MRG = R∗F RG .

(28) (29)

∗ RW MRG = R∗R RG .

(37) (38)

3.4 Case IV: Calibrated target F ∗ is in the field-of-view of the reverse looking camera frame IR and above-the-aisle target F is in the field-of-view of the forward looking camera frame IF This arrangement represents the position and orientation (i.e. pose) of a WMR such that the calibrated target F ∗ is in the field-of-view of the reverse looking camera frame IR and above-the-aisle target F is in the field-of-view of the forward looking camera frame IF . The daisy chaining concept (27) is utilized for self-calibration of above-theeisle targets, in which the position and orientation of above-the-eisle targets (i.e. F) is determined with respect to the greenhouse inertial frame of reference IG .Using the expressions given in (2) and (3), the relationship between  the Euclidean coordinates m ¯ ∗Ri (t) and m ¯ ∗Ri (t) can be stated as follows  T m ¯ ∗Ri = RRF (m ¯ ∗Ri − xf RF ). (39) In (39), the Euclidean coordinates m ¯ ∗Ri (t) can be determined from (24) since the constant Euclidean coordinates m ¯ ∗RF i are found in Section 3.1 and decomposition of homography HRRF (t) yields RRRF (t) and xhRRF (t). Therefore, (39) can be utilized to compute the virtual Euclidean  coordinates m ¯ ∗Ri (t) of the feature points on the plane π ∗ expressed in the camera frame IF . Also, the expression in (2) yields the following relationship 

m ¯ F i = x∗f + R∗ m ¯ ∗Ri

(40)

where R∗ (t) and x∗f (t) denote the time-varying rotation and translation from F ∗ to F, as expressed in F ∗ . Since  m ¯ ∗Ri (t) as well as m ¯ F i are expressed in IF , the geometric relationship between the calibrated target F ∗ and abovethe-eisle target F can be expressed as follows x∗f T ∗ m ¯ F i = (R∗ + ∗ n∗R )m ¯ Ri . (41) dR From the expression given in (10), the rotation and translation between the coordinate systems F ∗ and F can now be related in terms of the normalized Euclidean coordinates as follows: 

mF i =

∗ zRi

zF i  α∗i

T



(R∗ + x∗h n∗R ) m∗Ri    H

(42)



where ∈ R denotes the depth ratio, H ∗ (t) ∈ R3×3 denotes the Euclidean homography (18), and x∗h (t) ∈ R3 denotes the scaled translation vector that is defined as follows x∗f x∗h = ∗ . (43) dRF By using (16) and (42), the following relationship can be developed:  pF i = α∗i AF H ∗ A−1 AF m∗Ri F    (44) G∗ α∗i (t)

∗ where G∗ (t) = [gij (t)] ∀i, j = 1, 2, 3 ∈ R3×3 denotes the projective homography. Sets of linear equations can be developed from (44) to determine the projective homography up to a scalar multiple. Various techniques can be used (e.g., see (19; 37)) to decompose the Euclidean

homographies, to obtain α∗i (t) , x∗h (t) , R∗ (t). From the coordinate geometry as depicted in Fig. (2), the position and orientation of above-the-aisle target F with respect to greenhouse inertial frame of reference IG can be determined as follows xf G = x∗f G + x∗f RG = R



(45)

∗ RG

(46)

Relationships given in (45) and (46) yield the position and orientation of above-the-eisle target F with respect to greenhouse inertial frame IG . Hence, above-the-eisle target F can now be regarded as a calibrated reference target for localization of a WMR. 4. EXPERIMENTAL RESULTS The experimental results are provided to illustrate the performance of the daisy chaining based WMR localization scheme. The intrinsic camera calibration matrix used for the forward looking on-board camera and the reverse looking on-board camera are given as follows: u0 = 318 [pixels] and v0 = 247 [pixels] denote the pixel coordinates of the principal point; λ1 = 837.98259, λ2 = 771.33238 denote the product of the focal length and the scaling factors, respectively; and φ = 0 [rad] is the skew angle for each camera. The intrinsic camera calibration matrix for the forward looking and the reverse looking camera is given as follows

 837.9826 0 318 0 771.3324 247 . Af = Ar = (47) 0 0 1 ∗ The constant known rotation RRF between F ∗ to IRF was considered as an identity matrix, also the constant known rotation RRF and translation xf RF between IF to IR measured in IF were selected as follows

 100 T RRF = 0 1 0 xf RF = [ 152.4 0 0 ] . (48) 001

The Euclidean coordinates of the calibrated beginning-ofthe-eisle target F ∗ with respect to the greenhouse inertial frame of reference IG were selected as follows T x∗f G = [ 355 244 545 ] . (49) 4.1 Case I: Calibrated target F ∗ is in the field-of-view of the forward looking camera frame IRF The image space coordinates (all image space coordinates are in units of pixels) of the four constant reference target points on the plane π∗ as viewed by IF R were selected as follows T

p∗RF 1 = [ 425 289 1 ]   T p∗RF 2 = 629 288 1 T

p∗RF 3 = [ 631 161 1 ]

T

p∗RF 4 = [ 415 152 1 ] .

(50)

Utilizing (47), (49), and (50), the Euclidean coordinates of the feature points were computed as

T

p∗F 1 = [ 311 286 1 ]   T p∗F 2 = 533 287 1

600

T

p∗F 3 = [ 535 155 1 ]

F*

Z−Axis [mm]

500

T

p∗F 4 = [ 310 150 1 ] .

400

(56)

Decomposition of homography given in (25) yield the rotation RF RF (t), translation xf F RF (t), unit normal to the plane π∗ , and depth ratio αF RF i (t) as follows

300 200 100

IRF

0 265 260

255 Y−A xis 250 [mm ]

400

RF RF

350 245 250

240 235

300 ] is [mm X−Ax

T

(51)

From (1), (18), (19), and (51) the Euclidean position and orientation of a WMR with respect to greenhouse inertial frame of reference IG is computed as follows T

xf W MRG = [ 215.2561 262.6135 0 ]

 100 RW MRG = 0 1 0 001

(52) (53)

The actual position and orientation of a WMR with respect to the greenhouse inertial frame of reference was selected as T

xf W MRG(actual) = [ 215 262 0 ]

 100 RW MRG(actual) = 0 1 0 001

αF RF 2 = 1.040473

αF RF 3 = 1.064294

αF RF 4 = 1.024555

n∗RF

= [ 0.00015591 0.00039507 1 ]

(57)

T

T

= [ 69.7439 −67.4135 545 ] .

αF RF 1 = 1.000564

Utilizing (2), (30), (21), (22), (51), and (57), the Euclidean pose of a WMR was identified as follows

T

m ¯ ∗RF 1 = [ 69.7439 29.3865 545 ]   T m ¯ ∗RF 2 = 202.4197 28.6800 545 = [ 203.7205 −61.0544 545 ]



T

Fig. 3. Euclidean plot indicating the identified pose of a WMR (denoted by ‘+’), the actual pose of a WMR IRF (denoted by ‘x’) while viewing the target F ∗ (denoted by ‘*’) by the forward looking camera.

m ¯ ∗RF 4

0.997305 0.017191 −0.021936 = −0.016461 0.999602 −0.021936 −0.017390 0.021835 0.997198

xf F RF = [ −91 12 0 ]

200

m ¯ ∗RF 3

(54) (55)

From (52), (53), (54), and (55), the pose of a WMR has been successfully identified. Fig. (3) indicates the actual and the experimentally identified position and orientation of a WMR while the forward looking camera IRF viewing the calibrated target F ∗ . 4.2 Case II: Calibrated target F ∗ is in the field-of-view of the forward looking camera frame IF The image space coordinates (all image space coordinates are in units of pixels) of the four constant reference target points on the plane π∗ as viewed by IR for the given instant were selected as follows

xf W MRG = [ 297.6723 264.8610 4.3637 ]

 0.9973 0.0172 0.0170 RW MRG = −0.0165 0.9996 −0.0219 −0.0174 0.0218 0.9972

(58) (59)

The actual position and orientation of a WMR with respect to the greenhouse inertial frame of reference was selected as T

xf W MRG(actual) = [ 292 262 0 ]

 100 RW MRG(actual) = 0 1 0 001

(60) (61)

From (58), (59), (60), and (61), the pose of a WMR has been successfully identified. Fig. (4) indicates the actual and the experimentally identified position and orientation of a WMR while the forward looking camera IF viewing the calibrated target F ∗ . 4.3 Case III: Calibrated target F ∗ is in the field-of-view of the reverse looking camera frame IR The image space coordinates (all image space coordinates are in units of pixels) of the four constant reference target points on the plane π ∗ as viewed by IR for the given instant were selected as follows T

p∗R1 = [ 14 304 1 ]   T p∗R2 = 219 306 1 T

p∗R3 = [ 217 171 1 ] T

p∗R4 = [ 11 177 1 ] .

(62)

Decomposition of homography given in (25) yield the rotation RRRF (t), translation xf RRF (t), unit normal to the plane π∗ , and depth ratio αRRF i (t) as follows

600

600 F* 500

Z−Axis [mm]

Z−Axis [mm]

500 400 300 200

F*

400 300 200

100

100

IR

IF 0 270

IR

0 320 300

IF

260

Y−A

xis

250 [mm ]

380 360 320

240 300 230

340 m] Axis [m

X−

280 Y−A 260 xis [mm240 ]

220 350

200 180

280

Fig. 4. Euclidean plot indicating the identified pose of a WMR (denoted by ‘+’), the actual pose of a WMR IF (denoted by ‘x’) while viewing the target F ∗ (denoted by ‘*’) by the forward looking camera.

 0.998597 0.094310 0.077288 RRRF = −0.094355 0.995497 −0.018622 −0.078687 0.011477 1.002996

400 ] is [mm

X−Ax

300

Fig. 5. Euclidean plot indicating the identified pose of a WMR IR (denoted by ‘+’), the actual pose of a WMR IR (denoted by ‘x’) while viewing the target F ∗ (denoted by ‘*’) by the reverse looking camera. T

p∗R1 = [ 14 304 1 ]   T p∗R2 = 219 306 1

T

xf RRF = [ −305.8 12 0 ] αF RF 1 = 1.000564

500 450

T

p∗R3 = [ 217 171 1 ]

αF RF 2 = 1.040473

T

p∗R4 = [ 11 177 1 ] .

αF RF 3 = 1.064294 αF RF 4 = 1.024555   ∗ −6 nRF = 8.7518 × 10 0.00043127 1

(63)

(68)

T

pF 1 = [ 397 306 1 ]   T pF 2 = 608 306 1

Utilizing (1), (20), (31), (32), (51), and (63), the Euclidean pose of a WMR was identified as follows

T

xf W MRG = [ 480.8930

0.9986 RF RF = −0.0944 −0.0787

T

273.8639 9.5768 ]  0.0943 0.0773 0.9955 −0.0186 0.0115 1.0030

(64) (65)

The actual position and orientation of a WMR with respect to the greenhouse inertial frame of reference was selected as T

xf W MRG(actual) = [ 493 262 0 ]

 100 RW MRG(actual) = 0 1 0 001

pF 3 = [ 612 179 1 ]

(66)

T

pF 4 = [ 400 172 1 ] .

Using (39) the virtual pixel coordinates of the target points on the plane π∗ were obtained for the forward looking onboard camera as follows 

T

p∗R1 = [ −214.0880 282.3448 1 ]    T p∗R2 = −15.5866 263.8270 1 

(67)

From (64), (65), (66), and (67), the pose of a WMR has been successfully identified. Fig. (5) indicates the actual and the experimentally identified position and orientation of a WMR while the reverse looking camera IR viewing the calibrated target F ∗ . 4.4 Case IV: Calibrated target F ∗ is in the field-of-view of the reverse looking camera frame IR and above-the-aisle target F is in the field-of-view of the forward looking camera frame IF The image space coordinates (all image space coordinates are in units of pixels) of the four constant reference target points on the planes π∗ and π as viewed by IR and IF , respectively, for the given instant were selected as follows

(69)

T

p∗R3 = [ −27.5935 133.6519 1 ] 

T

p∗R4 = [ −229.3023 144.8639 1 ] .

(70)

Decomposition of homography given in (42) yield the rotation R∗ , translation x∗h , and depth ratio α∗i as follows

1.0683 −0.0816 −0.0249 R = 0.0885 0.9967 0.0026 0.0246 −0.0063 1.0720 ∗

 T

x∗f = [ 393.6373 15.6168 −13.0245 ] α1 = 0.969647

α2 = 0.971852

α3 = 0.965629

α4 = 0.963069

(71)

Utilizing (45), (46), and (71), the Euclidean position and orientation of above-the-aisle target F with respect to the greenhouse inertial frame of reference was identified as follows

REFERENCES F 600

Z−Axis [mm]

500 400

F*

300 200 100

IF 0 270

IR 800

260 Y−A

xis

700

[mm250 ] 240 400 230

500 is X−Ax

600 [mm]

300

Fig. 6. Euclidean plot indicating the identified position and orientation of above-the-aisle target F (denoted by ‘*’), while the forward looking camera IF (denoted by ‘+’) viewing the target F and the reverse looking camera IR viewing the calibrated target F ∗ (denoted by ‘x’). T

xf G = [ 748.6373 259.6168 531.9755 ]

 1.0683 −0.0816 −0.0249 RG = 0.0885 0.9967 0.0026 0.0246 −0.0063 1.0720

(72) (73)

The actual position and orientation of the target F with respect to the greenhouse inertial frame of reference was selected as T

xfG(actual) = [ 766 244 549 ]

 100 RG(actual) = 0 1 0 001

(74) (75)

From (72), (73), (74), and (75), the position and orientation of the target F has been successfully identified. Fig. (6) indicates the experimentally identified position and orientation of above-the-aisle target F while the forward looking camera IF viewing the target F.and reverse looking camera IR viewing the calibrated target F ∗ . 5. CONCLUSION In this paper, the position and orientation of a WMR is identified with respect the the greenhouse inertial frame of reference using a collaborative visual servo control strategy. To achieve the result, multiple views of a reference objects were used to develop Euclidean homographies. By decomposing the Euclidean homographies into separate translation and rotation components, reconstructed Euclidean information was obtained for calibration of unknown targets for localization of a WMR. The impact of this paper is a new framework to identify the pose of a moving WMR through images acquired by a moving camera, which would be beneficial for spatial selection and autonomous operations in a greenhouse. Experimental results verify the daisy chaining approach for localization scheme presented in this paper.

[1] G.M. Acaccia, R.C. Michelini, R.M. Molfino, R.P. Razzoli, “Mobile robots in greenhouse cultivation: inspection and treatment of plants”, Proc. of ASER 1st International Workshop on Advances in Service Robotics, March 2003, Bardolino, Italy. [2] J. Badgery-Parker, Agnote DPI/249, Edition 1, pp. 1-2, 1999. [3] R. Brockett, “Asymptotic Stability and Feedback Stabilization,” Differential Geometric Control Theory, (R. Brockett, R. Millman, and H. Sussmann Eds.), Birkhauser, Boston, (1983). [4] D. Burschka and G. Hager, “Vision-Based Control of Mobile Robots,” Proc. of the IEEE International Conference on Robotics and Automation, pp. 17071713, 2001. [5] L. Chaimowicz, B. Grocholsky, J. F. Keller, V. Kumar, C. J. Taylor, “Experiments in Multirobot AirGround Coordination,” Proc. of the International Conference on Robotics and Automation, pp. 40534058, New Orleans - LA, April 2004. [6] J. Chen, D. M. Dawson, W. E. Dixon, and A. Behal, “Adaptive Homography-Based Visual Servo Tracking for Fixed and Camera-in-Hand Configurations,” IEEE Transactions on Control Systems Technology, accepted, to appear. [7] P. Dario, G. Sandini, B. Allotta, A. Bucci, F. Buemi, M. Massa, F. Ferrari, M. Magrassi, L. Bosio, R. Valleggi, E. Gallo, A. Bologna, F. Cantatore, G. Torrielli, A. Mannucci,.“The Agribot Project for Greenhouse Automation”, Proc. of International Symposium on New Cultivation Systems in Greenhouse, pp. 361:8592, Italy, 1994. [8] J. Chen, W. E. Dixon, D. M. Dawson, and M. McIntire, “Homography-based Visual Servo Tracking Control of a Wheeled Mobile Robot,” Proc. of the IEEE International Conference on Intelligent Robots and Systems, Las Vegas, Nevada, pp. 1814-1819, October 2003; see also IEEE Transactions on Robotics, accepted, to appear. [9] J. Chen, W. E. Dixon, D. M. Dawson, and V. Chitrakaran, “Visual Servo Tracking Control of a Wheeled Mobile Robot with a Monocular Fixed Camera,” Proc. of the IEEE Conference on Control Applications, Taipei, Taiwan, pp. 1061-1066, 2004. [10] J. Chen, D. M. Dawson, W. E. Dixon, and A. Behal, “Adaptive Homography-Based Visual Servo Tracking,” Proc. of the 2003 IEEE International Conference on Intelligent Robots and Systems, Las Vegas, Nevada, pp. 230-235, October 2003. [11] A. K. Das, et al., “Real-Time Vision-Based Control of a Nonholonomic Mobile Robot,” Proc. of the IEEE International Conference on Robotics and Automation, pp. 1714-1719, 2001. [12] C. A. Desoer and M. Vidyasagar, Feedback Systems: Input-Output Properties, New York: Academic Press, 1975. [13] W. E. Dixon, D. M. Dawson, E. Zergeroglu and A. Behal, Nonlinear Control of Wheeled Mobile Robots, Springer-Verlag, 2001. [14] W. E. Dixon, D. M. Dawson, E. Zergeroglu, and A. Behal, “Adaptive Tracking Control of a Wheeled Mobile Robot via an Uncalibrated Camera System,”

[15]

[16]

[17]

[18] [19]

[20]

[21]

[22]

[23]

[24]

[25]

[26] [27]

[28]

IEEE Transactions on Systems, Man, and Cybernetics -Part B: Cybernetics, Vol. 31, No. 3, pp. 341-352, 2001. W. E. Dixon, D. M. Dawson, E. Zergeroglu, and A. Behal, “Adaptive Tracking Control of a Wheeled Mobile Robot via an Uncalibrated Camera System,” IEEE Transactions on Systems, Man, and Cybernetics Part-B: Cybernetics, Vol. 31, No. 3, June 2001. Y. Fang, W. E. Dixon, D. M. Dawson, and J. Chen, “Robust 2.5D Visual Servoing for Robot Manipulators,” Proc. of the IEEE American Control Conference, Denver, pp. 3311-3316, Colorado, June 2003. Y. Fang, D. M. Dawson, W. E. Dixon, and P. Chawda, “Homography-Based Visual Servoing of Wheeled Mobile Robots,” IEEE Transactions on Systems, Man, and Cybernetics -Part B: Cybernetics, Vol. 35, No. 5, pp. 1041-1050, (2005). O. Faugeras, Three-Dimensional Computer Vision, The MIT Press, Cambridge Massachusetts, 2001. O. Faugeras and F. Lustman, “Motion and Structure From Motion in a Piecewise Planar Environment,” International Journal of Pattern Recognition and Artificial Intelligence, Vol. 2, No. 3, pp. 485-508, 1988. G. D. Hager, D. J. Kriegman, A. S. Georghiades, and O. Ben-Shahar, “Toward Domain-Independent Navigation: Dynamic Vision and Control,” Proc. of the IEEE Conference on Decision and Control, pp. 3257-3262, 1998. B. H. Kim, et al., “Localization of a Mobile Robot using Images of a Moving Target,” Proc. of the IEEE International Conference on Robotics and Automation, pp. 253-258, 2001. Y. Ma, J. Kosecka, and S. Sastry, “Vision Guided Navigation for Nonholonomic Mobile Robot”, IEEE Transactions on Robotics and Automation, Vol. 15, No. 3, pp. 521-536, June 1999. E. Malis, “Contributions à la modélisation et à la commande en asservissement visuel”, Ph.D. Dissertation, University of Rennes I, IRISA, France, Nov. 1998. E. Malis and F. Chaumette, “Theoretical Improvements in the Stability Analysis of a New Class of Model-Free Visual Servoing Methods,” IEEE Transactions on Robotics and Automation, Vol. 18, No. 2, pp. 176-186, April 2002. E. Malis and F. Chaumette, “2 1/2 D Visual Servoing with Respect to Unknown Objects Through a New Estimation Scheme of Camera Displacement,” International Journal of Computer Vision, Vol. 37, No. 1, pp. 79-97, June 2000. E. Malis, F. Chaumette, and S. Bodet, “2 1/2 D Visual Servoing,” IEEE Transactions on Robotics and Automation, Vol. 15, No. 2, pp. 238-250, April 1999. S. S. Mehta, W. E. Dixon, D. MacArthur, C.D. Crane, “Visual Servo Control of an Unmanned Ground Vehicle via a Moving Airborne Monocular Camera”, Proc. of the IEEE American Control Conference, Minneapolis Minnesota, 2006, to appear. S. S. Mehta, W. E. Dixon, T. Burks, S. Gupta, “Teach by Zooming Visual Servo Control for an Uncalibrated Camera System,” Proc. of the AIAA Guidance, Navigation, and Control Conference, AIAA 2005-6095, San Francisco, August 2005.

[29] M. de Queiroz, D. Dawson, S. Nagarkatti, and F. Zhang, Lyapunov-based Control of Mechanical Systems, Birkhauser, New York, 2000. [30] Philip J. Sammons, Tomonari Furukawa and Andrew Bulgin, “Autonomous Pesticide Spraying Robot for use in a Greenhouse”, Proc. of the Australasian Conference on Robotics & Automation, Australia, 2005. [31] C. Samson, “Control of Chained Systems Application to Path Following and Time-Varying PointStabilization of Mobile Robots”, IEEE Transactions on Automatic Control, Vol. 40, No. 1, pp. 64-77, January 1995. [32] J. J. E. Slotine and W. Li, Applied Nonlinear Control, Prentice Hall, Inc: Englewood Cliff, NJ, 1991. [33] K.-T. Song and J.-H. Huang, “Fast Optical Flow Estimation and Its Application to Real-time Obstacle Avoidance,” Proc. of the IEEE International Conference on Robotics and Automation, pp. 2891-2896, 2001. [34] M. W. Spong and M. Vidyasagar, Robot Dynamics and Control, John Wiley and Sons, Inc: New York, NY, 1989. [35] E.J. Van Henten, B.A.J. Van Tuijl, G.-J. Hoogakker, M.J. Van Der Weerd, J. Hemming, J.G. Kornet, J. Bontsema, “An Autonomous Robot for De-leafing Cucumber Plants Grown in a High-wire Cultivation System”, Proc. of ISHS International Conference on Sustainable Greenhouse Systems, Belgium, 2004. [36] H. Y. Wang, S. Itani, T. Fukao, and N. Adachi, “Image-Based Visual Adaptive Tracking Control of Nonholonomic Mobile Robots”, Proc. of the IEEE/RJS International Conference on Intelligent Robots and Systems, pp. 1-6, 2001. [37] Z. Zhang and A. R. Hanson, “Scaled Euclidean 3D Reconstruction Based on Externally Uncalibrated Cameras,” IEEE Symp. on Computer Vision, pp. 3742, 1995.

A Theoretical Model for Vision-Based Localization of a ...

The function of a greenhouse is to create ... and controls theory application of wheeled mobile ro- ... In (30), an autonomous mobile robot for use in pest control.

349KB Sizes 0 Downloads 114 Views

Recommend Documents

A comparison of a theoretical model for quasi-statically ...
proportion of the papers were devoted to ground vibration from railway trains. A number of models have ... +44-23-8059-2311; fax: +44-23-8059-3190. .... The dispersion curves describe the free vibration of the ground, giving the dependence ...

A stand-alone method for anatomical localization of ...
implemented as part of our NIRS Analysis Package (NAP), a public domain Matlab ... MNI space (Okamoto et al., 2004; Okamoto and Dan, 2005; Singh et al.,.

A Behavioural Model for Client Reputation - A client reputation model ...
The problem: unauthorised or malicious activities performed by clients on servers while clients consume services (e.g. email spam) without behavioural history ...

A Scalable UWB Based Scheme for Localization in ...
However simple GPS based schemes do not work well ... The goal is to track the vector hn, estimate the channel taps, ..... location and tracking system”, Proc.

A practical multirobot localization system - STRANDS project
form (offline, online), as well as to translate, print, publish, distribute and sell ... School of Computer Science, University of Lincoln ... user's specific application.

Microtubule-based localization of a synaptic calcium - Development
convenient tool with which to analyze the function of microtubules in biological .... visualization of protein subcellular localization and AWC asymmetry in ... 138 tir-1(tm3036lf); odr-3p::tir-1::GFP r. –. 0. 100. 0. 147 odr-3p::nsy-1(gf), L1 s. â

Theoretical study of an abstract bubble vibration model
we refer to [20] and to [18]. In particular, one of the most difficult issues raised by diphasic flows is the numerical handling of interfaces. That is why an accurate resolution requires an adaptive mesh refinement technique to avoid any diffusion o

A Theoretical Agenda for Entertainment—Education
introductory article provides a historical background to this special issue of ... theoretical agenda for future research on entertainment-education. Theoretical ...... fers its audience members an online Web site through which they can.

Design of a Distributed Localization Algorithm to ...
GPS to jamming) by providing a cheap, low-power alternative that can exploit existing, readily ... In the robotic domain, angular sensors (e.g., monocular ...

A practical multirobot localization system - STRANDS project
form (offline, online), as well as to translate, print, publish, distribute and sell ... School of Computer Science, University of Lincoln. E-mail: tkrajnik ...

Microtubule-based localization of a synaptic calcium - Semantic Scholar
NSY-5 gap junction network is required for the induction of AWC asymmetry (Chuang et al., 2007). Once AWC .... with a speed of seven frames per second and an exposure time of 140 mseconds. Movies were analyzed using .... To test directly the effect o

Predictions of a Recurrent Model of Orientation
Jan 3, 1997 - linear and an analytic solution to the network can be found. The biases for certain numbers of peaks in the responses become evident once the ...

A Theoretical Framework for Back-Propagation - Yann LeCun
A Theoretical Framework for Back-Propagation *. Yann le Gun 7. Department of Computer Science, University of Toronto. Toronto, Ontario, MES 1A4. CANADA.

A theoretical framework for understanding the ecology ... - Springer Link
May 29, 2012 - Abstract Worldwide, populations of a diverse array of bamboo-specialist birds must respond to the life cycles of typical woody bamboos, which ...

The subspace Gaussian mixture model – a structured model for ...
Aug 7, 2010 - We call this a ... In HMM-GMM based speech recognition (see [11] for review), we turn the .... of the work described here has been published in conference .... ize the SGMM system; we do this in such a way that all the states' ...

Predictions of a Recurrent Model of Orientation
Jan 3, 1997 - run on a network of 5 12 units whose output represents the activity of .... on the initial state of the network. .... (O'Toole & Wenderoth, 1977).

A demographic model for Palaeolithic ... - Semantic Scholar
Dec 25, 2008 - A tradition may be defined as a particular behaviour (e.g., tool ...... Stamer, C., Prugnolle, F., van der Merwe, S.W., Yamaoka, Y., Graham, D.Y., ...

Towards a Theoretical Explanation of Time-Varying ...
6 Sep 2017 - attention to the stock-market; he will instead disregard it for a long time if expected re- turns are low. .... Papers under the first heading find their roots in Baumol (1952) and Tobin (1956) and share the common feature that ..... low

Structural Sources of Intraorganizational Power: A Theoretical Synthesis
authority, resource control, and network centrality—are integrated in a theoretical synthesis. ... Every social act is an exercise of power, every social relationship is a power ..... general managers, "who typically span and integrate a variety of

DEVELOPING A COMMUNITY SUPPORT MODEL FOR TOURISM.pdf
DEVELOPING A COMMUNITY SUPPORT MODEL FOR TOURISM.pdf. DEVELOPING A COMMUNITY SUPPORT MODEL FOR TOURISM.pdf. Open. Extract.

Petition for a Model Rule.pdf
Page 1 of 2. Page 1 of 2. RCIMODELRULESCOMMITTEE. PETITION FOR NEW RULE OR CHANGE TO EXISTING RULE. Your ContactInformation: Name: Organization: Address: Phone(s):. Fax #:. E-mail Address: A. Brief Description of the Issue. B. Discussion of the Issue