Proceedings of the 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems October 9- 15, 2006, Beijing, China
Robot Localization Sensor for Development of Wireless Location Sensing Network Heesung Chae, Wonpil Yu, Jaeyeong Lee, and Young-Jo Cho Intelligent Robot Research Division, Electronics and Telecommunications Research Institute 161 Gajeong-dong, Yuseong-gu, Daejeon, Korea {hschae, ywp, jylee, youngjo} @etri.re.kr factors among others which make indoor localization a challenging task. Triangulation techniques based on RSSI (Received Signal Strength Indicator), TOA (Time-Of-Arrival), AOA (AngleOf-Arrival), and TDOA (Time-Difference-Of-Arrival) are popular techniques to process radio signals to estimate location information. As mentioned earlier, the performance of these techniques is severely deteriorated depending on domain-specific RF factors or receiver-transmitter relative geometric configuration. On these grounds, in order to overcome the RF problems, even pattern-recognition approaches have been reported to model instantaneous spatial RF distribution inside a building [4] [6]. More recently, ultra wideband (UWB) systems have attracted considerable attention to achieve accurate localization information [7][8]. Along with the wireless applications, researchers have paid attention to design intelligent systems which support a human or a robot inside building environment [9] [10] [11]. For this kind of space technology, knowledge about spatial information is critical for an intelligent space to be meaningful. For example, in order for a robot to move freely and intelligently inside such a space, very good information about the current location is necessary, as evidenced in the traditional robotics community [2]. For seamless provision of spatial information about a robot in a large space, cooperating networked sensors embedded in the space can be a plausible solution to this problem. We think that there is not yet a realistic solution for robotic tasks such as intelligent navigation, as can be appreciated from [12]. In this paper, we describe a localization sensor suite to build a location sensing network; the location sensing network will also be a constituting element to build a ubiquitous robotic space where a robot or human can have information assistance from the space or a specific robotic task can be carried out more efficiently compared with the traditional approaches. The minimal requirement of the proposed sensor suite for robot localization task comprises two infrared landmark modules attached on the ceiling of a space in question and an image sensor equipped on top of a mobile robot. This configuration is, in fact, very well recognized in robotics community [13][14][15]. But, there seems no rigorous consideration to use such a sensor suite to build a location sensing network in a large indoor environment. This paper is organized as follows. In Section II, we briefly introduce the design principles to build a location
Abstract - The localization is one of the most important issues for mobile robot. We describe a novel localization sensor suite for the development of a wireless location sensing network. The sensor suite comprises wirelessly controlled infrared landmarks and an image sensor which detects the pixel positions of infrared sources. The proposed sensor suite can operate irrespective of the illumination condition in the indoor environment. We describe the operating principles of the developed sensor suite and report the performance for mobile robot localization and navigation. The advantage of the developed sensor suite lies in its robustness and low cost to obtain localization information as well as simplicity of deployment to build a robot location sensing network. Experimental results show that the developed sensor suite outperforms the state-ofthe-art localization sensor. Index Terms - Localization, Navigation, Artificial landmark, Camera, Mobile robot. I. INTRODUCTION
Navigation is the science of getting vehicles from place to place by determining the vehicle's position, course, and distance traveled [1]. Prompt and relevantly accurate information about the vehicle's position is a key component underpinning the success of a navigational application. In the field of mobile robotics, localization technology refers to a systematic approach to determine the current location of a mobile robot, namely, the 2-D position and heading angle of the mobile robot, by utilizing uncertain readings of robot sensors. Localization technology in the field of mobile robotics has been well studied and a multitude of methods have been proposed so far; a good overview on the robot localization technology can be found in [2]. Recently, with the advance of sensor and wireless connectivity technologies, it has become more feasible to design and build a wireless sensing network providing context-awareness; accordingly, there is an increasing need for accurate indoor location sensing in domestic, public, and military applications [3][4]. The most well-known location sensing technique using wireless communications may be GPS (Global Positioning System); GPS has dominated over all other outdoor radio navigation techniques because of its high accuracy, worldwide availability, and low cost [5]. For indoor location sensing applications, however, environmental effect imposes severe impediment to reliably obtain location information. Multi-path fading, interference, and non-LOS condition are well-known
1-4244-0259-X/06/$20.00 C)2006 IEEE
37
sensing network and describe how the proposed localization sensor suite satisfies the design constraints as well as its advantages. In Section III and IV, we describe how to get the position and heading angle of a mobile robot both static and dynamic state respectively. We show the robot localization and navigation performance based on the proposed sensor suite in Section V. Finally, in Section VI, we summarize the contributions and limitations of our proposed sensor suite and suggest future research directions.
is perpendicular to the ground. In order to obtain maximal field of view, a wide-angle camera lens is utilized. Each landmark module contains an infrared LED, on-off status of which is controlled externally by wireless communication. In order to control LEDs of the landmarks independently, a unique landmark ID is assigned to each infrared landmark module. Lauiidmai kj
La,nd:maFki
II. DEVELOPMENT OF LOCATION SENSING NETWORK
(Ceiling
Clt
In this section, we describe the design principles of a location sensing network as a key component for a ubiquitous robotic space. We introduce a couple of factors to build the location sensing network.
\
* Accuracy: for a mobile robot to navigate reliably, the required accuracy should be less than 10 cm. * Coverage: for a location sensing network to be meaningful, its coverage should be unlimited, which means that the location sensing network must be highly scalable. * Availability: location information about a robot should be provided at any time of the day and any place of an indoor environment. * Cost: constituting elements should be cost effective for a location sensing network to be deployed in a large indoor environment.
/
~~~~~~~~~~~~~I)
Imnage I)laIle, (U
C)
(i
v)..I ;X.
Robot (Camei;a)
Flooi
Fig. I Sensor configuration of the proposed location sensing system.
The location information of a robot can be obtained when at least two infrared LEDs are detected within the field of view of the camera. Therefore, more than two landmarks may be required to cover the whole area of the space in a large indoor environment. The necessary number of infrared landmarks varies according to the 2-D dimension of the space, the height of the ceiling, and the angle of the camera field of view. The optimal deployment of infrared landmarks is beyond the scope of this paper and will not be further described. The localization is performed in two steps. In the first step, the image coordinates of the infrared LEDs, if any, are computed and tag IDs are identified. In the second step 2-D location and heading angle of a robot are computed from the image coordinates and world coordinates of the detected LEDs.
Although there are various other factors to be determined, we considered the four factors as primary elements for building a location sensing network. As explained earlier, radio signal processing based on RSSI, TOA, or TDOA was excluded in an early stage due to its vulnerable characteristics according to RF interference or multi-path effects. Ultrasonic or vision sensors were also considered, but they were also turned out not to satisfy all of the constraints above. We chose the proposed localization sensor suite as a basic element to infer location information. First of all, it is very cheap and by utilizing a wide-angle camera lens, the coverage provided by the localization sensor suite can be as large as up to 10 meters. More importantly, it is a well-established technology so that we could easily construct each network node. Finally, the robustness of location data as well as its accuracy is guaranteed under modest environmental conditions. In the next section, we describe the developed localization sensor suite in more detail.
A. Detection and Identification ofInfrared Landmarks The infrared band-pass filter attached on the CCD camera transmits only infrared band by filtering out visible band. Fig. 2 demonstrates the effect of infrared band-pass filtering. The left image (a) was captured with a normal CCD camera and the right one (b) captured with the same camera but equipped with an infrared band-pass filter. The infrared LEDs are well discriminated as white spots in the bottom image. As is well known in vision community, it is very difficult to robustly locate particular patterns from images in varying illumination condition. Our infrared band-pass filtering solution makes the detection problem a simple thresholding problem, and more importantly, it enables robust detection of target landmarks at any time of the day. Let I(x, y) be an image, from which infrared spots to be detected. Firstly, the image is binarized by fixed thresholding
III. SELF-LoCALIZATION WITH THE PROPOSED SENSOR SUITE
The location sensing system using the proposed sensor suite is configured such that infrared landmark modules are attached on the ceiling of a space in question and an image sensor is mounted on top of a mobile robot as shown in Fig. 1. The image sensor is a CCD camera having an infrared bandpass filter. It is oriented to look upward so that its optical axis
38
f:P= h1:d
with a predetermined threshold value. The target blobs of infrared spots are then located by connected component analysis. Let bk be the kth blob located, for k = 1, ..., n. Finally, the mass center of each blob, (xk Yk)' is given by 1 (1) Xk = ZxI(x,y)
f
Using calculated d1 parameter f :P'= h2 :d1
Sk (x,y)Ebk
Yk=
ZyI(x,y)'
P'(u1
(2)
vj)
(4)
h
h2
Image coordinate of landmarkj is calibrated.
Sk (x,Y)Eb
where Sk =Lb I(x, y), for k 1, In order to identify the emitting source of the detected LED spot, only one infrared LED is turned on and the others turned off before capturing the image. If any spot is detected from the image, it is identified as being originated from the landmark which was set to be turned on. This detection and identification procedure is iterated for each landmark module until two spots are detected. Camera calibration is a necessary step in this localization sensing system for using a wide-angle lens. However, we had used the established calibration methods [16][17][18]. Especially, the main initialization for the camera calibration phase has been partially inspired from the Zhengyou Zhang' algorithm [19]. ...,
(3)
Ph d, =-h
n.
Landm
ark
14
Ceililug
f`i
/i
I.
Image plane
Flooi
Robot (Cainela)
Fig. 3 Landmark diagram for the image calibration in the different ceiling.
C. Computation of 2-D Localization and Heading Angle For the description of robot localization procedure, we use three coordinate systems: world coordinate system, image coordinate system, and extra coordinate system, denoted by subscript w, i, and e respectively. Fig. 4 shows the relationship of the three coordinate systems. The world coordinate system represents the real position in the space in question and the image coordinate system refers to the pixel coordinate in the image. The extra coordinate system is a landmark-oriented coordinate system and introduced for convenience to transform image coordinates to world coordinates. In the previous step, we have located two LEDs in the image and determined their corresponding landmark IDs. Let L'i and L2i be the image coordinates of the detected two LEDs and let L1W and L2W be the corresponding world coordinates. We assume that the world coordinate of each landmark is already known. The extra coordinate system is constructed from L'1i and L2i so that L'1i becomes the origin and y-axis spans from L1i to L2i as shown in Fig. 4. Let Pi be the image coordinates of a robot. Then we can set Pi to be the center of image since the optical axis of the aligned with the vertical axis of the mobile robot platform camera is to the ground. The localization task is to compute the world coordinate of the robot, denoted by Pw, and heading angle, denoted by Or. This requires two times of coordinate transformation: firstly from image coordinates to extra coordinates and then from extra coordinates to world coordinates. Let 01 be the rotation angle between x-axes of image and extra coordination
(a) (b) Fig. 2 Two sample images of a pair of infrared landmarks. (a) Image captured with an ordinary camera. (b) Image with an infrared band-pass filter
B. Landmark Image Coordinate Projection in Uneven Ceiling When the height of the ceiling where landmark modules are attached is even, location of the robot can be calibrated by calculating landmark image coordinates obtained from equations (1) and (2). However, in case of landmark modules being attached on the ceiling having different height as shown in Fig. 3, it is necessary for correction work to revise image coordinate of the tags. In this paper, image coordinate is revised only using the height information of the uneven ceiling where landmark modules are attached. In Fig. 3, actual location of landmark j is converted as projected position j' through correction process and image coordinate of j' is expressed as p'(u1 v1). Among many parameters that are needed to calculate P'(u1 vj), the focal length f and height parameters of the ceiling h1, h2 are parameters that do not change as robot moves. However, since d1 is a parameter that changes its value according to the movement of the robot, its calculation has to be done as shown by,
systems.
The extra coordinates of the robot, denoted by Pe, is then given by Pe
39
=
Rie (Pi -Tie)X
(5)
state described in the section III, since we make a landmark's LED turn on and off and take images of the light-emitting landmark when the lights are on and off for the difference of the two images. In this Section, what we concerned is how to detect a landmark' ID and image coordinate when the robot moves around. In the proposed location sensing system, using the images obtained by turning on the landmark LEDs in order, landmark's ID and image coordinate are measured. Using these measured data, position of the robot is measured. However, when a robot moves, occurring the discrepancy of the speed of the robot and the delay resulting from image processing time, it is impossible to measure a landmark's ID and image coordinates. Therefore, in order to solve these problems, following tracking algorithm is proposed in this
Ix
7X*w
X ooriate
LED Robot
*
w
World coordidmate
0
Fig. 4 Three coordinate systems used in roblot localization.
where Tie is given by R
Ti
a
translation matrix and Rie is
cos(- 69) -sin(-69) sin(-) cos(-Ol)j
a
paper.
cosL91 sinf61 L
The landmark detection algorithm using tracking is follows.
rotation matrix,
sin01
cosO
(1) As shown in Fig. 5 (a), obtain landmark's ID and image coordinate needed for measuring the position of the robot on a stationary state. (2) Set fixed sized searching area centered on the obtained image coordinate of the landmark. A size of a searching area is determined from the linear velocity of a robot. (3) Turn on all the landmark LEDs. (4) After the robot movement, the position is measured by searching only the masked searching area that was set in the image coordinate of the landmark (shown in Fig. 5 (b)) (5) Repeat the step in (4).
(6) j
(7)
L.
We define a scale factor, s, to unify the scale of extra coordinate system and world coordinate system, which is given by llL'w S
L'
L2w
8
L2i
()
where function 1 denotes the L2 norm. In addition, from the scale factor s, we don't need information of the focal lengthf and height parameters of the ceiling when the height of the ceiling where landmark modules are attached is even. The world coordinates of the robot, PW, is then given by -
Rwe SPe + Tweve
Pw
osO2 ee sinO2
Tle =L'W be the rotation angle between
where 02 extra coordinate systems. The heading angle calculated as
Or
Or
The computed Pw and of the robot.
Or
s
(0)
2
AX
x-axes
of the robot 02
Applying the proposed algorithm, landmark image coordinates can be measured in real-time without switching landmark LED on and off.
(9)
sinO2 c
as
(1 1) of world and
can
Aiea
Laiuhnai
Landmalikb
Aea
F ..
.:Piresent
Lanldinak
Pievious
Lalllhnark
-0
be simply
O01 .+ 2
(a) (b) Fig. 5 (a) Detecting the landmarks on a stationary state and establishing the searching area. (b) Tracking method using the searching area
(12)
give the final location information
V. EXPERIMENTAL SYSTEMS AND RESULTS
To show the feasibility of the proposed location sensing suite, we have implemented an experimental system, as shown in Fig. 6. The robot platform used for this experiment is the Pioneer of the ActiveMedia equipped with a camera and wireless communication devices to control the landmarks. In order to increase the precision of the robot's position, a wideangle camera is mounted perpendicular to the ground on the top of the robot. An infrared cut-off filter was installed in front of the camera. For the evaluation of localization performance
IV. DYNAMIC LOCALIZATION USING LANDMARK TRACKING
The proposed location sensing system needs the landmark' ID and image coordinate to measure the position of the robot. If the robot changes its direction and moves to the destination, it is impossible to obtain dynamically a landmark' ID and image coordinate with the method of the stationary
40
of the proposed sensor suite, we have conducted a couple of localization experiments by changing the location and heading angle of a mobile robot. Finally, we applied our sensing suite to the navigation using this experimental system. i
de-angle (C.ameira; with
off
Table I shows the repeatability results of localization when a robot stayed at the same position such as (0, 0) and (200, 200). Compared with the position (0, 0), we can find the tremendous standard deviation in the position (200, 200). Due to the diminished intensity of landmark's LED and increased camera distortion according to the increased distance between landmark and mobile base caused the difference of error and standard deviation which are due to measuring location. But, the standard deviations are so small that they may be neglected.
filtei
Robot C ontiollel (Notebook)
Mobile Robot Platfoi-in
TABLE I REPEATABILITY RESULTS AT GROUND-TRUTH (0, 0) AND (200,200)
Index
~~~~~~~Lanldtmaks w-itli tIe inlfit*alvd LLED
1 2 3 4 5 6 7 8 9
Fig. 6 Experimental Equipments for evaluation of the proposed sensening system.
For the static localization, the experiment was performed in an indoor environment where two infrared landmark modules are attached on the ceiling. In case of navigation, there are 5 landmarks attached on the ceiling to cover the whole area of environment and the uneven height of ceiling. The image sensor used is a CCD board camera having 2.3mm fisheye lens.
10 Ma SDb
A. Experimental Results of Static Localization We have done the experiments to focus on the accuracy and repeatability of the estimated position with the sensor suite discussed above. The height of the ceiling is 2.44m and each distance between infrared landmarks is 1.20m. The location information was computed for every grid points which are spaced regularly with 60cm distance. Fig. 7 shows the localization result using the proposed sensor suite in the fixed points when the heading angle is zero. The estimated positions are represented by asterisk and true positions with circle. The mean position error is 4. 1cm with standard deviation of 2.9cm. We have the maximum position error of 17.1 cm at (180, 60) because of uneven ground condition. The experimental result shows that the proposed sensor suite gives acceptable localization performance in indoor environment.
Ground-truth (0, 0)
Ground-truth (200, 200)
Estimated (= Error) position. (x, y) cm
Estimated position. (x, y) cm
(-2.08, -2.28)
(197.74, 196.57)
(-2.1, -2.3) (-2.1, -2.3) (-2.1, -2.2) (-2.1, -2.3) (-2.1, -2.3) (-2.1, -2.3) (-2.1, -2.3) (-2.1, -2.2) (-2.0, -2.3) (-2.1, -2.3)
(198.1, (198.5, (196.8, (197.5, (198.2, (198.5, (197.5, (198.1, (196.7, (197.5,
(0.042, 0.042) aMean position and error, 10
|
x~~D
9
150
4P
V1
196.7) 196.4) 197.1) 196.8) 196.7) 196.4) 196.8) 196.7) 195.3) 196.8)
(1.9, 3.3) (1.5, 3.6) (3.2, 2.9) (2.5, 3.2) (1.8, 3.3) (1.5, 3.6) (2.5, 3.2) (1.9, 3.3) (3.3, 4.7) (2.5, 3.2)
(2.26, 3.43)
(0.65, 0.49)
bStandard deviation
r
estimated positions ground-truth position
8
60 42E
0 -2
4-4 -6 -8
-10 L -10
-8
-6
-4
-2
0
x
(cm)
2
4
6
8
10
(a) Ground-truth position (0, 0) 210
estimated positions
ground-truth position
208
200
Error
206
D
204 202
100 198
L2
1r
40 B
D
-
196 194
-0
192 190 190
-100
192
194
196
198
200 x
-150 -200 -200
-150
-100
-50
0 X (cm)
50
100
150
(cm)
202
204
206
208
210
(b) Ground-truth position (200, 200) Fig. 8 (b) Repeatability test result of the same position.
200
B. Experimental Results ofNavigation
Fig. 7 Static localization results of the fixed points.
41
The navigation performance of the proposed sensor suite was tested under a synthetic home environment which has lOmX6m. The height of ceiling is varying as follows: 2.44m from 0 to 5.0m, and 4.12m from 5.0 to lOm along the x-axis. There are 5 landmarks attached on ceiling to cover the whole area of environment and the uneven height of ceiling, and the speed of robot is fixed at 30cm/sec. We navigated a same optimal path on 3 times, and A* algorithm generated the path consisting of start point, turning point and end point sequentially. In case of odometry navigation, the accumulated errors are increasing in proportion to a number of navigations as shown in Fig. 9. However, the proposed sensor suite can eliminate the accumulated error and follow the same path to navigate repeatedly in contrast to odometry navigating. The measured location error occurred from changing the height of ceiling around (5.0, 4.0) has the effect on smooth movement of the robot. We observed the proposed sensor suite could do a realtime location measurement without accumulated error in compared with odometry based navigation.
4
E
0 Height (m)
2.44I
I
0
1
2
3
4
5 x (mi)
(
7
8
9
REFERENCES [1] I. J. Cox and G. T. Wilfong, Autonomous Robot Vehicles, SpringerVerlag, 1990. [2] J. Borenstein, H. R. Everett, and L. Feng, "Where am I? Sensors and Methods for Mobile Robot Positioning," Technical Report, Univ. of Michigan, APR. 1996. [3] Fuji-Keizai USA, Inc., "Wireless Sensing Networks: Market, R&D and Commercialization Activities," Market Research Report, FEB. 2004. [4] K. Pahlavan, X. Li, and J. Makela, "Indoor Geolocation Science and Technology," IEEE Communications Magazine, pp. 112-118, FEB. 2002. [5] T. S. Rappaport, J. H. Reed, and B. D. Woerner, "Position Location Using Wireless Communications on Highways of the Future," IEEE Communications Magazine, pp. 33-41, OCT. 1996. [6] K. Yamano, K et al., "Self-localization of mobile robots with RFID system by using support vector machine," IEEE Int. Cont. Intell. Robots and Systems, pp. 3756-3761, 2004. [7] J. Y. Lee and R. A. Scholtz, "Ranging in a Dense Multipath Environment Using an UWB Radio Link," IEEE J. Selected Areas in Communications, vol. 20, no. 9, DEC. 2002. [8] The Ubisense Smart Space Platform, http://www.ubisense.net. [9] J. H. Lee, K. Morioka, N. Ando, H. Hashimoto, "Cooperation of Distributed Intelligent Sensors in Intelligent Environment," IEEE/ASME Trans. Mechatronics, vol. 9, no. 3, SEP. 2004. [10]N. Y. Chong, H. Hongu, K. Ohba, S. Hirai, and K. Tanie, "A Distributed Knowledge Network for Real World Robot Applications," IEEE Int. Conf. Intell. Robots and Systems, pp. 187-192, 2004. [11] D. Haehnel, W. Burgard, D. Fox, K. P. Fishkin, M. Philipose, "Mapping and Localization with RFID Technology," IEEE Int. Conf. Robotics and Automation, pp. 1015-1020, 2004. [12] J. Hightower and G. Borriello, "Location Systems for Ubiquitous Computing," IEEE Computer, vol. 34, no. 8, pp. 57-66, AUG. 2001. [13]W. Lin, S. Jia, T. Abe, and K. Takase, "Localization of Mobile Robot based on ID Tag and WEB camera," IEEE Int. Conf. Robotics and Mechatronics, pp. 851-856, 2004. [14] Y. Nagumo and A. Ohya, "Human Following Behavior of an Autonomous Mobile Robot Using Light-Emitting Device," IEEE Int. Workshop on Robot and Human Interaction, pp. 18-21, 2001. [15] J. S. Park and M. J. Chung, "Path Planning with Uncalibrated Stereo Rig for Image-based Visual Servoing under Large Pose Discrepancy," IEEE Trans. Robotics and Automation, vol. 19, no. 2, pp. 250-258, APR. 2003. [16] A versatile camera calibration technique for high accuracy 3D machine vision metrology using off-the-shelf TV cameras and lenses - R. Y. Tsai ,IEEE J. Robotics Automat., pages 323-344, Vol. RA-3, No. 4 1987. [17] The Development of Camera Calibration Methods and Models - T.A. Clarke and J.G. Fryer, Photogrammetric Record, 16(91): 51-66, April 1998. [18] On Plane-Based Camera Calibration: A General Algorithm, Singularities, Applications - Sturm and Maybank, CVPR99 [19] Flexible Camera Calibration by Viewing a Plane from Unknown Orientations - Zhang, ICCV99
10
Fig. 9 Navigation results of the proposed sensor suite and odometry
VI. CONCLUSION In this paper, we have proposed a localization sensor suite for development of robotic location sensing network. The proposed sensor can operate irrespective of illumination condition, which is not the case for most vision-based approaches. Experimental results confirm the robustness of location data and its accuracy. As a future work, it should be done to minimize the position error in case of landmark modules being attached on ceiling having different height. Furthermore, we are planning to use a fusion method with other localization technique to minimize the number of required infrared landmark modules and for error recovery. ACKNOWLEDGEMENT
This work was supported by Ministry of Information and Communication, Korea.
42