Robot Localization Network for Development of Ubiquitous Robotic Space Wonpil Yu, Heesung Chae, Jae-Yeong Lee, Nakju Lett Doh, Young-Jo Cho Intelligent Robot Research Division, ETRI 161 Gajeong-dong, Yuseong-gu, Daejeon, Korea, 305-700 E-mail: [email protected]

Abstract— In this paper, we describe a new localization sensor suite for the development of a location sensing network. The sensor suite comprises wirelessly controlled infrared beacons and an image sensor which detects the pixel positions of infrared sources. We describe the operating principles of the developed sensor suite and report the performance for mobile robot localization. The advantage of the developed sensor suite lies in its robustness and low cost to obtain localization information as well as simplicity of deployment to build a robotic location sensing network. Experimental results show that the developed sensor suite outperforms the state-of-the-art localization sensor.

Keywords—Localization, sensor network, mobile robot, navigation

I. I NTRODUCTION Navigation is the science of getting a vehicle from place to place by determining the vehicle’s position, course, and distance traveled [1]. Prompt and relevantly accurate information about the vehicle’s position is a key component underpinning the success of a navigational application. In the field of mobile robotics, localization technology refers to a systematic approach to determine the current location of a mobile robot, namely, the 2-D position and heading angle of the mobile robot, by utilizing uncertain sensor readings of the robot. Localization technology in the field of mobile robotics has been well studied and a multitude of methods have been proposed so far; a good overview on the robot localization technology can be found in [2]. Recently, with the advance of sensor and wireless connectivity technologies, it has become more feasible to design and build a wireless sensing network providing context-awareness; accordingly, there is an increasing need for accurate indoor location sensing in domestic, public, and military applications [3], [4]. The most well-known location sensing technique using wireless communications may be GPS (Global Positioning System); GPS has dominated over other outdoor radio navigation techniques because of its high accuracy, worldwide availability, and low cost [5]. For indoor location sensing applications, however, environmental effect imposes severe impediment to reliably obtain location information. Multipath fading, interference, and non-LOS condition are wellknown factors among others which make indoor localization a challenging task. Triangulation techniques based on RSSI (Received Signal Strength Indicator), TOA (Time-Of-Arrival), AOA (Angle-Of-

Arrival), and TDOA (Time-Difference-Of-Arrival) are popular techniques for processing radio signals to estimate location information. As mentioned earlier, the performance of these techniques can be severely deteriorated according to domainspecific RF factors or geometric configuration of receivertransmitter pairs. In order to overcome RF interference, even pattern-recognition approaches have been reported to model instantaneous spatial RF distribution of interested areas inside a building [4], [6]. More recently, ultra wideband (UWB) systems have attracted considerable attention due to their improved robustness to environmental effect in achieving accurate localization information [7], [8]. Along with the wireless applications, researchers have paid attention to design intelligent spaces which support a human or a robot inside building environment [9], [10], [11]. For this kind of space technology, knowledge about spatial information is crucial for an intelligent space to be meaningful. For example, in order for a robot to move freely and intelligently inside such a space, very good information about the current location is necessary, as evidenced in the traditional robotics community [2]. For seamless provision of spatial information about a robot in a large space, cooperating networked sensors embedded in the space can be a plausible solution to this problem. However, there is not yet a realistic solution for a robotic task such as intelligent navigation, as can be appreciated from [12]. In this paper, we describe a localization sensor suite to build a robot localization network; the localization network will also be a constituting element to build a ubiquitous robotic space where a robot or human can have information assistance from the space and a specific robotic task can be carried out more efficiently compared with traditional approaches [2]. The minimal requirement of the proposed sensor suite for robot localization task comprises two infrared beacon modules attached on the ceiling of a space in question and an image sensor equipped on the top of a mobile robot. This configuration is, in fact, very well recognized in robotics community [13], [14], [15]. As will be described in Section II, several constraints should be satisfied if any localization technique is to be a practical solution to real-world robotic applications, particularly for building an intelligent space incorporating robot navigation. This paper is organized as follows. In Section II, we briefly introduce the design principles to build a robot localization

network and how it is related to building a ubiquitous robotic space. In Section III, we describe the operating principles of the proposed sensor suite. We report the robot localization performance based on the proposed localization network in Section IV. An overall description about a currently implemented robotic service is also provided, which is based on the proposed localization network, sensor network, and existing communications network. Finally, we conclude the paper and suggest future research directions in Section V. II. D EVELOPMENT OF A ROBOT L OCALIZATION N ETWORK AND I TS E XTENSION TO U BIQUITOUS ROBOTIC S PACE In this section, we describe the design principles of the proposed localization network as a key component for building a ubiquitous robotic space. We introduced a couple of constraints to build a realistic robot localization network. • Accuracy: for the convenience of developing robot navigation tasks, the positional accuracy should be less than 20 cm in x, y each, orientation accuracy being less than 5◦ . • Repeatability: for a mobile robot to navigate reliably, jitter in the location information should be bounded by 1 cm of positional error variance and 1◦ of orientation error variance. • Coverage: for a localization network to be meaningful, its coverage should be unlimited, which means that the localization network must be highly scalable. • Response time: update frequency for locating a robot should be high enough. It was set to be more than 10 Hz to provide 3-D localization information (x, y, θ). • Availability: location information about a robot should be provided at any time of the day and any place of an indoor environment. • Deployment: the localization network should be capable of wireless operation except for power supply. • Cost: constituting elements should be cost effective for the proposed localization network to be deployed in a large indoor environment. As explained earlier, radio signal processing based on RSSI, TOA, or TDOA was excluded in an early stage due to its vulnerable characteristics related with RF interference or multipath effects. Conventional robotic sensors such as ultrasonic or vision sensors were also considered, which were turned out because they did not satisfy all of the constraints above. The conceptual structure of a ubiquitous robotic space (URS) is illustrated in Fig. 1. The proposed URS comprises three spaces: physical, semantic, and virtual space. The information from the proposed localization network goes to semantic space, where contextual information based on robot’s current location is extracted along with sensor data provided by sensor networks. The virtual space provides user-centric display of the URS as well as remote monitoring and control of the URS through the Internet or communications network such as CDMA. The localization network plays a key role underlying the whole ubiquitous robotic space. As can be noticed, in order to build a URS as a serious business solution

Fig. 1.

Conceptual structure of a ubiquitous robotic space.

Fig. 2.

Sensor configuration for estimating robot location.

based on robotic technology, the constraints above should be completely satisfied. We chose an optical tracking scheme to realize a localization network satisfying the constraints above. Fig. 2 is a basic sensor configuration to build the localization network. First of all, each beacon can be made very cheap and by utilizing a wide-angle camera lens, the coverage provided by the localization sensor suite can be as large as 10 meters depending on the ceiling height. More importantly, localization based on triangulation is a well-established technology so that we could easily construct each network node, which is an IR emitting beacon containing a communication module. Finally, the repeatability of location data as well as its accuracy is guaranteed under modest environmental conditions. In the next section, we describe a basic localization sensor suite developed for building a localization network in more detail. III. S ELF -L OCALIZATION WITH THE P ROPOSED L OCALIZATION S ENSOR S UITE The proposed sensor suite is configured such that infrared beacon modules are attached on the ceiling of a space in question and an image sensor is mounted on the top of a mobile robot as shown in Fig. 2. The image sensor is a

CCD camera having an infrared band-pass filter. It is oriented to look upward so that its optical axis is perpendicular to the ground. For the sake of maximal field of view, a wideangle camera lens is utilized. Each beacon module contains an infrared LED whose on/off status is controlled externally by wireless communication. In order to control LEDs of the beacons independently, a unique beacon ID is assigned to each infrared beacon module. The location information of a robot can be obtained when at least two infrared LEDs are detected within the field of view of the camera. Therefore, more than two beacons are required to cover the whole area of a large indoor environment. The required number of infrared beacons varies according to the 2-D geometry of the space, the height of the ceiling, and the field of view of the camera. The optimal distribution of infrared beacons is beyond the scope of this paper and will not be further described. The localization is performed in two steps: in the first step, the image coordinates of the infrared LEDs are computed and tag IDs are then identified. In the second step, 2-D position and heading angle of a robot are computed from the image coordinates and world coordinates of the detected LEDs.

(a)

(b)

A. Detection and Identification of Infrared Beacons An optical filter attached at the front of the employed camera lens transmits infrared lights narrowly confined around a predetermined wavelength. Fig. 3-(a) and (b) demonstrate the effect of infrared band-pass filtering. The top image was captured with a normal CCD camera and the bottom one captured with the same camera having an infrared band-pass filter. As can be noticed from Fig. 3-(b), the infrared LEDs are well discriminated as white spots in the bottom image. Generally speaking, it is very difficult to robustly locate particular patterns from images captured under varying illumination condition. Infrared band-pass filtering makes the detection problem a simple thresholding one; more importantly, it enables robust detection of target beacons at any time of the day. Due to these properties, the developed optical tracking algorithm is very simple to be implemented in the form of an embedded imaging solution, which accordingly guarantees low-cost and robustness against ambient lighting condition. Let I(x, y) be an image, from which infrared spots to be detected. Firstly, the image is dichotomized by thresholding with a predetermined threshold value. The target blobs of infrared spots are then located by connected component analysis. Let bk be a pixel belonging to the k-th blob found after connected component analysis. The center of mass of each blob, (xk , yk ), is given by X xk = xI(x, y), (1) (x,y)∈bk

yk =

X

(x,y)∈bk

where k = 1, . . . , n.

yI(x, y),

(2)

Fig. 3. Two sample images of a pair of infrared beacons. The top image (a) was captured without filter and the bottom (b) with an infrared band-pass filter.

In order to distinguish each LED spot, only one infrared LED is turned on, while the others remain turned off before capturing the image. If any spot is detected from the image, it is identified as being originated from the beacon which was set to be turned on. This detection and identification procedure is iterated for each beacon module until two spots are detected in the current image. B. Computation of 2-D Position and Heading Angle of a Robot For the description of robot localization procedure, we introduce three coordinate systems: world coordinate system, image coordinate system, and extra coordinate system, denoted by subscripts w, i, and e respectively. Fig. 4 shows the relationship of the three coordinate systems. The world coordinate represents the real position in the space in question and the image coordinate refers to the pixel coordinate in the image. The extra coordinate system is a beacon-oriented coordinate system and introduced for the sake of transforming from image coordinate to world coordinate. In the previous step, we have located two LEDs in the image and determined their corresponding beacon IDs. Let L1i and L2i be the image coordinates of the two LEDs and let L1w and L2w be the corresponding world coordinates. We assume that the world coordinate of each beacon is already known. The extra coordinate system is constructed from L1i and L2i so that L1i becomes the origin and y-axis spans from L1i to L2i as shown in Fig. 3. Let Pi be the image coordinate of a robot. We can set Pi to be the center of the image by assuming that

IV. E XPERIMENTAL R ESULT

Fig. 4.

Three coordinate systems used for robot localization.

the optical center of the camera is coincident with the rotating axis of the robot. The localization task is to compute the world coordinate of the robot’s position, denoted by Pw , and heading angle, denoted by θr . This requires two times of coordinate transformation: from image coordinate to extra coordinate and then from extra coordinate to world coordinate. Let θ1 be the rotation angle between x-axes of the image and extra coordination systems. The extra coordinate of the robot, denoted by Pe , is then given by Pe = Rie (Pi − Tie ), (3) where Tie is a translation vector and Rie is a rotation matrix, given by ¸ · cos(−θ1 ) − sin(−θ1 ) , (4) Rie = sin(−θ1 ) cos(−θ1 ) Tie = L1i .

(5)

We define a scale factor, s, for scaling the extra coordinate to correspond to the units of the world coordinate system, which is given by k L1w − L2w k s= , (6) k L1i − L2i k where function k · k denotes the L2 -norm. The world coordinate of the robot, Pw , is then given by −1 Pw = Rwe · sPe + Twe , · ¸ cos(θ2 ) − sin(θ2 ) −1 Rwe = , sin(θ2 ) cos(θ2 )

Twe = L1w ,

(7)

For the evaluation of localization performance of the proposed sensor suite, we have conducted an experiment by changing the position and heading angle of a mobile robot. The experiment was performed in an indoor environment where three infrared beacon modules are attached on the ceiling. Each distance among infrared beacons is 120 cm. The image sensor used is a CCD board camera having 1.9 mm fisheye lens. The location information was computed for every grid points which are regularly spaced by the distance of 60 cm. Fig. 5 shows the localization results using the proposed sensor suite. The estimated positions are represented by circles and true positions with asterisk. The mean position error is 4.1 cm with standard deviation of 2.9 cm. We have the maximum position error of 17.1 cm at (180, 60) because of uneven ground condition. The error caused by uneven surface can be easily detected and corrected by incorporating odometry readings. Position jitter was measured to be less than 0.5 cm and 0.3◦ , which well satisfies the repeatability constraint. The developed localization suite and its extension to localization network were implemented to build a ubiquitous robotic space. A prototype URS was implemented in the ground floor of our building, the area of which was measured to be 22.8 m × 21.6 m. A sensor network based on ZigBee protocol was also installed in the same space to gather environmental data, which are delivered to the semantic space (see Fig. 1) along with the localization information to infer contextual information. Fig. 6 shows the metric map of the space. Thick blue lines represent the trajectory of the robot along which the robot carries out a routine monitoring task. Small dots represent the positions of the developed localization sensor suites and large green dots represent the positions of the ZigBee sensor nodes. Irregular situation is detected from the contextual information processing in the semantic space, which subsequently issues a new navigation task to the robot to visit the spot in question. The whole system can be monitored on a remote PC or PDAs; irregular situation is promptly reported to the user through CDMA communications network. By integrating all the elements above, a robotic security application was experimentally implemented and it was found that the proposed localization network provides information with acceptable performance for carrying out robotic tasks according to contextual information in the space.

(8)

V. C ONCLUSION

(9)

In this paper, we have proposed a robot localization network for development of a ubiquitous robotic space where a robot or human can have information assistance from the space and a specific robotic task can be carried out efficiently. Experimental result confirmed the robustness of localization data and its accuracy. By implementing a prototype ubiquitous robotic space, we could be assured that the proposed localization network well satisfied the design constraints imposed for developing a realistic space technology. One of our future works includes enlarging the area of the developed ubiquitous

where θ2 is the rotation angle between x-axes of the world and extra coordination systems. The heading angle of the robot, θr , can be simply calculated as π θr = θ2 − θ1 + . (10) 2 The computed Pw and θr give the final localization information of the robot.

Fig. 5.

Fig. 6.

Localization results with the proposed sensor suite.

Metric map of a developed ubiquitous robotic space.

robotic space to tackle the issues of large scale map building assisted by sensor networks and the localization network; dynamic network routing considering the mobility of the robot; and integration of other radio perception technology such as RFID to further enhance the perception capability of the robot. ACKNOWLEDGMENT This work was supported by Ministry of Information and Communication, Korea. R EFERENCES [1] I. J. Cox and G. T. Wilfong, Autonomous Robot Vehicles. SpringerVerlag, 1990. [2] J. Borenstein, H. R. Everett, and L. Feng, “Where am I? sensors and methods for mobile robot positioning,” Univ. of Michigan, Tech. Rep., Apr. 1996. [3] Wireless Sensing Networks: Market, R&D and Commercialization Activities. New York: Fuji-Keizai USA, Inc., Feb. 2004. [4] K. Pahlavan, X. Li, and J. Makeka, “Indoor geolocation science and technology,” IEEE Communications Magazine, pp. 112–118, Feb. 2002.

[5] T. S. Rappaport, J. H. Reed, and B. D. Woerner, “Position location using wireless communications on highways of the future,” IEEE Communications Magazine, pp. 33–41, Oct. 1996. [6] K. Yamano, K. Tanaka, M. Hirayama, E. Kondo, Y. Kimuro, and M. Matsumoto, “Self-localization of mobile robots with RFID system by using support vector machine,” in Proc. IEEE/RSJ Int. Conf. on Intelligent Robots and Systems (IROS), Sendai, Japan, Sept. 2004, pp. 3756–3761. [7] J. Y. Lee and R. A. Scholtz, “Ranging in a dense multipath environment using an uwb radio link,” IEEE J. Selected Areas in Communications, vol. 20, no. 9, pp. 1677–1683, Dec. 2002. [8] The ubisense smart space platform. [Online]. Available: http://www.ubisense.net [9] J. H. Lee, N. A. K. Morioka, and H. Hashimoto, “Cooperation of distributed intelligent sensors in intelligent environment,” IEEE/ASME Trans. Mechatronics, vol. 9, no. 3, pp. 535–543, Sept. 2004. [10] N. Y. Chong, H. Hongu, K. Ohba, S. Hirai, and K. Tanie, “A distributed knowledge network for real world robot applications,” in Proc. IEEE/RSJ Int. Conf. on Intelligent Robots and Systems (IROS), Sendai, Japan, Sept. 2004, pp. 187–192. [11] D. Haehnel, W. Burgard, D. Fox, K. P. Fishkin, and M. Philipose, “Mapping and localization with RFID technology,” in Proc. IEEE Int. Conf. on Robotics and Automation (ICRA), New Orleans, USA, Apr. 2004, pp. 1015–1020. [12] J. Hightower and G. Borriello, “Location systems for ubiquitous computing,” IEEE Computer, vol. 34, no. 8, pp. 57–66, Aug. 2001. [13] W. Lin, S. Jia, T. Abe, and K. Takase, “Localization of mobile robot based on ID tag and WEB camera,” in Proc. IEEE Int. Conf. on Robotics, Automation and Mechatronics, Singapore, Dec. 2004, pp. 851–856. [14] Y. Nagumo and A. Ohyae, “Human following behavior of an autonomous mobile robot using light-emitting device,” in Proc. IEEE Int. Workshop on Robot and Human Interactive Communication, BordeauxParis, Sept. 2001, pp. 225–230. [15] J. S. Park and M. J. Chung, “Path planning with uncalibrated stereo rig for image-based visual servoing under large pose discrepancy,” IEEE Trans. Robotics and Automation, vol. 19, no. 2, pp. 250–258, Apr. 2003.

Robot Localization Network for Development of ...

suite for the development of a location sensing network. The sensor suite comprises ... mobile robotics, localization technology refers to a systematic approach to ...

209KB Sizes 1 Downloads 181 Views

Recommend Documents

Robot Localization Sensor for Development of Wireless ... - IEEE Xplore
issues for mobile robot. We describe a novel localization sensor suite for the development of a wireless location sensing network. The sensor suite comprises ...

Kalman Filter for Mobile Robot Localization
May 15, 2014 - Algorithm - This is all you need to get it done! while true do. // Reading robot's pose. PoseR = GET[Pose.x; Pose.y; Pose.th]. // Prediction step. ¯Σt = (Gt ∗ Σt−1 ∗ GT t )+(Vt ∗ Σ∆t ∗ V T t ) + Rt. // Update step featu

Rescue Robot Localization and Trajectory Planning ...
Rescue Robot Localization and Trajectory Planning Using ICP and Kalman Filtering. Based Approach ... target and path in disaster areas, its precision, speed,.

Mobile Robot Indoor Localization Using Artificial ...
to validate our approach several ANN topologies have been evaluated in experimental ... to accomplish several tasks in autonomous mobile robotic area. Also, knowledge about ... the wireless network are Received Signal Strength Indication. (RSSI) and

Accurate Mobile Robot Localization in indoor ...
for localization of a mobile robot using bluetooth. Bluetooth has several inherent advantages like low power consumption, an ubiquitous presence, low cost and ...

Microtubule-based localization of a synaptic calcium - Development
convenient tool with which to analyze the function of microtubules in biological .... visualization of protein subcellular localization and AWC asymmetry in ... 138 tir-1(tm3036lf); odr-3p::tir-1::GFP r. –. 0. 100. 0. 147 odr-3p::nsy-1(gf), L1 s. â

Development and Optimizing of a Neural Network for Offline Signature ...
Computer detection of forgeries may be divided into two classes, the on-line ... The signature recognition has been done by using optimum neural network ...

Development and Optimizing of a Neural Network for Offline ... - IJRIT
IJRIT International Journal of Research in Information Technology, Volume 1, Issue ... hidden neurons layers and 5 neurons in output layers gives best results as.

Design and Development of a Medical Parallel Robot ...
At last, the experimental results made for the prototype illustrate the performance of the control algorithm well. This re- search will lay a good foundation for the development of a medical robot to assist in CPR operation. Index Terms—Control, de

Further Results on Sensor Network Localization ... - Semantic Scholar
In the general model of sensor networks, there are usually some nodes named beacons, whose position in- formation is known. Those beacons have either GPS ...

spatial sound localization model using neural network
Report #13, Apple Computer, Inc, 1988. [19] IEC 61260, Electroacoustic - Octave-band and fractional octave-band filters, 1995. [20] R. Venegas, M. Lara, ...

Development of the RAPID Network
ing first in the marketplace.5 Several academic programs .... The creation of an on-line NCIIA bulletin board for rapid prototyping .... College of Business Administration (CBA). This cur- ... heat Thermal Industries, Dynamic Medical Solutions.

HISTOCHEMICAL-LOCALIZATION-OF-HYALURONATE....pdf ...
intercellular spaces from basal to upper spinous layers displayed strong staining, most intense in the middle. spinous cell layer. The uppermost vital cell layers as well as the cornified cell layer remained unstained. In the non-keratinized epitheli

External Localization System for Mobile Robotics - GitHub
... the most known external localization reference is GPS; however, it ... robots [8], [9], [10], [11]. .... segments, their area ratio, and a more complex circularity .... The user just places ..... localization,” in IEEE Workshop on Advanced Robo

Integrating human / robot interaction into robot control architectures for ...
architectures for defense applications. Delphine Dufourda and ..... focusses upon platform development, teleoperation and mission modules. Part of this program ...

Fast maximum likelihood algorithm for localization of ...
Feb 1, 2012 - 1Kellogg Honors College and Department of Mathematics and Statistics, .... through the degree of defocus. .... (Color online) Localization precision (standard devia- ... nia State University Program for Education and Research.

Evaluation of Vocabulary Trees for Localization in ...
to build a vocabulary and to index images, which means a query image is represented by a vector. In addition, we consider memory occupancy. We use ukbench[3] for the dataset and oxford5k[7] for vocabulary learning. These are the most popular datasets

An Ambient Robot System Based on Sensor Network ... - IEEE Xplore
In this paper, we demonstrate the mobile robot application associated with ubiquitous sensor network. The sensor network systems embedded in environment.

A stand-alone method for anatomical localization of ...
implemented as part of our NIRS Analysis Package (NAP), a public domain Matlab ... MNI space (Okamoto et al., 2004; Okamoto and Dan, 2005; Singh et al.,.

Convolutional Networks for Localization Yunus Emre - GitHub
1 Introduction: Image recognition has gained a lot of interest more recently which is driven by the demand for more sophisticated algorithms and advances in processing capacity of the computation devices. These algorithms have been integrated in our

Relative-Absolute Information for Simultaneous Localization and ...
That is why it is always required to handle the localization and mapping. “simultaneously.” In this paper, we combine different kinds of metric. SLAM techniques to form a new approach called. RASLAM (Relative-Absolute SLAM). The experiment result

pdf-0749\learning-network-services-for-professional-development ...
Connect more apps... Try one of the apps below to open or edit this item. pdf-0749\learning-network-services-for-professional-development-from-springer.pdf.

localization
locations to investigate the stability of RSSI in two seemingly common environments. The open office environment chosen was meant to simulate an open space.

Development of an artificial neural network software ...
a distance of 1.5m, using a broadband, TEM horn antenna. ... London EC1V 0HB, U.K. .... 6 Comparison between the peaks of the three perpendicular directions.