International Conference on Mobile Ubiquitous Computing, Systems, Services and Technologies
An Ambient Robot System based on Sensor Network: Concept and Contents of Ubiquitous Robotic Space Kyuseo Han, Jaeyoung Lee, Sangik Na, and Wonpil You Electronics and Telecommunications Research Institute, Daejeon, KOREA {kyuseo, jylee, nsi, ywp} @ etri.re.kr There are some recent researches related on developing robot technology with ambient intelligence. TAG project at AIST proposed a distributed knowledge robot control scheme in which every object is posted with RFID tag having network address and knowledge needed for robots to deal with the object concerned [1]. Intelligence space project targeted to construct the intelligence environment being able to monitor what is happening in the environment, and communication with people who are in now through only vision sensors [2]. The GUIDE project aims to observe and understand human activities with attaching RFID tags to all interesting objects [3]. These previous researches have mainly concentrated on how to manage objects in environment or to build intelligent environment. In this paper, we proposed new revised robotic system to deal with information from environment through sensor network, what we called this system as URS (Ubiquitous Robotic Space). The URS aims to enhance the conventional navigation techniques for mobile robots as well as to properly convert increasingly immense environmental data into meaningful information for providing more flexible and adaptive services that robots operate. The URS consists of three major conceptual spaces which are combined into an intelligent distributed robotic space: physical, semantic, and virtual robotic space. This paper consists of three parts; in section 2, we will describe the URS in terms of characteristics of each conceptual space. And section 3 will show configurations of sensor network using in experiment stage on URS. The following section will explain the details of experiment including how to interconnect with robot and sensor network. The last section will discuss the current state of the URS and future research.
Abstract In this paper, we demonstrate the mobile robot application associated with ubiquitous sensor network. The sensor network systems embedded in environment make it for robots to more precisely navigate and localize itself and to more intellectually response to events occurred in the space where robots and environment are cooperated each other, what we called “Ubiquitous Robotic Space”. The ubiquitous robotic space consists of three conceptual spaces, physical, semantic, and virtual space. These spaces are seamlessly connected and provide more suitable information to robot and users. In experiments, we will show one of future application based on sensor network embedded in real-life environment.
1. Introduction Recent advances of sensor network technology has played a key role in various applications, such as remote inspection of electricity and water, remote health care, one-stop payment, and intelligence building. Furthermore, more applications will increasingly use information from sensor network to support more services. One of more realizable solution in near future supported by USN (Ubiquitous Sensor Network) focuses on LBS (Location Based Service), such as intelligent car navigation, accurate people or event detection/localization, rescue activity, and emergency management. Also, USN will give us more valuable information of static change of environment as well as dynamic change of it. Therefore, of great importance will be how to cooperate with intelligent environment more concisely.
0-7695-2993-3/07 $25.00 © 2007 IEEE DOI 10.1109/UBICOMM.2007.44
155
Semantic URS
- Semantic URS modeling & Sync. - Service trigger and information processing
Position, object data (logical data)
Virtual URS
- URS Simulation - Mapping
Robot Perception
- Modeling - Reasoning Distrubuted Sensing
Smart Action triggering
Smart Action
-Navigation Reactive action triggering Manipulatio n
Physical URS - Robot Navigation & Localization - Sensor network
Figure 1. Conceptual structure of ubiquitous robotic space: physical, semantic, and virtual URS describing states of physical world. The result of semantic URS is back propagated into the physical URS as chunks of action the robot will perform in response to actual events in real world. The role of virtual URS is closely connected to communicating with users. The virtual URS receives the information data from physical URS or semantic URS and shows the data as to be easily readable or recognizable as possible to users. Additionally, to achieve more preferable convenience for displaying information, the virtual URS has an ability to do sensor data fusion. In the URS, each sub-URS can interchange its own information under supporting interoperability. Generating information in physical URS can seamlessly deliver to other two URSs, as well as modified information in the two URSs can send to each other without conflict.
2. URS (Ubiquitous Robotic Space) The close connectivity between environmental information and functionalities of robot plays a key role to characterize the URS. In general, conventional mobile robot has acquired information for itself using its own built-in sensors, such as sonar, infrared sensors, and laser finder. The shortcomings of information used in traditional mobile robot are mainly the irregularity and the instability of data quality. Especially, it has been little robust on dynamic change of environment. Recent evolving ubiquitous sensor network make it to overcome these disadvantages as providing enriching information of environment. In Fig. 1., the conceptual structure of URS is shown as comprising three major sub-URSs; physical URS, semantic URS, and virtual URS. From the sensor network view, the physical URS is much more important than others in terms of obtaining information from environment. In fact, sensor network, robot, and object in certain environment are located in physical URS. In other words, all environmental information including location of each embedded object and various sensing data are coming from physical URS. Therefore, physical URS has been a source provider to other two sub-URSs as well as a self-working player, which means that robots can operate their tasks only with physical URS. Meanwhile, the semantic URS manages domain knowledge database and process raw sensing data in physical URS into logically contextual information
3. Sensor network configuration for URS Currently, the URS has used two different sensor network systems for robot navigation/localization and processing environmental events. For robot localization, we have newly developed positioning sensor, what we called StarLITE, consisting of wirelessly controlled infrared LED tag and detector, for emitting light by control signal and transmitting control signal and capture the image, respectively [4]. Fig. 2 shows the detector and tag for localizing robot. Generally, the tags are attached to the ceiling in environment, and the detector is mounted on top of a robot and produces localization information by optical tracking tags. Its
156
position and orientation errors are bounded by 5 cm and 1 , respectively. The update frequency is 30 Hz and localization coverage is 5 x 5 square meters when the ceiling height is three meters. When the positioning sensor the coverage of the network is installed in environment, sensor should be considered for satisfying its stability and accuracy. Basically, there is no specific topology to construct localization sensor network due to no connection needed for each other. In other topological view, this localization sensor network seemingly resembles to star network, but there are no transmit message or information from one node to other nodes through central switch. The message communication occurs at only connection between central switch (detector on top of robot) and nodes (tags on the ceiling).
(a)
been used as positioning sensor by analyzing signal strength. The structure of Zigbee localization sensor network is depicted in Fig. 6. However, it has shortcomings of low resolution of positioning data and interference of signal from environmental structure. For localization, coarse-to-fine approach will be needed to use Zigbee sensors for obtaining low resolution of position and StarLITE for high resolution of it.
4. Experiments For testing the validation of URS concept, we constructed three sub-URSs in real environment. The test space is office environment, which consists of three navigation zones in the first level of our building and another navigation zone in the second level of it. And another URS is constructed the other building in remote area. The reason why there are different navigation zone is to perform zone-to-zone navigation to cover a large working area as dividing a few of small areas [6]. The physical URS is composed of one mobile robot which we developed, what we called ROMI, and two sensor networks; one is for localization, and the other is for checking environmental events, such as unexpected intrusion. The event checking network is closely related to the semantic URS which interprets sensing data into meaningful information for understanding current states of environment or for detecting events in environment. The sync node in Zigbee sensor network embedded in ROMI connects to semantic URS installed in main control board of ROMI. The result of semantic URS consists of type of events occurring in environment and actions to be performed by ROMI, which is position information to go in this experiment. The virtual URS manages the map and robot position to be displayed in web-based client for users to see where the robot is now in the map and subsidiary images what the robot just sees now. Fig. 3 shows the working webbased client to display map, robot position, and images. Additionally, the 3D virtual map is engaged in real map and real robot position to make it easier for users to understand current situation and current environment as depicted in Fig 3. In Fig. 5, the two sub-URSs, physical and semantic URS, are coherently interconnected to each other to perform a following action: when the door is opened unexpectedly, the semantic URS creates action to move
(b)
Figure 2. StarLITE: Localization sensor based on Infrared light (a) detector and (b) tag For processing environmental events, the physical URS included motion detector with magnetic bar and temperature/humidity detector on stack of Zigbee standard [5] as an event-oriented sensor network. The Zigbee network supports the autonomous adding new sensor and flexible configuration of sensor network topology. Zigbee sensors suite consists of two types: one is a fixed reference node, and the other is moving node. They send the reference coordinate frame as well as distance and to calculate 2-axis velocity, respectively. The topology of this sensor network mainly consists of star network. Each central switch, which is reference node of Zigbee sensor, of one star network makes it possible for moving node to attach or detach the network flexibly. Actually, Zigbee is a specification of wireless transmission so that we can freely mount as many as sensors for adjusting task on stack of Zigbee. In addition, Zigbee itself possibly has
157
(a)
(b)
Figure 3. Web-based client system for Virtual URS (a) client displaying 2-D map, robot position, and images (b) Engaging 3D map via VRML Door and sending the message to a robot through sensor network in physical URS. Especially, images on client also sent to mobile phone throughout 3G mobile network for remote watch.
6. References [1] http://staff,aist.go.jp/k.ohba/index_en.htm
5. Conclusion
[2] Joo-Ho Lee and Hideki Hashimoro, “Intelligent Space – Its concept and contents”, Advanced Robotics Journal, Vol. 16, No. 4, 2002
In this paper, we introduced the concept and structure of URS. As described in the paper, the URS has three sub-URSs: physical, semantic, and virtual URS. The functionality of each sub-URS is summarized as follows:
[3] M. Pilipose, K.P. Fishkin, D. Fox, H. Kautz, D Patterson, and M. Perkowitz, “Guide: Towards Understanding Daily Life via Auto-Identification and Statistical Analysis”, UbiHealth 2003, Seattle, Wa. USA, Oct. 2003
[4] B. Sohn, J. Lee, H.Chae, and W. Yu, “An Embedded Localization Sensor Based on IR Landmark for Indoor Mobile Robot”, URAI 2006, Oct., 2006, pp. 452-453
The physical URS maintains the sensor network, robot, and object in environment. The key role of this URS is producing sensing data through sensor network and makes the robot perform its own task based on generating actions from semantic URS or physical URS. The semantic URS manages domain knowledge and converts raw sensing data into contextual information based on intelligence, such as ontology. The result is propagated into physical URS as a set of actions or virtual URS as logical expression. The virtual URS provides the communication interface between users and URS. Through this URS, users can easily understand what happens in environment and what action will be needed for robot.
[5] IEEE 802.15.4-2006: Wireless Medium Access Control(MAC) and Physical Layer (PHY) specification for Low Rate Wireless Personal Area Network (WPANs) [6] J. Lee, H.Chae, H. Ahn, W. Yu and Y. Cho, “Development of Ubiquitous Robotic Space for Networked Robot”, URAI 2006, Oct. 2006, pp. 172-176
Finally, the robot with sensor network will be a good application for how to combine USN into other technology and suggests a new service produced from new paradigm, communicating with environment.
Figure 4. ROMI : Robo-Mate for Information and Communication, developed by ETRI
158
Figure 5. One of scenario for interconnecting physical URS and semantic URS. When the door is opened unexpectedly, the firing sensor node on the door sends the signal to sensor network, and then the signal is delivered to sync node embedded in mobile robot. The semantic URS generates actions and sends them to robot in physical URS as a response to request from sensor network in physical URS.
① Fixed ref. Node
ZigBee Communicatio n
Moving tag with motion sensor Measuring motion and Signal strength
②
Fixed ref. Node
ZigBee Communicatio n
Transmitting Signal strength
Localization network server
④
③
Moving ref.tag On Robot
Transmitting Signal strength
Real-time tag position
ZigBee Communicatio n
ZigBee Communicatio n
Interconnecting with robot
Fixed ref. Node
Fixed ref. Node
Figure 6. The structure of ZigBee Localization sensor network with fixed reference node and moving reference node for localizing robot in physical URS
159