ISSNIP Biosignals and Biorobotics Conference 2010 Theme: Biosignals and Robotics for Better and Safer Living Vitoria, Brazil - 4-6 January 2010

A Brain-Machine Interface Model Based on EEG for Automated Navigation of Mobile Robotic Device Mufti Mahmud a,* a

NeuroChip Laboratory, Department of Information Engineering, University of Padova, 35131, Italy Email: [email protected] these signals, they are to be conditioned and processed to perform the specific work for which they are generated. In other words, the interface should be able to adapt and understand the meaning of these signals and work accordingly. If this can be successfully done, then, a brainmachine interface can be used to improve the capacity of movement of individuals with motor dysfunctions, using, for example, a robotic wheelchair to carry them [5]. The previous works done in development of BMI, shows that the degree of difficulty is gradually increased in signal acquisition and processing [5, 7, 8 and 9]. The work done by Mahmud et al. [10] is rather a simpler approach with reduced complexities. But, in that work two subjects were used to control two wheels of the robotic device, which in general sense would be impossible for a disabled. This current work proposes a simple model of Brain-Machine Interface by EEG signals recorded through conventional EEG devices from the scalp. This model makes use of the event related evoked response (as in case of saccadic eye movement), thanks to the existing sophisticated signal analysis and processing technologies that allow to detect smaller changes in terms of micro-volt in the EEG traces, is used to control a mobile robot, which is able to navigate through a predefined path. The schematic diagram of the model can be seen in Figure 1. The boxes represent the processes and the arrowed lines show the direction oriented data flow.

Abstract— The highly parallel neurophysiological recordings and the increasing number of signal processing tools open up new avenues for connecting technologies directly to neuronal processes. As the understanding of the neuronal signals is taking a better shape, lot more work to perform is coming up to properly interpret and use these signals for brain-machine interfaces. A simple brain-machine interface may be able to reestablish the broken loop of the persons with motor dysfunction. With time the brain-machine interfacing is growing more complex due to the increased availability of instruments and processes for implementation. In this work, the author proposes a brain-machine interface model through a few simple processes for automated navigation and control of robotic device using the extracted features from the EEG signals based on saccadic eye movement tasks. Keywords- Brain-Machine Interface; Electroencephalogram; Neurophysiological recording; Signal processing; Seccadic eye movement.

I.

INTRODUCTION

These days with the availability of various efficient computational resources and the improving knowledge about brain dysfunctions have turned many researchers’ interest in using large scale neurophysiological recordings for therapeutic and replacement strategies [1, 2]. Many patients with physiological disorders such as Amyotrophic Lateral Sclerosis (ALS) or injuries such as high-level spinal cord injury suffer from disruption of the communication path between the brain and the body. People with severe motor disabilities may lose all voluntary muscle control. To come up with a solution to this type of problems, sophisticated processing of brain signals is required. The disabled people with the above mentioned problems are forced to accept a reduced quality of life, resulting in dependence on caretakers and escalating social costs [3]. Most of the existing assistive technology devices for these patients are not possible because these devices are dependent on motor activities from specific parts of the body. Alternative control paradigms for these individuals are thus desirable [4]. The electrophysiological signals generated from the brain can be used to command different devices, provided that the person who will control the device should also be able to control the generation of these signals. Having generated ___________________ * Corresponding Author

Figure 1. The schematic of a Brain-Machine inter-face system where the devices can be controlled only with conditioned brain signals.

236

ISSNIP Biosignals and Biorobotics Conference 2010 Theme: Biosignals and Robotics for Better and Safer Living Vitoria, Brazil - 4-6 January 2010

II.

THE BRAIN-MACHINE INTERFACE SYSTEM

B. Devices and methods 1) Data acquisition

A.

The Electroencephalogram (EEG) The Electroencephalogram (EEG) is produced by neuronal firing of the brain, which propagates through the cortex and is recorded along the sculp. The invention of this evolutionary phenomenon that revolutionized the understanding of many critical diseases at the early twentieth century was done by a Swiss psychiatrist, namely, Hans Berger. It was during the late 1920s when he assessed the electrical activity of the cerebral cortex. The signals are usually recorded by placing electrodes over the scalp at some predefined positions. A mapping of these electrode positions is shown in Figure 2. Depending on the position from where the signal is recorded, it provides the necessary information related to that part of the brain. For example, the visual cortex is located at the caudal portion (occipital lobe) of the brain, thus, placing an electrode at this part of the scalp will record the brain activity caused by visual stimuli. Since its discovery, the EEG has been employed to answer many questions about brain functioning and diagnose many diseases clinically. This spontaneous activity of the brain produces the signal with amplitude usually under 100μV and a frequency ranging from little above the DC voltage up to 100 Hz. The EEG signal is recorded from the scalp and is extracellular in nature consisting of a few individual signals with different frequency bands, namely - delta (δ) with a frequency range of 0-3Hz, theta (θ) ranges from 4-7 Hz, alpha (α) varies from 8-12 Hz, beta (β) is from 12-30 Hz, and gamma (γ) has a band of 34-100 Hz. These different signals have their own clinical implications in disease diagnosis.

The data acquisition, i.e., the EEG signal are recorded using a four channel commercial EEG recording device, g®.MOBIlab, manufactured by the g.tec medical engineering GmbH, Austria (http://www.gtec.at/). But, out of the four only three channels are used during the recording. Two of the four channels are used in recording simultaneous signals from two electrode positions of the scalp (‘O1’ and ‘O2’) and the third electrode is used as a reference for the recorded signals, which is placed at the F or FP region in the electrode mapping. The manufacturing body provided a module capable to acquire and a framework for further processing of the EEG signals. This module is implemented in the Matlab® (Mathworks) platform using Matlab® scripting and Simulink. This module works as an interface between the recording device and the computer used for the recording, also provides a framework which can be extended to suit the necessity of online signal processing based on the different applications. Therefore, the module is reengineered to analyze and process the EEG signal acquired by two electrodes placed in the occipital region of the scalp of the individual. The occipital region generates the signal related to visual activity. After having the data read from the scalp using the electrodes, they are amplified with a certain gain and send it to the computer for on-the-fly online processing. The Figure 3 shows the schematic of the signal acquisition process. In Figure 3, the signals detected by recording electrodes with respect to a reference electrode are fed into a preamplifier, where the different channel signals are separated and then amplified by a predefined gain (shown as right faced triangles). These amplified signals are then fed into the computer at a sampling rate of 256 Hz through a universal serial bus (USB) port using a serial-to-USB converter. The recorded EEG signals are ready for processing now.

Figure 2. The electrode positions on the scalp for EEG recording according to the standard 10-20 international system.

In this work, the EEG signals are processed to extract the information related to the saccadic eye movement from the occipital region of the scalp (indicated by ‘O1’ and ‘O2’ in the surface view of the electrode mapping, Fig. 2). Based on the existing work done by K. Ohno [11], it is evident that the saccadic eye movement is well represented in the EEG during the memory tasks prior to the actual eye movement. These phenomena of changing EEG can be visualized by observing the signals recorded from the electrodes, ‘O1’ and ‘O2’. Depending on the direction of the eye movement, the recording from the contralateral electrode changes sharply in the EEG making them suitable for the input to the BMI.

Figure 3. Schematic of EEG signal acquisition and conditioning process.

2) Signal processing Scanning the EEG signal detected from the occipital region of the subject’s scalp, each channel’s signal are filtered using a band-pass filter with the low-pass and highpass cut-off frequencies to be 15 Hz and 100 Hz

237

ISSNIP Biosignals and Biorobotics Conference 2010 Theme: Biosignals and Robotics for Better and Safer Living Vitoria, Brazil - 4-6 January 2010

respectively. This filtering eliminates the possibility of recording the alpha wave also generated by the occipital region of the brain. Then the signals are scanned for the occurrence of the sharp change in their amplitude based on a dynamic threshold. The system makes use of the facility of being trained before the actual experiment as the EEG signal amplitude varies from person to person. This threshold gets renewed by calculating the ratio between the usual electrical activity and the existing threshold. This facilitates the avoidance of the smaller peak like activity might have been artifacted the recordings. Extra care has been paid to artifacts. Eye blink, cardiac rhythms, noise coming from the 50–60 Hz power line and body movement are examples of artifacts. These artifacts can modify the detected signal and was removed using the process proposed in [12] and [13]. In figure 3, the grey rectangular boxes at the centre represent the processing of signals mentioned above. These processed individual channels are then multiplexed and sent to another computer running on Linux through a user datagram protocol (UDP) port using a crossover ethernet cable for interfacing with the robot.

(a)

(b)

(c)

Figure 5. EEG communication interface devices. (a). cap and electrodes. (b). connector and/or multiplexer. (c). preamplifier.

An open source a software, ‘iqr’ [14], is used for interfacing and commanding the robot. The iqr provides a very flexible platform for simulating neural networks and developing robust applications to interface with robotic devices. Three modules are made for our purpose. To achieve our goal of steering the wheels of a robot (shown by the number 8 in the schematic diagram) using the extracted saccade information, three modules are developed in the iqr as shown in Figure 6 (a), (b), and (c). Two mother processes are designed to perform the two major tasks, receiving the input signal coming from the UDP port and sending a control signal based on the input signal to the robot (Figure 6 (a)). These mother processes in turn contain a number of neuronal populations. In Figure 6 (c) the Simulink process group is shown where the input signals are received from two channels to be able to control a wheel of the robot. The control signals are sent from the computer to the robot through the Bluetooth® communication protocol as shown by the number 7 in the schematic diagram. The Robot process group controls the movement of the robot’s wheels. The input control signals are in form of synapse, either excitatory or inhibitory, making a population of neurons to be active or inactive.

3) Interfacing with robotic device The Figure 4 shows the schematic diagram of the interfaced system where each numbered box and element is representing an interface through which the communication is made to the robotic system. The boxes represent computers used for this communication. The number 1 represents the EEG recording cap and the electrodes (Figure 5, (a)) that detect the signal from the scalp and sends the signal to number 2 in the diagram which represents the electrode connector and/or multiplexer (Figure 5, (b)). The multiplexed signal from each channel is then sent to the preamplifier, represented by the number 3 in the diagram (Figure 5, (c)), where the signal is amplified with a predefined gain and transferred to the computer for digitization at 256 Hz (denoted by the number 4 in the diagram) and further processing to extract the saccade information from the raw EEG signal. The number 5 of the diagram shows the UDP transfer of the conditioned signal to another computer (denoted by the number 6 in the diagram) running on Linux for generating the command signals to the mobile robotic device.

(a)

(b)

(c)

Figure 6. iqr modules for generating the command signal to control the robotic device.

Figure 4. The schematic diagram of the Brain-machine interface system.

238

ISSNIP Biosignals and Biorobotics Conference 2010 Theme: Biosignals and Robotics for Better and Safer Living Vitoria, Brazil - 4-6 January 2010

Five neuronal populations are used to perform this work. The input control signal from each channel is fed to a neuronal population representing a wheel. These two populations are connected to the motor neuronal population (representing the motor of the robot) which initiates the movement of the wheels. Each time a wheel (in the rest of this section, the parts of the robot will denote a set of neuronal population) fires, it sends an excitatory synapse signal to the motor; another inhibitory synapse to the other wheel inactivating that wheel’s activity; and another inhibitory synapse to the forward movement neuronal population making the motor to stop the movement initiated by the other wheel. Finally, there is one robot’s will neuronal population that keeps generating excitatory synapse to the motor through the forward movement neuronal population that keeps the robotic device moving. The synapse from a wheel is required to stop the other wheel to steer it to follow a predefined course. III.

IV.

CONCLUSION

This model of Brain-machine interface based on the EEG is done as a proposed model and a proof-of-principle to reduce the complexity that is gradually prevailing upon the very potential field of applying this technique in helping disables to recover from their disability. This work might be extended to control assistive robotic devices (e.g., robotic wheelchair) for the disabled. REFERENCES [1]

[2]

[3]

DISCUSSION

[4]

A healthy subject is plugged in with the necessary equipments of EEG signal detection (see Figure 5). A predefined course is defined that the mobile robotic device follows as a response to the saccadic eye movement of the subject. The excitatory synapse from the forward movement neuronal population keeps the robot moving forward; to make it to take a left turn the left wheel has been stopped through an inhibitory synapse generated by the right wheel, and the vice-versa for a right turn. The Figure 7 shows the simulated result of the robot’s navigation during an experiment using this model.

[5]

[6]

[7]

[8]

[9]

[10]

[11]

[12]

[13]

[14]

Figure 7. Simulated navigation result of the robotic device during an experiment to follow a course.

239

Mason S.G., Birch G.E., “A general framework for brain-computer interface design”, IEEE Trans. Neural Syst. Rehabil. Eng., vol. 11, no.1, March 2003, pp. 70-85. Millán J., Renkens F., Mouriño J., Gerstner W., “Non-Invasive BrainActuated Control of a Mobile Robot”, Proceedings of the 18th International Joint Conference on Artificial Intelligence, Acapulco, Mexico, August 9-15, 2003, pp. 1121-1126. Vaughan T., et al., “Brain-computer interface technology: a review of the second international meeting”, IEEE Trans. Neural Syst. Rehab. Eng., vol. 11, no.2, 2003, pp. 94-109. Mehrdad Fatourechi, “Design of a Self-Paced Brain Computer Interface System using Features Extracted from Three Neurological Phenomena”, Dissertation of PhD Degree, The University of British Columbia, Canada, January 2008, pp. 1-2. Ferreira, Andre., et al., “Human-machine interfaces based on EMG and EEG applied to robotic systems”, Journal of NeuroEngineering and Rehabilitation, Vol. 5, No. 1, 2008, pp. 10. A Cheein, FA, Postigo, J., “A Fast Finite State Machine Design for a Brain Computer Interface”, XI Reunion de Trabajo en Processamiento de la Informacion y Control, Argentina, 2005. Mourino, J., “EEG-based Analysis for the Design of Adaptive Brain Interfaces”, PhD thesis, Universitat Politecnica de Catalunya, Barcelona, Spain, 2003. Rani, P., Sarkar, M., “EMG-based high level human-robot interaction system for people with disability”, Proceedings of 2005 IEEE International Workshop on Robot and Human Interactive Communication (ROMAN 2005), Nashville, Tennessee, 2005, pp. 280 – 285. Moon, I., Lee, M., Chu, J., Mun, M., “Wearable EMG-based HCI for Electric-Powered Wheelchair Users with Motor Disabilities”, Proceedings of 2005 IEEE International Conference on Robotics and Automation (ICRA 2005), Barcelona, Spain, 2005, pp. 2649 – 2654. Mahmud, M., Hawellek, D., Valjamae A., “A Brain-Machine Interface Based on EEG: Extracted Alpha Waved Applied to Mobile Robot”, Proceedings of the 2009 ECSIS Symposium on Advanced Technologies for Enhanced Quality of Life (AT-EQUAL 2009), Iasi, Romania, 22-26 July 2009, pp. 28-31. Ohno K., Funase A., Cichocki A., Takumi I., "Analysis of EEG Signals in Memory Guided Saccade Tasks", IFMBE Proceedings of World Congress on Medical Physics and Biomedical Engineering 2006, Vol. 14/4, August 27 – September 1, 2006, COEX Seoul, Korea, pp. 2664-2667. Haas, S.M., Frei, M.G., Osorio, I., Pasik-Duncan, B., Radel, J., “EEG Ocular Artifact Removal Through ARMAX Model System Identification Using Extended Least Squares”, Communications in Information and Systems, Vol. 3, 2003, pp. 19 – 40. Rohalova, M., Sykacek, P., Koska, M., Dorffner, G., “Detection of the EEG Artifacts by the Means of the (Extended) Kalman Filter”, Measurement Science Review, Vol. 1, 2001, pp. 59 – 62. Ulysses Bernardet, Mark Blanchard, Paul F. M. J. Verschure, “IQR: a distributed system for real-time real-world neuronal simulation”, Neurocomputing, 44:46, 2002, pp. 1043-1048.

A Brain-Machine Interface Model Based on EEG for ...

Jan 4, 2010 - and the increasing number of signal processing tools open up new avenues .... capable to acquire and a framework for further processing of the EEG signals. ... An open source a software, 'iqr' [14], is used for interfacing and ...

1MB Sizes 1 Downloads 157 Views

Recommend Documents

toward a silent speech interface based on unspoken ...
Tonantzintla, México [email protected], {kargaxxi, villasen}@inaoep.mx. Keywords: Silent Speech Interfaces (SSI), Electroencephalograms (EEG), Unspoken Speech, Discrete Wavelet Transform. (DWT), Classification. Abstract: This work aims

ParSketch: A Sketch-Based Interface for a 2D ...
A theoretical analysis of the efficiency component of ... solid computer models” and the “ability to sketch engineering objects in the freehand mode” were the ... PC screen. The recognition engine cleans up input data and adjusts edges to make.

A Sketch-Based Interface for Collaborative Design
tems (e.g., CAD systems), on the other hand, provide con- siderable ... A more flexible approach for sketch- ing 3D shapes of ..... phous elements, such as clouds, fire and water, tradition- .... Human Factors in Computing Systems (1995), pp.

MCGP: A Software Synthesis Tool Based on Model ... - Semantic Scholar
Department of Computer Science, Bar Ilan University. Ramat Gan 52900 .... by following the best generated programs, and by navigating through the chain.

MCGP: A Software Synthesis Tool Based on Model ...
candidate program from a correct solution. The tool allows the user to control various parameters, such as the syntactic building blocks, the structure of the programs, and the fitness function, and to follow their effect on the convergence of the sy

MCGP: A Software Synthesis Tool Based on Model ... - Semantic Scholar
whether) a candidate solution program satisfies a property. The main ... software, a natural challenge is to generate automatically correct-by-design pro- grams.

Model generation for robust object tracking based on ...
scription of the databases of the PASCAL object recogni- tion challenge). We try to overcome these drawbacks by proposing a novel, completely unsupervised ...

Finite State Model-Based Testing on a Shoestring - harryrobinson.net
Generate sequences of test actions from the model. ... Action: Click on “Settings\Digital”. Outcome: Does the Clock correctly change to the Digital display? Create a Finite State Model of an Application. Finite state models are excellent ..... A

Towards a Fully Interpretable EEG-based BCI System
National Institute for Research in Computer Science and Control (INRIA), ... rules describing which input feature values correspond to which output class.

Approximate Entropy for EEG-based Movement Detection
BCI Group, Dept. of Computing and Electronic Systems, University of Essex, ... An approximate entropy feature is tested with parameters appropriate for online BCI - ... Such signals can be best distinguished using complexity measures which are .....

A New Mode of EEG Based COmmunication
best performance for subject 3. However ... content i.e. Power Spectral Density (PSD) of the EEG .... pair l (rotation, maths, baseline) gives the best performance.

Towards a Fully Interpretable EEG-based BCI System
2National Research Institute for Computer Sciences and Control (INRIA), Rennes, ... i.e., we cannot interpret what BCI algorithms automatically learnt from EEG data ... processing and classification techniques from which we could gain insights.

EEG Based Biometric Framework for Automatic Identity ... - Springer Link
The energy of brain potentials evoked during processing of visual stimuli is ... achieved by performing spatial data/sensor fusion, whereby the component ...

Brain-computer interface based on high frequency ...
Abstract—Brain-computer interfaces (BCIs) based on steady- state visual evoked .... checkerboard was 6.64 degrees horizontally and 6.64 degrees vertically.

A note on fracture criteria for interface fracture
e-mail: [email protected]). Received 4 January .... arc in order to produce compressive residual stresses at the specimen edges. Residual curing stresses ...

Brain-computer interface based on high frequency steady-state visual ...
visual evoked potentials: A feasibility study ... state visual evoked potentials (SSVEPs) are systems in which virtual or physical ... The classification of the data is.

Interface for MEMS-based rotational accelerometer for ...
chanical systems (MEMS) based rotational accelerometers is presented. It is housed in ... disturbances acting on computer hard-disk drives so as to steadily keep the ... A. Baschirotto is with the Department of Innovation Engineering, University of L

A note on fracture criteria for interface fracture
criteria in (9) and (13) are employed for comparison and exhibited in Figure 10. ... more contact and friction induced in the specimen employed by Liechti and Chai (1992) ... elastic solution into (20). ... length parameter L to center the curve.

A Mechanism-based Disease Progression Model for ...
313. 1567-567X/06/0600-0313/0 © 2006 Springer Science+Business Media, Inc. ... decades and therefore leaves only a relatively small trajectory of change within the limited time ..... point of active drug administration, keq the rate constant of equi