A Prototype of Electric Wheelchair Controlled by Eye-Only For Paralyzed User

Download as pdf or txt
Download as pdf or txt
You are on page 1of 2

Arai, K. and Mardiyanto, R.

Paper:
A Prototype of Electric Wheelchair Controlled
by Eye-Only for Paralyzed User
Kohei Arai

and Ronny Mardiyanto


,

Saga University
1 Honjo, Saga 840-8502, Japan
E-mail: [email protected]

Institut Teknologi Sepuluh Nopember


Keputih, Sukolilo, Surabaya, Indonesia
E-mail: ronny [email protected]
[Received March 22, 2010; accepted June 25, 2010]
The numbers of persons who are paralyzed dependent
on others due to loss of self-mobility is growing as the
population ages. We have developed a wheelchair pro-
totype exclusively controlled by eye and able to be used
different users while proving robust against vibration,
illumination change, and user movement. The keys to
this exibility are the camera mounted on the users
glasses and the use of pupil detection. Image pro-
cessing analyzes the users gaze for wheelchair con-
trol. Result of pupil detection method compared to
others showed that our method is superior. Also, re-
sult of camera placement compared to other systems
regarding vibration inuence showed that our camera
placement reduces vibration almost completely. More-
over, inuence of illumination change has been evalu-
ated. Experiments involving ve different users in the
wheelchair along a 9.73-meter track recorded an aver-
age travel time of 85.8 second. Demonstrating the fea-
sibility and reliability of our proposal providing com-
puter input for paralyzed user is to use in controlling
wheelchair.
Keywords: wheelchair, eye gaze, paralysis, computer in-
put by eye-only, hand-free controller
1. Introduction
The development of wheelchairs for paralyzed users
is surprisingly recent starting with conventional man-
ually powered wheelchairs and advancing to electrical
wheelchairs [1]. Conventional wheelchair use tends to fo-
cus exclusively on manual use which assumes users still
able to use their hands which excludes those unable to do
so. Diseases or accidents injuring the nervous system also
frequently cause people lose their ability to move the vol-
untary muscle. Because voluntary muscle is the main ac-
tuator enabling people to move their body, paralysis may
causes a person cannot move their locomotor organ such
as arm, leg, and others. Paralysis may be local, global,
or follow specic patterns. Most paralysis are constant,
however there are other forms such as periodic paralysis
(mostly caused by genetic diseases) and sleep paralysis
(occurs when brain awake from REM (Rapid Eye Move-
ment) [2], but body cannot be moved during several sec-
ond or minutes) caused by other factors.
Scientist Stephen W. Hawking is perhaps the most
well-known victim of major paralysis Hawking was
diagnosed with incurable Amyotrophic Lateral Sclerosis
(ALS) in 1962 actively using a wheelchair [3].
Many of those suffering close to or complete paraly-
sis usually however still can control their eye movement
which inspired us to develop an eye-controlled electric
wheelchair.
Free-hand based wheelchair as assistive mobility de-
vice can be broadly categorized into following categories.
1. Biosignal-based [4, 5]. Electrooculography (EOG),
electroencephalography (EEG), and electromyogra-
phy (EMG) adapt user biosignals for wheelchair con-
trol. An example of an electric wheelchair [4] con-
trolled using EOG analyzed user eye movement via
electrodes directly on the eye to obtain horizontal
and vertical eye-muscle activity. Signal recognition
analyzed Omnidirectional eye movement patterns.
Another approach proposed wheelchair control us-
ing muscle and brain signals [5]. User intent was
analyzed using EMG and EEG via electrodes on the
head whose output signal were analyzed and con-
verted to wheelchair control commands.
2. Voice-based [6]. Wheelchair control has also been
guided by voice commands delivered through speech
recognition, motor control, a user interface, and cen-
tral processor modules. Such systems usually require
the user to record functional oral commands, e.g.,
Forward makes the wheelchair move forward and
Stop makes the wheelchair stop.
3. Vision-based [710]. Using a camera to acquire
user images and analyze user intent. Ref [7] pro-
posed wheelchair controlled by head gestures. Viola-
Jones face detection is used to recognize the face
prole and omnidirectional head gestures and is
66 Journal of Robotics and Mechatronics Vol.23 No.1, 2011
A Prototype of Electric Wheelchair Controlled by Eye-Only
used to control speed and turning. A similar ap-
proach [8] uses horizontal gaze direction and blink-
ing. Gaze direction is derived from the triangle lo-
cations which are formed from eyes center and nose
locations. User gaze and blinking are used to pro-
vide the direction and timing commands. The di-
rection command is related with the movement di-
rection of electric wheelchair and the timing com-
mand is related with the time condition when the
wheelchair should moves. Still another proposal in-
volves wheelchair use with one indoor camera for
monitoring wheelchair movement and another cam-
era on the wheelchair to detect obstacles [9]. A simi-
lar proposal uses gaze control [10] in which stereo
CCD cameras determine user gaze and head pose
and a range nder recognizes the surrounding envi-
ronment.
Many different types of wheelchairs have also been
proposed by Prof. Kuno of Saitama University in
Japan [1114]. These include a wheelchair with a care-
giver [12] who helps wheelchair by action such as pushes
a button to call an elevator or open a door. Another [13]
involves a robotic wheelchair with a speech interface
that follows user speech commands using environmental
range-sensor information. Still another [14] proposed a
robotic wheelchair that monitors the user pedestrian and
caregivers via multiple sensors. All of this still require
manual control.
The problem remains that signal-based systems require
direct contact with the user, e.g., electrodes attached to
the user making these system expensive, invasive, and in-
convenient. Although voice-based systems are easy and
simple to develop, voice of surrounding people in actual
practical application is still a problem. For the above rea-
sons, we propose a wheelchair using a vision-based sys-
tem whose objectives are to be usable by different para-
lyzed users and to be robust against vibration, user move-
ment, and illumination change causing problems with pre-
vious systems.
Although paralyzed users may use their eyes, we de-
cided to focus on gaze, rather than blinking, to show in-
tent because long-term blinking makes users inordinately
tired, which could result in erroneous communication.
Our proposal is basically same as that in [10] in
that both use the user gaze to capture user information.
Whereas [10] uses a stereo CCD camera to analyze the
gaze, we use only a single camera on the users glasses.
The camera mounted on users glasses also has been de-
veloped by NAC Company (NAC Image Technology).
The EMR-9 tracks the eye movement using camera on
the users head and Purkinje-based method. This product
allows user to move freely and record data into memory
card or transfer by using wireless LAN. This data then can
be read and analyzed by computer. The difference with
our system is that our camera mounted on users glasses
detects eye movement using pupil knowledge. The in-
creasing of pupil detection accuracy will improves robust-
ness against different users.
Our system is designed for indoor purpose use with:
(1) illumination less than 1500 lux (direct sunlight), (2) a
at travel surface no slopes on a road at least 1.2 m, no
passing on stairs, and (3) a minimumrotate space of 2 m
2
.
Although designed for paralyzed users who can at least
move their eyes, it is not designed for those with vision
problems such as squinting and those who use excessive
eye makeup such as mascara.
When using this wheelchair, user must be accompa-
nied with assistant (nurse or family member). Assistant
will help them sit on the chair, turns on the power supply,
turns on the computer, put the glasses with camera, and
unlock the chairs hand brake. The assistant must also aid
in reversing all of this when user is nishing using the
wheelchair.
Our system consists of a single infrared camera
mounted on the users glasses, netbook PC, a microcon-
troller, and a modied Yamaha JW II wheelchair. Infrared
LEDs adjust illumination during environmental changes
and camera position follows head movement to keep up
with user movement while eliminating vibration thanks
to the bodys stability in the chair. Once the users im-
age is acquired by the camera, Adaboost Classier (pro-
posed by Viola-Jones) eye detection, adaptive threshold,
and pupil detection determine the gaze. A single ultra-
sonic sensor on front of the wheelchair is used for col-
lision avoidance. Wheelchair control uses command in-
visible layout to turn left, right, and go forward which is
selected by user with looking at the key on layout during
1 second. Invisible layout (which contains several keys)
means that user knows the keys positions without any real
mark. Our wheelchair does not use stop key for safety
reason. When user changes the gaze direction, wheelchair
will automatically stop. Also when system fails analyze
user gaze, wheelchair will stop. By implementing this
system, wheelchair will move safely and help paralyzed
users life because they get back their mobility.
This paper is organized as follows: Section 2 propose
our hardware conguration, gaze estimation, eye model,
microcontroller circuit, and wheelchair control. Section 3
reviews experimental pupil detection performance results
for different users and illumination change and also exper-
imental vibration performance and wheelchair feasibility
results. Section 4 presents conclusions.
2. Proposal
Our proposal prime importance is that the prototype
works for all types of users under real circumstances of
vibration, illumination change, user movement, and type
of user. The system must also move safely. The in-
frared camera on the users glasses enables user move-
ment and reduces vibration. The infrared LED auto-
matically adjusts illumination and stabilizes image. Our
gaze estimation uses pupil knowledge such as size, color,
shape, sequential location, and movement locates the
pupil location. The simple model then converts pupil lo-
cation to user gaze. A microcontroller circuit connects
Journal of Robotics and Mechatronics Vol.23 No.1, 2011 67

You might also like