Sensors: ARBIN: Augmented Reality Based Indoor Navigation System
Sensors: ARBIN: Augmented Reality Based Indoor Navigation System
Sensors: ARBIN: Augmented Reality Based Indoor Navigation System
Article
ARBIN: Augmented Reality Based Indoor
Navigation System
Bo-Chen Huang 1 , Jiun Hsu 2 , Edward T.-H. Chu 1, * and Hui-Mei Wu 2
1 Department of Computer Science and Information Engineering, National Yunlin University of Science and
Technology, Yunlin 64002, Taiwan; [email protected]
2 National Taiwan University Hospital YunLin Branch, Yunlin 640203, Taiwan; [email protected] (J.H.);
[email protected] (H.-M.W.)
* Correspondence: [email protected]; Tel.: +886-05-534-2601-4519
Received: 27 August 2020; Accepted: 12 October 2020; Published: 17 October 2020
Abstract: Due to the popularity of indoor positioning technology, indoor navigation applications
have been deployed in large buildings, such as hospitals, airports, and train stations, to guide visitors
to their destinations. A commonly-used user interface, shown on smartphones, is a 2D floor map
with a route to the destination. The navigation instructions, such as turn left, turn right, and go
straight, pop up on the screen when users come to an intersection. However, owing to the restrictions
of a 2D navigation map, users may face mental pressure and get confused while they are making a
connection between the real environment and the 2D navigation map before moving forward. For this
reason, we developed ARBIN, an augmented reality-based navigation system, which posts navigation
instructions on the screen of real-world environments for ease of use. Thus, there is no need for users
to make a connection between the navigation instructions and the real-world environment. In order to
evaluate the applicability of ARBIN, a series of experiments were conducted in the outpatient area of
the National Taiwan University Hospital YunLin Branch, which is nearly 1800 m2 , with 35 destinations
and points of interests, such as a cardiovascular clinic, x-ray examination room, pharmacy, and so on.
Four different types of smartphone were adopted for evaluation. Our results show that ARBIN can
achieve 3 to 5 m accuracy, and provide users with correct instructions on their way to the destinations.
ARBIN proved to be a practical solution for indoor navigation, especially for large buildings.
Keywords: augmented reality; Bluetooth; indoor positioning system; indoor navigation system;
smart hospital
1. Introduction
Due to the advance of the internet of things and business opportunities, indoor navigation systems
have been deployed in many large buildings, such as big train stations, shopping malls, hospitals,
and government buildings. After installing a navigation mobile app, users can select a point of interest
on a menu list. Then, the app will determine a route to the destination, which is usually the shortest
path. Nowadays, the most commonly used user interface (UI) of navigation applications is a 2D map
with a route. Users are provided with navigation instructions, such as turn left, turn right, and go
straight, when they are close to an intersection. However, due to the limitations of a 2D navigation
map, it could add an additional cognitive load for users to construct the relationship between the 2D
navigation map and the real environment. Extra mental pressure may also be induced and make users
confused [1]. Therefore, eliminating possible user confusion is important for navigator UI design.
In order to create a good user experience, several research efforts have been devoted to developing
an indoor navigation system by utilizing augmented reality (AR) technology. A. Mulloni et al. [2,3]
and L. C. Huey et al. [4] deployed markers as location anchors in the environment. A user can know
their location by matching the markers with the associated location information stored at a remote
server or on a user’s phone. However, the angle of the camera must be in proper alignment with
markers before the matching process can start. In addition, the markers could get dirty easily and
become unrecognizable, therefore increasing maintenance costs. S. Kasprzak et al. [5], J. Kim et al. [6]
first performed an image search for pre-tagged objects, such as billboards and trademarks, in the
environment, and then determined the user’s location based on the obtained objects. However, the more
complicated the environment is, the more difficult it will be to identify pre-tagged objects. The image
matching processing becomes even challenging when the layout and decoration of different parts of
the space are similar. Feature matching is another method to determine user’s location [1]. However,
constructing point clouds of a real indoor environment is time consuming and costly, especially for a
large building.
In this paper, we designed ARBIN, an augmented reality-based navigation system, by extending
our previous work, WPIN [7]. WPIN utilized Bluetooth low energy (BLE) beacons, named Lbeacons
version 1 (BiDaE Technology, Taipei, Taiwan), deployed at each intersection and point of interest
(POIs), to get the coordinates of the current position. 2D images, such as turn left, turn right, and go
straight, were provided to users as direction indicators along the route to the destination. Unlike WPIN,
ARBIN uses AR technology that combines virtual objects and the real world. Navigation instructions,
as well as AR 3D models, are posted on the screen on the surrounding environment through the
smartphone camera. Therefore, there is no need for users to make a connection between the navigation
instructions and the real-world environment. In our implementation, Google ARCore (Google Inc.,
Mountain View, California, United States) [8] is adopted to create AR 3D models, obtain gyroscope
sensor readings, and determine where to put the models. Accuracy is the key factor for the success of
an AR-based indoor navigator. The difficulties in achieving accuracy of indoor positioning, and that of
AR 3D model placement are described as follows.
In WPIN [7], Lbeacons were adopted at waypoints to periodically broadcast their own coordinates
to smartphones nearby. A waypoint can be an intersection, a point of interest (POI), or the middle
of a corridor. After receiving a broadcast message sent from a Lbeacon, the positioning module,
running on the user’s smartphone, starts to estimate the distance between itself and the Lbeacon
according to a RSSI (received signal strength indicator) distance model. The stronger the received signal
is, the closer the user is to the Lbeacon. When the user and the Lbeacon are close enough, for example
less than 5 m, a new direction indicator will pop up to guide the user to the next waypoint. The above
process continues until the user arrives the destination. However, because of the machine cutting error,
the size of the antenna board of each Lbeacon may not be identical, which could affect its capability for
transmitting and receiving signals. Furthermore, the characteristics of the RF (Radiofrequency) circuit
of each Lbeacon may also be different due to the nature of an analogy circuit. Therefore, the RSSI
distance model of each Lbeacon is not exactly identical according to our experience. In our previous
work, to achieve the required positioning accuracy, we constructed a RSSI model for each Lbeacon,
which was time consuming and unscalable. To overcome this unavoidable and challenging hardware
problem, a novel RSSI modeling method was developed to overcome the problem of the heterogeneous
issues of Lbeacons, which is given in Section 3.2.
The AR 3D models, such as a left arrow or a right arrow, should be placed properly in a real-world
environment to avoid possible user confusion. An inaccurate placement of the 3D model may make
users confused, and avoid using it. For example, displaying a 3D model in the wrong orientation,
an incorrect elevation angle, or an incorrect depression angle. Many parameters should be carefully
considered before having the correct placement of a 3D model, such as the face orientation of a user,
the location, and orientation of the smartphone. Constructing a relationship between these parameters
and the coordinates of a 3D model is challenging. The detailed method is presented in Section 3.4 to
Section 3.5.
In order to investigate the applicability of ARBIN, we first evaluated the responsiveness of the
positioning module of ARBIN. We then set up a field trial in a hospital. For the former, we conducted a
Sensors 2020, 20, 5890 3 of 20
series of experiments in the engineering building No. 5 of the National Yunlin University of Science
and Technology, crossing three floors with a total area of around 250 m2 . The experiment results
showed that the adopted RSSI (received signal strength indicator) model could accurately determine
the distance between a Lbeacon and a smartphone. Thus, the AR models could be displayed correctly
on the smartphone screen. Furthermore, a field trial was conducted at the outpatient area of the
National Taiwan University Hospital YunLin Branch, which is nearly 1800 m2 , with 35 destinations
and point of interests, such as a cardiovascular clinic, x-ray examination room, pharmacy, and so on.
Four different types of smartphone were adopted for evaluation. Our results show that ARBIN can
achieve 3 to 5 m accuracy and give users correct instructions on their ways to the destinations. ARBIN
proved to be a practical solution for indoor navigation, especially for large buildings.
2. Related Work
Due to the promise of providing a user-friendly interface to users, several researchers have utilized
AR technologies to develop indoor navigation applications. Based on the positioning technologies
they used, the existing AR-based navigations can be classified into three types: marker-based method,
2D image recognition-based method, and 3D space recognition-based method. Each of them is
described as follows.
image recognition, such as the view angle of camera, the distance between the objects and the user,
the number of moving objects in the surrounding environment, and so on.
3. Methodology
Figure 1.
1. System architecture of ARBIN.
Figure 1.System
Figure System architecture ARBIN.
architecture of ARBIN.
3.2.Indoor
3.2. IndoorPositioning
PositioningModule
Module
Thepurpose
The purposeofofthe thepositioning
positioningmodule
moduleisistotodetermine
determinethe theuser’s
user’slocation.
location.As AsFigure
Figure3a 3ashows,
shows,
Lbeaconsare
Lbeacons aredeployed
deployedatatwaypoints.
waypoints. InIn this
this work,
work, a waypoint
a waypoint is defined
is defined as intersection,
as an an intersection, a point
a point of
of interest, or the middle of a corridor. Each Lbeacon periodically
interest, or the middle of a corridor. Each Lbeacon periodically broadcasts its coordinate information broadcasts its coordinate
toinformation
smartphones tonearby.
smartphones
From the nearby. Fromofthe
view point view his
the user, point of smartphone
or her the user, his or her smartphone
continuously receives
continuously
the coordinate receives
information the coordinate information
sent by Lbeacons nearby, sent by Lbeacons
while determiningnearby,
howwhilefar thedetermining
smartphone how
is
far the
from thesmartphone is fromIfthe
closest Lbeacon. closest
the Lbeacon.
distance betweenIf the
thedistance between
smartphone and thea smartphone
Lbeacon is close and aenough,
Lbeacon
is close
for example enough, for navigation
3 m, the example 3 m, app the navigation
provides app provides
the user the user with
with a directional a directional
indicator to guide indicator
him or herto
toguide himwaypoint.
the next or her to the Annext waypoint.
illustrated An illustrated
example is shown example
in Figureis3b,shown in Figure
the route starts 3b,
fromthewaypoint
route startsA
from
and waypoint
ends A andC.ends
at waypoint The at
userwaypoint C. The
first receives userstraight”
a “go first receives a “gowhen
command straight” command
entering the areawhenof
entering A,
waypoint theand
areathen
of waypoint
a “turn left”A, command
and then aat“turn left” command
waypoint at waypoint
B. The coverage size of aB.waypoint
The coverage
depends size
onof the
a waypoint
size of the depends on theorsize
intersection theofpoint
the intersection
of interest. or The the pointthe
larger of interest.
coverageThe arealarger
is, thethe coverage
larger the
area is,
range of the larger the
a waypoint is.range
In ourofimplementation,
a waypoint is. Inthe ourcoverage
implementation, the coverage
size of a waypoint is asize
3-m,of5-m,
a waypoint
or 7-m
is a 3-m,
radius 5-m, The
circle. or 7-m
keyradius
factorcircle. The key factor for
for waypoint-based waypoint-based
navigation success navigation
is accurately success is accurately
determining the
determining
distance betweenthe distance
the user andbetween the user and
the Lbeacons. Forthe
this,Lbeacons. For this,
in our previous in our
work [7],previous work [7],
RSSI distance RSSI
models
distance
stored on models stored onwere
the smartphone the smartphone were adopted
adopted to estimate to estimate
the distance. the distance.
However, because However, because
of the machine
of the machine
cutting error andcutting error and theofcharacteristics
the characteristics the RF circuit,of thethe RF circuit,
RSSI distancethe RSSI of
model distance model of
each Lbeacon is each
not
Lbeacon To
identical. is not identical.
achieve To achieve
the required the required
positioning accuracy,positioning accuracy,
we constructed we constructed
a RSSI model for each a RSSI model
Lbeacon,
foriteach
but wasLbeacon, but it was
time consuming andtime consuming and unscalable.
unscalable.
Figure 3. The positioning method of waypoint-based navigation [7]. (a) User and Lbeacon; (b) waypoint
Figure 3. The positioning method of waypoint-based navigation [7]. (a) User and Lbeacon; (b)
and Lbeacons.
waypoint and Lbeacons.
To overcome this problem, in this work we first analyzed the characteristics of RSSI models of
To overcome this problem, in this work we first analyzed the characteristics of RSSI models of
about 24 randomly selected Lbeacons, from 70 Lbeacons. We then classified them into four types.
about 24 randomly selected Lbeacons, from 70 Lbeacons. We then classified them into four types. For
For each type of Lbeacon, only one RSSI model was used. Because the navigator must give a user a
each type of Lbeacon, only one RSSI model was used. Because the navigator must give a user a
directional indicator when he/she enters the coverage of a waypoint, we mainly focused on the behavior
directional indicator when he/she enters the coverage of a waypoint, we mainly focused on the
of the RSSI curve in the range of 0 to 3 m, 3 to 5 m, and 5 to 7 m. As Figure 4 shows, we measured
behavior of the RSSI curve in the range of 0 to 3 m, 3 to 5 m, and 5 to 7 m. As Figure 4 shows, we
the RSSI values at the locations 1 m to 7 m away from a Lbeacon. For each location, we collected
measured the RSSI values at the locations 1 m to 7 m away from a Lbeacon. For each location, we
one-minute of RSSI samples (i.e., 240 samples) and took the average as the result. The measurement
collected one-minute of RSSI samples (i.e., 240 samples) and took the average as the result. The
stops when the seven locations have been measured.
measurement stops when the seven locations have been measured.
As shown in Figure 5a, four Lbeacons, numbered A1, A2, A3, and A4, were classified as type 1,
in which the RSSI values drops inversely to the distance, in the range of 0–3 m and 5–7 m. Therefore,
Type 1 Lbeacons are suitable to cover a waypoint with a radius of 3 m and 7 m. Similarly, Type 2
Lbeacon are only suitable to cover a waypoint with a radius of 3 m. Meanwhile, Type 3 Lbeacons are
suitable for a waypoint with radius of 3, 5, or 7 m since the RSSI values drops inversely to all the
distances we measured. Type 4 Lbeacons are suitable for a waypoint with a radius of 5 m. Based on
the measurement, for each type of Lbeacon, we adopted a polynomial function as a regression model
to represent the relationship between the distance and the RSSI values. Results are shown in Figure
Sensors 2020, 20, x FOR PEER REVIEW 7 of 20
6. Given a new and unknown type of Lbeacon, we first classified it into one of the four types based
on measurement,
the the characteristicforof its RSSI
each curve.
type of A RSSI
Lbeacon, wemodel wasa then
adopted picked from
polynomial the as
function RSSI models shown
a regression model in
Sensors 2020, 20, 5890 7 of 20
Figure
to 6. In the
represent Section 5, the correctness
relationship of the
between the proposed
distance andmodels
the RSSI shown in Results
values. Figure 6are
is evaluated.
shown in Figure
6. Given a new and unknown type of Lbeacon, we first classified it into one of the four types based
on the characteristic of its RSSI curve. A RSSI model was then picked from the RSSI models shown in
Figure 6. In Section 5, the correctness of the proposed models shown in Figure 6 is evaluated.
Figure 4. Collecting received signal strength indicator (RSSI) samples at different distances.
Figure 4. Collecting received signal strength indicator (RSSI) samples at different distances.
As shown in Figure 5a, four Lbeacons, numbered A1, A2, A3, and A4, were classified as type 1,
To tolerate
in which the RSSIthe variability
values drops of RSSI values,
inversely to thewe considered
distance, in the the RSSIof
range values
0–3 mofandLbeacons
5–7 m. nearby.
Therefore,Let
and represent
Type 1 Lbeacons the highest
are suitable
Figure 4. Collecting and
to cover
received the
signal second-highest
a waypoint RSSI
with a(RSSI)
strength indicator detected
radius by
of 3 matand
samples the smartphone.
Similarly, Type 2is
7 m. distances.
different The
the RSSI of waypoint and the is that of waypoint, . Since
Lbeacon are only suitable to cover a waypoint with a radius of 3 m. Meanwhile, Type is the highest, the user3isLbeacons
regarded
as being
are at for
suitable waypoint
To tolerate athe
waypoint . Based
variability ofon
with the values,
RSSI
radius RSSI
of 3,models,
5,we 7 mwe canthe
or considered
since obtain
theRSSIthe
RSSI theoretical
values
values value of
of Lbeacons
drops inverselynearby.and
to all Let
the
at waypoint
and we
distances i; that
represent
measured. is, and
the highest . If – >=
and the second-highest
Type 4 Lbeacons – , the
are suitable forRSSI user’s location
detectedwith
a waypoint by the is updated
smartphone.
a radius to waypoint
of 5 m. The
Based on isi.
OnRSSI
the the other hand,
of waypoint
measurement, forif each– type
and the< of Lbeacon,
– that
is , the Si isadopted
of we considered
waypoint, .aSinceas a signal
polynomialis the surge
highest,
function and awill
asthe be is
user filtered
regressionregardedout.
model
as being at waypoint
to represent . Based
the relationship on the the
between RSSI models,and
distance wethecanRSSI
obtain the theoretical
values. Results arevalue
shown ofin Figure
and 6.
Given
at a newi;and
waypoint thatunknown
is, andtype. of
If Lbeacon,
– >=we first
– classified it into
, the user’s one of
location the four types
is updated based on
to waypoint i.
the characteristic of
On the other hand, if its RSSI
– curve.
< A RSSI model was then picked from the RSSI models shown
– , the Si is considered as a signal surge and will be filtered out. in
Figure 6. In Section 5, the correctness of the proposed models shown in Figure 6 is evaluated.
(a) (b)
(a) (b)
(c) (d)
Figure 5. Four types of RSSI distance models. (a) Type 1; (b) Type 2; (c) Type 3; (d) Type 4.
(c) (d)
Figure
Figure5.
5.Four
Fourtypes
typesof
ofRSSI
RSSIdistance
distancemodels.
models.(a)
(a)Type
Type1;1;(b)
(b)Type
Type 2;
2; (c)
(c) Type
Type 3;
3; (d)
(d) Type 4.
Type 4.
All Lbeacons were classified into four types. As Figure 6a,d show, each type has its own RSSI
model. The RSSI model used to determine the distance between a user’s smartphone and a Lbeacon
depended on the type of the Lbeacon. When getting close to a Lbeacon, the smartphone uses received
UUID to look up the type of Lbeacon and its associated RSSI model, pre-stored in the smartphone.
(a) (b)
(c) (d)
Figure 6. The regression model of the distance and the RSSI values. (a) The regression model of type
1 Lbeacon; (b) The regression model of type 2 Lbeacon; (c) The regression model of type 3 Lbeacon;
Figure
(d) 6. The regression
The regression model model
of typeof the distance and the RSSI values. (a) The regression model of type
4 Lbeacon;.
1 Lbeacon; (b) The regression model of type 2 Lbeacon; (c) The regression model of type 3 Lbeacon;
To
(d)tolerate the variability
The regression model ofoftype
RSSI values, we considered the RSSI values of Lbeacons nearby. Let Si
4 Lbeacon;.
and S j represent the highest and the second-highest RSSI detected by the smartphone. The Si is the
3.3. Route
RSSI Planning
of waypoint Module
i and the S j is that of waypoint, j. Since S j is the highest, the user is regarded as being
at waypoint i. Based on
After receiving thethe RSSI models,
information we canlocation
of user’s obtain the
andtheoretical value
destination, of Siin
shown and S j at 1,
Figure waypoint i;
the route
that is, Ś i and Ś j . If Si – S j >= Śi – Ś j , the user’s location is updated to
planning (RP) module determines a route to the destination by the well-known Dijkstra’s shortest waypoint i. On the other hand,
ifpath S j < Śi – Ś jBased
Si – algorithm. , the Sionis considered
the route, the as RP
a signal
modulesurge and will
updates thebe
ARfiltered
modelout.placement module with a
direction indicator and an expected face orientation when the user comes totype
All Lbeacons were classified into four types. As Figure 6a,d show, each has its own
a waypoint. TheRSSI
two
model. The RSSI model used to determine the distance between a
pieces of information are then used for placing a 3D model on the real-world environment. Foruser’s smartphone and a Lbeacon
depended
example, as onFigure
the type 3bof the Lbeacon.
shows, the userWhen starts getting close A
at waypoint to and
a Lbeacon,
moves the smartphone
to waypoint uses received
B. When the user
UUID
entersto the look up theof
coverage type of Lbeacon
waypoint B, the and its associated
expected RSSI model,
face orientation pre-stored
is east. in the
After the usersmartphone.
turns left and
moves forward, his/her expected face orientation at waypoint C is north. For ARBIN, at each
3.3. Route Planning Module
waypoint, if the user’s orientation is not the same as the expected face orientation, the associated
After receiving
directional indicatorthe
willinformation
not show in ofthe
user’s locationenvironment.
real-world and destination, shown in
A warning Figure will
message 1, thepop
route
up
planning
to remind(RP)the module
user, whendetermines
needed.aIfroute
this to the destination
happens, possible by the well-known
reasons are that theDijkstra’s shortest
user is going the
path
wrong algorithm. Based
way, or that theon thedoes
user route, theface
not RP to
module updatesorientation.
the expected the AR model
Theplacement
route will module with a
be recalculated
direction indicator
if the user is foundand anunexpected
at an expected face orientation
waypoint. In when the user comes to
our implementation, thea waypoint.
orientationThe two pieces
is obtained by
of information are then used for placing a 3D model on the real-world environment.
IMU (inertial measurement unit) sensors of the smartphone. ARBIN uses the getOrientation() of For example,
as Figure 3b shows, the user starts at waypoint A and moves to waypoint B. When the user enters
the coverage of waypoint B, the expected face orientation is east. After the user turns left and moves
forward, his/her expected face orientation at waypoint C is north. For ARBIN, at each waypoint,
if the user’s orientation is not the same as the expected face orientation, the associated directional
indicator will not show in the real-world environment. A warning message will pop up to remind
Sensors 2020, 20, 5890 9 of 20
the user, when needed. If this happens, possible reasons are that the user is going the wrong way,
or that the user does not face to the expected orientation. The route will be recalculated if the user
is found at an unexpected waypoint. In our implementation, the orientation is obtained by IMU
Sensors 2020,
(inertial 20, x FOR PEERunit)
measurement REVIEWsensors of the smartphone. ARBIN uses the getOrientation() of Android 9 of 20
Sensor Manager [26] to obtain the orientation. In the above-mentioned example, if B and C are
Android Sensor Manager [26] to obtain the orientation. In the above-mentioned example, if B and C
not detected when the user arrives at D, ARBIN will recalculate the route. Then D will be a new
are not detected when the user arrives at D, ARBIN will recalculate the route. Then D will be a new
starting point.
starting point.
Let R denote the expected orientation at a waypoint. The R is an integer between 0 to 7, each of
Let R denote the expected orientation at a waypoint. The R is an integer between 0 to 7, each of
which represents a type of orientation, shown in Figure 7a. For example, R = 1 is northeast while R = 2
which represents a type of orientation, shown in Figure 7a. For example, R = 1 is northeast while R =
is east. After the user passes through a waypoint, the R is updated by how many degrees the user
2 is east. After the user passes through a waypoint, the R is updated by how many degrees the user
turns to the new orientation. For example, for a turn right instruction, the R is updated by adding
turns to the new orientation. For example, for a turn right instruction, the R is updated by adding
90◦ . Additionally, for a turn left instruction, the R is updated by adding 270◦ . Since there are only 8
90°. Additionally, for a turn left instruction, the R is updated by adding 270°. Since there are only 8
types of directional indicator in our implementation, we use L, an integer between 0 to 7, to represent
types of directional indicator in our implementation, we use L, an integer between 0 to 7, to represent
the turning angle of a directional indicator. The definition of each value of L is given in Figure 7b.
the turning angle of a directional indicator. The definition of each value of L is given in Figure 7b.
When
When the the user
user enters
enters a
a waypoint,
waypoint, R R is
is updated
updated by (R ++L)
by (R L)mod
mod8.8.For
Forexample,
example,ininFigure
Figure3b,
3b,atatthe
the
beginning,
beginning, the the user
user faces to the
faces to the east and RR is
east and is2.2.When
Whenthetheuser
usercomes
comestotowaypoint
waypointB,B,the
theexpected
expectedfaceface
orientation is east. After the user turns left and moves forward, the expected face orientation
orientation is east. After the user turns left and moves forward, the expected face orientation R at the R at the
waypoint C is updated to 0 (= (2 + 6) mod 8), which
waypoint C is updated to 0 (= (2 + 6) mod 8), which is north. is north.
(a) (b)
Figure 7. The
The definition
definition of
of orientation
orientationand
anddirectional
directionalindicators.
indicators.(a)
(a)RRvalues;
values;(b)
(b)L Lvalues.
values.
3.4.
3.4. Motion Tracking Module
Motion Tracking Module
The motion tracking
tracking module
module aims
aims toto determine
determinethe thedirection
direction(azimuth)
(azimuth)and andthethepitch
pitchofofthe
the
smartphone
smartphone based on the magnetic sensor and the acceleration sensor of a smartphone.
magnetic sensor and the acceleration sensor of a smartphone. Since the Since the
coordinate
coordinate system of the the smartphone
smartphone andand earth
earthare
aredifferent,
different,transformation
transformationisisneeded
neededbefore
beforethethe
sensor
sensor readings
readings cancan be
beused.
used. As
As shown
shownin inFigure
Figure8a,8a,ininour
ourusage
usagescenario,
scenario,the
thesmartphone
smartphoneshould
shouldbe
kept
be keptupright so that
upright a 3Da model
so that 3D modelcan can
be properly put onto
be properly a reala environment.
put onto If theIfsmartphone
real environment. is laid
the smartphone
flat, shown
is laid in Figure
flat, shown 8b, a warning
in Figure message
8b, a warning will bewill
message provided to remind
be provided the user.
to remind the Let
user.vector V be V
Let vector the
be the heading
heading direction direction of the smartphone.
of the smartphone. As Figure shows,8aV shows,
As8aFigure V isona the
is a vector X-Zon
vector the X-Z
plane. ARBINplane.
uses
ARBIN
the uses the
orientation of Vorientation of V face
as the expected as the expected Moreover,
orientation. face orientation.
the pitchMoreover, the pitchshould
of the smartphone of thebe
smartphone
greater than 80 ◦ before
should be greater than can
a 3D model 80° be
before a 3D model
displayed. can be displayed.
The definition The
of pitch is definition
shown of pitch
in Figure 8c.
is shown in Figure 8c.
Let , , and represent the three axes of the smartphone coordinate system S. In addition,
, , and represent those of the earth coordinate system G. The is the angle between the
axis of S and the axis of G, in which = , , or , and = , , or . The angles can be
obtained by the IMU sensor built into a smartphone. Let , , be a point on S and its associated
coordinate in G be ́ , ́ , ́ . Therefore, we have
́ = x cos ́ + cos ́ + cos ́,
Figure 8.
Figure 8. The
The orientation of aa smartphone.
orientation of smartphone. (a)
(a) Phone is keeping
Phone is upright; (b)
keeping upright; (b) Phone
Phone is
is lying
lying flat;
flat; (c)
(c) The
The
definition of
definition of pitch.
pitch.
Let X, Y, and Z represent the three axes of the smartphone coordinate system S. In addition, X́,
Ý, and Ź represent those of the earth coordinate system G. The θij is the angle between the i axis of
S and the j axis of G, in which i = X, Y, or Z, and j = X́, Ý, or Ź. The angles can be obtained by the
IMU sensor built into a smartphone. Let (x, y, z) be a point on S and its associated coordinate in G be
(x́, ý, ź). Therefore, we have
The (x́, ý, ź) can be represented by (x́, ý, ź)T = R(x, y, z). The R the rotation matrix which is:
Therefore, when the smartphone has a rotation around different axis, we can have a different
rotation matrix [27]. They are:
1 0 0 1 0 0
cos θ y ý cos θ yź
RP = 0 = 0
cos P sin P ,
cos θz ý cos θzź
0 0 − sin P cos P
in which RP is the rotation matrix when the smartphone has a rotation around X axis, and P is the
pitch angular. In addition, RA is the rotation matrix when the smartphone has a rotation around Y
xis, and A is the azimuth angular. Furthermore, RO is the rotation matrix when the smartphone has a
rotation around the Z axis and O is the roll angular. By using the rotation matrix and rotation angles,
we can transform a coordinate between S and G.
Sensors 2020, 20, 5890 11 of 20
In our implementation, the Android sensor manager (Google Inc., Mountain View, California,
United States) [26] is adopted to transfer the V vector from the smartphone coordinate system, S,
to the earth coordinate system, G, and obtain the pitch of the smartphone. ARBIN invokes the
getRotationMatrix() function to get a rotation matrix, by feeding the sensor readings of the magnetic
sensor and the acceleration sensor. The rotation matrix transformation is used to transform the vectors
and coordinates from the smartphone coordinate system to the earth coordinate system. Based on
the rotation matrix, ARBIN then uses getOrientation() to obtain the orientation, azimuth, and pitch
of the smartphone. Environment noises could affect the correctness of the IMU of the smartphones.
For this reason, ARBIN can be integrated with advanced noise filters or probability models to reduce
the interference. Since sensor calibration and compensation is not the major focus of this work, for the
readers who are interested in this topic, please refer to [28–30].
Figure 9.
Figure Thecoordinates
9. The coordinatesof
ofaa3D
3Dmodel.
model.
4. Experiment
4. Experiment
Our experiment included two parts: in-house experiments and a field trial. The in-house
Our experiment included two parts: in-house experiments and a field trial. The in-house
experiments were undertaken in the Engineering Building (EB) No. 5 of Yunlin University of Science
experiments were undertaken in the Engineering Building (EB) No. 5 of Yunlin University of Science
and Technology (Yuntech). The purpose was to evaluate the orientation determination of a smartphone,
and Technology (Yuntech). The purpose was to evaluate the orientation determination of a
and the correctness of the RSSI model proposed in Section 4.2. After the in-house experiments were
smartphone, and the correctness of the RSSI model proposed in Section 4.2. After the in-house
completed, we then conducted a field trial at the National Taiwan University Hospital YunLin Branch
experiments were completed, we then conducted a field trial at the National Taiwan University
(NTUH-Yunlin) where 35 Lbeacons were deployed in the outpatient area, over 1800 m2 , covering two
floors. Volunteers were invited to evaluate the responsiveness of ARBIN.
Figure 11
Figure 11 shows
showsthetheexperiment
experimentsetup
setupforformeasuring
measuringresponsiveness.
responsiveness. The
The tester
tester walked
walked around
around in
in the building with a normal speed of about one meter per second. As Figure 12 shows,
the building with a normal speed of about one meter per second. As Figure 12 shows, for each Lbeacon, for each
Lbeacon,
the the tester
tester walked walked
back backfive
and forth andtimes.
forth When
five times. When
receiving thereceiving theindicator
directional directional indicator
indicated by
indicated by ARBIN, he stopped and measured the responsiveness distance,
ARBIN, he stopped and measured the responsiveness distance, L, between the smartphone and the L, between the
smartphone
Lbeacon. As and the11
Figure Lbeacon. As Figure
shows, the 11 shows,
L is measured by the L is measured
an infrared by an In
rangefinder. infrared rangefinder.
other words, for eachIn
other words, for each Lbeacon, the tester first walked forward to the Lbeacon, then recorded
Lbeacon, the tester first walked forward to the Lbeacon, then recorded the L after he was notified by the L
after he was
ARBIN’s notifiedinstruction.
directional by ARBIN’sThen,
directional instruction.
the tester Then,After
kept walking. the tester kept
leaving thewalking.
range ofAfter leaving
the Lbeacon
(i.e., a circle with radius of 3 m), he turned around and walked forward to the Lbeacon again from
the opposite direction. The same measurement was conducted again when the tester was close to
the Lbeacon. For each Lbeacon, the tester repeated the above-mentioned experiment for five rounds.
The average values of L in both the forward direction and backward direction are shown in Table 3.
For the Lbeacons placed on a north-south corridor, the forward direction pointed to north, and the
backward direction pointed to south. In addition, for the Lbeacons deployed on an east-west corridor,
the forward direction pointed to east, and the backward direction pointed to west.
In our in-house experiment, the responsiveness distance should have been less than 3 m in order
to create a good user experience. As Table 3 shows, in 92.5% (=37/40) of the test cases, ARBIN could
properly notify the tester when the tester was close to a Lbeacon. However, there was an exception at
Lbeacon A3. When the tester held a Samsung Galaxy smartphone and moved forward to Lbeacon A3,
the smartphone notified the tester earlier than was expected. In other words, the smartphone received
a relatively strong signal when it was 5 m away from Lbeacon A3. A possible solution was to slightly
raise the software threshold of RSSI for Lbeacon A3 to defer the notification. We also found that the
responsiveness distance, L, of the same Lbeacon differed the in forward and backward directions.
The possible reason is that the directional antennas of the Lbeacons may have different abilities to
send out signals in different directions. Although the responsiveness distance may depend on the
user’s arrival direction, it does not affect the ability of ARBIN, in providing users a proper directional
received
receiveda arelatively
relativelystrong
strongsignal
signalwhen
whenititwas
was5 5mmaway
awayfrom
fromLbeacon
LbeaconA3.A3.AApossible
possiblesolution
solutionwaswas
totoslightly raise the software threshold of RSSI for Lbeacon A3 to defer the notification.
slightly raise the software threshold of RSSI for Lbeacon A3 to defer the notification. We also foundWe also found
that
thatthetheresponsiveness
responsivenessdistance,
distance,L,L,ofofthethesame
sameLbeacon
Lbeacondiffered
differedthetheininforward
forwardand andbackward
backward
directions.
directions. The possible reason is that the directional antennas of the Lbeacons may havedifferent
The possible reason is that the directional antennas of the Lbeacons may have different
abilities
Sensors to send
2020, 20, out
5890 signals in different directions. Although the responsiveness distance
abilities to send out signals in different directions. Although the responsiveness distance may depend may depend
15 of 20
on the user’s arrival direction, it does not affect the ability of ARBIN, in providing
on the user’s arrival direction, it does not affect the ability of ARBIN, in providing users a proper users a proper
directional
directionalindicator.
indicator.Furthermore,
Furthermore,the theresults
resultsshowed
showedthatthatthere
therewere
werenonocases
casesofofnotifying
notifyingthe
thetester
tester
indicator.
after he had Furthermore,
already passedthethe
results showed
Lbeacon, that met
which thereour
were no cases of notifying the tester after he had
requirements.
after he had already passed the Lbeacon, which met our requirements.
already passed the Lbeacon, which met our requirements.
Figure
Figure11.
Figure 11.The
The
Theexperimental
experimentalsetup
experimental for
setup
setup measuring
for
for responsiveness.
measuring
measuring responsiveness.
responsiveness.
Figure
Figure12. The process ofofmeasuring the responsiveness.
Figure 12.
12. The
The process
process of measuring
measuring the
the responsiveness.
responsiveness.
The Lbeacon is equipped with a directional antenna with conical beams [32]. It can generate a
3 m range and 60◦ radiation pattern to provide a 3 m horizontal accuracy. However, according to our
experience, the shape of the conical beams is not as perfect as is claimed. Therefore, the responsiveness
distance, L, of the same Lbeacon may differ in forward and backward directions.
Sensors 2020, 20, 5890 16 of 20
Figure13.
Figure 13.The
Thedeployment
deployment maps
maps of
of Lbeacons
Lbeacons in
in NTUH-Yunlin.
NTUH-Yunlin.
Figure
Figure 14.
14. The
The installation
installation of
of Lbeacons
Lbeacons in
in the
the outpatient
outpatient area
area of
of NTUH-Yunlin.
NTUH-Yunlin.
As Table 4 shows, each volunteer passed through nine Lbeacons on their way to the destinations.
They judge the responsiveness whenever they reach a waypoint. The results show that 97% (35/36) of
the user feedbacks were “moderate”, which indicates that ARBIN can properly notify users when they
approach a waypoint. Volunteer C marked A18 as slow because he experienced a deferred notification
when he approached A18. A possible solution could be slightly decreasing the threshold of A18 to make
ARBIN react properly. In our implementation, there was no 3D AR model placed at the destination.
Therefore, we asked volunteers to check if ARBIN could guide them to the destinations successfully.
The results show that ARBIN could successfully guide the volunteers to their destinations.
In addition to the user experience evaluation, we also measured the responsiveness of ARBIN
in the hospital. Samsung Galaxy S10e and Sony Xperia XZ Premium were used for the evaluation.
We evaluated the responsiveness of the ARBIN along the route: Entrance Hall(A6) -> Registration
counter(A11) -> Outpatient clinic(A22)-> X-ray examination room (B3) -> Pharmacy(A25) -> Exit (C1).
There were in total 14 waypoints on the route. As shown in Figure 15, five different walking patterns of
pedestrians were evaluated. The walking speed was around one meter per second. In our experiment,
the area of a waypoint was set at a radius of 5 m. We used (Slow, Moderate, Fast) to represent
responsiveness distance (~0 m, 0 m~5 m, >5 m). For example, if the responsiveness distance was larger
Sensors 2020, 20, 5890 18 of 20
than 5 m, we marked it as fast. No signal indicates that ARBIN had no response when the user entered
the waypoint. We repeated the experiment five time, and summarize the results in Table 5. In the
single mode and side-by-side mode, the responsiveness distance of both smartphones was always in
the range of 0 m to 5 m at every waypoint we measured. In the triangle mode, the response distance of
the Sony Xperia XZ Premium became slow at waypoint B3. Additionally, in the line-up mode and
stagger mode, Sony Xperia XZ Premium had no response in waypoint B3 and A10. The possible reason
may have been the poor design of antenna of the Sony Xperia XZ Premium. The situation could be
improved by deploying one more Lbeacon at these waypoints to reduce the possibility of no reaction
or slow
Sensors reaction.
2020, 20, x FOR PEER REVIEW 18 of 20
Figure 15.
Figure Different walking
15. Different walking patterns
patterns of
of user.
user.
5. Conclusions
5. Conclusions
In this
In this paper,
paper, wewe presented
presented ARBIN,ARBIN, an an augmented reality-based indoor
augmented reality-based indoor navigation
navigation system,
system, to
to
guide users
guide users to
to their
their destinations
destinations in in an
an indoor
indoor environment.
environment. When Whenusers
usersenter
enterthe
therange
rangeofofaawaypoint,
waypoint,
ARBIN posts a 3D directional indicator into the real-world surrounding
ARBIN posts a 3D directional indicator into the real-world surrounding environment. With environment. With the
the
support of
support of augmented
augmented reality,
reality, itit is
is easier
easierfor
forusers
userstotodetermine
determinetheir
theirlocations
locationswhen
whenwalking
walkinginside
insidea
building. To address the heterogeneous problems of Lbeacons, four types
a building. To address the heterogeneous problems of Lbeacons, four types of RSSI model areof RSSI model are proposed.
Experiences
proposed. in correctlyin
Experiences placing
correctlya 3Dplacing
model in a real-world
a 3D model in were also explained.
a real-world were alsoFurther, we conducted
explained. Further,
both
we in-houseboth
conducted experiments
in-house and a field trial
experiments andtoa verify thetoresponsiveness
field trial and practicality
verify the responsiveness of ARBIN.
and practicality
The
of in-house
ARBIN. The experiments showed thatshowed
in-house experiments in 92.5% of in
that the92.5%
test cases,
of theARBIN could
test cases, provide
ARBIN users
could with a
provide
users with a proper directional indicator when they came close to a Lbeacon. For the field trial, four
volunteers were invited. Of the of the user feedbacks, 97% (35/36) were moderate. Our results show
that ARBIN can achieve a 3 to 5 m accuracy, and provide users with correct instructions on their way
to the destinations. ARBIN proved to be a practical solution for indoor navigation, especially for large
buildings. To further enhance user experience, in the future we plan to extend the capability of
Sensors 2020, 20, 5890 19 of 20
proper directional indicator when they came close to a Lbeacon. For the field trial, four volunteers were
invited. Of the of the user feedbacks, 97% (35/36) were moderate. Our results show that ARBIN can
achieve a 3 to 5 m accuracy, and provide users with correct instructions on their way to the destinations.
ARBIN proved to be a practical solution for indoor navigation, especially for large buildings. To further
enhance user experience, in the future we plan to extend the capability of ARBIN by adding landmark
objects into real-world environments, and showing advertisement messages provided by a surrounding
information system.
Author Contributions: Conceptualization, B.-C.H., J.H., E.T.-H.C. and H.-M.W.; Data curation, B.-C.H.;
Formal analysis, B.-C.H.; Funding acquisition, J.H. and E.T.-H.C.; Investigation, J.H., E.T.-H.C. and H.-M.W.;
Methodology, B.-C.H., J.H. and E.T.-H.C.; Project administration, J.H., E.T.-H.C. and H.-M.W.; Resources, J.H. and
E.T.-H.C.; Software, B.-C.H., J.H. and H.-M.W.; Supervision, J.H. and E.T.-H.C.; Validation, J.H. and E.T.-H.C.;
Writing—original draft, B.-C.H.; Writing—review and editing, E.T.-H.C. All authors have read and agreed to the
published version of the manuscript.
Funding: This work was supported by National Taiwan University Hospital YunLin Branch Project NTUHYL
109.C027 and Ministry of Science and Technology, Taiwan.
Conflicts of Interest: The authors declare no conflict of interest.
References
1. Rehman, U.; Cao, S. Augmented-Reality-Based Indoor Navigation: A Comparative Analysis of Handheld
Devices Versus Google Glass. IEEE Trans. Hum. Mach. Syst. 2017, 47, 140–151. [CrossRef]
2. Mulloni, A.; Wagner, D.; Barakonyi, I.; Schmalstieg, D. Indoor positioning and navigation with camera
phones. IEEE Pervasive Comput. 2009, 8, 22–31. [CrossRef]
3. Mulloni, A.; Seichter, H.; Schmalstieg, D. Handheld augmented reality indoor navigation with activity-based
instructions. In Proceedings of the 13th International Conference on Human Computer Interaction with
Mobile Devices and Services, Stockholm, Sweden, 30 August–2 September 2011; pp. 211–220.
4. Huey, L.C.; Sebastian, P.; Drieberg, M. Augmented reality based indoor positioning navigation tool.
In Proceedings of the IEEE Conference on Open Systems, Langkawi, Malaysia, 25–28 September 2011;
pp. 256–260.
5. Kasprzak, S.; Komninos, A.; Barrie, P. Feature-based indoor navigation using augmented reality.
In Proceedings of the 9th International Conference on Intelligent Environments, Athens, Greece, 18–19 July
2013; pp. 100–107.
6. Kim, J.; Jun, H. Vision-based location positioning using augmented reality for indoor navigation. IEEE Trans.
Consum. Electron. 2008, 54, 954–962. [CrossRef]
7. Chu, E.T.-H.; Wang, S.C.; Liu, J.W.S.; Hsu, J.; Wu, H.M. WPIN: A waypoint-based indoor navigation system.
In Proceedings of the IEEE 10th International Conference on Indoor Positioning and Indoor Navigation,
Pisa, Italy, 30 September–3 October 2019.
8. ARCore. Available online: https://developers.google.com/ar (accessed on 25 August 2020).
9. Reitmayr, G.; Schmalstieg, D. Location based applications for mobile augmented reality. In Proceedings of
the 4th Australasian User Interface Conference, Adelaide, SA, Australia, 4–7 February 2003; pp. 65–73.
10. Sato, F. Indoor Navigation System Based on Augmented Reality Markers. In Proceedings of the 11th
International Conference on Innovative Mobile and Internet Services in Ubiquitous Computing, Torino, Italy,
10–12 July 2017; pp. 266–274.
11. Feng, C.; Kamat, V.R. Augmented reality markers as spatial indices for indoor mobile AECFM applications.
In Proceedings of the 12th International Conference on Construction Applications of Virtual Reality, Taipei,
Taiwan, 1–2 November 2012; pp. 235–242.
12. Al Delail, B.; Weruaga, L.; Zemerly, M.J. CAViAR: Context aware visual indoor augmented reality for a
university campus. In Proceedings of the IEEE/WIC/ACM International Conferences on Web Intelligence
and Intelligent Agent Technology, Macau, China, 4–7 December 2012; pp. 286–290.
13. Koch, C.; Neges, M.; König, M.; Abramovici, M. Natural markers for augmented reality-based indoor
navigation and facility maintenance. Autom. Constr. 2014, 48, 18–30. [CrossRef]
Sensors 2020, 20, 5890 20 of 20
14. Romli, R.; Razali, A.F.; Ghazali, N.H.; Hanin, N.A.; Ibrahim, S.Z. Mobile Augmented Reality(AR)
Marker-based for Indoor Library Navigation. IOP Conf. Ser. Mater. Sci. Eng. 2020, 767, 012062.
[CrossRef]
15. Al Delail, B.; Weruaga, L.; Zemerly, M.J.; Ng, J.W.P. Indoor localization and navigation using smartphones
augmented reality and inertial tracking. In Proceedings of the IEEE International Conference on Electronics
Circuits and Systems, Abu Dhabi, UAE, 8–11 December 2013; pp. 929–932.
16. Nam, G.H.; Seo, H.S.; Kim, M.S.; Gwon, Y.K.; Lee, C.M.; Lee, D.M. AR-based Evacuation Route Guidance
System in Indoor Fire Environment. In Proceedings of the 25th Asia-Pacific Conference on Communications
(APCC), Ho Chi Minh City, Vietnam, 6–8 November 2019; pp. 316–319.
17. Wu, J.; Huang, C.; Huang, Z.; Chen, Y.; Chen, S. A Rapid Deployment Indoor Positioning Architecture based
on Image Recognition. In Proceedings of the IEEE 7th International Conference on Industrial Engineering
and Applications (ICIEA), Bangkok, Thailand, 16–21 April 2020; pp. 784–789.
18. Gerstweiler, G.; Vonach, E.; Kaufmann, H. HyMoTrack: A mobile AR navigation system for complex indoor
environments. Sensors 2015, 16, 17. [CrossRef] [PubMed]
19. Metaio SDK. Available online: http://www.metaio.com/products/sdk (accessed on 25 August 2020).
20. Rustagi, T.; Yoo, K. Indoor AR navigation using tilesets. In Proceedings of the 24th ACM Symposium on
Virtual Reality Software and Technology, Tokyo, Japan, 28 November–1 December 2018; pp. 1–2.
21. MapBox. Available online: https://docs.mapbox.com/api/ (accessed on 25 August 2020).
22. Unity. Available online: https://unity.com/ (accessed on 25 August 2020).
23. Koc, I.A.; Serif, T.; Gören, S.; Ghinea, G. Indoor Mapping and Positioning using Augmented Reality.
In Proceedings of the 7th International Conference on Future Internet of Things and Cloud (FiCloud),
Istanbul, Turkey, 26–28 August 2019; pp. 335–342.
24. ARKit. Available online: https://developer.apple.com/documentation/arkit (accessed on 25 August 2020).
25. Choi, H.; Lim, K.; Ko, Y. Improved Virtual Anchor Selection for AR-assisted Sensor Positioning in Harsh
Indoor Conditions. In Proceedings of the Global Internet of Things Summit (GIoTS), Dublin, Ireland,
3 June 2020; pp. 1–6.
26. Android Sensor Manager. Available online: https://developer.android.com/reference/android/hardware/
SensorManager (accessed on 25 August 2020).
27. Shuster, M.D. A survey of attitude representations. Navigation 1993, 8, 439–517.
28. Kok, M.; Hol, J.D.; Schön, T.B.; Gustafsson, F.; Luinge, H. Calibration of a magnetometer in combination with
inertial sensors. In Proceedings of the 15th International Conference on Information Fusion, Singapore, 9–12
July 2012; pp. 787–793.
29. Wang, Y.; Zhang, J.Y.; Zhang, D.W. Error Analysis and Algorithm Validation of Geomagnetic Sensor.
Appl. Mech. Mater. 2015, 742, 21–26. [CrossRef]
30. Zhou, Y.; Zhang, X.; Xiao, W. Calibration and compensation method of three-axis geomagnetic sensor based
on pre-processing total least square iteration. J. Instrum. 2018, 13, T04006. [CrossRef]
31. ViroCore SDK. Available online: https://viromedia.com/virocore (accessed on 25 August 2020).
32. Li, C.C.; Su, J.; Chu, E.T.-H.; Liu, J.W.S. Building/environment Data/information Enabled Location Specificity
and Indoor Positioning. IEEE Internet Things J. 2017, 6, 2116–2128. [CrossRef]
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional
affiliations.
© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access
article distributed under the terms and conditions of the Creative Commons Attribution
(CC BY) license (http://creativecommons.org/licenses/by/4.0/).