Sensors: ARBIN: Augmented Reality Based Indoor Navigation System

Download as pdf or txt
Download as pdf or txt
You are on page 1of 20

sensors

Article
ARBIN: Augmented Reality Based Indoor
Navigation System
Bo-Chen Huang 1 , Jiun Hsu 2 , Edward T.-H. Chu 1, * and Hui-Mei Wu 2
1 Department of Computer Science and Information Engineering, National Yunlin University of Science and
Technology, Yunlin 64002, Taiwan; [email protected]
2 National Taiwan University Hospital YunLin Branch, Yunlin 640203, Taiwan; [email protected] (J.H.);
[email protected] (H.-M.W.)
* Correspondence: [email protected]; Tel.: +886-05-534-2601-4519

Received: 27 August 2020; Accepted: 12 October 2020; Published: 17 October 2020 

Abstract: Due to the popularity of indoor positioning technology, indoor navigation applications
have been deployed in large buildings, such as hospitals, airports, and train stations, to guide visitors
to their destinations. A commonly-used user interface, shown on smartphones, is a 2D floor map
with a route to the destination. The navigation instructions, such as turn left, turn right, and go
straight, pop up on the screen when users come to an intersection. However, owing to the restrictions
of a 2D navigation map, users may face mental pressure and get confused while they are making a
connection between the real environment and the 2D navigation map before moving forward. For this
reason, we developed ARBIN, an augmented reality-based navigation system, which posts navigation
instructions on the screen of real-world environments for ease of use. Thus, there is no need for users
to make a connection between the navigation instructions and the real-world environment. In order to
evaluate the applicability of ARBIN, a series of experiments were conducted in the outpatient area of
the National Taiwan University Hospital YunLin Branch, which is nearly 1800 m2 , with 35 destinations
and points of interests, such as a cardiovascular clinic, x-ray examination room, pharmacy, and so on.
Four different types of smartphone were adopted for evaluation. Our results show that ARBIN can
achieve 3 to 5 m accuracy, and provide users with correct instructions on their way to the destinations.
ARBIN proved to be a practical solution for indoor navigation, especially for large buildings.

Keywords: augmented reality; Bluetooth; indoor positioning system; indoor navigation system;
smart hospital

1. Introduction
Due to the advance of the internet of things and business opportunities, indoor navigation systems
have been deployed in many large buildings, such as big train stations, shopping malls, hospitals,
and government buildings. After installing a navigation mobile app, users can select a point of interest
on a menu list. Then, the app will determine a route to the destination, which is usually the shortest
path. Nowadays, the most commonly used user interface (UI) of navigation applications is a 2D map
with a route. Users are provided with navigation instructions, such as turn left, turn right, and go
straight, when they are close to an intersection. However, due to the limitations of a 2D navigation
map, it could add an additional cognitive load for users to construct the relationship between the 2D
navigation map and the real environment. Extra mental pressure may also be induced and make users
confused [1]. Therefore, eliminating possible user confusion is important for navigator UI design.
In order to create a good user experience, several research efforts have been devoted to developing
an indoor navigation system by utilizing augmented reality (AR) technology. A. Mulloni et al. [2,3]
and L. C. Huey et al. [4] deployed markers as location anchors in the environment. A user can know

Sensors 2020, 20, 5890; doi:10.3390/s20205890 www.mdpi.com/journal/sensors


Sensors 2020, 20, 5890 2 of 20

their location by matching the markers with the associated location information stored at a remote
server or on a user’s phone. However, the angle of the camera must be in proper alignment with
markers before the matching process can start. In addition, the markers could get dirty easily and
become unrecognizable, therefore increasing maintenance costs. S. Kasprzak et al. [5], J. Kim et al. [6]
first performed an image search for pre-tagged objects, such as billboards and trademarks, in the
environment, and then determined the user’s location based on the obtained objects. However, the more
complicated the environment is, the more difficult it will be to identify pre-tagged objects. The image
matching processing becomes even challenging when the layout and decoration of different parts of
the space are similar. Feature matching is another method to determine user’s location [1]. However,
constructing point clouds of a real indoor environment is time consuming and costly, especially for a
large building.
In this paper, we designed ARBIN, an augmented reality-based navigation system, by extending
our previous work, WPIN [7]. WPIN utilized Bluetooth low energy (BLE) beacons, named Lbeacons
version 1 (BiDaE Technology, Taipei, Taiwan), deployed at each intersection and point of interest
(POIs), to get the coordinates of the current position. 2D images, such as turn left, turn right, and go
straight, were provided to users as direction indicators along the route to the destination. Unlike WPIN,
ARBIN uses AR technology that combines virtual objects and the real world. Navigation instructions,
as well as AR 3D models, are posted on the screen on the surrounding environment through the
smartphone camera. Therefore, there is no need for users to make a connection between the navigation
instructions and the real-world environment. In our implementation, Google ARCore (Google Inc.,
Mountain View, California, United States) [8] is adopted to create AR 3D models, obtain gyroscope
sensor readings, and determine where to put the models. Accuracy is the key factor for the success of
an AR-based indoor navigator. The difficulties in achieving accuracy of indoor positioning, and that of
AR 3D model placement are described as follows.
In WPIN [7], Lbeacons were adopted at waypoints to periodically broadcast their own coordinates
to smartphones nearby. A waypoint can be an intersection, a point of interest (POI), or the middle
of a corridor. After receiving a broadcast message sent from a Lbeacon, the positioning module,
running on the user’s smartphone, starts to estimate the distance between itself and the Lbeacon
according to a RSSI (received signal strength indicator) distance model. The stronger the received signal
is, the closer the user is to the Lbeacon. When the user and the Lbeacon are close enough, for example
less than 5 m, a new direction indicator will pop up to guide the user to the next waypoint. The above
process continues until the user arrives the destination. However, because of the machine cutting error,
the size of the antenna board of each Lbeacon may not be identical, which could affect its capability for
transmitting and receiving signals. Furthermore, the characteristics of the RF (Radiofrequency) circuit
of each Lbeacon may also be different due to the nature of an analogy circuit. Therefore, the RSSI
distance model of each Lbeacon is not exactly identical according to our experience. In our previous
work, to achieve the required positioning accuracy, we constructed a RSSI model for each Lbeacon,
which was time consuming and unscalable. To overcome this unavoidable and challenging hardware
problem, a novel RSSI modeling method was developed to overcome the problem of the heterogeneous
issues of Lbeacons, which is given in Section 3.2.
The AR 3D models, such as a left arrow or a right arrow, should be placed properly in a real-world
environment to avoid possible user confusion. An inaccurate placement of the 3D model may make
users confused, and avoid using it. For example, displaying a 3D model in the wrong orientation,
an incorrect elevation angle, or an incorrect depression angle. Many parameters should be carefully
considered before having the correct placement of a 3D model, such as the face orientation of a user,
the location, and orientation of the smartphone. Constructing a relationship between these parameters
and the coordinates of a 3D model is challenging. The detailed method is presented in Section 3.4 to
Section 3.5.
In order to investigate the applicability of ARBIN, we first evaluated the responsiveness of the
positioning module of ARBIN. We then set up a field trial in a hospital. For the former, we conducted a
Sensors 2020, 20, 5890 3 of 20

series of experiments in the engineering building No. 5 of the National Yunlin University of Science
and Technology, crossing three floors with a total area of around 250 m2 . The experiment results
showed that the adopted RSSI (received signal strength indicator) model could accurately determine
the distance between a Lbeacon and a smartphone. Thus, the AR models could be displayed correctly
on the smartphone screen. Furthermore, a field trial was conducted at the outpatient area of the
National Taiwan University Hospital YunLin Branch, which is nearly 1800 m2 , with 35 destinations
and point of interests, such as a cardiovascular clinic, x-ray examination room, pharmacy, and so on.
Four different types of smartphone were adopted for evaluation. Our results show that ARBIN can
achieve 3 to 5 m accuracy and give users correct instructions on their ways to the destinations. ARBIN
proved to be a practical solution for indoor navigation, especially for large buildings.

2. Related Work
Due to the promise of providing a user-friendly interface to users, several researchers have utilized
AR technologies to develop indoor navigation applications. Based on the positioning technologies
they used, the existing AR-based navigations can be classified into three types: marker-based method,
2D image recognition-based method, and 3D space recognition-based method. Each of them is
described as follows.

2.1. Marker-Based Methods


For marker-based methods, markers are first deployed in an indoor environment, and function
as location anchors. A marker can be a QR-Code or a specially-designed pattern. The universally
unique identifier (UUID) and coordinate of each marker are pre-stored in either the local storage of
a smartphone or a remote database for future queries. When finding a marker, the user points the
smartphone camera at the marker and scans it. The scanned image is then used for determining
the user’s location and the place to put a 3D AR model. A. Mulloni et al. [2,3], L. C. Huey et al. [4],
G. Reitmayr et al. [9], F. Sato [10], and C. Feng et al. [11] used highly recognizable pictures as markers.
Each marker is regarded as a node of the navigation route. When a user comes to a marker and aims the
camera at it, a 3D arrow model will be shown on the screen of smartphone to guide the user to the next
node. Although markers are easy to deploy, extra user training may be needed to make a marker-based
navigation system successful. For example, users should be able to identify markers in the surrounding
environment before getting location information and moving forward. It is particularly difficult for a
user who has no idea of what a marker looks like. In addition, the camera should be aligned with the
marker to ensure the correctness of marker recognition, which makes it user-unfriendly. Maintenance
could also be a critical issue when markers get dirty and become unrecognizable. Owing to the above
limitations, marker-based navigation is not considered in our work.

2.2. 2D Image Recognition-Based Methods


Unlike marker-based methods, 2D image recognition-based methods search for pre-annotated
objects, such as billboards, trademarks, and signs in the environment, and then determine the user’s
location based on the obtained objects [5,6,12–15]. With the development of image recognition
technology, there is a trend for replacing marker-based navigation with image recognition-based
navigation. Although the image recognition-based methods may not have the problem of maintenance,
they could fail when two objects are similar and undifferentiable. For example, in a large shopping
mall, chairs, signboards, and decorations are usually designed in a similar way, which makes it difficult
for image recognition. Many other factors can also affect the accuracy of image recognition, such as the
view angle of the camera, the distance between the objects and the user, the number of moving objects
in the surrounding environment, and so on. Since all these issues should be well addressed before an
indoor navigation system can well function in a large and crowded building, we decided not to use
an image recognition-based method. Both G. H. Nam et al. [16] and J. Wu et al. [17] adopted image
recognition technology for indoor navigation. However, many factors can also affect the accuracy of
Sensors 2020, 20, 5890 4 of 20

image recognition, such as the view angle of camera, the distance between the objects and the user,
the number of moving objects in the surrounding environment, and so on.

2.3. 3D Space Recognition-Based Methods


Compared to 2D image recognition-based methods, 3D space recognition-based methods collect
features of the entire space rather than the features of 2D objects. During runtime, a feature matching
mechanism is first used to determine a user’s location, and then an AR engine is adopted to draw a 3D
model for direction indication. For the implementation, G. Gerstweiler et al. [18], U. Rehman et al. [1]
used Metaio SDK (Metaio, Munich, Germany) [19] to create a 3D point cloud for indoor positioning.
T. Rustagi et al. [20] used the MapBox API (Mapbox, San Francisco, California, United States) [21]
to collect vector data of the indoor environment, and create a corresponding 3D model. The Unity
(Unity Technologies, San Francisco, California, United States) [22] game engine was adopted in their
system to show AR models. Although 3D could deliver better indoor positioning accuracy, construing
3D models is costly and time consuming, especially for large buildings. In addition, re-modeling is
required when indoor layout is changed, which usually happens to shopping malls and exhibition
halls. Furthermore, moving objects inside the space could significantly reduce positioning accuracy.
The more crowded the environment is, the lower positional accuracy will be. A. Koc et al. [23]
adopted ARKit (Apple Inc., Cupertino, California, United States) [24] to build an AR-based indoor
navigation. Similar to ARCore, ARKit constructs an AR world with associated 3D information.
However, the proposed method requires the user to create coordinate information of all corners in the
indoor environment, which makes it time consuming and error prone. In addition, their experimental
results showed that the accumulated error becomes significant when the place is large. H. Choi et al. [25]
proposed a virtual anchor (VA) point selection method for AR-assisted sensor positioning. According to
their definition, a VA is a positioning reference point used by a UWB (Ultra-Wideband) positioning
mechanism. The more VAs are selected, the more time is required to determine the user’s location.
Since UWB devices are required, the issues of hardware cost and energy consumption should also be
addressed before a deployment.
Unlike existing feature-matching methods, ARBIN utilized Lbeacons deployed at each intersection
and POIs to get the coordinates of the current position. ARBIN performs well in crowded spaces due
to the advantage of direction antenna built into Lbeacons, which can adjust transmission power and
beam width to properly cover navigation areas. Finally, ARBIN is easy to configure and maintain.

3. Methodology

3.1. System Overview


As Figure 1 shows, ARBIN consists of four modules: indoor positioning, route planning, motion
tracking, and AR 3D model placement. At the beginning, the destination selected by the user is sent to
the route planning module for determining of a route to the destination (Step 1). The underneath indoor
positioning module continuously updates the user’s location based on the received BLE advertisement
messages and the associated RSSI (Step 2). When the user comes to a waypoint, the route planning
module sends a message including the expected face orientation and directional indicator to the AR
placement module (Step 3). The AR placement relies on the motion tracking module to obtain the
direction (azimuth) and the pitch of the smartphone from the IMU (Inertial Measurement Unit) (Step 4).
Based on the collected information, the placement module overlays a 3D arrow model, such as turn
left or turn right, on the real-world image (Step 5).
Figure 2 lists the user interface of the ARBIN App. Frequently asked destinations are shown on
the main page (Figure 2a). After a user selects a destination on the list (Figure 2b), ARBIN determines
the user’s current location and a route with the shortest distance to the destination. At the beginning,
the user is asked to face a specific direction before the navigation service starts (Figure 2c). In other
words, the navigation service will not start until the user faces the expected orientation. On the way to
Sensors 2020, 20, 5890 5 of 20

Sensors 2020, 20, xx FOR


FOR PEER
PEER REVIEW
REVIEW 5 5ofof2020
the destination, a 3D indicator will be placed in the real-world environment when the use approaches
anapproaches
intersectionanorintersection
a point of interest, such asinterest,
stairs or elevators (Figureelevators
2d–g). The navigation service
intersection or or aa point
point of
of interest, such
such asas stairs
stairs or
or elevators (Figure
(Figure2d–g).
2d–g).The
The
stops when the
navigation user arrives
service
service stops
at the the
stops when
when the
destination.
user Finally,
user arrives
arrives at
a message
at the
popsFinally,
the destination.
up to reminder
destination. Finally, aamessage thepops
messagepops
userup
that the
uptoto
navigation
reminder service
reminder the
the useris finished
user that
that the (Figure
the navigation 2h).
navigation service
serviceisisfinished
finished(Figure
(Figure2h).
2h).

Figure 1.
1. System architecture of ARBIN.
Figure 1.System
Figure System architecture ARBIN.
architecture of ARBIN.

(a) (b) (c) (d)


(a) (b) (c) (d)

(e) (f) (g) (h)


(e) (f) (g) (h)
Figure 2. The user interface of ARBIN. (a) Main page of ARBIN; (b) Destination list; (c) Start
Figure
Figure 2. The
2. The
navigation useruser
service; interface
interface
(d–g) 3D of ARBIN.
ofindicator
ARBIN. (a) (a) page
Main Maininstruction;
of navigation ofpage of ARBIN;
ARBIN; (b) (b) message.
Destination
(h) Destination
Arrival list; navigation
list; (c) Start (c) Start
navigation
service; (d–g)service; (d–g) 3D
3D indicator indicator ofinstruction;
of navigation navigation instruction; (h) Arrival message.
(h) Arrival message.
Sensors 2020, 20, 5890 6 of 20
Sensors 2020, 20, x FOR PEER REVIEW 6 of 20

3.2.Indoor
3.2. IndoorPositioning
PositioningModule
Module
Thepurpose
The purposeofofthe thepositioning
positioningmodule
moduleisistotodetermine
determinethe theuser’s
user’slocation.
location.As AsFigure
Figure3a 3ashows,
shows,
Lbeaconsare
Lbeacons aredeployed
deployedatatwaypoints.
waypoints. InIn this
this work,
work, a waypoint
a waypoint is defined
is defined as intersection,
as an an intersection, a point
a point of
of interest, or the middle of a corridor. Each Lbeacon periodically
interest, or the middle of a corridor. Each Lbeacon periodically broadcasts its coordinate information broadcasts its coordinate
toinformation
smartphones tonearby.
smartphones
From the nearby. Fromofthe
view point view his
the user, point of smartphone
or her the user, his or her smartphone
continuously receives
continuously
the coordinate receives
information the coordinate information
sent by Lbeacons nearby, sent by Lbeacons
while determiningnearby,
howwhilefar thedetermining
smartphone how
is
far the
from thesmartphone is fromIfthe
closest Lbeacon. closest
the Lbeacon.
distance betweenIf the
thedistance between
smartphone and thea smartphone
Lbeacon is close and aenough,
Lbeacon
is close
for example enough, for navigation
3 m, the example 3 m, app the navigation
provides app provides
the user the user with
with a directional a directional
indicator to guide indicator
him or herto
toguide himwaypoint.
the next or her to the Annext waypoint.
illustrated An illustrated
example is shown example
in Figureis3b,shown in Figure
the route starts 3b,
fromthewaypoint
route startsA
from
and waypoint
ends A andC.ends
at waypoint The at
userwaypoint C. The
first receives userstraight”
a “go first receives a “gowhen
command straight” command
entering the areawhenof
entering A,
waypoint theand
areathen
of waypoint
a “turn left”A, command
and then aat“turn left” command
waypoint at waypoint
B. The coverage size of aB.waypoint
The coverage
depends size
onof the
a waypoint
size of the depends on theorsize
intersection theofpoint
the intersection
of interest. or The the pointthe
larger of interest.
coverageThe arealarger
is, thethe coverage
larger the
area is,
range of the larger the
a waypoint is.range
In ourofimplementation,
a waypoint is. Inthe ourcoverage
implementation, the coverage
size of a waypoint is asize
3-m,of5-m,
a waypoint
or 7-m
is a 3-m,
radius 5-m, The
circle. or 7-m
keyradius
factorcircle. The key factor for
for waypoint-based waypoint-based
navigation success navigation
is accurately success is accurately
determining the
determining
distance betweenthe distance
the user andbetween the user and
the Lbeacons. Forthe
this,Lbeacons. For this,
in our previous in our
work [7],previous work [7],
RSSI distance RSSI
models
distance
stored on models stored onwere
the smartphone the smartphone were adopted
adopted to estimate to estimate
the distance. the distance.
However, because However, because
of the machine
of the machine
cutting error andcutting error and theofcharacteristics
the characteristics the RF circuit,of thethe RF circuit,
RSSI distancethe RSSI of
model distance model of
each Lbeacon is each
not
Lbeacon To
identical. is not identical.
achieve To achieve
the required the required
positioning accuracy,positioning accuracy,
we constructed we constructed
a RSSI model for each a RSSI model
Lbeacon,
foriteach
but wasLbeacon, but it was
time consuming andtime consuming and unscalable.
unscalable.

Figure 3. The positioning method of waypoint-based navigation [7]. (a) User and Lbeacon; (b) waypoint
Figure 3. The positioning method of waypoint-based navigation [7]. (a) User and Lbeacon; (b)
and Lbeacons.
waypoint and Lbeacons.
To overcome this problem, in this work we first analyzed the characteristics of RSSI models of
To overcome this problem, in this work we first analyzed the characteristics of RSSI models of
about 24 randomly selected Lbeacons, from 70 Lbeacons. We then classified them into four types.
about 24 randomly selected Lbeacons, from 70 Lbeacons. We then classified them into four types. For
For each type of Lbeacon, only one RSSI model was used. Because the navigator must give a user a
each type of Lbeacon, only one RSSI model was used. Because the navigator must give a user a
directional indicator when he/she enters the coverage of a waypoint, we mainly focused on the behavior
directional indicator when he/she enters the coverage of a waypoint, we mainly focused on the
of the RSSI curve in the range of 0 to 3 m, 3 to 5 m, and 5 to 7 m. As Figure 4 shows, we measured
behavior of the RSSI curve in the range of 0 to 3 m, 3 to 5 m, and 5 to 7 m. As Figure 4 shows, we
the RSSI values at the locations 1 m to 7 m away from a Lbeacon. For each location, we collected
measured the RSSI values at the locations 1 m to 7 m away from a Lbeacon. For each location, we
one-minute of RSSI samples (i.e., 240 samples) and took the average as the result. The measurement
collected one-minute of RSSI samples (i.e., 240 samples) and took the average as the result. The
stops when the seven locations have been measured.
measurement stops when the seven locations have been measured.
As shown in Figure 5a, four Lbeacons, numbered A1, A2, A3, and A4, were classified as type 1,
in which the RSSI values drops inversely to the distance, in the range of 0–3 m and 5–7 m. Therefore,
Type 1 Lbeacons are suitable to cover a waypoint with a radius of 3 m and 7 m. Similarly, Type 2
Lbeacon are only suitable to cover a waypoint with a radius of 3 m. Meanwhile, Type 3 Lbeacons are
suitable for a waypoint with radius of 3, 5, or 7 m since the RSSI values drops inversely to all the
distances we measured. Type 4 Lbeacons are suitable for a waypoint with a radius of 5 m. Based on
the measurement, for each type of Lbeacon, we adopted a polynomial function as a regression model
to represent the relationship between the distance and the RSSI values. Results are shown in Figure
Sensors 2020, 20, x FOR PEER REVIEW 7 of 20
6. Given a new and unknown type of Lbeacon, we first classified it into one of the four types based
on measurement,
the the characteristicforof its RSSI
each curve.
type of A RSSI
Lbeacon, wemodel wasa then
adopted picked from
polynomial the as
function RSSI models shown
a regression model in
Sensors 2020, 20, 5890 7 of 20
Figure
to 6. In the
represent Section 5, the correctness
relationship of the
between the proposed
distance andmodels
the RSSI shown in Results
values. Figure 6are
is evaluated.
shown in Figure
6. Given a new and unknown type of Lbeacon, we first classified it into one of the four types based
on the characteristic of its RSSI curve. A RSSI model was then picked from the RSSI models shown in
Figure 6. In Section 5, the correctness of the proposed models shown in Figure 6 is evaluated.

Figure 4. Collecting received signal strength indicator (RSSI) samples at different distances.
Figure 4. Collecting received signal strength indicator (RSSI) samples at different distances.
As shown in Figure 5a, four Lbeacons, numbered A1, A2, A3, and A4, were classified as type 1,
To tolerate
in which the RSSIthe variability
values drops of RSSI values,
inversely to thewe considered
distance, in the the RSSIof
range values
0–3 mofandLbeacons
5–7 m. nearby.
Therefore,Let
and represent
Type 1 Lbeacons the highest
are suitable
Figure 4. Collecting and
to cover
received the
signal second-highest
a waypoint RSSI
with a(RSSI)
strength indicator detected
radius by
of 3 matand
samples the smartphone.
Similarly, Type 2is
7 m. distances.
different The
the RSSI of waypoint and the is that of waypoint, . Since
Lbeacon are only suitable to cover a waypoint with a radius of 3 m. Meanwhile, Type is the highest, the user3isLbeacons
regarded
as being
are at for
suitable waypoint
To tolerate athe
waypoint . Based
variability ofon
with the values,
RSSI
radius RSSI
of 3,models,
5,we 7 mwe canthe
or considered
since obtain
theRSSIthe
RSSI theoretical
values
values value of
of Lbeacons
drops inverselynearby.and
to all Let
the
at waypoint
and we
distances i; that
represent
measured. is, and
the highest . If – >=
and the second-highest
Type 4 Lbeacons – , the
are suitable forRSSI user’s location
detectedwith
a waypoint by the is updated
smartphone.
a radius to waypoint
of 5 m. The
Based on isi.
OnRSSI
the the other hand,
of waypoint
measurement, forif each– type
and the< of Lbeacon,
– that
is , the Si isadopted
of we considered
waypoint, .aSinceas a signal
polynomialis the surge
highest,
function and awill
asthe be is
user filtered
regressionregardedout.
model
as being at waypoint
to represent . Based
the relationship on the the
between RSSI models,and
distance wethecanRSSI
obtain the theoretical
values. Results arevalue
shown ofin Figure
and 6.
Given
at a newi;and
waypoint thatunknown
is, andtype. of
If Lbeacon,
– >=we first
– classified it into
, the user’s one of
location the four types
is updated based on
to waypoint i.
the characteristic of
On the other hand, if its RSSI
– curve.
< A RSSI model was then picked from the RSSI models shown
– , the Si is considered as a signal surge and will be filtered out. in
Figure 6. In Section 5, the correctness of the proposed models shown in Figure 6 is evaluated.

(a) (b)

(a) (b)

(c) (d)
Figure 5. Four types of RSSI distance models. (a) Type 1; (b) Type 2; (c) Type 3; (d) Type 4.

(c) (d)
Figure
Figure5.
5.Four
Fourtypes
typesof
ofRSSI
RSSIdistance
distancemodels.
models.(a)
(a)Type
Type1;1;(b)
(b)Type
Type 2;
2; (c)
(c) Type
Type 3;
3; (d)
(d) Type 4.
Type 4.
All Lbeacons were classified into four types. As Figure 6a,d show, each type has its own RSSI
model. The RSSI model used to determine the distance between a user’s smartphone and a Lbeacon
depended on the type of the Lbeacon. When getting close to a Lbeacon, the smartphone uses received
UUID to look up the type of Lbeacon and its associated RSSI model, pre-stored in the smartphone.

Sensors 2020, 20, 5890 8 of 20

(a) (b)

(c) (d)

Figure 6. The regression model of the distance and the RSSI values. (a) The regression model of type
1 Lbeacon; (b) The regression model of type 2 Lbeacon; (c) The regression model of type 3 Lbeacon;
Figure
(d) 6. The regression
The regression model model
of typeof the distance and the RSSI values. (a) The regression model of type
4 Lbeacon;.
1 Lbeacon; (b) The regression model of type 2 Lbeacon; (c) The regression model of type 3 Lbeacon;
To
(d)tolerate the variability
The regression model ofoftype
RSSI values, we considered the RSSI values of Lbeacons nearby. Let Si
4 Lbeacon;.
and S j represent the highest and the second-highest RSSI detected by the smartphone. The Si is the
3.3. Route
RSSI Planning
of waypoint Module
i and the S j is that of waypoint, j. Since S j is the highest, the user is regarded as being
at waypoint i. Based on
After receiving thethe RSSI models,
information we canlocation
of user’s obtain the
andtheoretical value
destination, of Siin
shown and S j at 1,
Figure waypoint i;
the route
that is, Ś i and Ś j . If Si – S j >= Śi – Ś j , the user’s location is updated to
planning (RP) module determines a route to the destination by the well-known Dijkstra’s shortest waypoint i. On the other hand,
ifpath S j < Śi – Ś jBased
Si – algorithm. , the Sionis considered
the route, the as RP
a signal
modulesurge and will
updates thebe
ARfiltered
modelout.placement module with a
direction indicator and an expected face orientation when the user comes totype
All Lbeacons were classified into four types. As Figure 6a,d show, each has its own
a waypoint. TheRSSI
two
model. The RSSI model used to determine the distance between a
pieces of information are then used for placing a 3D model on the real-world environment. Foruser’s smartphone and a Lbeacon
depended
example, as onFigure
the type 3bof the Lbeacon.
shows, the userWhen starts getting close A
at waypoint to and
a Lbeacon,
moves the smartphone
to waypoint uses received
B. When the user
UUID
entersto the look up theof
coverage type of Lbeacon
waypoint B, the and its associated
expected RSSI model,
face orientation pre-stored
is east. in the
After the usersmartphone.
turns left and
moves forward, his/her expected face orientation at waypoint C is north. For ARBIN, at each
3.3. Route Planning Module
waypoint, if the user’s orientation is not the same as the expected face orientation, the associated
After receiving
directional indicatorthe
willinformation
not show in ofthe
user’s locationenvironment.
real-world and destination, shown in
A warning Figure will
message 1, thepop
route
up
planning
to remind(RP)the module
user, whendetermines
needed.aIfroute
this to the destination
happens, possible by the well-known
reasons are that theDijkstra’s shortest
user is going the
path
wrong algorithm. Based
way, or that theon thedoes
user route, theface
not RP to
module updatesorientation.
the expected the AR model
Theplacement
route will module with a
be recalculated
direction indicator
if the user is foundand anunexpected
at an expected face orientation
waypoint. In when the user comes to
our implementation, thea waypoint.
orientationThe two pieces
is obtained by
of information are then used for placing a 3D model on the real-world environment.
IMU (inertial measurement unit) sensors of the smartphone. ARBIN uses the getOrientation() of For example,
as Figure 3b shows, the user starts at waypoint A and moves to waypoint B. When the user enters
the coverage of waypoint B, the expected face orientation is east. After the user turns left and moves
forward, his/her expected face orientation at waypoint C is north. For ARBIN, at each waypoint,
if the user’s orientation is not the same as the expected face orientation, the associated directional
indicator will not show in the real-world environment. A warning message will pop up to remind
Sensors 2020, 20, 5890 9 of 20

the user, when needed. If this happens, possible reasons are that the user is going the wrong way,
or that the user does not face to the expected orientation. The route will be recalculated if the user
is found at an unexpected waypoint. In our implementation, the orientation is obtained by IMU
Sensors 2020,
(inertial 20, x FOR PEERunit)
measurement REVIEWsensors of the smartphone. ARBIN uses the getOrientation() of Android 9 of 20

Sensor Manager [26] to obtain the orientation. In the above-mentioned example, if B and C are
Android Sensor Manager [26] to obtain the orientation. In the above-mentioned example, if B and C
not detected when the user arrives at D, ARBIN will recalculate the route. Then D will be a new
are not detected when the user arrives at D, ARBIN will recalculate the route. Then D will be a new
starting point.
starting point.
Let R denote the expected orientation at a waypoint. The R is an integer between 0 to 7, each of
Let R denote the expected orientation at a waypoint. The R is an integer between 0 to 7, each of
which represents a type of orientation, shown in Figure 7a. For example, R = 1 is northeast while R = 2
which represents a type of orientation, shown in Figure 7a. For example, R = 1 is northeast while R =
is east. After the user passes through a waypoint, the R is updated by how many degrees the user
2 is east. After the user passes through a waypoint, the R is updated by how many degrees the user
turns to the new orientation. For example, for a turn right instruction, the R is updated by adding
turns to the new orientation. For example, for a turn right instruction, the R is updated by adding
90◦ . Additionally, for a turn left instruction, the R is updated by adding 270◦ . Since there are only 8
90°. Additionally, for a turn left instruction, the R is updated by adding 270°. Since there are only 8
types of directional indicator in our implementation, we use L, an integer between 0 to 7, to represent
types of directional indicator in our implementation, we use L, an integer between 0 to 7, to represent
the turning angle of a directional indicator. The definition of each value of L is given in Figure 7b.
the turning angle of a directional indicator. The definition of each value of L is given in Figure 7b.
When
When the the user
user enters
enters a
a waypoint,
waypoint, R R is
is updated
updated by (R ++L)
by (R L)mod
mod8.8.For
Forexample,
example,ininFigure
Figure3b,
3b,atatthe
the
beginning,
beginning, the the user
user faces to the
faces to the east and RR is
east and is2.2.When
Whenthetheuser
usercomes
comestotowaypoint
waypointB,B,the
theexpected
expectedfaceface
orientation is east. After the user turns left and moves forward, the expected face orientation
orientation is east. After the user turns left and moves forward, the expected face orientation R at the R at the
waypoint C is updated to 0 (= (2 + 6) mod 8), which
waypoint C is updated to 0 (= (2 + 6) mod 8), which is north. is north.

(a) (b)

Figure 7. The
The definition
definition of
of orientation
orientationand
anddirectional
directionalindicators.
indicators.(a)
(a)RRvalues;
values;(b)
(b)L Lvalues.
values.

3.4.
3.4. Motion Tracking Module
Motion Tracking Module
The motion tracking
tracking module
module aims
aims toto determine
determinethe thedirection
direction(azimuth)
(azimuth)and andthethepitch
pitchofofthe
the
smartphone
smartphone based on the magnetic sensor and the acceleration sensor of a smartphone.
magnetic sensor and the acceleration sensor of a smartphone. Since the Since the
coordinate
coordinate system of the the smartphone
smartphone andand earth
earthare
aredifferent,
different,transformation
transformationisisneeded
neededbefore
beforethethe
sensor
sensor readings
readings cancan be
beused.
used. As
As shown
shownin inFigure
Figure8a,8a,ininour
ourusage
usagescenario,
scenario,the
thesmartphone
smartphoneshould
shouldbe
kept
be keptupright so that
upright a 3Da model
so that 3D modelcan can
be properly put onto
be properly a reala environment.
put onto If theIfsmartphone
real environment. is laid
the smartphone
flat, shown
is laid in Figure
flat, shown 8b, a warning
in Figure message
8b, a warning will bewill
message provided to remind
be provided the user.
to remind the Let
user.vector V be V
Let vector the
be the heading
heading direction direction of the smartphone.
of the smartphone. As Figure shows,8aV shows,
As8aFigure V isona the
is a vector X-Zon
vector the X-Z
plane. ARBINplane.
uses
ARBIN
the uses the
orientation of Vorientation of V face
as the expected as the expected Moreover,
orientation. face orientation.
the pitchMoreover, the pitchshould
of the smartphone of thebe
smartphone
greater than 80 ◦ before
should be greater than can
a 3D model 80° be
before a 3D model
displayed. can be displayed.
The definition The
of pitch is definition
shown of pitch
in Figure 8c.
is shown in Figure 8c.
Let , , and represent the three axes of the smartphone coordinate system S. In addition,
, , and represent those of the earth coordinate system G. The is the angle between the
axis of S and the axis of G, in which = , , or , and = , , or . The angles can be
obtained by the IMU sensor built into a smartphone. Let , , be a point on S and its associated
coordinate in G be ́ , ́ , ́ . Therefore, we have
́ = x cos ́ + cos ́ + cos ́,

́ = x cos ́ + cos ́ + cos ́,


Based on the rotation matrix, ARBIN then uses getOrientation() to obtain the orientation, azimuth,
and pitch of the smartphone. Environment noises could affect the correctness of the IMU of the
smartphones. For this reason, ARBIN can be integrated with advanced noise filters or probability
models to reduce the interference. Since sensor calibration and compensation is not the major focus
of this2020,
Sensors work, for the readers who are interested in this topic, please refer to [28–30].
20, 5890 10 of 20

(a) (b) (c)

Figure 8.
Figure 8. The
The orientation of aa smartphone.
orientation of smartphone. (a)
(a) Phone is keeping
Phone is upright; (b)
keeping upright; (b) Phone
Phone is
is lying
lying flat;
flat; (c)
(c) The
The
definition of
definition of pitch.
pitch.

Let X, Y, and Z represent the three axes of the smartphone coordinate system S. In addition, X́,
Ý, and Ź represent those of the earth coordinate system G. The θij is the angle between the i axis of
S and the j axis of G, in which i = X, Y, or Z, and j = X́, Ý, or Ź. The angles can be obtained by the
IMU sensor built into a smartphone. Let (x, y, z) be a point on S and its associated coordinate in G be
(x́, ý, ź). Therefore, we have

x́ = x cos θxx́ + y cos θx ý + z cos θxź ,

ý = x cos θ yx́ + y cos θ y ý + z cos θ yź ,

ź = x cos θzx́ + y cos θz ý + z cos θzź .

The (x́, ý, ź) can be represented by (x́, ý, ź)T = R(x, y, z). The R the rotation matrix which is:

 cos θxx́ cos θx ý cos θxź


 

R =  cos θ yx́ cos θ y ý cos θ yź
 
.

cos θzx́ cos θz ý cos θzź
 

Therefore, when the smartphone has a rotation around different axis, we can have a different
rotation matrix [27]. They are:
   
 1 0 0   1 0 0 
cos θ y ý cos θ yź
   
RP =  0  =  0
  cos P sin P ,

cos θz ý cos θzź

0 0 − sin P cos P
  

 cos θxx́ 0 cos θxź   cos A 0 − sin A


   

   
RA =  0 1 0  =  0
  1 0 ,

cos θzx́ 0 cos θzź

sin A 0 cos A
  

 cos θxx́ cos θx ý 0   cos 0 sin 0 0


   

RO =  cos θ yx́ cos θ y ý 0  =  − sin 0 cos 0 0
   
,

  
0 0 1 0 0 1

in which RP is the rotation matrix when the smartphone has a rotation around X axis, and P is the
pitch angular. In addition, RA is the rotation matrix when the smartphone has a rotation around Y
xis, and A is the azimuth angular. Furthermore, RO is the rotation matrix when the smartphone has a
rotation around the Z axis and O is the roll angular. By using the rotation matrix and rotation angles,
we can transform a coordinate between S and G.
Sensors 2020, 20, 5890 11 of 20

In our implementation, the Android sensor manager (Google Inc., Mountain View, California,
United States) [26] is adopted to transfer the V vector from the smartphone coordinate system, S,
to the earth coordinate system, G, and obtain the pitch of the smartphone. ARBIN invokes the
getRotationMatrix() function to get a rotation matrix, by feeding the sensor readings of the magnetic
sensor and the acceleration sensor. The rotation matrix transformation is used to transform the vectors
and coordinates from the smartphone coordinate system to the earth coordinate system. Based on
the rotation matrix, ARBIN then uses getOrientation() to obtain the orientation, azimuth, and pitch
of the smartphone. Environment noises could affect the correctness of the IMU of the smartphones.
For this reason, ARBIN can be integrated with advanced noise filters or probability models to reduce
the interference. Since sensor calibration and compensation is not the major focus of this work, for the
readers who are interested in this topic, please refer to [28–30].

3.5. AR 3D Model Placement Module


The purpose of the AR 3D model placement module (APM) is to overlay a 3D model on a
real-world image. The process includes three steps: pitch check, face orientation check, and placement.
Each of the steps is descried as follows. First, APM checks if the smartphone is kept upright. The larger
the pitch angle is, the better the camera view is. In our implementation, the pitch angle is set in the
range of 80 to 90◦ . If the pitch angle does not meet the requirement, a warning message is displayed
to remind the user to adjust the pitch angle of the smartphone. Second, APM examines whether the
orientation of the smartphone is the same as the expected face orientation. If both the pitch angle
and the orientation of the smartphone meet the required conditions, a 3D model is placed onto a
real environment.
The 3D model placement relies on visual-inertial odometry (VIO), which first uses a camera
to extract special feature points of the surrounding environment, such as the corners, boundaries,
and blocks. It then continuously matches the features in the contiguous frames to estimate the
movement of the camera. Based on the movement of the camera, the 3D model can then be kept at the
place we expected until the 3D model is not in the field of view of a camera. In our implementation,
we used ViroCore SDK (Viro Media, Inc., Seattle, Washington, United States) [31] to implement the
model placement module. ViroCore is a tool package built on top of AndroidARCore (Google Inc.,
Mountain View, California, United States) [8]. We used getLastCameraPositionRealtime () to get
the camera coordinates, and getLastCameraForwardRealtime () to get the camera shooting direction.
The calibration of camera depth is done by the smartphone itself. In our configuration, the 3D model
is placed at 1 m away from the camera along the camera shooting direction. To have a better view,
the 3D model is further put 30 cm below the camera shooting direction. For example, as Figure 9
shows, the camera coordinate is (0, 0, 0) and the camera shooting direction vector is (0, 0, –1). Taking
the above-mentioned coordinates, ARBIN determines the coordinated 3D model by (0, 0, 0) + (0, 0, –1)
+ (0, –0.3, 0) = (0, –0.3, –1), in which the unit is meter. The ARCore then takes (0, –0.3, –1) as input and
adopts VIO technology to places the 3D model in the place we expect.
3D model is further put 30 cm below the camera shooting direction. For example, as Figure 9 shows,
the camera coordinate is (0, 0, 0) and the camera shooting direction vector is (0, 0, –1). Taking the
above-mentioned coordinates, ARBIN determines the coordinated 3D model by (0, 0, 0) + (0, 0, –1) +
(0, –0.3, 0) = (0, –0.3, –1), in which the unit is meter. The ARCore then takes (0, –0.3, –1) as input and
adopts VIO technology to places the 3D model in the place we expect.
Sensors 2020, 20, 5890 12 of 20

Figure 9.
Figure Thecoordinates
9. The coordinatesof
ofaa3D
3Dmodel.
model.

4. Experiment
4. Experiment
Our experiment included two parts: in-house experiments and a field trial. The in-house
Our experiment included two parts: in-house experiments and a field trial. The in-house
experiments were undertaken in the Engineering Building (EB) No. 5 of Yunlin University of Science
experiments were undertaken in the Engineering Building (EB) No. 5 of Yunlin University of Science
and Technology (Yuntech). The purpose was to evaluate the orientation determination of a smartphone,
and Technology (Yuntech). The purpose was to evaluate the orientation determination of a
and the correctness of the RSSI model proposed in Section 4.2. After the in-house experiments were
smartphone, and the correctness of the RSSI model proposed in Section 4.2. After the in-house
completed, we then conducted a field trial at the National Taiwan University Hospital YunLin Branch
experiments were completed, we then conducted a field trial at the National Taiwan University
(NTUH-Yunlin) where 35 Lbeacons were deployed in the outpatient area, over 1800 m2 , covering two
floors. Volunteers were invited to evaluate the responsiveness of ARBIN.

4.1. In-House Experiments

4.1.1. Evaluation of Azimuth of Smartphones


ARBIN relies on information of face orientation to correctly place a 3D model in the real-world
environment. Therefore, it is important to ensure that the azimuth value provided by a smartphone is
correct. According to the Android sensor manager [26], azimuth is the angle between the smartphone’s
current compass direction and magnetic north. If the smartphone faces magnetic north, the azimuth is
0◦ ; if it faces south, the azimuth is 180◦ . additionally, if it faces west, the azimuth is 270◦ , and if it faces
east, the azimuth is 90◦ .
In our experiment, we investigated two selected smartphones, shown in Table 1, and checked
if they could determine the azimuth correctly. We tested all possible orientations used by ARBIN.
They are north, northeast, east, southeast, south, southwest, west, and northwest. For each direction,
we kept the smartphones upright and recorded the readings of azimuth provided by the smartphone
for 10 s. The average value was taken for evaluation. As Table 2 shows, both the Samsung S10e
(Samsung, Seoul, South Korea) and SONY Xperia XZ premium (Sony, Tokyo, Japan) could achieve
a percentage error less than 5◦ in determining the azimuth. The ground truth of each measurement
was obtained by a real compass. For example, the azimuth error of Samsung S10e when it faces north
ranged from +2.66 to –3.32◦ . The maximum error was 4.67◦ when it faced Northeast. The larger the
azimuth error is, the higher is possibility it will make the user confused. According to our experience,
a user’s maximum tolerance level is 20◦ . The azimuth errors of the two smartphones were small
enough to be tolerated, and will not affect the placement of a 3D model in a real-world environment.
No further calibration on the azimuth angle was required.
Sensors 2020, 20, 5890 13 of 20

Table 1. The smartphones used for the experiment.

Model CPU Memory OS Bluetooth


Samsung Galaxy S10e Exynos 9820 6 GB Android 9.0 5.0
Sony Xperia XZ Premium Snapdragon 835 4 GB Android 9.0 5.0

Table 2. The accuracy of azimuth.

Samsung S10e SONY Xperia XZ Premium


Azimuth (Degree) Max Min Max Min
(Degree) (Degree) (Degree) (Degree)
North
+2.66 –3.32 +2.35 –1.46
(0)
Northeast
+4.67 –1.35 +0.98 –2.18
(45)
East
+3.26 –2.54 +1.23 –3.35
(90)
Southeast
+2.09 –2.52 +2.33 –3.04
(135)
South
–0.03 +0.01 –0.01 +0.01
(180/–180)
Southwest
+1.94 –3.48 +1.74 –3.05
(–135)
West
+2.28 –3.77 +3.44 –4.31
(–90)
Northwest
+2.49 –3.64 +1.28 –1.65
(–45)

4.1.2. Responsiveness of ARBIN


This experiment investigated the ability of ARBIN to have a proper reaction when a user comes
close to a Lbeacon. In order to provide a good user experience, we defined 3 m as the responsiveness
distance. The ARBIN needs to provide the user with a directional indicator when he or she enters the
area of a circle with radius of 3 m, where the Lbeacon is at the center of circle. In other words, it is
meaningless to notify the user when he or she is not in the area, because the distance between the user
and the Lbeacon is too far away. Similarly, notifying the user after he or she has already passed through
the Lbeacon is also useless. The better the responsiveness of ARBIN, the better the user experience
we create. The results of responsiveness can also represent the correctness of the four RSSI models
presented in Section 4.2, because ARBIN relies on the four models to estimate the distance between the
user and a Lbeacon.
The in-house experiment was conducted in EB-No.5 of Yuntech, where 10 Lbeacons were deployed
in three floors, with around 250 m2 . Figure 10 shows the deployment maps. The height of the ceiling
was about 3 m. When a user holds a smartphone for indoor navigation, the distance between the
smartphone and the ground (i.e., 1.5 m) is almost the same as the distance between the smartphone
and the ceiling. Hence, there is no significant difference in RSSI by putting Lbeacons on the ground
or mounting them on the ceiling according to our experiment. To have a quick deployment without
affecting the interior decoration, Lbeacons were put on the ground for the in-house experiment.
the ceiling was about 3 m. When a user holds a smartphone for indoor navigation, the distance
between the smartphone and the ground (i.e., 1.5 m) is almost the same as the distance between the
smartphone and the ceiling. Hence, there is no significant difference in RSSI by putting Lbeacons on
the ground or mounting them on the ceiling according to our experiment. To have a quick
deployment without affecting the interior decoration, Lbeacons were put on the ground for the in-
Sensors 2020, 20, 5890 14 of 20
house experiment.

Figure 10. The deployment maps of Lbeacons in EB-No. 5 at Yuntech.


Figure 10. The deployment maps of Lbeacons in EB-No. 5 at Yuntech.

Figure 11
Figure 11 shows
showsthetheexperiment
experimentsetup
setupforformeasuring
measuringresponsiveness.
responsiveness. The
The tester
tester walked
walked around
around in
in the building with a normal speed of about one meter per second. As Figure 12 shows,
the building with a normal speed of about one meter per second. As Figure 12 shows, for each Lbeacon, for each
Lbeacon,
the the tester
tester walked walked
back backfive
and forth andtimes.
forth When
five times. When
receiving thereceiving theindicator
directional directional indicator
indicated by
indicated by ARBIN, he stopped and measured the responsiveness distance,
ARBIN, he stopped and measured the responsiveness distance, L, between the smartphone and the L, between the
smartphone
Lbeacon. As and the11
Figure Lbeacon. As Figure
shows, the 11 shows,
L is measured by the L is measured
an infrared by an In
rangefinder. infrared rangefinder.
other words, for eachIn
other words, for each Lbeacon, the tester first walked forward to the Lbeacon, then recorded
Lbeacon, the tester first walked forward to the Lbeacon, then recorded the L after he was notified by the L
after he was
ARBIN’s notifiedinstruction.
directional by ARBIN’sThen,
directional instruction.
the tester Then,After
kept walking. the tester kept
leaving thewalking.
range ofAfter leaving
the Lbeacon
(i.e., a circle with radius of 3 m), he turned around and walked forward to the Lbeacon again from
the opposite direction. The same measurement was conducted again when the tester was close to
the Lbeacon. For each Lbeacon, the tester repeated the above-mentioned experiment for five rounds.
The average values of L in both the forward direction and backward direction are shown in Table 3.
For the Lbeacons placed on a north-south corridor, the forward direction pointed to north, and the
backward direction pointed to south. In addition, for the Lbeacons deployed on an east-west corridor,
the forward direction pointed to east, and the backward direction pointed to west.
In our in-house experiment, the responsiveness distance should have been less than 3 m in order
to create a good user experience. As Table 3 shows, in 92.5% (=37/40) of the test cases, ARBIN could
properly notify the tester when the tester was close to a Lbeacon. However, there was an exception at
Lbeacon A3. When the tester held a Samsung Galaxy smartphone and moved forward to Lbeacon A3,
the smartphone notified the tester earlier than was expected. In other words, the smartphone received
a relatively strong signal when it was 5 m away from Lbeacon A3. A possible solution was to slightly
raise the software threshold of RSSI for Lbeacon A3 to defer the notification. We also found that the
responsiveness distance, L, of the same Lbeacon differed the in forward and backward directions.
The possible reason is that the directional antennas of the Lbeacons may have different abilities to
send out signals in different directions. Although the responsiveness distance may depend on the
user’s arrival direction, it does not affect the ability of ARBIN, in providing users a proper directional
received
receiveda arelatively
relativelystrong
strongsignal
signalwhen
whenititwas
was5 5mmaway
awayfrom
fromLbeacon
LbeaconA3.A3.AApossible
possiblesolution
solutionwaswas
totoslightly raise the software threshold of RSSI for Lbeacon A3 to defer the notification.
slightly raise the software threshold of RSSI for Lbeacon A3 to defer the notification. We also foundWe also found
that
thatthetheresponsiveness
responsivenessdistance,
distance,L,L,ofofthethesame
sameLbeacon
Lbeacondiffered
differedthetheininforward
forwardand andbackward
backward
directions.
directions. The possible reason is that the directional antennas of the Lbeacons may havedifferent
The possible reason is that the directional antennas of the Lbeacons may have different
abilities
Sensors to send
2020, 20, out
5890 signals in different directions. Although the responsiveness distance
abilities to send out signals in different directions. Although the responsiveness distance may depend may depend
15 of 20
on the user’s arrival direction, it does not affect the ability of ARBIN, in providing
on the user’s arrival direction, it does not affect the ability of ARBIN, in providing users a proper users a proper
directional
directionalindicator.
indicator.Furthermore,
Furthermore,the theresults
resultsshowed
showedthatthatthere
therewere
werenonocases
casesofofnotifying
notifyingthe
thetester
tester
indicator.
after he had Furthermore,
already passedthethe
results showed
Lbeacon, that met
which thereour
were no cases of notifying the tester after he had
requirements.
after he had already passed the Lbeacon, which met our requirements.
already passed the Lbeacon, which met our requirements.

Figure
Figure11.
Figure 11.The
The
Theexperimental
experimentalsetup
experimental for
setup
setup measuring
for
for responsiveness.
measuring
measuring responsiveness.
responsiveness.

Figure
Figure12. The process ofofmeasuring the responsiveness.
Figure 12.
12. The
The process
process of measuring
measuring the
the responsiveness.
responsiveness.

Table 3. The results of responsiveness distance.

Samsung Galaxy S10e Sony Xperia XZ Premium


Lbeacon L L
Forward Direction Backward Direction Forward Direction Backward Direction
A1 2.0 m 0.8 m 0.2 m 0.4 m
A2 2.0 m 1.1 m 0.3 m 0.4 m
A3 3.0 m 5.3 m 1.4 m 0.4 m
B1 2.1 m 1.4 m 1.1 m 0.4 m
B2 1.0 m 3.1 m 2.2 m 0.6 m
B3 0.1 m 1.8 m 0.2 m 0.2 m
B4 2.2 m 2.0 m 1.2 m 0.2 m
C1 1.5 m 1.6 m 0.5 m 1.2 m
C2 1.3 m 3.1 m 1.1 m 1.2 m
C3 2.7 m 3.0 m 1.5 m 1.9 m
Average 1.7 m 2.3 m 1.0 m 0.7 m

The Lbeacon is equipped with a directional antenna with conical beams [32]. It can generate a
3 m range and 60◦ radiation pattern to provide a 3 m horizontal accuracy. However, according to our
experience, the shape of the conical beams is not as perfect as is claimed. Therefore, the responsiveness
distance, L, of the same Lbeacon may differ in forward and backward directions.
Sensors 2020, 20, 5890 16 of 20

4.2. Field Trial


The purpose of the field trial was to evaluate the responsiveness of ARBIN in NTUH-Yunlin by
collecting user’s feedback. The deployment maps are shown in Figure 13, in which 35 Lbeacons were
deployed in the outpatient area (i.e., B1 and 1F) of the new medical building. Figure 14 shows the
installation of Lbeacons mounted on the ceiling. Each Lbeacon periodically broadcasted its UUID at
4 Hz to the smartphones nearby. The starting point was unknow to ARBIN. Based on the UUID the
smartphone received, ARBIN automatically determined the starting point. We invited four volunteers,
with an average age of 25, who had never either used ARBIN or been to the hospital. They were
asked to judge the responsiveness of ARBIN whenever they arrived at a waypoint. The acceptable
responsiveness distance was set at 5 m because the coverage area of a Lbeacon deployed in the hospital
was a circle with radius of around 5 m. If the ARBIN notified a volunteer after he or she entered the
coverage range of a Lbeacon, the responsiveness was moderate. Other hand, if the ARBIN notified
a user before he or she entered the coverage range (i.e., the responsiveness distance was larger than
5 m), the responsiveness was fast. In addition, the responsiveness was slow when the responsiveness
distance approached zero. The above standards were told to the volunteers before they started the
testing. A calibration was required for a new smartphone. Users were required to stand at a specific
location, for example the entrance of the building, for 5 to 10 sec. ARBIN will automatically adjust
thresholds based on the received RSSI values. Since the purpose of field trial was to evaluate the
user experience, volunteers judged the responsiveness visually rather than using an infrared range
finder. To simulate an outpatient flow, the volunteers were asked to go to the following destinations
accordingly: registration counter (A11), X-ray examination room (B3), pharmacy (A25), and exit (C1).
Detailed information
Sensors 2020, of each
20, x FOR PEER route and the Lbeacons on that route are listed in Table 4.
REVIEW 16 of 20

Figure13.
Figure 13.The
Thedeployment
deployment maps
maps of
of Lbeacons
Lbeacons in
in NTUH-Yunlin.
NTUH-Yunlin.

Figure 14. The installation of Lbeacons in the outpatient area of NTUH-Yunlin.


Sensors 2020, 20, 5890 17 of 20
Figure 13. The deployment maps of Lbeacons in NTUH-Yunlin.

Figure
Figure 14.
14. The
The installation
installation of
of Lbeacons
Lbeacons in
in the
the outpatient
outpatient area
area of
of NTUH-Yunlin.
NTUH-Yunlin.

Table 4. The volunteers’ feedback on responsiveness.


As Table 4 shows, each volunteer passed through nine Lbeacons on their way to the destinations.
They judge the responsiveness whenever
Task 1: Go to The they
Registerreach a waypoint.
Counter The results show that 97% (35/36)
and Then the Clinic.
of the user Route
feedbacks were “moderate”,
Information whichAindicates
Volunteer that ARBIN
Volunteer B can properly
Volunteer C notify usersDwhen
Volunteer
(Samsung) (Asus) (Oppo) (SONY)
they approach a waypoint. Volunteer C marked A18 as slow because he experienced a deferred
Actions Lbeacons
notification when he
Go to the registration approached
counter A6–A11 A18. A possible solution
Moderate could be slightly
Moderate decreasing the
Moderate threshold
Moderate
A11–A20 Moderate Moderate Moderate Moderate
of A18 to Gomake ARBIN react
to the clinic properly.
A20–A22 In our implementation, there was no 3D AR model placed at
Arrive successfully Arrive successfully Arrive successfully Arrive successfully
the destination. Therefore, we asked volunteers
Task 2: Go to check room.
to the X-ray examination if ARBIN could guide them to the
destinations successfully. The results
A22–A20 show
Moderatethat ARBIN Moderate successfully
could guide the volunteers
Moderate Moderate to
Go to the stairs
their destinations. A20–A19 Moderate Moderate Moderate Moderate
A19–A18 Moderate Moderate Moderate Moderate
Down the stairs A18–B1 Moderate Moderate Moderate Moderate
Go to the examination room B1–B3 Arrive successfully Arrive successfully Arrive successfully Arrive successfully
Task 3: Go to the pharmacy.
Go to the stairs B3–B1 Moderate Moderate Moderate Moderate
Go up the stairs B1–A18 Moderate Slow Moderate Moderate
Go to the pharmacy A18–A25 Arrive successfully Arrive successfully Arrive successfully Arrive successfully
Task 4: Go to the exit.
A25–A31 Moderate Moderate Moderate Moderate
Go to the exit
A31–C1 Arrive successfully Arrive successfully Arrive successfully Arrive successfully

As Table 4 shows, each volunteer passed through nine Lbeacons on their way to the destinations.
They judge the responsiveness whenever they reach a waypoint. The results show that 97% (35/36) of
the user feedbacks were “moderate”, which indicates that ARBIN can properly notify users when they
approach a waypoint. Volunteer C marked A18 as slow because he experienced a deferred notification
when he approached A18. A possible solution could be slightly decreasing the threshold of A18 to make
ARBIN react properly. In our implementation, there was no 3D AR model placed at the destination.
Therefore, we asked volunteers to check if ARBIN could guide them to the destinations successfully.
The results show that ARBIN could successfully guide the volunteers to their destinations.
In addition to the user experience evaluation, we also measured the responsiveness of ARBIN
in the hospital. Samsung Galaxy S10e and Sony Xperia XZ Premium were used for the evaluation.
We evaluated the responsiveness of the ARBIN along the route: Entrance Hall(A6) -> Registration
counter(A11) -> Outpatient clinic(A22)-> X-ray examination room (B3) -> Pharmacy(A25) -> Exit (C1).
There were in total 14 waypoints on the route. As shown in Figure 15, five different walking patterns of
pedestrians were evaluated. The walking speed was around one meter per second. In our experiment,
the area of a waypoint was set at a radius of 5 m. We used (Slow, Moderate, Fast) to represent
responsiveness distance (~0 m, 0 m~5 m, >5 m). For example, if the responsiveness distance was larger
Sensors 2020, 20, 5890 18 of 20

than 5 m, we marked it as fast. No signal indicates that ARBIN had no response when the user entered
the waypoint. We repeated the experiment five time, and summarize the results in Table 5. In the
single mode and side-by-side mode, the responsiveness distance of both smartphones was always in
the range of 0 m to 5 m at every waypoint we measured. In the triangle mode, the response distance of
the Sony Xperia XZ Premium became slow at waypoint B3. Additionally, in the line-up mode and
stagger mode, Sony Xperia XZ Premium had no response in waypoint B3 and A10. The possible reason
may have been the poor design of antenna of the Sony Xperia XZ Premium. The situation could be
improved by deploying one more Lbeacon at these waypoints to reduce the possibility of no reaction
or slow
Sensors reaction.
2020, 20, x FOR PEER REVIEW 18 of 20

Figure 15.
Figure Different walking
15. Different walking patterns
patterns of
of user.
user.

Table 5. The volunteers’ feedback on responsiveness.


Table 5. The volunteers’ feedback on responsiveness.
Model/Working Pattern No Signal Slow Moderate Fast
Model/Working pattern No Signal Slow Moderate Fast
Single 0 0 14 0
Single 0 0 14 0
Line up 0 0 14 0
Samsung Galaxy S10e Line up
Triangle 0 0 0 0 14
14 00
Samsung Galaxy S10e Triangle
Side by 0 0 14 0
0 0 14 0
side by side
Side 0 0 14 0
Stagger
Stagger 0 0 0 0 14
14 00
Single 0 0 14 0
Single
Line up 1 (B3)
0 0
0 14
13 0
0
Sony Xperia XZ Premium Line
Triangle up 0 1 (B3) 1 (B3)0 13
13 0
0
Sony Xperia XZ Premium Side
Triangle
by 0 1 (B3) 13 0
0 0 14 0
side by side
Side 0 0 14 0
Stagger 1(A10) 0 13 0
Stagger 1(A10) 0 13 0

5. Conclusions
5. Conclusions
In this
In this paper,
paper, wewe presented
presented ARBIN,ARBIN, an an augmented reality-based indoor
augmented reality-based indoor navigation
navigation system,
system, to
to
guide users
guide users to
to their
their destinations
destinations in in an
an indoor
indoor environment.
environment. When Whenusers
usersenter
enterthe
therange
rangeofofaawaypoint,
waypoint,
ARBIN posts a 3D directional indicator into the real-world surrounding
ARBIN posts a 3D directional indicator into the real-world surrounding environment. With environment. With the
the
support of
support of augmented
augmented reality,
reality, itit is
is easier
easierfor
forusers
userstotodetermine
determinetheir
theirlocations
locationswhen
whenwalking
walkinginside
insidea
building. To address the heterogeneous problems of Lbeacons, four types
a building. To address the heterogeneous problems of Lbeacons, four types of RSSI model areof RSSI model are proposed.
Experiences
proposed. in correctlyin
Experiences placing
correctlya 3Dplacing
model in a real-world
a 3D model in were also explained.
a real-world were alsoFurther, we conducted
explained. Further,
both
we in-houseboth
conducted experiments
in-house and a field trial
experiments andtoa verify thetoresponsiveness
field trial and practicality
verify the responsiveness of ARBIN.
and practicality
The
of in-house
ARBIN. The experiments showed thatshowed
in-house experiments in 92.5% of in
that the92.5%
test cases,
of theARBIN could
test cases, provide
ARBIN users
could with a
provide
users with a proper directional indicator when they came close to a Lbeacon. For the field trial, four
volunteers were invited. Of the of the user feedbacks, 97% (35/36) were moderate. Our results show
that ARBIN can achieve a 3 to 5 m accuracy, and provide users with correct instructions on their way
to the destinations. ARBIN proved to be a practical solution for indoor navigation, especially for large
buildings. To further enhance user experience, in the future we plan to extend the capability of
Sensors 2020, 20, 5890 19 of 20

proper directional indicator when they came close to a Lbeacon. For the field trial, four volunteers were
invited. Of the of the user feedbacks, 97% (35/36) were moderate. Our results show that ARBIN can
achieve a 3 to 5 m accuracy, and provide users with correct instructions on their way to the destinations.
ARBIN proved to be a practical solution for indoor navigation, especially for large buildings. To further
enhance user experience, in the future we plan to extend the capability of ARBIN by adding landmark
objects into real-world environments, and showing advertisement messages provided by a surrounding
information system.

Author Contributions: Conceptualization, B.-C.H., J.H., E.T.-H.C. and H.-M.W.; Data curation, B.-C.H.;
Formal analysis, B.-C.H.; Funding acquisition, J.H. and E.T.-H.C.; Investigation, J.H., E.T.-H.C. and H.-M.W.;
Methodology, B.-C.H., J.H. and E.T.-H.C.; Project administration, J.H., E.T.-H.C. and H.-M.W.; Resources, J.H. and
E.T.-H.C.; Software, B.-C.H., J.H. and H.-M.W.; Supervision, J.H. and E.T.-H.C.; Validation, J.H. and E.T.-H.C.;
Writing—original draft, B.-C.H.; Writing—review and editing, E.T.-H.C. All authors have read and agreed to the
published version of the manuscript.
Funding: This work was supported by National Taiwan University Hospital YunLin Branch Project NTUHYL
109.C027 and Ministry of Science and Technology, Taiwan.
Conflicts of Interest: The authors declare no conflict of interest.

References
1. Rehman, U.; Cao, S. Augmented-Reality-Based Indoor Navigation: A Comparative Analysis of Handheld
Devices Versus Google Glass. IEEE Trans. Hum. Mach. Syst. 2017, 47, 140–151. [CrossRef]
2. Mulloni, A.; Wagner, D.; Barakonyi, I.; Schmalstieg, D. Indoor positioning and navigation with camera
phones. IEEE Pervasive Comput. 2009, 8, 22–31. [CrossRef]
3. Mulloni, A.; Seichter, H.; Schmalstieg, D. Handheld augmented reality indoor navigation with activity-based
instructions. In Proceedings of the 13th International Conference on Human Computer Interaction with
Mobile Devices and Services, Stockholm, Sweden, 30 August–2 September 2011; pp. 211–220.
4. Huey, L.C.; Sebastian, P.; Drieberg, M. Augmented reality based indoor positioning navigation tool.
In Proceedings of the IEEE Conference on Open Systems, Langkawi, Malaysia, 25–28 September 2011;
pp. 256–260.
5. Kasprzak, S.; Komninos, A.; Barrie, P. Feature-based indoor navigation using augmented reality.
In Proceedings of the 9th International Conference on Intelligent Environments, Athens, Greece, 18–19 July
2013; pp. 100–107.
6. Kim, J.; Jun, H. Vision-based location positioning using augmented reality for indoor navigation. IEEE Trans.
Consum. Electron. 2008, 54, 954–962. [CrossRef]
7. Chu, E.T.-H.; Wang, S.C.; Liu, J.W.S.; Hsu, J.; Wu, H.M. WPIN: A waypoint-based indoor navigation system.
In Proceedings of the IEEE 10th International Conference on Indoor Positioning and Indoor Navigation,
Pisa, Italy, 30 September–3 October 2019.
8. ARCore. Available online: https://developers.google.com/ar (accessed on 25 August 2020).
9. Reitmayr, G.; Schmalstieg, D. Location based applications for mobile augmented reality. In Proceedings of
the 4th Australasian User Interface Conference, Adelaide, SA, Australia, 4–7 February 2003; pp. 65–73.
10. Sato, F. Indoor Navigation System Based on Augmented Reality Markers. In Proceedings of the 11th
International Conference on Innovative Mobile and Internet Services in Ubiquitous Computing, Torino, Italy,
10–12 July 2017; pp. 266–274.
11. Feng, C.; Kamat, V.R. Augmented reality markers as spatial indices for indoor mobile AECFM applications.
In Proceedings of the 12th International Conference on Construction Applications of Virtual Reality, Taipei,
Taiwan, 1–2 November 2012; pp. 235–242.
12. Al Delail, B.; Weruaga, L.; Zemerly, M.J. CAViAR: Context aware visual indoor augmented reality for a
university campus. In Proceedings of the IEEE/WIC/ACM International Conferences on Web Intelligence
and Intelligent Agent Technology, Macau, China, 4–7 December 2012; pp. 286–290.
13. Koch, C.; Neges, M.; König, M.; Abramovici, M. Natural markers for augmented reality-based indoor
navigation and facility maintenance. Autom. Constr. 2014, 48, 18–30. [CrossRef]
Sensors 2020, 20, 5890 20 of 20

14. Romli, R.; Razali, A.F.; Ghazali, N.H.; Hanin, N.A.; Ibrahim, S.Z. Mobile Augmented Reality(AR)
Marker-based for Indoor Library Navigation. IOP Conf. Ser. Mater. Sci. Eng. 2020, 767, 012062.
[CrossRef]
15. Al Delail, B.; Weruaga, L.; Zemerly, M.J.; Ng, J.W.P. Indoor localization and navigation using smartphones
augmented reality and inertial tracking. In Proceedings of the IEEE International Conference on Electronics
Circuits and Systems, Abu Dhabi, UAE, 8–11 December 2013; pp. 929–932.
16. Nam, G.H.; Seo, H.S.; Kim, M.S.; Gwon, Y.K.; Lee, C.M.; Lee, D.M. AR-based Evacuation Route Guidance
System in Indoor Fire Environment. In Proceedings of the 25th Asia-Pacific Conference on Communications
(APCC), Ho Chi Minh City, Vietnam, 6–8 November 2019; pp. 316–319.
17. Wu, J.; Huang, C.; Huang, Z.; Chen, Y.; Chen, S. A Rapid Deployment Indoor Positioning Architecture based
on Image Recognition. In Proceedings of the IEEE 7th International Conference on Industrial Engineering
and Applications (ICIEA), Bangkok, Thailand, 16–21 April 2020; pp. 784–789.
18. Gerstweiler, G.; Vonach, E.; Kaufmann, H. HyMoTrack: A mobile AR navigation system for complex indoor
environments. Sensors 2015, 16, 17. [CrossRef] [PubMed]
19. Metaio SDK. Available online: http://www.metaio.com/products/sdk (accessed on 25 August 2020).
20. Rustagi, T.; Yoo, K. Indoor AR navigation using tilesets. In Proceedings of the 24th ACM Symposium on
Virtual Reality Software and Technology, Tokyo, Japan, 28 November–1 December 2018; pp. 1–2.
21. MapBox. Available online: https://docs.mapbox.com/api/ (accessed on 25 August 2020).
22. Unity. Available online: https://unity.com/ (accessed on 25 August 2020).
23. Koc, I.A.; Serif, T.; Gören, S.; Ghinea, G. Indoor Mapping and Positioning using Augmented Reality.
In Proceedings of the 7th International Conference on Future Internet of Things and Cloud (FiCloud),
Istanbul, Turkey, 26–28 August 2019; pp. 335–342.
24. ARKit. Available online: https://developer.apple.com/documentation/arkit (accessed on 25 August 2020).
25. Choi, H.; Lim, K.; Ko, Y. Improved Virtual Anchor Selection for AR-assisted Sensor Positioning in Harsh
Indoor Conditions. In Proceedings of the Global Internet of Things Summit (GIoTS), Dublin, Ireland,
3 June 2020; pp. 1–6.
26. Android Sensor Manager. Available online: https://developer.android.com/reference/android/hardware/
SensorManager (accessed on 25 August 2020).
27. Shuster, M.D. A survey of attitude representations. Navigation 1993, 8, 439–517.
28. Kok, M.; Hol, J.D.; Schön, T.B.; Gustafsson, F.; Luinge, H. Calibration of a magnetometer in combination with
inertial sensors. In Proceedings of the 15th International Conference on Information Fusion, Singapore, 9–12
July 2012; pp. 787–793.
29. Wang, Y.; Zhang, J.Y.; Zhang, D.W. Error Analysis and Algorithm Validation of Geomagnetic Sensor.
Appl. Mech. Mater. 2015, 742, 21–26. [CrossRef]
30. Zhou, Y.; Zhang, X.; Xiao, W. Calibration and compensation method of three-axis geomagnetic sensor based
on pre-processing total least square iteration. J. Instrum. 2018, 13, T04006. [CrossRef]
31. ViroCore SDK. Available online: https://viromedia.com/virocore (accessed on 25 August 2020).
32. Li, C.C.; Su, J.; Chu, E.T.-H.; Liu, J.W.S. Building/environment Data/information Enabled Location Specificity
and Indoor Positioning. IEEE Internet Things J. 2017, 6, 2116–2128. [CrossRef]

Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional
affiliations.

© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access
article distributed under the terms and conditions of the Creative Commons Attribution
(CC BY) license (http://creativecommons.org/licenses/by/4.0/).

You might also like