Abstract
Augmented Reality is a technology that has been utilized in solutions to reduce the difficulties of human navigation. In this paper we survey a collection of these solutions, focusing on solutions with visualization elements meant to guide users. We propose a holistic framework for categorizing and understanding these solutions inspired by the Model-View-Controller software design pattern. The collected solutions are analyzed with our framework, and trends in the research are identified along with unsolved problems and potential future work.
Categories and subject descriptors: augmented reality, virtual reality, software development, human navigation, human-computer interaction
Keywords: navigation, survey
1. INTRODUCTION
Human navigation is dependent on a mixture of mobility and orientation skills. People use either path integration, orienting themselves relative to a starting position using proprioceptive information, or landmark-based navigation, which relies on perceptual cues along with an external or cognitive map. Path integration and landmark-based navigation can also be used together. Decreasing skill requirements for mobility and orientation would decrease the barriers to human navigation; if both barriers were reduced or eliminated, people could easily move to wherever they wished without risk of failure or delays.
This paper surveys a particular method of reducing orientation skill requirements: visual Augmented Reality (AR) navigation. Using AR technology, it is possible to provide more visual cues than occur naturally in the environment, allowing for improved landmark-based navigation. This additional information reduces the orientation skill requirements of the user, and allows for them to more efficiently navigate. However, most papers involving a visual AR navigation solution focus on the localization aspect–the methods by which the AR device orients itself. The visualization element often is an afterthought, with little or no justification for why one visualization was chosen over another. This paper seeks to provide a more comprehensive framework for such AR solutions, one that takes into account a holistic view of the elements of a human navigation system. This framework is applied to a wide selection of gathered papers, and notable weaknesses and strengths in the visual AR navigation research are identified. Visualizations are not the only possible AR aid for navigation; verbal directions or haptic feedback can also act as navigation solutions. However, with recent developments in industry related to AR devices, such as the Apple Vision Pro or the Magic Leap 2, the likelihood of widespread adoption of the technology appears increasingly dependent on optimizing the user’s experience. The visualization element of user interfaces is important in most Human-Computer Interaction (HCI) scenarios, and especially with AR devices. For this reason, this survey focuses on that element, with the goal of contributing a bird’s eye perspective on the field of AR visual navigation for other researchers, industry leaders, and developers. Specifically, this paper aims to answer two research questions. First, what is the current state of research in the field of AR visual navigation? Second, what challenges does the field of AR visual navigation face? The methods for paper collection as formatted by the PRISM guidelines Page et al. [2021] started with defining inclusion and exclusion criteria. The main inclusion criterion was that the paper contain either a screenshot of a visual AR navigation solution in action or a detailed description of a proposed visual AR navigation solution. No exclusion criteria was necessary, as the inclusion criteria drastically reduced the number of eligible papers. Three databases were used in the search: IEEE, ACM, and Google Scholar. The same search strategies were used for all three, using the search terms: “AR indoor navigation,” “augmented reality indoor navigation,” as well as shortened versions of those phrases, “augmented reality navigation,” and “indoor navigation.” No publish date constraints or other filters were used. Papers with thorough references were also used as sources for the ‘snowballing’ Molléri et al. [2016] method of paper collection. Once all candidate papers were selected, an abstract screening Polanin et al. [2019] weeded out unsuitable candidate papers, and a thorough review of the remaining papers further reduced the number of relevant papers. This paper first discusses related work, including similar survey attempts, then proposes a new model for understanding AR navigation. Each of the three components of the model is examined in turn, with the bulk of the collected papers examined in relation to the third element: visualization. Trends and observations on the collected work are discussed and current problems in the field are identified, with suggestions of potential future work.
2. RELATED WORK
There are few attempts at surveying the field of AR navigation research, and they typically are less robust than similar surveys in related fields. The work by Bhorkar. [2017], for example, provides a limited analysis and categorization of different types of AR navigation, such as pedestrian versus car navigation or indoor versus outdoor navigation. The paper by Sharma et al. [2020] similarly categorizes AR navigation research, but only provides surface-level analysis. There are more comprehensive survey papers in relation to navigation generally, more notably the survey by Fallah et al. [2013], which provides an excellent baseline for understanding navigation conceptually, as well as useful examples of each type of navigation.
3. MODEL FOR HUMAN NAVIGATION
A common design pattern in software is the Model-View-Controller (MVC) Gamma et al. [1994]. It is useful in its separation of the different aspects of software, dividing it into data, the representation of that data to the user, and means by which the user interacts with the software. We propose a similar design paradigm for visual AR navigation solutions: the Model is the method of localization, the View is the visualization element that directs the user, and the Controller is the action the user takes to proceed to their destination, along with any peripheral actions related to the navigation. These functions are shown in Figure 1. The MVC is useful in describing visual AR navigation solutions because it makes clear distinctions between the three components, and can provide methods to study each. For example, the localization (Model) can be tested via accuracy tests, infrastructure costs, or computation rates, while one visualization type (View) can be tested against other different visualization types. The Controller can be tested via user studies, with both objective feedback like user movement data and subjective survey questions. Finally, the MVC indicates a spectrum on which a visual AR navigation solutions can be placed: navigation automation versus human control. On the navigation automation end, the Model does the majority of the navigation work, requiring the user to only follow very specific directions from the View. On the human control side, the user does some navigation work, like identifying signs or interpreting feedback from the View. Looking at the gathered papers through the prism of this framework, two of the three aspects of the MVC are understudied: the View, and the Controller. First, in reviewing the collected papers, we formed the impression that the researchers generally used whatever default visualization element was available in whatever programming library with which they implemented their AR solution. There are exceptions, but broadly the visualization element is neither discussed nor treated as an important facet of the solution. This neglect is unfortunate, because if a goal of AR navigation research is to prepare for the public to adopt AR solutions to navigation, the visualization element may be important to their experience. This leads to the second trend: there is a worrying lack of user studies in general for testing visual AR navigation solutions, and for those that exist, they often focus on the accuracy of localization. Also, frequently the users in these studies are only few, and sometimes only members of the research team. There are exceptions; studies conducted by Möller et al. [2012], Cao et al. [2018], Lu et al. [2021], and Wen et al. [2014] sport high numbers of user study participants. Such studies demonstrate that localization need not be tested at the cost of analyzing the user experience. But for the most part, the human aspects of these solutions were taken for granted.
Figure 1.

AR MVC Diagram.
4. LOCALIZATION
Localization acts as the Model for visual AR navigation solutions. Localization techniques are varied, but are usefully classified into four categories by Fallah et al. [2013]: dead-reckoning, direct-sensing, triangulation, and pattern-recognition. Dead-reckoning uses estimates of a previously known position to update the current position; IMUs are the most relevant device type for this category. Direct-sensing involves using features of an environment to identify a location; examples of this often include WiFi signals or Bluetooth beacons. Triangulation is similar to direct-sensing, but uses three or more environmental features to calculate position. Pattern recognition attempts to match environment features with previous sensor readings to establish position; computer vision is the most relevant category for this localization technique. With respect to visual AR navigation localization, direct-sensing and pattern-recognition tend to be common for indoor navigation, and triangulation tends to the localization solution for outdoor navigation. This is due to GPS, which is classified as a triangulation method, being a sufficient solution to the lower accuracy requirements of outdoor navigation, especially navigation for driving. Indoors, however, GPS is not accurate enough, and is often blocked by walls and ceilings. Indoor navigation localization is a major focus of the current research for this reason; finding a method of localization that is as universally accepted as GPS is for outdoor navigation is an open research goal. However, the focus of this survey is not localization, as the noted work by Fallah et al. [2013] remains relevant and useful as a primer for such research.
5. CONTROLLER
The action the user takes in response to the Visualization is the Controller for visual AR navigation solutions. Usually this action is to move as suggested by the Visualization, such as following a line or moving in the direction of an arrow. With some Visualizations, however, the actions taken in response can be more complex. In particular, the type of AR device can influence the types of actions a user takes. A smartphone might be held in a variety of positions, and a user might be less likely to ‘look around’ with the phone’s camera as opposed to moving their gaze with an AR headset. The AR devices used in the 71 surveyed papers are broken down as follows: 26 Android smartphones, 18 unspecified smartphones, 5 iPhone smartphones, 3 Oculus headsets, 7 unspecified or custom AR headsets, 3 Google Glass headsets, 2 HoloLens headsets, 1 Magic Leap headset, 1 HTC Vive Pro headset, 1 unspecified tablet, 1 iPad Air tablet, 1 Palm Pilot tablet, and 7 custom driving simulator AR displays. The smartphone is used in the majority of the surveyed papers, likely due to its relative cheapness compared to dedicated AR headsets, and due to the ubiquity of smartphones in the general population. An undesirable action in response to a Visualization cue is confusion, stopping the user in their tracks and reorienting themselves. Another action could be to search the environment themselves, to either ensure the accuracy of the Visualization or try to memorize their route. In general, the response of the user to the Visualization is an unstudied aspect of the model. The papers cited in this survey rarely mention actions the user takes outside the assumed movement along a route. This is possibly because navigation and movement is natural to humans, so making note of the user’s responses is considered unimportant. It is possible that these responses are unimportant, but the ultimate success of AR navigation solutions may be dependent on a full understanding of the model. Noting strange or unexpected user activity may uncover lurking problems interfering with the adoption of AR navigation.
6. VISUALIZATION
Visualization refers to the visual AR component guiding the user to their destination, and acts as the View for visual AR navigation solutions. Other visualization elements, like menu screens, buttons, and other miscellaneous UI elements also fall under the View, but the primary focus is the visualization of the navigation, as it is the element unique to AR navigation. It should be noted that other types of AR besides visual, such as auditory and tactile, are also subjects of research. Assistive technology commonly uses these other senses, however, this is not the focus of this survey. This paper narrows the scope to the visual AR components, in an effort to understand the most common form of AR navigation. In surveying the collected papers, we observed ten recurring categories of AR visual components. An notable caveat in these categories is that sometimes the AR solutions use more than one form of visualization, such as the solution by Huang et al. [2012], which used both a line and a live map, or the solution by Zhang and Nakajima [2020] which used both a line and a virtual cat character. The papers are sorted by visualization then year in Table 3, along with other details relevant to their AR solutions, and each paper is further examined in the rest of this section.
Table 3:
Collected Papers
| Paper Title and Reference | Year | Localization Method | Visualization Type | Device | Use Case |
|---|---|---|---|---|---|
| ARGuide Pro: An AR-based Indoor Navigation by Juile et al. [2024] | 2024 | SLAM | Arrow | Android | University Building |
| Improving Pedestrian Navigation in Urban Environment Using Augmented Reality and Landmark Recognition by Kumar et al. [2024] | 2024 | Pedometer and Feature Recognition | Arrow | Android | University Campus |
| Design and Development of Object Detection System in Augmented Reality Based Indoor Navigation Application by Yunardi et al. [2022] | 2022 | Bluetooth Beacons | Arrow | Android | University Building |
| Towards Underwater Augmented Reality Interfaces to Improve the Navigation Experience Szyszka and Kunze [2022] | 2022 | GPS and SLAM | Arrow | AR Headset | Underwater |
| AR-Based Navigation Using RGB-D Camera and Hybrid Map by Chidsin [2021] | 2021 | SLAM | Arrow | Camera and Computer | Building Interior |
| Research and Application of Indoor Guide Based on Mobile Augmented Reality System by Yan et al. [2015] | 2020 | Feature Recognition | Arrows | Android | Building Interior |
| AR Navigation Application based on iOS Platform by Wang and Zhang [2019] | 2019 | GPS | Arrows | iPhone | Sidewalks |
| Older Pedestrians Navigating With AR Glasses and Bone Conduction Headset by Montuwy et al. [2018] | 2018 | GPS | Arrow | AR Glasses | Sidewalks |
| A Comparative Study of Simulated Augmented Reality Displays for Vehicle Navigation by Jose et al. [2016] | 2016 | Virtual Environment | Arrow | Oculus Rift | Virtual City Streets |
| A New Approach for Indoor Navigation Using Semantic Webtechnologies and Augmented Reality by Matuszka et al. [2013] | 2013 | Pedometer and Wifi | Arrow | Android | Research Building and Shopping Center |
| User Awareness of Tracking Uncertainties in AR Navigation Scenarios by Pankratz et al. [2013] | 2013 | Infrared | Arrow, Line, Virtual Character | AR Headset and Smartphone | Building Interior |
| A Mobile Indoor Navigation System Interface Adapted to Vision-Based Localization by Möller et al. [2012] | 2012 | Feature Recognition | Arrow | VR Headset and Phone | Virtual Building Interior and Building Interior |
| AR-Based Indoor Navigation System For Personal Locating by Kim et al. [2006] | 2006 | GPS | Arrow | Custom AR Headset | University Building |
| A Wearable Augmented Reality System for Navigation Using Positioning Infrastructures and a Pedometer by Tenmoku et al. [2003] | 2003 | RFID Tags and Pedometer | Arrow | Custom AR Headset | City Streets |
| A Hybrid Indoor Navigation System by Butz et al. [2001] | 2001 | Infrared | Arrow | Palm Pilot | Building Interior |
| Reliability and Accuracy of Indoor Warehouse Navigation Using Augmented Reality by Hořejší et al. [2024] | 2024 | Feature Recognition and QR Markers | Fishbone | Smartphone | Warehouse |
| CityAR: Augmented reality navigation in the smart cities infrastructure by Archangelskaya et al. [2022] | 2022 | Feature Recognition | Fishbone and Line | Smartphone | Museum |
| Indoor AR Navigation and Emergency Evacuation System Based on Machine Learning and IoT Technologies by Yoo and Choi [2022] | 2022 | Feature Recognition and Bluetooth Beacons | Fishbone | Smartphone | Disaster Evacuation and Building Interior |
| An ARCore-Based Augmented Reality Campus Navigation System by Lu et al. [2021] | 2021 | SLAM and IMUs | Fishbone and Map | Android | University Campus |
| Augmented Reality Navigation by Sharma et al. [2020] | 2020 | GPS and IMUs | Fishbone | Android | Sidewalks |
| Disha-Indoor Navigation App by Birla et al. [2020] | 2020 | Marker Recognition | Fishbone | Smartphone | Sidewalks |
| Indoor Navigation System for Evacuation Route in case of Fire by using Environment and Location Data Lee et al. [2020] | 2020 | Bluetooth Beacons | Fishbone | Smartphone | Fire Evacuation and Building Interior |
| Navigation Based Application with Augmented Reality and Accessibility by Rocha and Lopes [2020] | 2020 | 3D Point Clouds and SLAM | Fishbone with Dots | Smartphone | Building Interior |
| A Field Study to Collect Expert Knowledge for the Development of AR HUD Navigation Concepts by Schneider et al. [2019] | 2019 | GPS, Accelerometer, Gyroscope, and FlexRay | Fishbone | Integrated Vehicle AR Interface | City Streets |
| A Smartphone Indoor Positioning System Using Hybrid Localization Technology by Gang and Pyun [2019] | 2019 | Bluetooth Beacons and IMUs | Fishbone | Android | Building Interior |
| Indoor AR Navigation Using Tilesets by Rustagi and Yoo [2018] | 2018 | Feature Recognition | Fishbone and Line | Smartphone | Museum |
| Really, It’s for Your Own Good... Making Augmented Reality Navigation Tools Harder to Use by Wen et al. [2014] | 2014 | Virtual Environment | Fishbone and Line | Computer Display | Virtual City Streets |
| If Reality Bites, Bite Back Virtually: Simulating Perfection in Augmented Reality Tracking by Wen et al. [2013] | 2013 | Virtual Environment | Fishbone and Line | Computer Display and Oculus Rift | Virtual City Streets |
| An Indoor Navigation Methodology for Mobile Devices by Integrating Augmented Reality and Semantic Web Rubio-Sandoval et al. [2021] | 2021 | Semantic Web Feature Recognition | Line | Android | Building Interior |
| AR Based Indoor Navigation System by Nikam et al. [2021] | 2021 | QR Codes | Line | Android | Building Interior |
| AR Smart Navigation System by Sayapogu et al. [2021] | 2021 | SLAM and IMUs | Line | Android | Building Interior |
| Gamified Navigation System: Enhancing Resident User Experience in City Exploration by Zhang and Nakajima [2020] | 2020 | GPS | Line and Virtual Character | iPhone | Sidewalks |
| Introduction to AR-Bot, an AR system for robot navigation by Alleaume et al. [2020] | 2020 | LiDAR and 3D Point Clouds | Line | Google Pixel Smartphone | Building Interior |
| Guiding People in Complex Indoor Environments Using Augmented Reality by Gerstweiler [2018] | 2018 | SLAM | Line | Tablet | Airport |
| Augmented reality navigation systems by Narzt et al. [2006] | 2006 | GPS | Line | Integrated Vehicle AR Interface | City Streets |
| Amateur: Augmented Reality Based Vehicle Navigation System by Cao et al. [2018] | 2018 | GPS and Motion Sensors | Line with Arrow | Nexus 5X Smartphone | City Streets |
| Augmented Reality Based Smart Supermarket System with Indoor Navigation using Beacon Technology by Jayananda et al. [2018] | 2018 | Bluetooth Beacons | Line with Arrow | Smartphone | Supermarket |
| An Investigation of Augmented Reality Presentations of Landmark-Based Navigation using a Head-Up Display by Bolton et al. [2015] | 2015 | Virtual Environment | Line with Arrow and Highlighted Landmarks | Bespoke Virtual Driving Device | Virtual City Streets |
| Experimental Evaluation of User Interfaces for Visual Indoor Navigation by Möller et al. [2014] | 2014 | Feature Recognition | Line with Arrow | Smartphone | Building Interior |
| The Next Generation of GPS Navigation Systems by Huang et al. [2012] | 2012 | GPS and IMUs | Line with Arrow and Map | Smartphone | Sidewalks |
| An experimental virtual museum based on augmented reality and navigation by Mata et al. [2011] | 2011 | Feature Recognition | Line with Arrow | iPhone | Museum |
| Augmented Reality vs. Street Views: A Driving Simulator Study Comparing Two Emerging Navigation Aids by Medenica et al. [2011] | 2011 | Virtual Environment | Line with Arrow | Bespoke Virtual Driving Device | Virtual City Streets |
| Perception Thresholds for Augmented Reality Navigation Schemes in Large Distances by Tonnis et al. [2008] | 2008 | Virtual Environment | Line with Arrow | Fixed-base Driving Simulator | Virtual City Streets |
| Indoor Localisation and Navigation on Augmented Reality Devices by Gupta et al. [2016] | 2016 | SLAM and IMUs | Map | Google Glass | Building Interior |
| Campus Navigation System Based on Mobile Augmented Reality by Qin et al. [2013] | 2013 | Feature Recognition and GPS | Map | Android | University Campus |
| Indoor Navigation with Mobile Augmented Reality and Beacon Technology for Wheelchair Users by Oliveira et al. [2017] | 2017 | Feature Recognition and Bluetooth Beacons | Map and Floating Marker | Android | Building Interior |
| Design of a Mobile Augmented Reality-based Indoor Navigation System by Ng and Lim [2020] | 2020 | Wifi and IMUs | Virtual Character | Android | Building Interior |
| Outdoor Navigation with Handheld Augmented Reality by Wu and Yu [2018] | 2018 | GPS and Wifi | Virtual Character | Android | Sidewalks |
| An Evaluation of AR-Assisted Navigation for Search and Rescue in Underground Spaces by Demirkan and Duzgun [2020] | 2020 | Virtual Environment | Lit Surfaces | Magic Leap AR Headset | Virtual Underground Mine |
| Investigating Smartphones and AR Glasses for Pedestrian Navigation and their Effects in Spatial Knowledge Acquisition by Lakehal et al. [2021] | 2020 | GPS | Written Directions | AR glasses and Smartphone | Sidewalks |
| Quadcopter-Projected In-Situ Navigation Cues for Improved Location Awareness by Knierim et al. [2018] | 2018 | GPS | Projected Arrow | Smartphone | Sidewalks |
| Personal Navi: Benefits of an Augmented Reality Navigational Aid Using a See-Thru 3D Volumetric HUD by Bark et al. [2014] | 2014 | Virtual Environment | Paper Airplane | Driving Simulator | Virtual City Streets |
6.1. Arrow
The most common visualization in the collected papers was the arrow. However, it had variety in implementation, including floating ‘compass’ arrows, which stay in the same position but point in the suggested direction, arrows that ‘stick’ to the floor or walls to act as ‘signposts,’ and various grouping of arrows that act similarly to the line style but aren’t as tightly grouped together as the fishbone style. Examples of the arrow visualization are shown in 2. The arrow style is an intuitive visualization strategy, as an early study by Butz et al. [2001] demonstrates. They used a Palm Pilot, along with other devices, to project arrows with different levels of precision onto the environment, depending on the quality of the localization and device used. Their work proposed a framework for AR navigation that would be usable by a variety of devices and styles–shift to universality that has not been accomplished in the years since. Another older study by Tenmoku et al. [2003] used a combination of RFID tags and pedometers in a basic AR navigation scheme, using red and blue arrows for navigation in the user display. While early solutions like these usually focused on direct-sensing and triangulation, some proposed basic computer vision solutions, such as the solution presented by Kim et al. [2006], which analyzed images via a color histogram matching method with a custom AR headset. The compass arrow is typically a floating arrow that points in the direction of the destination, leaving the user to determine the path. A standard implementation of this visualization strategy was done by Yunardi et al. [2022], with the addition of an obstacle detection algorithm to help the user avoid running into objects while potentially looking at their phone for guidance. They also performed a user study with 20 participants that indicated favorable user perception of the solution, and mostly positive results for the obstacle avoidance algorithm. VR is a useful tool for studying AR navigation solutions, as it ensures perfect localization. However, there may be differences in the manner users perceive AR and VR, as shown in the study by Möller et al. [2012], which compared standard AR navigation to a VR strategy that showed a 360 degree panorama on a mobile device. Their user study, with 81 participants, concluded that users strongly preferred the standard AR strategy when the system was working without localization or orientation errors. An notable result of this study is that users seemed not to notice when a localization or orientation error occurred with the VR strategy. Another comparison study, albeit with only 8 participants, was conducted by Pankratz et al. [2013], who compared the standard arrow with the line, ‘flying spirit,’ and virtual character visualization strategies. Their results were limited due to a study design failure–the task they had set up was too simple to properly discern differences in technique. However, they do provide good advice on how to avoid the difficulties they encountered. The system proposed by Juile et al. [2024] used an arrow visualization that repeated towards a destination for use in an indoor context. They focused on the ease of development and implementation of such a system, to make its use feasible with a minimum of hardware and software. They chose Unity and the ARCore library as their development base, noting the wide variety of potential platforms for a final application, as well as the relative ease of switching to these from other development environments. The study by Rehman and Cao [2016] compared handheld phones to the Google Glass HMD for AR indoor navigation, using floating arrows and superimposed text for visualization on both devices. The study involved 39 participants, and concluded with several notable findings. First, the HMD was perceived to have better accuracy than the phone. Second, the authors noted that with longer test conditions, continuously holding the phone in front of themselves was likely to make the user tired. Third, they noted that the retention of spatial information took a significant hit for both AR solutions in comparison to their third, baseline condition of a paper map. The use of the arrow style is not limited to indoor AR navigation; in the study by Jose et al. [2016] arrows were used in a variety of AR displays in a driving simulation. They compared Heads Up Display, Heads Down Display, and Head Mounted Display, all with arrow cues for a simulated driving route, and found that the displays were ranked in that order in terms of user performance and preference. Outdoor navigation is not limited to driving–in the solution presented by Montuwy et al. [2018], older users walked outdoor navigation routes, comparing an AR visual solution to an audio solution. The study found that the users preferred the audio solution, and had more navigation errors with the visual solution than the audio. An AR navigation system using the arrow visualization focused on improving pathfinding outdoors without relying on GPS was developed in the study by Kumar et al. [2024]. The focus of their study was the localization aspect, where they used an Android phone’s step-counting pedometer along with feature recognition to inform their system of the user’s location. In particular they focused on landmark recognition, which they note is a particularly useful and relatively high-accuracy form of feature recognition in an urban environment. Some arrow styles are difficult to categorize; the solution by Wang and Zhang [2019] used a trail of arrows that bears similarities to the line and line with arrow styles, but the generous spacing between the floating arrows puts their study tentatively into the arrow category. The paper goes into detail about the software design, particularly how they incorporated data from GPS and satellite views into the AR display. Beyond variations in style, some solutions are intended for exotic environments. A proposed solution by Szyszka and Kunze [2022] explores the concept of underwater AR navigation, using an arrow to guide divers or controlled robots through high turbidity or dark water.
6.2. Fishbone
The fishbone is characterized by a series of ’>’ markers closely grouped to resemble a line. It is distinct from the line, however, in that it has more empty space between the markers, and thus slightly more room for error when used in chronological solutions. The empty space also allows a user to see more of the environment compared to a line. Another difference is the explicit indication of direction, where a line implicitly indicates the direction by laying down a path. Examples of the fishbone visualization are shown in Figure 3. Several different styles of the fishbone strategy are shown in the short literature review by Sharma et al. [2020]. The paper points out the common use of IMUs in conjunction with other localization techniques, and makes interesting points about expanding AR navigation to include information like sales in nearby stores and warnings about potential obstacles along routes. An expansive user study with 120 participants conducted by Lu et al. [2021] utilized a fishbone strategy, combining it with a 2D map in the bottom of the display. The user study compared the experiences of an existing navigation solution and their AR navigation app, testing between the freshman of a university and non-freshman. A notable, if predictable result was that the freshman found the AR navigation more useful than the non-freshman. The AR navigation system developed by Gang and Pyun [2019] used the fishbone strategy in a test case for different combinations of localization strategies. The focus of their work was limiting localization error for the purposes of different use cases: social marketing, navigation, and AR gaming. They were successful in showing that adding more localization techniques increased the overall localization accuracy. A study conducted by Singh et al. [2021] compared a fishbone strategy with a ‘virtual global landmark’ strategy, which highlighted a simulated landmark to follow. The user study only had five participants, and they used a HoloLens 1, which was not designed for outdoor use. However, they did focus on the mental and physical effort of the navigation, which are useful metrics for determining the viability of different visual strategies. The fishbone strategy was used by Wen et al. [2014] in their study focusing on strategies for using AR navigation without sacrificing the spatial map humans build naturally when navigating. They found through a user study with 76 participants that there may be potential for gamifying AR navigation in such a way that increases spatial learning and recall, but the tools they developed towards this end were not effective. The application developed by Rocha and Lopes [2020] uses both the fishbone strategy and displayed instructions in an effort to increase accessibility, and decrease the likelihood of navigational errors. The fishbone visualization in the application is unorthodox, with dots alternating with the traditional ’>’ symbols. The color alternates as well, with blue dots and green ’>’ symbols. The application proposed by Lee et al. [2020] suggested the fishbone strategy for a fire evacuation routing system. Such use cases are a good example of where the fishbone could be useful–providing clear directional instructions while minimizing vision obstruction. Another application developed for an evacuation use case by Yoo and Choi [2022] used the fishbone strategy with curving symbols to indicate future directions. Their work focused on an deep learning algorithm to update directions in the case of obstacles–an important, and rarely addressed, aspect of AR navigation. The application developed by Rustagi and Yoo [2018] used the fishbone for navigation in a museum. Their fishbone was slightly unusual in that it shaded the normally empty space between the symbols, simulating a line. Another application by Archangelskaya et al. [2022] also used this shading method to merge the line and fishbone strategies. Their paper focused on the implementation of an AR navigation system in a museum, and claimed a higher navigation efficiency than traditional methods, though it detailed no user study in support of those claims. The study by Wen et al. [2013] compared a standard desktop display and HMD for navigating a virtual city area, testing for performance and spatial recall. It also tested the use of the fishbone against a standard 2D map. The study had 16 participants, and concluded that the AR navigation was preferable in most categories, but noted that without the perfect localization of the virtual environment the AR may not have performed as well. The proposed system by Birla et al. [2020] used QR codes for initial localization and then a simple fishbone for directions. It compares the AR solution to paper maps and standard application solutions, although it cites no user study in support of its claims. The user study by Schneider et al. [2019] used several types of fishbone visualizations, with varying degrees of symbol thickness and spacing. Their study, with 26 participants, used a simulated driving machine to test the different fishbone visualizations. Their results were largely qualitative, but overall they preferred the ‘solid’ fishbone design, which sported larger ‘>‘ symbols with minimal spacing. A system designed and tested by Hořejší et al. [2024] used a combination of the fishbone and QR markers for navigation in the context of a warehouse. While they did not perform a user study, they did technical testing of their system, and performed this testing on multiple iterations of their system. They showed an improvement of navigation accuracy over the iterations, partially by limiting features such as the QR marker scanning and adjusting for their warehouse’s environmental lighting, which included a section lit by a skylight window.
Figure 3.

Fishbone Visualization Examples from Schneider et al. [2019] (top-left and bottom) and Yoo and Choi [2022] (top-right)
6.3. Line
The line is typically a long steak superimposed above the ground or along a wall. There are differences within this category, such as the height of the line, whether the line is a solid color or has a pattern inside it, or how far the line stretches out along a given path. Examples of the line visualization are shown in Figure 4. The iPhone application developed by Zhang and Nakajima [2020] combined a standard line with a virtual character in an attempt to gamify the navigation process. They conducted a small user study with 5 participants, navigating the users through a series of city blocks. They found that while the users reported satisfaction with the experience, they felt a lack of safety and privacy, and felt fatigued after holding up the phone for long periods. The paper by Nikam et al. [2021] focuses mostly on an overview of localization technologies for AR navigation, but it does describe an experimental application that displays a path. Unfortunately it included no picture of the visualization, but it does mention that step-by-step instructions are also given on the smartphone. This is an unusual addition, as most navigation solutions have no form of ‘back-up’ localization or instructions. The work by Gerstweiler [2018] involved the comparison of two lines guiding users inside a larger tracking program. One line arced from the user in such a manner to keep in the user’s field of vision while the other line was simply overlaid on the ground. The paper included few details of a user study comparing these two lines, but claimed initial results indicated the former arcing line performed better than the latter. An older paper by Narzt et al. [2006] proposed different line solutions for driving AR navigation. A notable quality of the line solutions were their width compared with more modern solutions. The entire lane appears covered in most of the included pictures, and there is less transparency than in most AR visualizations of recent years. Another interesting note is that the authors pointed out the possibilities of context-sensitive messages in the AR app, such as hovering speed signs or toll road information. A detailed user study by Rubio-Sandoval et al. [2021] used a solid turquoise line in their AR navigation solution. The study, with 24 participants, primarily tested different localization strategies, but the authors made two interesting notes regarding the visualization. First, the lines were less precise in low illumination or feature-sparse conditions. Second, they noted that the primary complaint regarding visualization was due to incorrect localization– the users became frustrating when the visualization was clearly incorrectly navigating them. In the proposed AR navigation solution by Sayapogu et al. [2021], a blue slightly-transparent line is shown as part of the proposed system architecture. However, in the description of the AR path visualization, they describe an updating arrow solution. It is possible they decided to switch visualization strategies while writing their paper, an indication that they likely regarded the specific visualization trivial in comparison to the localization, on which the paper focused. The user study by Medenica et al. [2011] tested different forms of navigational aids in a virtual city driving environment. The AR navigation aid used a yellow line visualization that hovered above the car. This is an unusual approach, but may avoid the danger of the AR visualization obscuring the road. The user study, which had 18 participants, concluded that the AR navigation aid allowed the users to spend more time focusing on the road than a dash-mounted screen navigation aid. They also found that users drove better with the AR aid, and tended to prefer the AR aid to the alternatives.
Figure 4.

Line Visualization Examples from Medenica et al. [2011] (top), Zhang and Nakajima [2020] (bottom-left), and Alleaume et al. [2020] (bottom-right)
6.4. Line with Arrow
While analyzing the different visualizations, the authors noted a trend of lines ending with arrows. The number of solutions using this visualization almost equaled the number using the simple line, and there is enough of a distinction to separate them out into different categories. There are two primary differences: the arrow element indicates direction, encouraging the user to shift their gaze in the direction indicated, and the line ending before the horizon, which limits the necessary degree of localization accuracy. Examples of the line with arrow visualization are shown in Figure 5. The paper by Huang et al. [2012] considers potential futures for outdoor AR navigation, and shows the use of a largely transparent blue line ending in an arrow. They conducted a user study with 40 participants, comparing their AR navigation system with a conventional GPS navigation system, and concluded that the participants tended to prefer the AR navigation. The user study by Tonnis et al. [2008] tested three different types of lines with arrows, one flat, one with more defined ‘hard’ corners and edges, and one with rounded edges. They tested the visualizations in a virtual city street environment, and focused on perception thresholds at different speeds. They found that the rounded edged visualization was significantly harder to perceive than the other two, at both speed conditions. The user study by Möller et al. [2014] used a line with arrow visualization in their comparison of AR and a VR ‘panorama’ user interface for navigation. Several notable insights arise from this study. First, the authors mention that the VR solution, which involved pre-recording scenes and rendering them in VR, faced issues with lighting discrepancies from the recording against the actual data captured by the user. Second, the users tended to avoid holding their phones at chest height, which limited the camera’s field of vision. Third, while the users tended to prefer the VR interface, the authors noted that users enjoyed an automatic switching option between the two interfaces. The AR activated while the phone was held upright, and sported more precise navigation, and the VR was sufficient for the time in between. The application developed by Jayananda et al. [2018] used a line with arrow visualization in the context of shopping at a supermarket. Their application provided additional context for potential purchases in addition to the navigation. The user study by Bolton et al. [2015] compared different types of landmark highlighting and conventional AR line with arrow visualizations. Their study, with 20 participants, found that the highlighted landmarks were more useful to users than their line with arrow visualizations, although curiously the users still preferred the latter. They noted that the virtual city street environment in which the study took place may have been difficult to judge distances accurately; this may have resulted in the line with arrow visualization being less useful than it might have been otherwise. The paper by Mata et al. [2011] shows an AR application as a guide for a museum. The navigation was only one part of the application as museum objects are identified, and suggested tour routes are also presented to the user. This is an example of route suggestion, further decreasing the responsibility of the user in favor of the navigation automation. The user study by Cao et al. [2018] compared their AR navigation application to traditional Google Maps guidance along certain taxi driver routes. The study, with 50 participants, found that the AR navigation, which used a line with arrow visualization with different coloring and offsets based on road conditions, resulted in less distraction, and higher user satisfaction than Google Maps. One significant limitation the authors noted was that the AR navigation functioned very poorly at night, or in very dark road conditions.
Figure 5.

Line with Arrow Visualization Examples from Bolton et al. [2015] (top-left), Cao et al. [2018] (top-right), and Jayananda et al. [2018] (bottom)
6.5. Map
The map visualization is characterized by a 2D depiction of the surroundings, usually projected in a small part of the display. In the surveyed visualization solutions, it is often used in conjunction with other visualization techniques, as additional assistance for the user to localize themselves. However, there are a few solutions that exclusively utilize the map visualization. In the work by Gupta et al. [2016], a map display the user’s current position and a top-down overview of the path to their destination is displayed on a Google Glass headset. Their solution mainly focused on the localization aspect, testing the accuracy of their localization methods and trying to limit the amount of processing necessary for their solution to work on lower-powered hardware. Their solution was tested experimentally by the authors in a large room environment, but had no user study. The work by Qin et al. [2013] used a map as a navigation solution for students on their university campus. The paper is limited in scope, mostly focusing on the technical aspects of the solution’s implementation. Their solution does contain a notable feature, however: pictures of the buildings can pop up on the display, along with some information about it, allowing for the user to recognize their route and destination.
6.6. Floating Map
The floating map is distinct from the standard map in that it is projected on the environment in some fashion, usually as a floating model of the surroundings. It may have some interactivity with the user, such as manual scrolling via hand-motions. Another notable difference from the standard map is that the floating map is often used without other visualization strategies. One advantage is its 3D nature, which allows for intuitive multi-floor navigation. Examples of the floating map visualization are shown in Figure 6. The user study by Patel and Grewal [2022] compared an AR navigation application that utilized a floating map and a standard 2D map for navigation. They found that the AR navigation reduced the time it took their 20 participants to reach their destination compared to the 2D map. Additionally, they noted that even for their relatively short paths, it was inconvenient for users to hold their smartphones. The user study by Thi Minh Tran and Parker [2020] compared three different floating map types in a virtual city street environment. Their user study, with 18 participants, determined that users preferred a flatter variant of the map positioned above the street. A horizontal version of the map was found to obstruct the user’s vision, while a version linked to the user’s hand was less intuitive for users than the other two. The paper by Izani et al. [2020] used a floating map of a historic fortress for navigation and other informational features. It is notable for the size of its map, and may indicate that the floating map is suitable for larger environments, multi-floor environment. The paper by Subakti and Jiang [2016] used a floating map with other information in an AR application. The map shown is fairly bare-bones, but it does continue the trend of floating maps in addition to other informational features.
Figure 6.

Floating Map Visualization Examples from Subakti and Jiang [2016] (top-left), Izani et al. [2020] (top-right), and Ayyanchira. [2019] (bottom)
6.7. Floating Marker
The floating marker visualization overlays virtual objects to indicate location, such as those shown in Figure 7. The study by Oliveira et al. [2017] used floating markers to indicate localization beacons, along with a standard map. Their prototype was directed towards wheelchair users, and pointed out several navigation challenges unique to such users, including marking obstacles, ramps, and alternate routes. The study by Rochadiani et al. [2022] used floating marks to indicate points of interest and guide users along navigation routes in a mall environment. They performed a limited user study with 20 participants and received favorable responses to their application. Another feature of their application was providing educational information regarding various displays at the mall.
Figure 7.

Floating Marker Visualization Examples from Wang. [2020] (left), and Rochadiani et al. [2022] (right)
6.8. Virtual Character
The virtual character visualization uses a virtual character or object to indicate where the user should go by ‘leading’ them in the correct direction. Examples of the virtual character visualization are shown in Figure 8. In the study by Wu and Yu [2018] their AR navigation application used a small humanoid virtual character that ‘walked’ ahead of the user, turning to indicate the direction of the route. They noted two issues: firstly that the character sometimes turned to face the user at corners, which negatively impacted user’s trust of the system, and secondly that excessive hand movement by the user shook the phone, making the character hard to track and follow. In the study by Ng and Lim [2020] they used a robot virtual character to guide users in their AR navigation application. They performed a limited user study with 10 participants, finding that while the users generally reacted positively to the application, they also noted a serious design issue: the virtual character didn’t wait for users appropriately, and could outpace or rush users to hurry after it.
Figure 8.

Virtual Character Visualization Examples from Zhang and Nakajima [2020] (left) and Ng and Lim [2020] (right)
6.9. Highlighted Landmarks
The highlighted landmarks visualization is distinguished by virtually emphasizing real-world landmarks to help the user understand where they are in relation to the landmark. The examples of this visualization in the surveyed works were always alongside another, more prominent visualization; the highlighted landmarks function in a similar supporting fashion to the map visualization. The papers with highlighted landmarks are those by Bolton et al. [2015] and Singh et al. [2021], and are discussed in the line with arrow and fishbone sections respectively.
6.10. Miscellaneous
In the course of collecting papers, we discovered several papers that had atypical or unorthodox visualization techniques. In the study by Lakehal et al. [2021] written directions were overlaid on the displays of AR glasses and phones. They performed a user study with 20 participants to compare the same AR navigation design on the two mediums. The paper only detailed the study protocol, with minimal analysis of results, but the data collected was greater in variety than many such studies, including voice transcripts of users during their walks, and a memory retention test of the route one week after their participation in the study. In the study by Bark et al. [2014] a virtual paper airplane indicated the path a driver should take in a virtual city environment. Their AR display was designed for 3D environments, and as such depth perception could be used in addition to the airplane indicator. They performed 2 user studies, with 9 and 15 participants, related to various aspects of their application, and noted a few key results. First, the AR system was significantly more effective with 3D compared to 2D. Second, the system led to the participants spending more time looking ahead and spotting turn points more quickly. There was also a point in their study when the airplane ‘disappeared’ behind an overpass, disoriented the users momentarily–this sort of bug in a virtual environment illustrates the potential danger of AR driving navigation. In the study by Knierim et al. [2018] the authors employed a unique visualization strategy of arrow directions projected by a quadcopter onto sidewalks. They performed a user study with 16 participants comparing the quadcopter and a standard phone navigation application, and found that while users found the quadcopter interesting, it was too loud, and users didn’t want to walk too close to it, increasing their task completion times. This solution, while creative, is unlikely to be widely adopted due to the cost of the quadcopter and potential dangers of consistent usage. Such a solution could also only be used outdoors, with minimal overhead obstruction. In the paper by Sree et al. [2021] a visualization strategy of floating orbs is shown for their proposed AR navigation application. The size of the orbs and their luminosity may obscure the environment, however, and no user study is mentioned in the paper. In the study by Demirkan and Duzgun [2020] AR navigation is proposed for an underground search-and-rescue use case. Due to the unique use case, with minimal or no environmental lighting, the authors propose highlighting surfaces to assist users who might otherwise be unable to see. While this use case is very specific, feature and floor highlighting might be a useful feature for other use cases or for users with limited vision.
7. DISCUSSION
A notable feature of some research is the combination of feedback techniques, such as in the work by Zhang and Nakajima [2020] or Lu et al. [2021]. It seems that having a ‘back-up’ feedback technique, usually a floating map, is a popular method of mitigating the risk of failure for the primary feedback. This concept has merit, as a single error in localization or feedback could ruin a user’s navigation experience otherwise. However, an unstudied aspect of this idea is that combining several feedback techniques on a single display may cause information overload. Related to this concern is that the user might grow frustrated at the ‘repeated’ information being shown, causing a negative user experience. A potential solution is to have the ‘back-up’ feedback technique only shown when the main feedback technique is known to have unreliable localization or is showing known visualization errors repeatedly, such as clipping into walls or finding no spots for placing markers. However, this solution requires that such errors be detectable and measurable to some chosen threshold of tolerance. A common theme of the research is how the precision of localization and visualization link with each other. Precise visualization strategies, like the line or fishbone, frequently is paired with a precise localization technique, such as SLAM or feature recognition. For the research which took place in a virtual environment, where localization is a non-issue as every position in the environment is known, precise visualization strategies are used frequently, such as in the work by Wen et al. [2013] or Medenica et al. [2011]. Conversely, two navigation solutions by Rochadiani et al. [2022] and Izani et al. [2020] which use GPS, a less precise localization method, also use the less precise visualization techniques of Floating Markers and Floating Maps respectively. While there are exceptions, this general pattern plays out through much of the research. A possible explanation is that precise visualization, like the line or fishbone, is more likely to have obvious errors when paired with less precise localization. A line leading into a wall or turning at the wrong point is more noticeable than an arrow pointing in a slightly off direction or a floating marker at an offset. The continuous nature of such visualization gives the user a constant guide for each step, but it also must be constantly correct or risk the user losing trust in the navigation. Less precise visualization provides less automation for the user, but also offers a degree of leniency for localization errors. In Table 1, we rank the different types of visualization strategies by the degree of guidance, or automation, they provide the user in an AR navigation context. AR navigation in the context of driving has some pronounced differences and patterns compared to a pedestrian use case. A primary difference is the lack of precision necessary for localization. While a pedestrian may be turning tight corners and maneuvering through narrow sidewalks, a vehicle typically has an entire, comparatively unobstructed lane. For example, the early work by Narzt et al. [2006] uses the line visualization, as the localization provided by the GPS proved sufficiently precise to lay a line ahead of the vehicle. The line visualization is a common strategy for driving navigation solutions; this is likely due to the natural association with the lines separating the lanes of a roadway. Visualization techniques that contribute high levels of automation tend to require precise localization. This is shown in Table 2, which categorizes each localization style by its level of automation. A possible reason for this link between high automation level and precise localization is that otherwise the automation may lead the individual the wrong way, or into an obvious obstacle, ultimately leading to the user distrusting the navigation solution. Visualizations with less automation, like the floating map, don’t require as precise localization since the user is already engaged with their environment to interpret the map.
Table 1:
Visualization vs Navigation Automation
| Visualization Type | Navigation Automation |
|---|---|
| Arrow | Medium |
| Fishbone | Very High |
| Line | Very High |
| Line with Arrow | High |
| Map | Very Low |
| Floating Map | Low |
| Floating Marker | Low |
| Virtual Character | Very High |
| Highlighted Landmarks | Very Low |
Table 2:
Localization vs Level of Automation
| Localization Type | Level of Automation |
|---|---|
| Dead-Reckoning | None |
| GPS | Low |
| WLAN | High |
| RFID | High |
| Feature Recognition | Medium |
| SLAM | Medium |
| 3D Point Clouds | Medium |
| Marker Recognition | Low |
| QR/Bar Codes | Very Low |
| Bluetooth | High |
| WiFi | High |
8. CURRENT PROBLEMS
In the course of reviewing the papers collected for this survey, the authors identified several challenges in the field of AR navigation. First, there is a lack of comparative studies, particularly between devices and UI designs. There are a few studies comparing handheld devices to headsets, such as the work by Rehman and Cao [2016], but the vast majority of studies only test one device. Handhelds are the most common device tested–perhaps correctly so, given their prevalence in society compared to headsets. However, the lack of comparison between devices is a hole in the field of AR navigation research. There could be undiscovered differences between the way a user interacts with AR through a headset versus a handheld device. Similarly, very little research focuses on the UI design for AR navigation solutions. As a result, there is no standard UI seen in the screenshots provided for each paper. A variety of colors, line patterns, and even placement for visualization types varies from paper to paper, with little or no discussion on why or how such decisions were made. For example, even for very similar solutions, like those of Chidsin [2021] and Pankratz et al. [2013], it is harder to find similarities between the UI design than differences. This is not the fault of any developer–without clear guidelines, based in research, there is no reason to design an AR navigation solution UI one way over another, beyond the personal opinions of the developer. What color should an arrow visualization be? If it is possible, should a line visualization be displayed on the ground, on the left or right wall, or above the user? While these may seem trivial issues, questions like these compound, and may determine quality of the user’s experience. This observation, along with the previous, show a general trend in the field of AR navigation research: the user’s experience is largely ignored in favor of the technical aspects of the navigation. Specifically, research in the field focuses on localization, to the neglect of other avenues of study. Older papers focused on direct sensing and triangulation localization through the use of different types of hardware, such as the RFID tags and pedometer by Tenmoku et al. [2003] or the infrared sensing by Butz et al. [2001]. Such approaches are not uncommon in more recent work, but the focus of such research is rarely the localization itself. For the pattern recognition localization, however, many papers focus on computer vision algorithms and techniques, such as the study of object detection by Yunardi et al. [2022] or the combination of IMU localization and natural marker localization by Neges [2021]. The SLAM algorithm and feature recognition are also increasingly popular as a localization solution. The focus on localization makes sense, since without accurate localization the entire solution fails. However, the focus on solving its problems to the exclusion of others limits the progress of the field as a whole. In addition, it is possible that researching the other two parts of the visual AR navigation model, the View and Controller, might give insight into the Model. While there are challenges in the field that need to be addressed, there are two trends that may indicate emerging best practices for certain types of visualization solutions. First, the arrow visualization tends to be used when there are distinct paths and low localization precision, because even if the arrow is pointing in a slightly incorrect direction, it is still likely to indicate the correct path for the user. In particular, it is often used for indoor navigation, due to the unavailability of GPS, and the natural paths in indoor environments. Second, the line visualization, along with some of the line with arrow and fishbone visualizations that extend along the ground toward the horizon, are often used in environments with few visual obstructions, like roads or sidewalks. This is likely due to the fact that visualizations extending further out draws the users eye upwards to look forward, which is important for avoiding closer obstacles, like other pedestrians or traffic. However, if the localization is poor, then it may be noticed more readily by the user than a less expansive visualization type, leading them to distrust the navigation solution. Overall, the use case for a particular type of solution should be taken into consideration when making decisions on the visualization design.
9. FUTURE TRENDS
A variety of suggested future work appears in the surveyed papers, though not all is specifically relevant to visual AR navigation. Within the relevant future work, some work is part of various trends the authors have identified in the field, other suggestions are specific to a given study. The latter have been mentioned in the Visualization section in the discussions of the various studies, and the former are summarized in this section, divided by category according to the MVC paradigm Gamma et al. [1994].
9.1. Model
The localization of AR navigation solutions is the best studied element in the MVC paradigm, and a major trend has developed in this study. In general, earlier work focused on improving direct-sensing or triangulation techniques for localization, but as computer vision has improved in recent years, solutions have gradually shifted to pattern recognition for localization, particularly SLAM. Improvements of cameras, IMUs, and the inclusion of depth sensors on the hardware used in AR navigation solutions have also likely contributed to this shift towards pattern recognition. The addition of positive elements is accompanied by reducing the need for inconvenient elements. Direct-sensing and triangulation solutions both often involves external hardware, like Bluetooth beacons, and pattern recognition eliminates the extra cost and installation of that additional hardware. The reduced cost, as well as eliminating installation requirements, contributes to this trend away from direct-sensing and triangulation, especially for indoor contexts. However, even outdoor contexts, which traditionally use GPS, have seen attempts to replace GPS with computer vision, such as in the recent work by Kumar et al. [2024].
9.2. View
The visualization element of the MVC paradigm is often a secondary element of many studies and solutions, but there are a few trends worth noting. First, most studies do not compare visualizations, and those that do tend to compare a less common visualization with a more common one, such as the arrow or fishbone. As a result, many visualization types have not been directly compared with others. Metrics of user satisfaction and navigation accuracy are often components of analysis for single visualizations, but without comprehensive comparison with other visualizations in the same environment, these metrics hold limited value. Though implementing additional visualizations and testing them with users could be time-consuming, until such studies are done there is no scientific justification to choose one visualization over another. This trend will ideally be diminished in future work in this field. Second, in a similar vein to the first, most studies with a single visualization technique seldom use variations of that visualization. The few studies that do use variations often do so in form, such as changing the space between arrows in the fishbone visualization, or in presentation, by changing the color of visualizations or shifting where they are displayed in the environment. This trend is similarly unfortunate as the first, since testing the same visualization in different environments might lead to insight for choosing certain visualization strategies for certain use cases. Third, with regard to use cases, a notable trend in recent work such as the studies by Hořejší et al. [2024], Archangelskaya et al. [2022], and Kumar et al. [2024], is an attempt to expand AR navigation solutions to a variety of specific contexts. While early work often tested solutions in whichever environment was convenient for the researchers, often a university campus or building, recent work has seen greater specificity and variety in use cases. Warehouses, shopping malls, museums, and other specific use cases have been explored, and many of these studies note specific challenges the researchers encountered for their chosen context. While efforts to find generalized solutions for indoor and outdoor contexts have not abated, the study of specific contexts is revealing unique challenges that any universal solution must address. Fourth, due to the lack of uniformity with UI displays in the studies, a trend exists in recent work of including additional information along with the navigation visualization in the form of text or pictures. This occurs in a variety of contexts, such as labeling important landmarks outdoors, or rooms a user passes by indoors. In the pursuit of best practices in UI design for AR navigation, research on whether such additional information is useful or distracting may be necessary. Is there a point of information saturation, past which the user finds additional feedback distracting or useless? Are there particular types of additional information that is more useful than others?
9.3. Controller
As previously noted, the user’s behaviors when using AR navigation is a neglected element in the MVC paradigm. There are a few notable trends in peripheral components of user behavior. First, with regards to the hardware used in most AR navigation solutions, there is a clear focus on smartphones over AR headsets. This phone-headset gap is likely the result of the ubiquity of smartphones and the comparative lack of interest in, or higher cost of, AR headsets. However, use of smartphones for AR navigation has several apparent disadvantages compared to headsets. The field of view for the display is often smaller, the hardware for processing localization is slower, and the simple physical exertion of holding up a phone for an extended period tires the user out. However, it could also be argued that AR headsets, which can be bulky and inconvenient compared to smartphones, are not necessary for a relatively simple task of navigation. It is possible smartphones are the future of AR navigation despite their limitations, but more attention directed towards headsets might reveal insights into improving both mediums. Second, while AR navigation for pedestrians is the focus of most of the studies, there is an increasing number of AR navigation studies focused on vehicles in recent work. A user’s behavior while driving versus walking is likely very different, so identifying those differences may be necessary to improve solutions in their respective contexts. Notably, while the relaxed constraints for navigation with vehicles allow for leniency in AR solutions, vehicle AR navigation still suffers from the same lack of comparative studies as its pedestrian counterpart. Looking even further into the future, study of other modes of transportation, such as biking or boating, may see research once pedestrian and vehicle AR navigation becomes a more settled field.
10. CONCLUSION
This paper proposed the use of the Model-View-Controller design pattern as a lens to view AR visual navigation solutions. These solutions are an important avenue of exploration with regards to solving the problem of human navigation. We collected relevant papers and categorized them according to this model, with a particular focus on the View component due to most papers emphasis on the Model. The Controller was also a neglected component, to the point of most papers lacking any substance related to it at all. A summary of the wide variety of AR navigation solutions were presented, along with commentary about how each solution fits in the broader research field, and several research trends were identified and discussed. Problems and gaps in the research were identified, and several suggestions regarding future work to address those problems were made. In conclusion, while AR navigation solutions retain a great deal of promise, the approach of research on the subject needs to diversify to fully realize the technology’s potential.
Figure 2.

Arrow Visualization Examples from Yan et al. [2015] (top-left), Yunardi et al. [2022] (top-right), Liu et al. [2016] (bottom-left), and Matuszka et al. [2013] (bottom-right)
11. FUNDING
This research was supported by NSF grant 1911041 and NIH grant P30 GM145646.
12. AUTHOR BIOGRAPHIES
Hudson Lynam is a PhD student in the Department of Computer Science and Engineering at the University of Nevada, Reno, USA. He earned a BS in Computer Science from Georgia Tech, USA, in 2017. His primary focus of study is in human computer interaction, with a focus on AR and VR research. He is also the corresponding author for this paper, contact him at hlynam@unr.edu. Sergiu Dascalu is a professor in the Department of Computer Science and Engineering at the University of Nevada, Reno, USA. In 2001 he earned a PhD in Computer Science from Dalhousie University, Canada. His main interests are in human computer interaction, software engineering, and data science. Eelke Folmer received his PhD in Software Engineering from the University of Groningen in 2005. He was a Postdoc at the University of Alberta in 2005. Since 2006 he has been faculty in the Department of Computer Science and Engineering at the University of Nevada, Reno. He is currently Professor in Human-Computer Interaction with research interests in virtual/augmented reality, wearables, and accessibility.
REFERENCES
- Möller Andreas, Kranz Matthias, Huitl Robert, Diewald Stefan, and Roalter Luis. A mobile indoor navigation system interface adapted to vision-based localization. In Proceedings of the 11th International Conference on Mobile and Ubiquitous Multimedia, MUM ‘12, New York, NY, USA, 2012. Association for Computing Machinery. ISBN 9781450318150. doi: 10.1145/2406367.2406372. [DOI] [Google Scholar]
- Seide Molléri Jefferson, Petersen Kai, and Mendes Emilia. Survey Guidelines in Software Engineering: An Annotated Review. In Proceedings of the 10th ACM/IEEE International Symposium on Empirical Software Engineering and Measurement, ESEM ‘16, Cuidad Real, Spain, 2016. Association for Computing Machinery. ISBN 9781450344272. doi: 10.1145/2961111.2962619. [DOI] [Google Scholar]
- Cao Chu, Li Zhenjiang, Zhou Pengfei, and Li Mo. Amateur: Augmented reality based vehicle navigation system. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol, 2(4), Dec 2018. doi: 10.1145/3287033. [DOI] [Google Scholar]
- Lu Fangfang, Zhou Hao, Guo Lingling, Chen Jingjing, and Pei Licheng. An ARCore-Based Augmented Reality Campus Navigation System. Applied Sciences, 11(16), 2021. ISSN 2076–3417. doi: 10.3390/app11167515. [DOI] [Google Scholar]
- Page Matthew, McKenzie Joanne, Bossuyt Patrick, Boutron Isabelle, Tammy C Hoffmann. The PRISMA 2020 statement: an updated guideline for reporting systematic reviews. BMJ, 2021. doi: 10.1136/bmj.n71. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Yong-xu Qin, Jia-min Liu, Hui Qiu, Bo Yan, Chang-xu Jin. Campus Navigation System Based on Mobile Augmented Reality. In 2013 6th International Conference on Intelligent Networks and Intelligent Systems. doi: 10.1109/ICINIS.2013.42. [DOI] [Google Scholar]
- Wen James, Deneka Agnes, Helton William, and Billinghurst Mark. Really, it’s for your own good...making augmented reality navigation tools harder to use. In CHI ‘14 Extended Abstracts on Human Factors in Computing Systems, CHI EA ‘14, page 1297–1302, New York, NY, USA, 2014. Association for Computing Machinery. ISBN 9781450324748. doi: 10.1145/2559206.2581156. [DOI] [Google Scholar]
- Fallah Navid, Apostolopoulos Ilias, Bekris Kostas, and Folmer Eelke. Indoor human navigation systems: A survey. Interacting with Computers, 25:21–33. 2013. [Google Scholar]
- Huang JY, Tsai CH, and Huang ST. The next generation of GPS navigation systems. Commun. ACM, 55(3):84–93, Mar 2012. ISSN 0001–0782. doi: 10.1145/2093548.2093570. [DOI] [Google Scholar]
- Matuszka T, Gombos G, and Kiss A. A New Approach for Indoor Navigation Using Semantic Webtechnologies and Augmented Reality. In Lecture Notes in Computer Science, Berlin, Germany, 2013. 202–210. Springer. ISBN 978-3-642-39404-1. doi: 10.1007/978-3-642-39405-8_24. [DOI] [Google Scholar]
- Wen James, Helton William, and Billinghurst Mark. If Reality Bites, Bite Back Virtually: Simulating Perfection in Augmented Reality Tracking. In 14th Annual ACM SIGCHI conference on Computer-Human Interaction, Association for Computing Machinery. doi: 10.1145/2542242.2542246. [DOI] [Google Scholar]
- Wen James, Helton William, and Billinghurst Mark. Introduction to AR-Bot, an AR System for Robot Navigation. In 26th ACM Symposium on Virtual Reality Software and Technology, VRST ‘20. New York, NY, USA, 2020. Association for Computing Machinery. ISBN 9781450376198. doi: 10.1145/3385956.3422112. [DOI] [Google Scholar]
- Zhang Yiyi and Nakajima Tatsuo. Gamified navigation system: Enhancing resident user experience in city exploration. In Adjunct Proceedings of the 2020 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2020 ACM International Symposium on Wearable Computers, UbiComp-ISWC ‘20, page 180–183, New York, NY, USA, 2020. Association for Computing Machinery. ISBN 9781450380768. doi: 10.1145/3410530.3414405. [DOI] [Google Scholar]
- Butz Andreas, Baus Jörg, Krüger Antonio, and Lohse Marco. A hybrid indoor navigation system. In Proceedings of the 6th International Conference on Intelligent User Interfaces, IUI ‘01, page 25–32, New York, NY, USA, 2001. Association for Computing Machinery. ISBN 1581133251. doi: 10.1145/359784.359832. [DOI] [Google Scholar]
- Wang Zhaoyang, Tang Qirui, and Yang Zhou. AR Precision Navigation System Based on Baidu Map API. In Proceedings of the 2020 IEEE 3rd International Conference on Computer and Communication Engineering Technology, CCET ‘20, 314–317. doi: 10.1109/CCET50901.2020.9213134. URL https://ninercommons.charlotte.edu/islandora/object/etd%3A1963. [DOI] [Google Scholar]
- Ayyanchira Akshay. Cross-Platform Indoor Navigation using Mixed-Reality, The University of North Carolina at Charlotte. Ann Arbor URL https://ninercommons.charlotte.edu/islandora/object/etd%3A1963. [Google Scholar]
- Tenmoku Ryuhei, Kanbara Masayuki, and Yokoya Naokazu. A wearable augmented reality system for navigation using positioning infrastructures and a pedometer. In Proceedings of the 2nd IEEE/ACM International Symposium on Mixed and Augmented Reality, ISMAR ‘03, page 344, USA, 2003. IEEE Computer Society. ISBN 0769520065. [Google Scholar]
- Kim Jong-Bae, Lee Jeong-Mi, and Jun Hee-Sung. AR-based indoor navigation system for personal locating. In 2006 Digest of Technical Papers International Conference on Consumer Electronics, pages 217–218, 2006. doi: 10.1109/ICCE.2006.1598388. [DOI] [Google Scholar]
- Yunardi Dalila Husna, Saputra Kurnia, and Gunawan Budi. Design and development of object detection system in augmented reality based indoor navigation application (case study: Faculty of mathematics and sciences building, syiah kuala university). In 2022 International Conference on Electrical Engineering and Informatics (ICELTICs), pages 89–94, 2022. doi: 10.1109/ICELTICs56128.2022.9932093. [DOI] [Google Scholar]
- Pankratz Frieder, Dippon Andreas, Coskun Tayfur, and Klinker Gudrun Johanna. User awareness of tracking uncertainties in ar navigation scenarios. 2013 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), pages 285–286, 2013. [Google Scholar]
- Rehman Umair and Cao Shi. Augmented-reality-based indoor navigation: A comparative analysis of handheld devices versus google glass. IEEE Transactions on Human-Machine Systems, 47(1):1–12. 2016. Accessed 11 Oct. 2021. [Google Scholar]
- Jose Richie, Lee Gun A., and Billinghurst Mark. A comparative study of simulated augmented reality displays for vehicle navigation. In Proceedings of the 28th Australian Conference on Computer-Human Interaction, OzCHI ‘16, page 40–48, New York, NY, USA, 2016. Association for Computing Machinery. ISBN 9781450346184. doi: 10.1145/3010915.3010918. [DOI] [Google Scholar]
- Montuwy Angélique, Cahour Béatrice, and Dommes Aurélie. Older pedestrians navigating with ar glasses and bone conduction headset. In Extended Abstracts of the 2018 CHI Conference on Human Factors in Computing Systems, CHI EA ‘18, page 1–6, New York, NY, USA, 2018. Association for Computing Machinery. ISBN 9781450356213. doi: 10.1145/3170427.3188503. [DOI] [Google Scholar]
- Wang Xin and Zhang Risong. AR navigation application based on IOS platform. In 2019 2nd International Conference on Information Systems and Computer Aided Education (ICISCAE), pages 14–19, 2019. doi: 10.1109/ICISCAE48440.2019.221579. [DOI] [Google Scholar]
- Szyszka Ewa Anna and Kunze Kai. Towards underwater augmented reality interfaces to improve the navigation experience. In Augmented Humans 2022, AHs 2022, page 291–293, New York, NY, USA, 2022. Association for Computing Machinery. ISBN 9781450396325. doi: 10.1145/3519391.3524175. [DOI] [Google Scholar]
- Sharma Munesh Kumar, Chachaundiya Satya, and Vishal. Augmented reality navigation. International Journal of Engineering Research & Technology (IJERT, 09(ue 06). 2020. [Google Scholar]
- Gang H-S and Pyun J-Y. A smartphone indoor positioning system using hybrid localization technology. Energies, 12 (19). doi: 10.3390/en12193702. [DOI] [Google Scholar]
- Singh Avinash Kumar, Liu Jia, Tirado Cortes Carlos A., and Lin Chin-Teng. Virtual global landmark: An augmented reality technique to improve spatial navigation learning. In Extended Abstracts of the 2021 CHI Conference on Human Factors in Computing Systems, CHI EA ‘21, New York, NY, USA, 2021. Association for Computing Machinery. ISBN 9781450380959. doi: 10.1145/3411763.3451634. [DOI] [Google Scholar]
- Rocha Sebastião and Lopes Arminda. Navigation based application with augmented reality and accessibility. In Extended Abstracts of the 2020 CHI Conference on Human Factors in Computing Systems, CHI EA ‘20, page 1–9, New York, NY, USA, 2020. Association for Computing Machinery. ISBN 9781450368193. doi: 10.1145/3334480.3383004. [DOI] [Google Scholar]
- Lee Sanghoon, Park Sangmin, Kim SeungHwan, Lee Sang Hyeon, Lee Sooheng, and Park Sehyun. Indoor navigation system for evacuation route in case of fire by using environment and location data. In 2020 IEEE International Conference on Consumer Electronics - Taiwan (ICCE-Taiwan), pages 1–2, 2020. doi: 10.1109/ICCE-Taiwan49838.2020.9258143. [DOI] [Google Scholar]
- Yoo Sang-Jo and Choi Seung-Hee. Indoor AR navigation and emergency evacuation system based on machine learning and iot technologies. IEEE Internet of Things Journal, 9(21): 20853–20868, 2022. doi: 10.1109/JIOT.2022.3175677. [DOI] [Google Scholar]
- Rustagi Tara and Yoo Kyungjin. Indoor AR Navigation Using Tilesets. In Proceedings of the 24th ACM Symposium on Virtual Reality Software and Technology (VRST ‘18). Association for Computing Machinery, volume 54, page 1–2, New York, NY, USA, Article. doi: 10.1145/3281505.3281575. [DOI] [Google Scholar]
- Archangelskaya Anastasia, Gerasimov Ivan, Sardar Mariam Al, and Abramova Anna. City AR: Augmented reality navigation in the smart cities infrastructure. In 2022 IEEE International Smart Cities Conference (ISC2), pages 1–7, 2022. doi: 10.1109/ISC255366.2022.9921884. [DOI] [Google Scholar]
- Birla Simran, Singh Gurveen, Kumhar Pratik, Gunjalkar Kshitij, Sarode Sambhaji, Choubey Sanskar, and Pawar Mohandas. Disha-indoor navigation app. In 2020 2nd International Conference on Advances in Computing, Communication Control and Networking (ICACCCN), pages 517–522, 2020. doi: 10.1109/ICACCCN51052.2020.9362984. [DOI] [Google Scholar]
- Schneider Matthias, Bruder Anna, Necker Marc, Schluesener Tim, Henze Niels, and Wolff Christian. A field study to collect expert knowledge for the development of ar hud navigation concepts. In Proceedings of the 11th International Conference on Automotive User Interfaces and Interactive Vehicular Applications: Adjunct Proceedings, AutomotiveUI ‘19, page 358–362, New York, NY, USA, 2019. Association for Computing Machinery. ISBN 9781450369206. doi: 10.1145/3349263.3351339. [DOI] [Google Scholar]
- Nikam Rutuja Ghanshyam, Patil Snehal Dilip, Shejawale Manasi Mukund, Patil Priyanka Rangrao, and Patil Nutan V. Ar based indoor navigation system. Journal of Emerging Technologies and Innovative Research, 8(7): 700–703, Jun 2021. [Google Scholar]
- Polanin Joshua R., Pigott Terri D., Espelage Dorothy L., and Grotpeter Jennifer K.. Best practice guidelines for abstract screening large-evidence systematic reviews and meta-analyses. Research Synthesis Methods, 10(3):330–342. ISSN 1759–2879. doi: 10.1002/jrsm.1354. [DOI] [Google Scholar]
- Yan Xingya, Liu Wei, and Cui Xiaoyun. Research and Application of Indoor Guide Based on Mobile Augmented Reality System. In 2015 International Conference on Virtual Reality and Visualization, 308–311. doi: 10.1109/ICVRV.2015.48. [DOI] [Google Scholar]
- Gamma E, Helm R, Johnson R and Vlissides J Design Patterns: Elements of Reusable Object-Oriented Software. Addison-Wesley Professional, 1994. [Google Scholar]
- Liu Kaixu, Motta Gianmario, and Ma Tianyi. Research and Application of Indoor Guide Based on Mobile Augmented Reality System. In 2016 IEEE International Conference on Services Computing, 299–306. doi: 10.1109/SCC.2016.46. [DOI] [Google Scholar]
- Gerstweiler G. Guiding people in complex indoor environments using augmented reality. In 2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR, pages 801–802,. doi: 10.1109/VR.2018.8446138. [DOI] [Google Scholar]
- Bhorkar Gaurav. A Survey of Augmented Reality Navigation. In ArXiv. URL https://api.semanticscholar.org/CorpusID:6660049. [Google Scholar]
- Narzt Wolfgang, Pomberger Gustav, Ferscha Alois, Kolb Dieter, Müller Reiner, Wieghardt Jan, Hörtner Horst, and Lindinger Christopher. Augmented reality navigation systems. Universal Access in the Information Society, 4: 177–187 10 1007 10209-005-0017-5. 2006. [Google Scholar]
- Rubio-Sandoval J, Martinez-Rodriguez J, Lopez-Arevalo I, Rios-Alvarado A, Rodriguez-Rodriguez A, and Vargas-Requena D. An indoor navigation methodology for mobile devices by integrating augmented reality and semantic web. Sensors, 21(16):5435, 2021. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Sayapogu T, Dsa K, and Kaul P. AR smart navigation system. In 2nd International Conference for Emerging Technology (INCET, pages 1–4. 2021. [Google Scholar]
- Medenica Zeljko, Kun Andrew L., Paek Tim, and Palinko Oskar. Augmented Reality vs. Street Views: A driving simulator study comparing two emerging navigation aids. In Proceedings of the 13th International Conference on Human Computer Interaction with Mobile Devices and Services, MobileHCI ‘11, page 265–274, New York, NY, USA, 2011. Association for Computing Machinery. ISBN 9781450305419. doi: 10.1145/2037373.2037414. [DOI] [Google Scholar]
- Tonnis Marcus, Klein Leslie, and Klinker Gudrun. Perception thresholds for augmented reality navigation schemes in large distances. In Proceedings of the 7th IEEE/ACM International Symposium on Mixed and Augmented Reality, ISMAR ‘08, page 189–190, USA, 2008. IEEE Computer Society. ISBN 9781424428403. doi: 10.1109/ISMAR.2008.4637360. [DOI] [Google Scholar]
- Möller Andreas, Kranz Matthias, Diewald Stefan, Roalter Luis, Huitl Robert, Stockinger Tobias, Koelle Marion, and Lindemann Patrick A.. Experimental evaluation of user interfaces for visual indoor navigation. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI ‘14, page 3607–3616, New York, NY, USA, 2014. Association for Computing Machinery. ISBN 9781450324731. doi: 10.1145/2556288.2557003. [DOI] [Google Scholar]
- Jayananda PKV, Seneviratne DHD, Abeygunawardhana Pradeep, Dodampege LN, and Lakshani AMB. Augmented reality based smart supermarket system with indoor navigation using beacon technology (easy shopping Android mobile app). In 2018 IEEE International Conference on Information and Automation for Sustainability (ICIAfS), pages 1–6, 2018. doi: 10.1109/ICIAFS.2018.8913363. [DOI] [Google Scholar]
- Bolton Adam, Burnett Gary, and Large David R. An investigation of augmented reality presentations of landmark-based navigation using a head-up display. In Proceedings of the 7th International Conference on Automotive User Interfaces and Interactive Vehicular Applications, AutomotiveUI ‘15, page 56–63, New York, NY, USA, 2015. Association for Computing Machinery. ISBN 9781450337366. doi: 10.1145/2799250.2799253. [DOI] [Google Scholar]
- Mata Felix, Claramunt Christophe, and Juarez Alberto. An experimental virtual museum based on augmented reality and navigation. In Proceedings of the 19th ACM SIGSPA-TIAL International Conference on Advances in Geographic Information Systems, GIS ‘11, page 497–500, New York, NY, USA, 2011. Association for Computing Machinery. ISBN 9781450310314. doi: 10.1145/2093973.2094058. [DOI] [Google Scholar]
- Patel Vishva and Grewal Ratvinder. Augmented reality based indoor navigation using point cloud localization. International Journal of Engineering Applied Sciences and Technology, 6(9):67–77. 2022. [Google Scholar]
- T Tram Thi Minh and Parker Callum. Designing exocentric pedestrian navigation for AR head mounted displays. In Extended Abstracts of the 2020 CHI Conference on Human Factors in Computing Systems, CHI EA ‘20, page 1–8, New York, NY, USA, 2020. Association for Computing Machinery. ISBN 9781450368193. doi: 10.1145/3334480.3382868. [DOI] [Google Scholar]
- Izani M, Samad A, and Razak A. Augmented reality application based navigating the a famosa fortress site. In 2020 3rd International Conference on Intelligent Sustainable Systems (ICISS), pages 285–290, 2020. doi: 10.1109/ICISS49785.2020.9316019. [DOI] [Google Scholar]
- Subakti Hanas and Jiang Jehn-Ruey. A marker-based cyber-physical augmented-reality indoor guidance system for smart campuses. In 2016 IEEE 18th International Conference on High Performance Computing and Communications; IEEE 14th International Conference on Smart City; IEEE 2nd International Conference on Data Science and Systems (HPCC/Smart City/DSS), pages 1373–1379, 2016. doi: 10.1109/HPCC-SmartCity-DSS.2016.0194. [DOI] [Google Scholar]
- Oliveira LC, Andrade AO, Oliveira EC, Soares A, Cardoso A, and Lamounier E. Indoor navigation with mobile augmented reality and beacon technology for wheelchair users. In 2017 IEEE EMBS International Conference on Biomedical & Health Informatics (BHI, pages 37–40,. doi: 10.1109/BHI.2017.7897199. [DOI] [Google Scholar]
- Rochadiani Theresia Herlina, Atmojo Wahyu Tisno, Bari Mohammad, Kristina Erika, Renaldi, and Setiawan Andres. Find: Mall navigation using augmented reality. 2022 8th International Conference on Virtual Reality (ICVR), pages 110–115, 2022. [Google Scholar]
- Wu Lianyao and Yu Xiaoqing. Outdoor navigation with handheld augmented reality. In 2018 International Conference on Audio, Language and Image Processing (ICALIP), pages 237–241, 2018. doi: 10.1109/ICALIP.2018.8455285. [DOI] [Google Scholar]
- Ng XH and Lim WN. Design of a mobile augmented reality-based indoor navigation system. In 4th International Symposium on Multidisciplinary Studies and Innovative Technologies (ISMSIT, pages 1–6,. doi: 10.1109/ISMSIT50672.2020.9255121. [DOI] [Google Scholar]
- Lakehal Aymen, Lepreux Sophie, Efstratiou Christos, Kolski Christophe, and Nicolaou Pavlos. Investigating smartphones and ar glasses for pedestrian navigation and their effects in spatial knowledge acquisition. In 22nd International Conference on Human-Computer Interaction with Mobile Devices and Services, MobileHCI ‘20, New York, NY, USA, 2021. Association for Computing Machinery. ISBN 9781450380522. doi: 10.1145/3406324.3410722. [DOI] [Google Scholar]
- Bark Karlin, Tran Cuong, Fujimura Kikuo, and Ng-Thow-Hing Victor. Personal navi: Benefits of an augmented reality navigational aid using a see-thru 3D volumetric hud. In Proceedings of the 6th International Conference on Automotive User Interfaces and Interactive Vehicular Applications, AutomotiveUI ‘14, page 1–8, New York, NY, USA, 2014. Association for Computing Machinery. ISBN 9781450332125. doi: 10.1145/2667317.2667329. [DOI] [Google Scholar]
- Knierim Pascal, Maurer Steffen, Wolf Katrin, and Funk Markus. Quadcopter-projected in-situ navigation cues for improved location awareness. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, CHI ‘18, page 1–6, New York, NY, USA, 2018. Association for Computing Machinery. ISBN 9781450356206. doi: 10.1145/3173574.3174007. [DOI] [Google Scholar]
- Sree Satya, Kavya Shanmukha Sai, Sulthana Syed Reshma, Geethanjali S, and Manikanta Soma Sai. Indoor navigation using augmented reality. Journal of Advanced Research in Technology and Management Sciences, 3(4):33–37, Jul 2021. [Google Scholar]
- Demirkan DC and Duzgun S. An evaluation of AR-assisted navigation for search and rescue in underground spaces. In 2020 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct, pages 1–2,. doi: 10.1109/ISMAR-Adjunct51615.2020.00017. [DOI] [Google Scholar]
- Chidsin Woranipit. AR-based navigation using RGB-d camera and hybrid map. Sustainability, 13(10):5585, 10 3390 13105585. Accessed 20 June 2021. [Google Scholar]
- Hoang MK, Schmitz S, Drueke C, Vu DHT, Schmalenstroeer J, and Haeb-Umbach R. Server based indoor navigation using RSSI and inertial sensor information. In 10th Workshop on Positioning, Navigation and Communication (WPNC, pages 1–6,. doi: 10.1109/WPNC.2013.6533263. [DOI] [Google Scholar]
- Neges Matthias. Combining visual natural markers and IMU for improved AR based indoor navigation. Advanced Engineering Informatics, 31:18–31,. Accessed 11 Oct. 2021. [Google Scholar]
- Gupta Gaurav, Kejriwal Nishant, Pallav Prasun, Hassan Ehtesham, Kumar Swagat, Hebbalaguppe Ramya. Indoor Localisation and Navigation on Augmented Reality Devices. In 2016 IEEE International Symposium on Mixed and Augmented Reality, pages 107–112,. doi: 10.1109/ISMAR-Adjunct.2016.0052. [DOI] [Google Scholar]
- Hořejší Petr, Macháč Tomáš, and Šimon Michal. Reliability and Accuracy of Indoor Warehouse Navigation Using Augmented Reality. IEEE; Access. 12 pp. 94506–94519 (2024). [Google Scholar]
- Juile J, Chandrasekaran Surendar, Surya Pr. ARGuide Pro: An AR-based Indoor Navigation. In proceedings of 2024 International Conference on Communication, Computing and Internet of Things (IC3IoT), pages 1–4,. doi: 10.1109/IC3IoT60841.2024.10550258 [DOI] [Google Scholar]
- Kumar Dhananjay, Iyer Shreayaas, Raja Easwar, Kumar Ragul, Kafle Ved. Improving Pedestrian Navigation in Urban Environment Using Augmented Reality and Landmark Recognition. In IEEE Communications Standards Magazine. doi: 10.1109/MCOMSTD.0003.2300017 [DOI] [Google Scholar]
