FlyAR: Augmented Reality Supported Micro Aerial Vehicle Navigation (original) (raw)

Designing a Miniature Unmaned Aerial Vehicle Augmented Reality Application

2020

Augmented reality video games combine the elements of a virtual application with that of the real world. With the recent advances in technology, this type of digital content is easily available to users. The player experience is enhanced by bringing the virtual experience to the world around us, and new interaction techniques emerge. We propose a video game in which the user controls a virtual drone which interacts with objects situated in the real environment. We present the development methodology that we followed in order to bring this project to life, from the early stages of topic selection, to implementation details, and finally to the evaluation stage.

Augmented Reality System for the Assistance of Unmanned Aerial Vehicles

2020 15th Iberian Conference on Information Systems and Technologies (CISTI), 2020

This article proposes the development of a virtual and augmented reality environment to assist in the assembly and maintenance of an unmanned aerial vehicle. Taking the lack of technological knowledge as the research gap, the study provides researchers and practitioners with new solutions through the familiarization and guided assistance of a UAV. The system is based on the user tracing to locate the position of the hands and validate in real-time if the actions performed are correct. In the interface can also access to virtualized environment so that the user define a route with the movement of his hands and immediately the UAV follow the path marked from the beginning to the end, the errors are displayed on the same screen to verify the stability of control.

Augmented Reality Tool for the Situational Awareness Improvement of UAV Operators

Sensors (Basel, Switzerland), 2017

Unmanned Aerial Vehicles (UAVs) are being extensively used nowadays. Therefore, pilots of traditional aerial platforms should adapt their skills to operate them from a Ground Control Station (GCS). Common GCSs provide information in separate screens: one presents the video stream while the other displays information about the mission plan and information coming from other sensors. To avoid the burden of fusing information displayed in the two screens, an Augmented Reality (AR) tool is proposed in this paper. The AR system has two functionalities for Medium-Altitude Long-Endurance (MALE) UAVs: route orientation and target identification. Route orientation allows the operator to identify the upcoming waypoints and the path that the UAV is going to follow. Target identification allows a fast target localization, even in the presence of occlusions. The AR tool is implemented following the North Atlantic Treaty Organization (NATO) standards so that it can be used in different GCSs. The e...

The state of Augmented Reality in aerospace navigation and engineering

Applications of Augmented Reality - Current State of the Art [Working Title]

The concept of Augmented Reality (AR) has existed in the field of aerospace for several decades in the form of Head-Up Display (HUD) or Head-Worn Display (HWD). These displays enhance Human-Machine Interfaces and Interactions (HMI2) and allow pilots to visualize the minimum required flight information while seeing the physical environment through a semi-transparent visor. Numerous research studies are still being conducted to improve pilot safety during challenging situations, especially during low visibility conditions and landing scenarios. Besides flight navigation, aerospace engineers are exploring many modern cloud-based AR systems to be used as remote and/or AI-powered assist tools for field operators, such as maintenance technicians, manufacturing operators, and Air Traffic Control Officers (ATCO). Thanks to the rapid advancement in computer vision and deep neural network architectures, modern AR technologies can also scan or reconstruct the 3D environment with high precision...

PinpointFly: An Egocentric Position-control Drone Interface using Mobile AR

Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems

Accurate drone positioning is challenging because pilots only have a limited position and direction perception of a fying drone from their perspective. This makes conventional joystick-based speed control inaccurate and more complicated and signifcantly degrades piloting performance. We propose PinpointFly, an egocentric drone interface that allows pilots to arbitrarily position and rotate a drone using position-control direct interactions on a see-through mobile AR where the drone position and direction are visualized with a virtual cast shadow (i.e., the drone's orthogonal projection onto the foor). Pilots can point to the next position or draw the drone's fight trajectory by manipulating the virtual cast shadow and the direction/height slider bar on the touchscreen. We design and implement a prototype of PinpointFly for indoor and visual line of sight scenarios, which are comprised of real-time and predefned motion-control techniques. We conduct two user studies with simple positioning and inspection tasks. Our results demonstrate that PinpointFly makes the drone positioning and inspection operations faster, more accurate, simpler and fewer workload than a conventional joystick interface with a speed-control method. CCS CONCEPTS • Human-centered computing → Human computer interaction (HCI).

Autonomous, Vision-based Flight and Live Dense 3D Mapping with a Quadrotor Micro Aerial Vehicle

Journal of Field Robotics, 2015

The use of mobile robots in search-and-rescue and disaster-response missions has increased significantly in recent years. However, they are still remotely controlled by expert professionals on an actuator set-point level, and they would benefit, therefore, from any bit of autonomy added. This would allow them to execute highlevel commands, such as "execute this trajectory" or "map this area." In this paper, we describe a vision-based quadrotor micro aerial vehicle that can autonomously execute a given trajectory and provide a live, dense three-dimensional (3D) map of an area. This map is presented to the operator while the quadrotor is mapping, so that there are no unnecessary delays in the mission. Our system does not rely on any external positioning system (e.g., GPS or motion capture systems) as sensing, computation, and control are performed fully onboard a smartphone processor. Since we use standard, off-the-shelf components from the hobbyist and smartphone markets, the total cost of our system is very low. Due to its low weight (below 450 g), it is also passively safe and can be deployed close to humans. We describe both the hardware and the software architecture of our system. We detail our visual odometry pipeline, the state estimation and control, and our live dense 3D mapping, with an overview of how all the modules work and how they have been integrated into the final system. We report the results of our experiments both indoors and outdoors. Our quadrotor was demonstrated over 100 times at multiple trade fairs, at public events, and to rescue professionals. We discuss the practical challenges and lessons learned. Code, datasets, and videos are publicly available to the robotics community. C 2015 Wiley Periodicals, Inc.

BirdViewAR: Surroundings-aware Remote Drone Piloting Using an Augmented Third-person Perspective

Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems

We propose BirdViewAR, a surroundings-aware remote droneoperation system that provides significant spatial awareness to pilots through an augmented third-person view (TPV) from an autopiloted secondary follower drone. The follower drone responds to the main drone's motions and directions using our optimizationbased autopilot, allowing the pilots to clearly observe the main drone and its imminent destination without extra input. To improve their understanding of the spatial relationships between the main drone and its surroundings, the TPV is visually augmented with AR-overlay graphics, where the main drone's spatial statuses are This work is licensed under a Creative Commons Attribution International 4.0 License.

Multilayered Mapping and Navigation for Autonomous Micro Aerial Vehicles

Journal of Field Robotics, 2015

Micro aerial vehicles, such as multirotors, are particularly well suited for the autonomous monitoring, inspection, and surveillance of buildings, e.g., for maintenance or disaster management. Key prerequisites for the fully autonomous operation of micro aerial vehicles are real-time obstacle detection and planning of collision-free trajectories. In this article, we propose a complete system with a multimodal sensor setup for omnidirectional obstacle perception consisting of a 3D laser scanner, two stereo camera pairs, and ultrasonic distance sensors. Detected obstacles are aggregated in egocentric local multiresolution grid maps. Local maps are efficiently merged in order to simultaneously build global maps of the environment and localize in these. For autonomous navigation, we generate trajectories in a multi-layered approach: from mission planning over global and local trajectory planning to reactive obstacle avoidance. We evaluate our approach and the involved components in simulation and with the real autonomous micro aerial vehicle. Finally, we present the results of a complete mission for autonomously mapping a building and its surroundings.

SelfieDroneStick: A Natural Interface for Quadcopter Photography

2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2020

A physical selfie stick extends the user’s reach, enabling the acquisition of personal photos that include more of the background scene. Similarly, a quadcopter can capture photos from vantage points unattainable by the user; but teleoperating a quadcopter to good viewpoints is a difficult task. This paper presents a natural interface for quadcopter photography, the SelfieDroneStick that allows the user to guide the quadcopter to the optimal vantage point based on the phone’s sensors. Users specify the composition of their desired long-range selfies using their smartphone, and the quadcopter autonomously flies to a sequence of vantage points from where the desired shots can be taken. The robot controller is trained from a combination of real-world images and simulated flight data. This paper describes two key innovations required to deploy deep reinforcement learning models on a real robot: 1) an abstract state representation for transferring learning from simulation to the hardware...

Application of augmented reality in aviation

Proceedings of the 2nd International Conference on Intelligent and Innovative Computing Applications, 2020

Full bibliographic details must be given when referring to, or quoting from full items including the author's name, the title of the work, publication details where relevant (place, publisher, date), pagination, and for theses or dissertations the awarding institution, the degree type awarded, and the date of the award.