An augmented reality system for military operations in urban terrain (original) (raw)
Many future military operations are expected to occur in urban environments. These complex, 3D battlefields introduce many challenges to the dismounted warfighter. Better situational awareness is required for effective operation in urban environments. However, delivering this information to the dismounted warfighter is extremely difficult. For example, maps draw a user's attention away from the environment and cannot directly represent the three-dimensional nature of the terrain. To overcome these difficulties, we are developing the Battlefield Augmented Reality System (BARS). The system consists of a wearable computer, a wireless network system, and a tracked see-through head-mounted display (HMD). The computer generates graphics that, from the user's perspective, appear to be aligned with the actual environment. For example, a building could be augmented to show its name, a plan of its interior, icons to represent reported sniper locations, and the names of adjacent streets. This paper surveys the current state of development of BARS and describes ongoing research efforts. We describe four major research areas. The first is the development of an effective, efficient user interface for displaying data and processing user inputs. The second is the capability for collaboration between multiple BARS users and other systems. Third, we describe the current hardware for both a mobile and indoor prototype system. Finally, we describe initial efforts to formally evaluate the capabilities of the system from a user's perspective through scenario analysis. We also will discuss the use of the BARS system in STRICOM's Embedded Training initiative. ABOUT THE AUTHORS MARK A. LIVINGSTON is a Research Scientist in the Virtual Reality Laboratory at the Naval Research Laboratory, where he works on the Battlefield Augmented Reality System (BARS). He received his Ph.D. from the University of North Carolina at Chapel Hill, where he helped develop a clinical augmented reality system for both ultrasound-guided and laparoscopic surgical procedures, focusing on tracking subsystems. His current research focuses on vision-based tracking algorithms and on user perception in augmented reality systems. Livingston is a member of IEEE, ACM, and SIGGRAPH, and is a member of the VR2003 conference committee.