Supporting 3D window manipulation with a yawing mouse (original) (raw)

The orienting mouse: An input device with attitude

This paper presents a modified computer mouse, the Orienting Mouse, which delivers orientation as an additional dimension of input; when the mouse is moved on a flat surface it reports, in addition to the conventional x, y translation, angular rotation of the device in the x, y plane.

3D Interaction with the Desktop Bat

Computer Graphics Forum, 1995

Many applications now demand interaction with visualizations of 3D scenes and data sets. Current flat 2D displays are limited in their capacity to provide this not only by the display technology but the interaction metaphors and devices used. The Desktop Bat is a device that has 5 degrees of freedom whilst retaining the simplicity of use of a mouse. To use it for general 3D interaction several metaphors were created for the tasks of navigation and cursor manipulation and a set of experiments were conducted to determine which metaphors were the most efficient in use. Of these metaphors, a velocity control metaphor was the best for navigation and a metaphor that applied rotations and translations relative to the eyepoint coordinate system was best for object control.

On 3D Input Devices

IEEE Computer Graphics and Applications, 2006

A lthough 3D graphics applications are increasingly in regular use, the development of input devices for this particular domain has evolved slowly. The desktop is still dominated by the mouse, and overall, only a small variety of input devices is commercially available. For example, for VR applications, tracked wands are commonly used-and, to a lesser degree, gloves.

Towards applicable 3D user interfaces for everyday working environments

2007

Desktop environments have proven to be a powerful user interface and are used as the de facto standard human-computer interaction paradigm for over 40 years. However, there is a rising demand on 3D applications dealing with complex datasets, which exceeds the possibilities provided by traditional devices or two-dimensional display. For these domains more immersive and intuitive interfaces are required. But in order to get the users' acceptance, technology-driven solutions that require inconvenient instrumentation, e.g., stereo glasses or tracked gloves, should be avoided. Autostereoscopic display environments equipped with tracking systems enable users to experience 3D virtual environments more natural without annoying devices, for instance via gestures. However, currently these approaches are only applied for specially designed or adapted applications without universal usability. In this paper we introduce new 3D user interface concepts for such setups where minimal instrumentation of the user is required such that the strategies can be easily integrated in everyday working environments. Therefore, we propose an interaction system and framework which allows to display and interact with both mono-as well as stereoscopic content simultaneously. The challenges for combined mouse-, keyboardand gesture-based input paradigms in such an environment are pointed out and novel interaction strategies are introduced.

A Novel Form of PointingDevice

2003

This paper presents a novel approach for man machine interaction applying real time computer vision techniques. We use a handheld camera to control the mouse cursor on a computer display. The camera captures an image of the display in its field of view and this can be used to judge the camera's position and orientation relative to the display. The problem is modelled as a plane-to-plane projection (homography). Once the mapping of the display in the camera view to the real world display is known, the intersection between the central axis of the camera and the surface of the display can be computed. The mouse pointer is then moved to the corresponding display position. This calculation can be iterated continuously to update the mouse cursor position as the camera position and orientation changes. The camera can then be used to control the mouse cursor just like a laser pointer controls a laser dot. A prototype has been developed to demonstrate the approach.

Visual panel: virtual mouse, keyboard and 3D controller with an ordinary piece of paper

2001

This paper presents a vision-based interface system, VISUAL PANEL, which employs an arbitrary quadrangle-shaped panel (e.g., an ordinary piece of paper) and a tip pointer (e.g., fingertip) as an intuitive, wireless and mobile input device. The system can accurately and reliably track the panel and the tip pointer. The panel tracking continuously determines the projective mapping between the panel at the current position and the display, which in turn maps the tip position to the corresponding position on the display. By detecting the clicking and dragging actions, the system can fulfill many tasks such as controlling a remote large display, and simulating a physical keyboard. Users can naturally use their fingers or other tip pointers to issue commands and type texts. Furthermore, by tracking the 3D position and orientation of the visual panel, the system can also provide 3D information, serving as a virtual joystick, to control 3D virtual objects.