A Haptically Enhanced Broadcasting System (original) (raw)

Haptic Interaction in Realistic Multimedia Broadcasting

Lecture Notes in Computer Science, 2004

In this paper, we discuss a haptically enhanced multimedia broadcasting system. Four stages of a proposed system are briefly analyzed: scene capture, haptic editing, data transmission, and display with haptic interaction. In order to show usefulness of the proposed system, some potential scenarios with haptic interaction are listed. These scenarios are classified into passive and active haptic interaction scenarios, which can be fully authored by scenario writers or producers. Finally, in order to show how the haptically enhanced scenario works, a typical example is demonstrated to explain specifically for a home shopping setting.

Construction of a haptic-enabled broadcasting system based on the MPEG-V standard

Signal Processing: Image Communication, 2013

With rapid developments in communications technology and digital multimedia, there has been increasing demand in recent years for realistic broadcasting technology beyond conventional audiovisual media. In response to this demand, this paper presents an example of the construction of a haptic-enabled broadcasting system based on the MPEG-V standard that was established recently. The construction processes of the proposed haptic-enabled broadcasting system include various types of media acquisition, haptic content creation by modeling and authoring, transmission, rendering, and interaction. This paper illustrates the data flow within the system, from the creation of haptic contents to the rendering of these contents to the end user, and explains a method of building the system with the MPEG-V standard. The constructed haptic-enabled broadcasting system allows users to have more immersive interaction with the synthesized haptic multimedia, which is closely synchronized with audiovisual data.

A Novel Test-Bed for Immersive and Interactive Broadcasting Production Using Augmented Reality and Haptics

IEICE Transactions on Information and Systems, 2006

In this paper, we demonstrate an immersive and interactive broadcasting production system with a new haptically enhanced multimedia broadcasting chain. The system adapts Augmented Reality (AR) techniques, which merges captured videos and virtual 3D media seamlessly through multimedia streaming technology, and haptic interaction technology in near real-time. In this system, viewers at the haptic multimedia client can interact with AR broadcasting production transmitted via communication network. We demonstrate two test applications, which show that the addition of AR-and haptic-interaction to the conventional audio-visual contents can improve immersiveness and interactivity of viewers with rich contents service.

A Novel Test-Bed for Immersive and Interactive Broadcasting Production Using Augmented Reality and Haptics( Artificial Reality and Telexistence)

IEICE Transactions on Information and Systems, 2006

In this paper, we demonstrate an immersive and interactive broadcasting production system with a new haptically enhanced multimedia broadcasting chain. The system adapts Augmented Reality (AR) techniques, which merges captured videos and virtual 3D media seamlessly through multimedia streaming technology, and haptic interaction technology in near real-time. In this system, viewers at the haptic multimedia client can interact with AR broadcasting production transmitted via communication network. We demonstrate two test applications, which show that the addition of AR-and haptic-interaction to the conventional audiovisual contents can improve immersiveness and interactivity of viewers with rich contents service.

Haptic Communications

Proceedings of the IEEE, 2000

| Audiovisual communications is at the core of 10 multimedia systems that allow users to interact across dis-11 tances. It is common understanding that both audio and video 12 are required for high-quality interaction. While audiovisual 13 information provides a user with a satisfactory impression of 14 being present in a remote environment, physical interaction 15 and manipulation is not supported. True immersion into a dis-16 tant environment and efficient distributed collaboration re-17 quire the ability to physically interact with remote objects and 18 to literally get in touch with other people. Touching and mani-19 pulating objects remotely becomes possible if we augment 20 traditional audiovisual communications by the haptic modality. 21 Haptic communications is a relatively young field of research 22 The field of audiovisual communications has witnessed 48 tremendous growth and progress during the last decades. 49 This progress has led to improved productivity and quality 50 of experience in remote interaction scenarios such as video 51 conferencing. With increasing quality, users feel more 52 present, experience an improved feeling of togetherness, 53 and are able to perform more subtle interactions. The re-54 sulting level of immersiveness can, for instance, be 55 experienced in the commercial high-end teleconferencing 56 products from CISCO (TelePresence) [1] and Hewlett 57 Packard (Halo) [2], which have managed to partially fulfill 58 the promise of connecting people remotely and giving 59 them a feeling of presence and closeness that we usually 60 can only experience when people are in the same room. 61 Driving factors that made this improved telepresence 62 possible are high-quality audio and video capturing and 63 display devices, highly efficient audio and video coding Manuscript 64 standards, as well as the ever increasing transmission 65 capacity of our communication networks. It is expected 66 that 3-D video capture and display will further increase the 67 level of immersiveness experienced by users in telepre-68 sence systems. 69 Despite all these advances, however, presence in a 70 remote real or virtual environment and truly immersive 71 communication cannot be complete without the ability to 72 physically interact with distant objects and humans [3]-73 [17]. To achieve immersion into the task environment, 74 telepresence systems have continually attempted to supply 75 the user with comprehensive sensory feedbackVauditory, 76 visual, and haptic. The intricacies of involving haptics 77 originate from the human-in-the-loop nature of haptic 78 interaction. The human can not only perceive the haptic 79 feedback offeredVsimilar to other modalitiesVbut also, 80 in response, physically act upon an environment to alter it. 81 Therefore, a human-centric design and analysis of haptic 82 interaction systems is called for [18]-[20]. 83 In particular, in shared cooperative (virtual) envi-84 ronments, the communication of multimodal sensory 85 information plays a fundamental role as it enables the 86 participants to communicate and interact through their 87 actions [21]. In this context, the importance of the haptic 88 modality and its positive influence on such shared user 89 experience is discussed in [21] and [22]. The results clearly 90 show that shared haptic interaction toward a common 91 objective significantly improves task performance and the 92 sense of togetherness. In addition to interpersonal commu-93 nication, the integration of haptics significantly improves 94 the perception of important information about the envi-95 ronment such as surface characteristics and shape of 96 remote (virtual) objects [23]-[25]. The integration of 97 haptic communication enables novel applications in the 98 field of teaching/training, telerobotics, entertainment, 99 gaming, etc. 100 In this invited paper, we complement and extend our 101 discussion of haptic communications presented recently in 102 [26]. Our selection of topics is such that overlap with [26] 103 is avoided as much as possible without sacrificing the 104 completeness and readability of this article. Although in 105 the long run a joint treatment of all modalities (audio, 106 video, and haptics) is required, we constrain our discussion 107 in this paper mainly to the haptic modality and its role in 108 telepresence and telemanipulation systems including vir-109 tual reality systems.

Immersive haptic interaction with media

Visual Communications and Image Processing 2010, 2010

New 3D video representations enable new modalities of interaction, such as haptic interaction, with 2D and 3D video for truly immersive media applications. Haptic interaction with video includes haptic structure and haptic motion for new immersive experiences. It is possible to compute haptic structure signals from 3D scene geometry or depth information. This paper introduces the concept of haptic motion, as well as new methods to compute haptic structure and motion signals for 2D video-plus-depth representation. The resulting haptic signals can be rendered using a haptic cursor attached to a 2D or 3D video display. Experimental results and a demo system are available.

Haptic telexistence

2007

Nowadays, we can interact with humans or objects even if they are located in remote places or in virtual environments. In these interactions, we can watch, listen, touch, and move objects. However, the properties of an object are not present in conventional systems. When we communicate or perform a task, a lack of haptic sensation reduces the realism and interactivity. Therefore, there is increasing requirement for haptic technology presently.

Haptic interfaces and devices

Sensor Review, 2004

Haptic interfaces enable person‐machine communication through touch, and most commonly, in response to user movements. We comment on a distinct property of haptic interfaces, that of providing for simultaneous information exchange between a user and a machine. We also comment on the fact that, like other kinds of displays, they can take advantage of both the strengths and the limitations of human perception. The paper then proceeds with a description of the components and the modus operandi of haptic interfaces, followed by a list of current and prospective applications and a discussion of a cross‐section of current device designs.

The hapticon editor: a tool in support of haptic communication research

11th Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, 2003. HAPTICS 2003. Proceedings.

We define haptic icons, or "hapticons", as brief programmed forces applied to a user through a haptic interface, with the role of communicating a simple idea in manner similar to visual or auditory icons. In this paper we present the design and implementation of an innovative software tool and graphical interface for the creation and editing of hapticons. The tool's features include various methods for creating new icons including direct recording of manual trajectories and creation from a choice of basis waveforms; novel direct-manipulation icon editing mechanisms, integrated playback and convenient storage of icons to file. We discuss some ways in which the tool has aided our research in the area of haptic iconography and present an innovative approach for generating and rendering simple textures on a low degree of freedom haptic device using what we call terrain display

Dynamic Haptic Interaction with Video

This chapter introduces the notion of passive dynamic haptic interaction with video, and describes the computation of force due to relative motion between an object in a video and the haptic interface point (HIP) of a user, given associated pixel-based depth data. While the concept of haptic video, that is, haptic rendering of forces due to geometry and texture of objects in a video from the associated depth data, has already been proposed, passive dynamic haptic interaction with video has not been studied before. It is proposed that in passive dynamic interaction, a user experiences motion of a video object and dynamic forces due to its movement, even though the content of the video shall not be altered by this interaction. To this effect, the acceleration of a video object is estimated using video motion estimation techniques, while the acceleration of the HIP is estimated from the HIP position acquired by the encoders of the haptic device. Mass values are assigned to the video object and HIP such that user interaction shall not alter the motion of the video object according to the laws of physics. Then, the dynamic force is computed by using Newton's second law. Finally, it is scaled and displayed to the user through the haptic device in addition to the static forces due to the geometry and texture of the object. Experimental results are provided to demonstrate the difference in rendered forces with and without including the dynamics.