iCompose: context-aware physical user interface for application composition (original) (raw)

Supporting Autonomic and User-Controlled Application Composition in Ubiquitous Environments

2011

Networked devices, such as consumer electronics, digital media appliances and mobile devices are rapidly filling our everyday environments and changing them into ubiquitous spaces. Composing an application from resources and services available in these environments is a complex task which requires solving a number of equally important engineering challenges as well as issues related to user behavior and acceptance.

Touch & Compose: Physical User Interface for Application Composition in Smart Environments

2009 First International Workshop on Near Field Communication, 2009

In this paper, we sketch a model for interaction between smart environment and their inhabitants. We also introduce Touch & Compose, a concept for composing applications which utilizes this interaction model. The basic idea of Touch & Compose is to assemble applications from the resources that the user has selected manually by touching them with her mobile terminal. Resources (devices, services, files, etc.) are represented with icons attached to real environment objects. RFID tags are placed under the icons; they contain data identifying the resources. The mobile terminal is equipped with an RFID reader. The touched icons are collected in the mobile terminal's resource stack; an application is composed from the resources in the stack either automatically or when requested by the user. Some resources collected from the environment can be stored permanently in the mobile terminal. The stack allows also sharing resources with other users at the environment. We also present software architecture for implementing the Touch & Compose concept and suggest a GUI for the mobile terminal.

Touch & Compose: Physical User Interface for Application Composition in Smart Environments

2009

In this paper, we sketch a model for interaction between smart environment and their inhabitants. We also introduce Touch & Compose, a concept for composing applications which utilizes this interaction model. The basic idea of Touch & Compose is to assemble applications from the resources that the user has selected manually by touching them with her mobile terminal. Resources (devices, services, files, etc.) are represented with icons attached to real environment objects. RFID tags are placed under the icons; they contain data identifying the resources. The mobile terminal is equipped with an RFID reader. The touched icons are collected in the mobile terminal's resource stack; an application is composed from the resources in the stack either automatically or when requested by the user. Some resources collected from the environment can be stored permanently in the mobile terminal. The stack allows also sharing resources with other users at the environment. We also present software architecture for implementing the Touch & Compose concept and suggest a GUI for the mobile terminal.

A Framework for the Development of Context-Adaptable User Interfaces for Ubiquitous Computing Systems

Sensors, 2016

This paper addresses the problem of developing user interfaces for Ubiquitous Computing (UC) and Ambient Intelligence (AmI) systems. These kind of systems are expected to provide a natural user experience, considering interaction modalities adapted to the user abilities and preferences and using whatever interaction devices are present in the environment. These interaction devices are not necessarily known at design time. The task is quite complicated due to the variety of devices and technologies, and the diversity of scenarios, and it usually burdens the developer with the need to create many different UIs in order to consider the foreseeable user-environment combinations. Here, we propose an UI abstraction framework for UC and AmI systems that effectively improves the portability of those systems between different environments and for different users. It allows developers to design and implement a single UI capable of being deployed with different devices and modalities regardless the physical location.

Designing and Building Context-Aware Applications

2001

Abstract User interfaces must adapt to the growing dissemination of computing power in our everyday environment. Computing devices and applications are now used beyond the desktop, in diverse environments, and this trend is accelerating. By taking context into account, context-aware applications promise richer and easier interaction.

iCAP: Interactive prototyping of context-aware applications

Pervasive Computing, 2006

Although numerous context-aware applications have been developed and there have been technological advances for acquiring contextual information, it is still difficult to develop and prototype interesting context-aware applications. This is largely due to the lack of programming support available to both programmers and end-users. This lack of support closes off the context-aware application design space to a larger group of users. We present iCAP, a system that allows end-users to visually design a wide variety of context-aware applications, including those based on if-then rules, temporal and spatial relationships and environment personalization. iCAP allows users to quickly prototype and test their applications without writing any code. We describe the study we conducted to understand end-users' mental models of context-aware applications, how this impacted the design of our system and several applications that demonstrate iCAP's richness and ease of use. We also describe a user study performed with 20 end-users, who were able to use iCAP to specify every application that they envisioned, illustrating iCAP's expressiveness and usability.

Context-Aware Computing Applications

This paper describes software that examines and reacts to an individual's changing context. Such software can promote and mediate people's interactions with devices, computers, and other people, and it can help navigate unfamiliar places. We believe that a limited amount of information covering a person's proximate environment is most important for this form of computing since the interesting part of the world around us is what we can see, hear, and touch. In this paper we define context-aware computing, and describe four categories of context-aware applications: proximate selection, automatic contextual reconfiguration, contextual information and commands, and context-triggered actions. Instances of these application types have been prototyped on the PARCTAB, a wireless, palm-sized computer.

A conceptual framework and a toolkit for supporting the rapid prototyping of context-aware applications

2001

Computing devices and applications are now used beyond the desktop, in diverse environments, and this trend toward ubiquitous computing is accelerating. One challenge that remains in this emerging research field is the ability to enhance the behavior of any application by informing it of the context of its use. By context, we refer to any information that characterizes a situation related to the interaction between humans, applications, and the surrounding environment.

Ubiquitous Interaction: Adapting to the "User in Context

Context-awareness is increasing of importance in achieving an effective communication of information and provision of services. Then, besides taking into account "classical" user-related features, other contextdependent factors (i.e. users' location, activity, emotional state and technical characteristics of the used device) have to be considered. In this paper, we present the Personalization component of a multiagent infrastructure that we have developed for supporting interaction between users and environments. This component establishes which information to present, how to organize it and how to set up the output layout according to "user in context" features.