Dandelion: A Unified Code Offloading System for Wearable Computing (original) (raw)
Related papers
2015 IEEE Global Communications Conference (GLOBECOM), 2015
Wearable devices are becoming increasingly popular and are expected to become essential in our everyday life. Despite continuous improvement of hardware, the lifetime of mobile devices and their capabilities still remain a concern. Small size of batteries of smart watches, glasses, helmets and gloves limits the amount of computing, storage and communication resources. Mobile cloud computing can augment the capabilities of wearable devices by helping to execute some of the computing tasks in the cloud. Such computational offloading helps to preserve battery power at the cost of more intensive communications with the cloud. In this paper, we present a model and comprehensive analysis for computational offloading between wearable devices and clouds in realistic setups.
MAUI: Making Smartphones Last Longer with Code Offload
This paper presents MAUI, a system that enables fine-grained energy-aware offload of mobile code to the infrastructure. Previous approaches to these problems either relied heavily on programmer support to partition an application, or they were coarse-grained re- quiring full process (or full VM) migration. MAUI uses the benefits of a managed code environment to offer the best of both worlds: it supports fine-grained code offload to maximize energy savings with minimal burden on the programmer. MAUI decides at run- time which methods should be remotely executed, driven by an op- timization engine that achieves the best energy savings possible un- der the mobile device’s current connectivity constrains. In our eval- uation, we show that MAUI enables: 1) a resource-intensive face recognition application that consumes an order of magnitude less energy, 2) a latency-sensitive arcade game application that doubles its refresh rate, and 3) a voice-based language translation applica- tion that bypasses the limitations of the smartphone environment by executing unsupported components remotely
A Hitchhiker's Guide to Computation Offloading: Opinions from Practitioners
IEEE Communications Magazine, 2017
Due to the increasing usage and capabilities of smart devices, mobile application developers build a large number of resource intensive applications, such as WAR applications. Even with the rapid development in hardware technology, the computing capability and battery capacity on wearable devices and smartphones still cannot meet the application demands with heavy computations and high battery drain. Pervasive computing addresses this problem by migrating applications to the resource providers external to mobile devices. The profitability of this method heavily depends on how to implement it and when to use it. Although there are many computation offloading systems proposed in the literature, there is no practical manual that addresses all the implementation complexities on the way to building a general offloading system. In this article, we review developments in the field of pervasive computing on computation offloading. We use this literature review together with our own experience and provide designers with some detailed guidelines to gain a deep insight into the implementation challenges of a computation offloading system. The guidelines empower the reader to choose between the variety of solutions in the literature for developing any offloading system with the consideration of their own system architecture and available facilities. Finally, we evaluate our general offloading system on Android devices with two real-time applications.
Enhancing Mobile Devices through Code Offload
Advances in mobile hardware and operating systems have made mobile a first-class development platform. Activities such as web browsing, casual game play, media playback, and document reading are now as common on mobile devices as on full-sized desktop systems. However, developers are still constrained by the inherent resource limitations of mobile devices. Unlike desktop systems, mobile devices must sacrifice performance to accomodate smaller form factors and battery-backed operation. Opportunistic offloading of computation from a mobile device to remote server infrastructure (i.e., "code offload") offers a promising way to overcome these constraints and to expand the set of applications (i.e., "apps") that can run on devices. Deciding to offload requires a careful consideration of the costs and benefits of a range of possible program partitions. This cost-benefit analysis depends on external factors, such as network conditions and the resources availability, as well as internal app properties, such as component dependencies, data representations, and code complexity. Thus, benefiting from offload requires some assistance from developers, but requiring developers to adopt arcane or unnatural programming models will hinder adoption of regardless of the potential benefits. In this dissertation I characterize two frameworks that reduce the amount of developer effort required to improve the performance of mobile apps through code offload. The first, \emph{MAUI}, is designed for computationally intensive general-purpose apps such as speech and facial recognition. The second, \emph{Kahawai}, is designed for graphics-intensive apps like fast-action video games. MAUI continuously monitors the device, network, and app, and uses its measurements to compute an energy-efficient program partition. MAUI reduces the burden on developers by taking advantage of core features of the managed code environments common to mobile platforms: code portability, serialization, reflection, and type safety. These features allows MAUI to automatically instrument and potentially offload methods that the developer has tagged as suitable for offload. MAUI is particularly effective on applications composed by operations whose computational cost is large compared to the transfer cost of their input parameters and their output results. Kahawai is designed for graphics-intensive apps such as console-style games and takes advantage of two features of today's mobile gaming platforms: capable mobile GPUs and reusable game engines. Even though today's mobile devices cannot duplicate the sophisticated graphical detail provided by gaming consoles and high-end desktop GPUs, devices have seen rapid improvements in their GPU processing capabilities. Kahawai leverages a device's GPU to provide \emph{collaborative rendering}. Collaborative rendering relies on a mobile GPU to generate low-fidelity output, which when combined with server-side GPU output allows a mobile device to display a high-fidelity result. The benefits of collaborative rendering are substantial: mobile clients can experience high-quality graphical output using relatively little bandwidth. Fortunately, because most modern games are built on top of reusable game engines, developers only have to identify the sources of non-determinism in the game logic to take advantage collaborative rendering. Together, MAUI and Kahawai demonstrate that code offload can provide substantial benefits for mobile apps without overburdening app developers.
2019
The popularity of mobile devices has increased significantly, and nowadays they are used for the most diverse purposes like accessing the Internet or helping on business matters. Such popularity emerged as a consequence of the compatibility of these devices with a large variety of applications. However, the complexity of these applications boosted the demand for computational resources on mobile devices. Code Offloading is a solution that aims to mitigate this problem by reducing the use of resources and battery on mobile devices by sending parts of applications to be processed in the cloud. In this sense, this paper presents an evaluation of a transparent code offloading technique, where no modification in the application source code is required to allow the smartphone to send parts of the application to be processed in the cloud. We used a face detection application for the evaluation. Results showed the technique can improve applications performance in some scenarios, achieving s...
2013 Future Network & Mobile Summit, 2013
Mobile cloud computing is a new rapidly growing field. In addition to the conventional fashion that mobile clients access cloud services as in the well-known client/server model, existing work has proposed to explore cloud functionalities in another perspective - offloading part of the mobile codes to the cloud for remote execution in order to optimize the application performance and energy efficiency of the mobile device. In this position paper, we investigate the state of the art of code offloading for mobile devices, highlight the significant challenges towards a more efficient cloud-based offloading framework, and also point out how existing technologies can provide us opportunities to facilitate the framework implementation.
IEEE Access
Wearable devices have become essential in our daily activities. Due to battery constrains the use of computing, communication, and storage resources is limited. Mobile Cloud Computing (MCC) and the recently emerged Fog Computing (FC) paradigms unleash unprecedented opportunities to augment capabilities of wearables devices. Partitioning mobile applications and offloading computationally heavy tasks for execution to the cloud or edge of the network is the key. Offloading prolongs lifetime of the batteries and allows wearable devices to gain access to the rich and powerful set of computing and storage resources of the cloud/edge. In this paper, we experimentally evaluate and discuss rationale of application partitioning for MCC and FC. To experiment, we develop an Android-based application and benchmark energy and execution time performance of multiple partitioning scenarios. The results unveil architectural trade-offs that exist between the paradigms and devise guidelines for proper power management of service-centric Internet of Things (IoT) applications. INDEX TERMS Mobile cloud computing, fog computing, energy efficiency, IoT, wearable devices.
Elegant Computational Intensive Task Offloading Scenario for Android
2021
Despite the technical changes and enormous day by day upgradiation in the field of mobile computing the smart devices as well as IOT devices had experienced tremendous technical glitch, which narrow’s the life span and survivability of small scale processing devices. Today, end users are becoming more demanding and are expecting to run computational intensive tasks on their Smart phone devices and IOT devices. Therefore, virtual cloud computing (VCC) integrates local device computing and Cloud Computing (CC) in order to extend computational capabilities of smart phone devices and IOT devices using cloud offloading techniques. Computation Offloading tackles limitations of Smart phone devices and IOT devices such as limited battery duration, limited computational capabilities, and limited storage capacity by offloading the execution and workload to cloud which has better systems with better computation and storage capabilities. This paper aims to present the techniques to offload comp...
Enabling Automatic Offloading of Resource-Intensive Smartphone Applications
2011
The limited capability and energy constraint of smartphones have posed a significant challenge to running the "newest and hottest" applications which are becoming increasingly resource demanding, e.g., realtime image recognition. In this paper, we revisit the decade-old general concept of offloading computation to remote servers by focusing on a largely unsolved problem: how to automatically determine whether and when a smartphone application will benefit from offloading? This is an especially relevant and challenging problem today as (1) modern mobile applications tend to have complex interactions with users and advanced capabilities (e.g., GPS and camera) and hence cannot be offloaded as a whole; (2) whether an application component, e.g., a method call, will benefit from offloading depends on its execution time on smartphone and the size of state to be shipped, which in turn depend on the input parameters.