JointDNN: An Efficient Training and Inference Engine for Intelligent Mobile Cloud Computing Services (original) (raw)

A Lightweight Collaborative Deep Neural Network for the Mobile Web in Edge Cloud

Schahram Dustdar

IEEE Transactions on Mobile Computing

View PDFchevron_right

A DNN-Based Application in Joint Mobile and Cloud Services Platform

Kazi Shahiduzzaman

IJARCCE, 2020

View PDFchevron_right

Energy-Aware Inference Offloading for DNN-Driven Applications in Mobile Edge Clouds

Qiufen Xia

IEEE Transactions on Parallel and Distributed Systems, 2021

View PDFchevron_right

Enabling DNN Acceleration with Data and Model Parallelization over Ubiquitous End Devices

Xiuquan Qiao

IEEE Internet of Things Journal, 2021

View PDFchevron_right

Cost-effective Machine Learning Inference Offload for Edge Computing

Madhu Athreya

ArXiv, 2020

View PDFchevron_right

Evaluating Edge-Cloud Computing Trade-Offs for Mobile Object Detection and Classification with Deep Learning

Leandro Balby Marinho

Journal of Information and Data Management

View PDFchevron_right

Deep Learning Neural Networks in the Cloud

Burhan Humayun Awan, Ijaems Journal

View PDFchevron_right

A survey of Deep learning at the Edge computing

Nicanor Mayumu

View PDFchevron_right

A Deep Learning Approach for Energy Efficient Computational Offloading in Mobile Edge Computing

Zaiwar Ali

IEEE Access, 2019

View PDFchevron_right

AoDNN: An Auto-Offloading Approach to Optimize Deep Inference for Fostering Mobile Web

Schahram Dustdar

IEEE INFOCOM 2022 - IEEE Conference on Computer Communications

View PDFchevron_right

DistrEdge: Speeding up Convolutional Neural Network Inference on Distributed Edge Devices

Yongjie Guan

2022 IEEE International Parallel and Distributed Processing Symposium (IPDPS)

View PDFchevron_right

Moving Deep Learning to the Edge

Rui Policarpo Duarte

Algorithms

View PDFchevron_right

Toward Decentralized and Collaborative Deep Learning Inference for Intelligent IoT Devices

Xiuquan Qiao

2021

View PDFchevron_right

Challenges and Obstacles Towards Deploying Deep Learning Models on Mobile Devices

Ajay Balasubramaniam

2021

View PDFchevron_right

Complexity of Deep Convolutional Neural Networks in Mobile Computing

Saad NAeem

Complexity, 2020

View PDFchevron_right

A Novel Approach to Improving Distributed Deep Neural Networks over Cloud Computing

Karrar Shakir Muttair

International Journal of Interactive Mobile Technologies (iJIM)

View PDFchevron_right

Deep learning on mobile devices: a review

Yunbin Deng

Mobile Multimedia/Image Processing, Security, and Applications 2019, 2019

View PDFchevron_right

DeepFogSim: A Toolbox for Execution and Performance Evaluation of the Inference Phase of Conditional Deep Neural Networks with Early Exits Atop Distributed Fog Platforms

Enzo Baccarelli

Applied Sciences, 2021

View PDFchevron_right

Guidelines and Benchmarks for Deployment of Deep Learning Models on Smartphones as Real-Time Apps

Nasser Kehtarnavaz

Machine Learning and Knowledge Extraction, 2019

View PDFchevron_right

ALOHA: an architectural-aware framework for deep learning at the edge

Dolly Sapra

2018

View PDFchevron_right

TinyML: Enabling of Inference Deep Learning Models on Ultra-Low-Power IoT Edge Devices for AI Applications

Norah AlAjlan

Micromachines

View PDFchevron_right

MCDNN: An Execution Framework for Deep Neural Networks on Resource-Constrained Devices

Haichen Shen

2015

View PDFchevron_right

Partitioning Convolutional Neural Networks to Maximize the Inference Rate on Constrained IoT Devices

Edson Bonin

Future Internet, 2019

View PDFchevron_right

Optimizing Deep Learning Inference on Embedded Systems Through Adaptive Model Selection

Ben Taylor

ACM Transactions on Embedded Computing Systems, 2020

View PDFchevron_right

DeepRebirth: Accelerating Deep Neural Network Execution on Mobile Devices

Deguang Kong

ArXiv, 2018

View PDFchevron_right

Accelerated Deep Learning for the Edge-to-Cloud continuum: a Specialized Full Stack derived from Algorithms

hardik Sharma

2019

View PDFchevron_right

Application Optimizing AI Performance on Edge Devices: A Comprehensive Approach using Model Compression, Federated Learning, and Distributed Inference

International Journal of Automation, Artificial Intelligence and Machine Learning (ISSN 2563-7568)

International Journal of Automation, Artificial Intelligence and Machine Learning, 2024

View PDFchevron_right

Unveiling Energy Efficiency in Deep Learning: Measurement, Prediction, and Scoring across Edge Devices

Anik Mallik, Haoxin Wang

Proceedings of ACM/IEEE Symposium on Edge Computing (SEC), 2023

View PDFchevron_right

Benchmarking Convolutional Neural Network Inference on Low-Power Edge Devices

Oscar Ferraz

ICASSP 2023 - 2023 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)

View PDFchevron_right

The Case for Adaptive Deep Neural Networks in Edge Computing

Schahram Dustdar

2021 IEEE 14th International Conference on Cloud Computing (CLOUD), 2021

View PDFchevron_right

On-Device Neural Net Inference with Mobile GPUs

Ekaterina Ignasheva

ArXiv, 2019

View PDFchevron_right

An Energy-Efficient Fine-Grained Deep Neural Network Partitioning Scheme for Wireless Collaborative Fog Computing

hamed mirghasemi

IEEE Access, 2021

View PDFchevron_right

Toward Distributed, Global, Deep Learning Using IoT Devices

Schahram Dustdar

IEEE Internet Computing, 2021

View PDFchevron_right

DNN partitioning for inference throughput acceleration at the edge

Thomas Clausen

IEEE Access

View PDFchevron_right

DeepEdgeBench: Benchmarking Deep Neural Networks on Edge Devices

Anshul J.

2021

View PDFchevron_right