JointDNN: An Efficient Training and Inference Engine for Intelligent Mobile Cloud Computing Services (original) (raw)
Related papers
A Lightweight Collaborative Deep Neural Network for the Mobile Web in Edge Cloud
IEEE Transactions on Mobile Computing
A DNN-Based Application in Joint Mobile and Cloud Services Platform
IJARCCE, 2020
Energy-Aware Inference Offloading for DNN-Driven Applications in Mobile Edge Clouds
IEEE Transactions on Parallel and Distributed Systems, 2021
Enabling DNN Acceleration with Data and Model Parallelization over Ubiquitous End Devices
IEEE Internet of Things Journal, 2021
Cost-effective Machine Learning Inference Offload for Edge Computing
ArXiv, 2020
Journal of Information and Data Management
Deep Learning Neural Networks in the Cloud
Burhan Humayun Awan, Ijaems Journal
A survey of Deep learning at the Edge computing
A Deep Learning Approach for Energy Efficient Computational Offloading in Mobile Edge Computing
IEEE Access, 2019
AoDNN: An Auto-Offloading Approach to Optimize Deep Inference for Fostering Mobile Web
IEEE INFOCOM 2022 - IEEE Conference on Computer Communications
DistrEdge: Speeding up Convolutional Neural Network Inference on Distributed Edge Devices
2022 IEEE International Parallel and Distributed Processing Symposium (IPDPS)
Moving Deep Learning to the Edge
Algorithms
Toward Decentralized and Collaborative Deep Learning Inference for Intelligent IoT Devices
2021
Challenges and Obstacles Towards Deploying Deep Learning Models on Mobile Devices
2021
Complexity of Deep Convolutional Neural Networks in Mobile Computing
Complexity, 2020
A Novel Approach to Improving Distributed Deep Neural Networks over Cloud Computing
International Journal of Interactive Mobile Technologies (iJIM)
Deep learning on mobile devices: a review
Mobile Multimedia/Image Processing, Security, and Applications 2019, 2019
Applied Sciences, 2021
Guidelines and Benchmarks for Deployment of Deep Learning Models on Smartphones as Real-Time Apps
Machine Learning and Knowledge Extraction, 2019
ALOHA: an architectural-aware framework for deep learning at the edge
2018
Micromachines
MCDNN: An Execution Framework for Deep Neural Networks on Resource-Constrained Devices
2015
Partitioning Convolutional Neural Networks to Maximize the Inference Rate on Constrained IoT Devices
Future Internet, 2019
Optimizing Deep Learning Inference on Embedded Systems Through Adaptive Model Selection
ACM Transactions on Embedded Computing Systems, 2020
DeepRebirth: Accelerating Deep Neural Network Execution on Mobile Devices
ArXiv, 2018
2019
International Journal of Automation, Artificial Intelligence and Machine Learning (ISSN 2563-7568)
International Journal of Automation, Artificial Intelligence and Machine Learning, 2024
Proceedings of ACM/IEEE Symposium on Edge Computing (SEC), 2023
Benchmarking Convolutional Neural Network Inference on Low-Power Edge Devices
ICASSP 2023 - 2023 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
The Case for Adaptive Deep Neural Networks in Edge Computing
2021 IEEE 14th International Conference on Cloud Computing (CLOUD), 2021
On-Device Neural Net Inference with Mobile GPUs
ArXiv, 2019
IEEE Access, 2021
Toward Distributed, Global, Deep Learning Using IoT Devices
IEEE Internet Computing, 2021
DNN partitioning for inference throughput acceleration at the edge
IEEE Access
DeepEdgeBench: Benchmarking Deep Neural Networks on Edge Devices
2021