Chanh Nguyen | Umeå University (original) (raw)
Papers by Chanh Nguyen
2019 19th IEEE/ACM International Symposium on Cluster, Cloud and Grid Computing (CCGRID), 2019
Mobile Edge Clouds (MECs) is a promising computing platform to overcome challenges for the succes... more Mobile Edge Clouds (MECs) is a promising computing platform to overcome challenges for the success of bandwidth-hungry, latency-critical applications by distributing computing and storage capacity in the edge of the network as Edge Data Centers (EDCs) within the close vicinity of end-users. Due to the heterogeneous distributed resource capacity in EDCs, the application deployment flexibility coupled with the user mobility, MECs bring significant challenges to control resource allocation and provisioning. In order to develop a self-managed system for MECs which efficiently decides how much and when to activate scaling, where to place and migrate services, it is crucial to predict its workload characteristics, including variations over time and locality. To this end, we present a novel location-aware workload predictor for EDCs. Our approach leverages the correlation among workloads of EDCs in a close physical distance and applies multivariate Long Short-Term Memory network to achieve on-line workload predictions for each EDC. The experiments with two real mobility traces show that our proposed approach can achieve better prediction accuracy than a state-of-the art location-unaware method (up to 44%) and a location-aware method (up to 17%). Further, through an intensive performance measurement using various input shaking methods, we substantiate that the proposed approach achieves a reliable and consistent performance.
Proceedings of the 4th ACM/IEEE Symposium on Edge Computing, 2019
Mobile Edge Clouds (MECs) are distributed platforms in which distant data-centers are complemente... more Mobile Edge Clouds (MECs) are distributed platforms in which distant data-centers are complemented with computing and storage capacity located at the edge of the network. Their wide resource distribution enables MECs to ful ll the need of low latency and high bandwidth to o⇥er an improved user experience. As modern cloud applications are increasingly architected as collections of small, independently deployable services, they can be ⇤exibly deployed in various con gurations that combines resources from both centralized datacenters and edge locations. In principle, such applications should therefore be well-placed to exploit the advantages of MECs so as to reduce service response times. In this paper, we quantify the bene ts of deploying such cloud micro-service applications on MECs. Using two popular benchmarks, we show that, against conventional wisdom, end-to-end latency does not improve signi cantly even when most application services are deployed in the edge location. We developed a proler to better understand this phenomenon, allowing us to develop recommendations for adapting applications to MECs. Further, by quantifying the gains of those recommendations, we show that the performance of an application can be made to reach the ideal scenario, in which the latency between an edge datacenter and a remote datacenter has no impact on the application performance. This work thus presents ways of adapting cloud-native applications to take advantage of MECs and provides guidance for developing MEC-native applications. We believe that both these elements are necessary to drive MEC adoption. CCS CONCEPTS • Networks ⌅ Network measurement; • Computer systems organization ⌅ Cloud computing; • Software and its engineering ⌅ Software design engineering.
Over the last decade, cloud computing has realized the long-held dream of computing as a utility,... more Over the last decade, cloud computing has realized the long-held dream of computing as a utility, in which computational and storage services are made available via the Internet to anyone at any time and from anywhere. This has transformed Information Technology (IT) and given rise to new ways of designing and purchasing hardware and software. However, the rapid development of the Internet of Things (IoTs) and mobile technology has brought a new wave of disruptive applications and services whose performance requirements are stretching the limits of current cloud computing systems and platforms. In particular, novel large scale mission-critical IoT systems and latency-intolerant applications strictly require very low latency and strong guarantees of privacy, and can generate massive amounts of data that are only of local interest. These requirements are not readily satisfied using modern application deployment strategies that rely on resources from distant large cloud datacenters bec...
2019 19th IEEE/ACM International Symposium on Cluster, Cloud and Grid Computing (CCGRID), 2019
Mobile Edge Clouds (MECs) is a promising computing platform to overcome challenges for the succes... more Mobile Edge Clouds (MECs) is a promising computing platform to overcome challenges for the success of bandwidth-hungry, latency-critical applications by distributing computing and storage capacity in the edge of the network as Edge Data Centers (EDCs) within the close vicinity of end-users. Due to the heterogeneous distributed resource capacity in EDCs, the application deployment flexibility coupled with the user mobility, MECs bring significant challenges to control resource allocation and provisioning. In order to develop a self-managed system for MECs which efficiently decides how much and when to activate scaling, where to place and migrate services, it is crucial to predict its workload characteristics, including variations over time and locality. To this end, we present a novel location-aware workload predictor for EDCs. Our approach leverages the correlation among workloads of EDCs in a close physical distance and applies multivariate Long Short-Term Memory network to achieve on-line workload predictions for each EDC. The experiments with two real mobility traces show that our proposed approach can achieve better prediction accuracy than a state-of-the art location-unaware method (up to 44%) and a location-aware method (up to 17%). Further, through an intensive performance measurement using various input shaking methods, we substantiate that the proposed approach achieves a reliable and consistent performance.
Proceedings of the 4th ACM/IEEE Symposium on Edge Computing, 2019
Mobile Edge Clouds (MECs) are distributed platforms in which distant data-centers are complemente... more Mobile Edge Clouds (MECs) are distributed platforms in which distant data-centers are complemented with computing and storage capacity located at the edge of the network. Their wide resource distribution enables MECs to ful ll the need of low latency and high bandwidth to o⇥er an improved user experience. As modern cloud applications are increasingly architected as collections of small, independently deployable services, they can be ⇤exibly deployed in various con gurations that combines resources from both centralized datacenters and edge locations. In principle, such applications should therefore be well-placed to exploit the advantages of MECs so as to reduce service response times. In this paper, we quantify the bene ts of deploying such cloud micro-service applications on MECs. Using two popular benchmarks, we show that, against conventional wisdom, end-to-end latency does not improve signi cantly even when most application services are deployed in the edge location. We developed a proler to better understand this phenomenon, allowing us to develop recommendations for adapting applications to MECs. Further, by quantifying the gains of those recommendations, we show that the performance of an application can be made to reach the ideal scenario, in which the latency between an edge datacenter and a remote datacenter has no impact on the application performance. This work thus presents ways of adapting cloud-native applications to take advantage of MECs and provides guidance for developing MEC-native applications. We believe that both these elements are necessary to drive MEC adoption. CCS CONCEPTS • Networks ⌅ Network measurement; • Computer systems organization ⌅ Cloud computing; • Software and its engineering ⌅ Software design engineering.
Over the last decade, cloud computing has realized the long-held dream of computing as a utility,... more Over the last decade, cloud computing has realized the long-held dream of computing as a utility, in which computational and storage services are made available via the Internet to anyone at any time and from anywhere. This has transformed Information Technology (IT) and given rise to new ways of designing and purchasing hardware and software. However, the rapid development of the Internet of Things (IoTs) and mobile technology has brought a new wave of disruptive applications and services whose performance requirements are stretching the limits of current cloud computing systems and platforms. In particular, novel large scale mission-critical IoT systems and latency-intolerant applications strictly require very low latency and strong guarantees of privacy, and can generate massive amounts of data that are only of local interest. These requirements are not readily satisfied using modern application deployment strategies that rely on resources from distant large cloud datacenters bec...