Predictive Ultra-Reliable Communication: A Survival Analysis Perspective (original) (raw)

Reliability Modeling, Analysis and Prediction of Wireless Mobile Communications

2014 IEEE 79th Vehicular Technology Conference (VTC Spring), 2014

The future Fifth Generation (5G) mobile cellular networks that are currently in research phase today enable broad range of services/applications beyond classical mobile communications. One key enabler for Ultra-Reliable services to be integrated into mobile networks is the Reliability of transmission success of a given data packet. This is harder mainly owing to the time-dependent effective link qualities of the communicating devices. However, successful indication of the availability of the instantaneous link quality (e.g., by the device) would allow opportunistic access of ultra reliable services/applications when the link conditions are fair enough. This paper introduces a framework for modeling, predicting and analyzing the theoretical reliability of the wireless link based on factors such as fading, mobility, interference etc. The analysis and prediction is based on the part stress method [1] by assuming time dependent factors as elements/components and their respective Transmission Times To Failure (TTTF). The proposed framework also supports other reliability analysis techniques such as Fault Tree Analysis [2] and Accelerated testing [3] of wireless systems and to improve the components.

Radio Link Failure Prediction in 5G Networks

2021 IEEE Global Communications Conference (GLOBECOM), 2021

Radio Link Failure (RLF) is a challenging problem in 5G networks as it may decrease communication reliability and increases latency. This is against the objectives of 5G, particularly for the ultra-Reliable Low Latency Communications (uRLLC) traffic class. RLF can be predicted using radio measurements reported by User Equipment (UE)s, such as Reference Signal Receive Power (RSRP), Reference Signal Receive Quality (RSRQ), Channel Quality Indicator (CQI), and Power HeadRoom (PHR). However, it is very challenging to derive a closed-form model that derives RLF from these measurements. To fill this gap, we propose to use Machine Learning (ML) techniques, and specifically, a combination of Long Short Term Memory (LSTM) and Support Vector Machine (SVM), to find the correlation between these measurements and RLF. The RLF prediction model was trained with real data obtained from a 5G testbed. The validation process of the model showed an accuracy of 98% when predicting the connection status ...

Reliability Prediction Modelling for Wireless Communication Networks

2019 SoutheastCon, 2019

The University of Gloucestershire accepts no liability for any infringement of intellectual property rights in any material deposited but will remove such material from public view pending investigation in the event of an allegation of any such infringement.

Confidence Level for Finite Blocklength Ultra Reliable Communication over Fading Channels

2019 IEEE 30th Annual International Symposium on Personal, Indoor and Mobile Radio Communications (PIMRC), 2019

We present the reliability confidence level as a way to quantify reliability for ultra reliable connections in uncertain environments subject to random block-error rate fluctuations. As ultra reliability is usually linked to the low latency constraint, we analyze the reliability confidence level using the latest results on block-error rate in the finite blocklength regime. The analysis is carried out for OFDMbased systems over Rayleigh slow frequency block-fading channels. The reliability confidence level is bounded using analytic expressions which are then applied to solve two optimization problems. We first find the minimal resources (or codeword length) to guarantee a target reliability with a given confidence. We then investigate an optimal resource sharing strategy within the context of 5G New Radio. The solutions for optimization problems are obtained without resorting to cumbersome Monte-Carlo simulations. Numerical evaluations validate the relevance as well as the tightness of the proposed bounds and approximations.

Mission Reliability for URLLC in Wireless Networks

IEEE Communications Letters, 2018

Achieving connectivity reliability for Ultra-Reliable Low-Latency Communication (URLLC) is one major challenge for future wireless communications systems. The current understanding of reliability does not sufficiently cover time-related aspects, e.g, time-varying channels or the duration of a certain condition in a wireless system. Moreover, different applications require diverse execution periods during which no failure is allowed, which we denote as mission duration. In this paper, we focus on time-related aspects and introduce the performance metric mission reliability to wireless communications systems and employ its connection to the mean time to first failure (MTTFF), reflecting the expected time until the first failure occurs. As diversity is accepted to be essential for reliability, their evaluation is presented regarding a multi-connectivity system with Rayleigh fading, reflecting frequency diversity. We demonstrate the tradeoff between mission duration, mission reliability, and the number of links used for multi-connectivity, which can stimulate the discussion on URLLC and advance the design of future wireless systems.

Interference Distribution Prediction for Link Adaptation in Ultra-Reliable Low-Latency Communications

2020 IEEE 91st Vehicular Technology Conference (VTC2020-Spring), 2020

The strict latency and reliability requirements of ultra-reliable low-latency communications (URLLC) use cases are among the main drivers in fifth generation (5G) network design. Link adaptation (LA) is considered to be one of the bottlenecks to realize URLLC. In this paper, we focus on predicting the signal to interference plus noise ratio at the user to enhance the LA. Motivated by the fact that most of the URLLC use cases with most extreme latency and reliability requirements are characterized by semi-deterministic traffic, we propose to exploit the time correlation of the interference to compute useful statistics needed to predict the interference power in the next transmission. This prediction is exploited in the LA context to maximize the spectral efficiency while guaranteeing reliability at an arbitrary level. Numerical results are compared with state of the art interference prediction techniques for LA. We show that exploiting time correlation of the interference is an impor...

A Predictive Interference Management Algorithm for URLLC in Beyond 5G Networks

IEEE Communications Letters, 2021

Interference mitigation is a major design challenge in wireless systems,especially in the context of ultra-reliable low-latency communication (URLLC) services. Conventional averagebased interference management schemes are not suitable for URLLC as they do not accurately capture the tail information of the interference distribution. This letter proposes a novel interference prediction algorithm that considers the entire interference distribution instead of only the mean. The key idea is to model the interference variation as a discrete state space discrete-time Markov chain. The state transition probability matrix is then used to estimate the state evolution in time, and allocate radio resources accordingly. The proposed scheme is found to meet the target reliability requirements in a low-latency single-shot transmission system considering realistic system assumptions, while requiring only ∼ 25% more resources than the optimum case with perfect interference knowledge.

Channel stability prediction to optimize signaling overhead in 5G networks using machine learning

ICC 2020 - 2020 IEEE International Conference on Communications (ICC), 2020

Channel quality feedback is crucial for the operation of 4G and 5G radio networks, as it allows to control User Equipment (UE) connectivity, transmission scheduling, and the modulation and rate of the data transmitted over the wireless link. However, when such feedback is frequent and the number of UEs in a cell is large, the channel may be overloaded by signaling messages, resulting in lower throughput and data loss. optimizing this signaling process thus represents a key challenge. In this paper, we focus on Channel Quality Indicator (CQI) reports that are periodically sent from a UE to the base station, and propose mechanisms to optimize the reporting process with the aim of reducing signaling overhead and avoiding the associated channel overloads, particularly when channel conditions are stable. To this end, we apply machine learning mechanisms to predict channel stability, which can be used to decide if the CQI of a UE is necessary to be reported, and in turn to control the rep...

Continuous Time Markov Chain Based Reliability Analysis for Future Cellular Networks

2015 IEEE Global Communications Conference (GLOBECOM), 2015

It is anticipated that the future cellular networks will consist of an ultra-dense deployment of complex heterogeneous Base Stations (BSs). Consequently, Self-Organizing Networks (SON) features are considered to be inevitable for efficient and reliable management of such a complex network. Given their unfathomable complexity, cellular networks are inherently prone to partial or complete cell outages due to hardware and/or software failures and parameter misconfiguration caused by human error, multivendor incompatibility or operational drift. Forthcoming cellular networks, visa -vis 5G are susceptible to even higher cell outage rates due to their higher parametric complexity and also due to potential conflicts among multiple SON functions. These realities pose a major challenge for reliable operation of future ultra-dense cellular networks in cost effective manner. In this paper, we present a stochastic analytical model to analyze the effects of arrival of faults in a cellular network. We exploit Continuous Time Markov Chain (CTMC) with exponential distribution for failures and recovery times to model the reliability behavior of a BS. We leverage the developed model and subsequent analysis to propose an adaptive fault predictive framework. The proposed fault prediction framework can adapt the CTMC model by dynamically learning from past database of failures, and hence can reduce network recovery time thereby improving its reliability. Numerical results from three case studies, representing different types of network, are evaluated to demonstrate the applicability of the proposed analytical model.

Ultra-Reliable Low Latency Cellular Networks: Use Cases, Challenges and Approaches

IEEE Communications Magazine, 2018

The fifth-generation cellular mobile networks are expected to support mission critical ultra-reliable low latency communication (URLLC) services in addition to the enhanced mobile broadband applications. This article first introduces three emerging mission critical applications of URLLC and identifies their requirements on end-to-end latency and reliability. We then investigate the various sources of end-to-end delay of current wireless networks by taking the 4G Long Term Evolution (LTE) as an example. Subsequently, we propose and evaluate several techniques to reduce the end-to-end latency from the perspectives of error control coding, signal processing, and radio resource management. We also briefly discuss other network design approaches with the potential for further latency reduction.