The Complexity of Estimating Systematic Risk in Networks (original) (raw)
Related papers
Estimating Systematic Risk in Real-World Networks
2014
Social, technical and business connections can all give rise to security risks. These risks can be substantial when individual compromises occur in combinations, and difficult to predict when some connections are not easily observed. A significant and relevant challenge is to predict these risks using only locally-derivable information. We illustrate by example that this challenge can be met if some general topological features of the connection network are known. By simulating an attack propagation on two large real-world networks, we identify structural regularities in the resulting loss distributions, from which we can relate various measures of a network's risks to its topology. While deriving these formulae requires knowing or approximating the connective structure of the network, applying them requires only locally-derivable information.
Hardness and Approximation of Network Vulnerability
Handbook of Combinatorial Optimization, 2013
Assessing network vulnerability is a central research topic to understand networks structures, thus providing an efficient way to protect them from attacks and other disruptive events. Existing vulnerability assessments mainly focus on investigating the inhomogeneous properties of graph elements, node degree, for example; however, these measures and the corresponding heuristic solutions cannot either provide an accurate evaluation over general network topologies
Network Topology Vulnerability/Cost Tradeoff: Model, Application, and Computational Complexity
Internet Mathematics, 2015
Technological networks (e.g. telephone and sensor networks, Internet) have provided modern society with increased efficiency, but have also exposed us to the risks posed by their vulnerability to attacks. Mitigating these risks involves designing robust network topologies in situations where resources are economically constrained. In this paper, we consider the vulnerability of network topologies from an economic viewpoint and propose security metrics, which are necessary for assessing the efficiency of our solutions. We define the vulnerability of a network as the potential loss in connectivity due to the actions of a strategic adversary. To derive vulnerability metrics, we revisit our recently introduced network blocking game models, which provide a framework for quantifying network topology vulnerability in adversarial environments. We assume that the network operator takes both security and economic goals into consideration. To model these goals, we generalize previous models by introducing usage costs and budget constraints for the operator. We study two natural constraint formulations, the maximum and the expected cost constraints, and derive the feasible vulnerability/cost region. Since the proposed metrics are based on game-theoretic models, computing them can be challenging. To elucidate these challenges, we provide a thorough complexity analysis for solving the proposed games.
Assessing attack vulnerability in networks with uncertainty
2015 IEEE Conference on Computer Communications (INFOCOM), 2015
A considerable amount of research effort has focused on developing metrics and approaches to assess network vulnerability. However, most of them neglect the network uncertainty arisen due to various reasons such as mobility and dynamics of the network, or noise introduced in data collection process. To this end, we introduce a framework to assess vulnerability of networks with uncertainty, modeling such networks as probabilistic graphs. We adopt expected pairwise connectivity (EPC) as a measure to quantify global connectivity and use it to formulate vulnerability assessment as a stochastic optimization problem. The objective is to identify a few number of critical nodes whose removal minimizes EPC in the residual network. While solutions for stochastic optimization problems are often limited to small networks, we present a practical solution that works for larger networks. The key advantages of our solution include 1) the application of a weighted averaging technique that avoids considering all, exponentially many, possible realizations of probabilistic graphs and 2) a Fully Polynomial Time Randomized Approximation Scheme (FPRAS) to efficiently estimate the EPC with any desired accuracy. Extensive experiments demonstrate significant improvement on performance of our solution over other heuristic approaches.
Vulnerability of complex networks
Communications in Nonlinear Science and Numerical Simulation, 2011
We consider normalized average edge betweenness of a network as a metric of network vulnerability. We suggest that normalized average edge betweenness together with is relative difference when certain number of nodes and/or edges are removed from the network is a measure of network vulnerability, called vulnerability index. Vulnerability index is calculated for four synthetic networks: Erdős-Rényi (ER) random networks, Barabási-Albert (BA) model of scale-free networks, Watts-Strogatz (WS) model of small-world networks, and geometric random networks. Real-world networks for which vulnerability index is calculated include: two human brain networks, three urban networks, one collaboration network, and two power grid networks. We find that WS model of small-world networks and biological networks (human brain networks) are the most robust networks among all networks studied in the paper. interaction networks, neural networks, ecological networks) and etc. A central issue in the analysis of complex networks is the assessment of their robustness and vulnerability. Different approaches to address network robustness and vulnerability have recently been proposed by research community. The first approach is related to structural robustness : how different classes of network topologies are affected by the removal of a finite number of links and/or nodes. It was concluded that the more heterogeneous a network is in terms of, e.g., degree distribution, the more robust it is to random failures, while, at the same time, it appears more vulnerable to deliberate attacks on highly connected nodes. The second approach concerns dynamical robustness . For networks supporting the flow of a physical quantity, the removal of a node or link will cause the flow to redistribute with the risk that some other nodes or links may be overloaded and failure prone. Hence, a triggering event can cause a whole sequence of failures due to overload, and may even threaten the global stability of the network. Such behavior is termed cascading failure.
Rethinking failure and attack tolerance assessment in complex networks
Physica A: Statistical Mechanics and its …, 2011
Studies have revealed that real complex networks are inherently vulnerable to the loss of high centrality nodes. These nodes are crucial to maintaining the network connectivity and are identified by classical measures, such as degree and betweenness centralities. Despite its significance, an assessment based solely on this vulnerability premise is misleading for the interpretation of the real state of the network concerning connectivity. As a matter of fact, some networks may be in a state of imminent fragmentation before such a condition is fully characterized by an analysis targeted solely on the centrally positioned nodes. This work aims at showing that, in fact, it is basically the global network configuration that is responsible for network fragmentation, as it may allow many other lower centrality nodes to seriously damage the network connectivity.
Multiscale vulnerability of complex networks
Chaos, 2007
We present a novel approach to quantify the vulnerability of a complex network, i.e., the capacity of a graph to maintain its functional performance under random damages or malicious attacks. The proposed measure represents a multiscale evaluation of vulnerability, and makes use of combined powers of the links' betweenness. We show that the proposed approach is able to properly describe some cases for which earlier measures of vulnerability fail. The relevant applications of our method for technological network design are outlined.
Quantifying topological robustness of networks under sustained targeted attacks
In this paper, we introduce a measure to analyse the structural robustness of complex networks, which is specifically applicable in scenarios of targeted, sustained attacks. The measure is based on the changing size of the largest component as the network goes through disintegration. We argue that the measure can be used to quantify and compare the effectiveness of various attack strategies. Applying this measure, we confirm the result that scale-free networks are comparatively less vulnerable to random attacks and more vulnerable to targeted attacks. Then we analyse the robustness of a range of real world networks, and show that most real world networks are least robust to attacks based on betweenness of nodes. We also show that the robustness values of some networks are more sensitive to the attack strategy as compared to others. Furthermore, robustness coefficient computed using two centrality measures may be similar, even when the computational complexities of calculating these centrality measures may be different. Given this disparity, the robustness coefficient introduced potentially plays a key role in choosing attack and defence strategies for real world networks. While the measure is applicable to all types of complex networks, we clearly demonstrate its relevance to social network analysis.
Measuring topological robustness of networks under sustained targeted attacks
In this paper, we introduce a measure to analyse the structural robustness of complex networks, which is specifically applicable in scenarios of targeted, sustained attacks. The measure is based on the changing size of the largest component as the network goes through disintegration. We argue that the measure can be used to quantify and compare the effectiveness of various attack strategies. Applying this measure, we confirm the result that scale-free networks are comparatively less vulnerable to random attacks and more vulnerable to targeted attacks. Then we analyse the robustness of a range of real world networks, and show that most real world networks are least robust to attacks based on betweenness of nodes. We also show that the robustness of some networks are more sensitive to the attack strategy compared to others, and given the disparity in the computational complexities of calculating various centrality measures, the robustness coefficient introduced can play a key role in choosing the attack and defence strategies for real world networks. While the measure is applicable to all types of complex networks, we clearly demonstrate its relevance to social network analysis.
A note on measurement of network vulnerability under random and intentional attacks
2010
In this paper we propose an alternative approach for the assessment of network vulnerability under random and intentional attacks as compared to the results obtained from the "vulnerability function"given by Criado et al. [Int. J. Comput. Math., 86 (2)(2009), pp. 209-218]. By using spectral and statistical measurements, we assess robustness as the antonym to vulnerability of complex networks and suggest a tentative ranking for vulnerability, based on the interpretation of quantified network characteristics. We conclude that vulnerability function, derived from the networks degree distribution and its variations only, is not general enough to reflect the lack of robustness due to the specific configurations in graphs with hierarchical or centralized structures. The spectral and statistical metrics, on the other hand, capture different aspects of network topology which provide a more thorough assessment of network vulnerability.