behnam rezaei | University of California, Los Angeles (original) (raw)
Papers by behnam rezaei
Susceptibility of scale free Power Law (PL) networks to attacks has been traditionally studied in... more Susceptibility of scale free Power Law (PL) networks to attacks has been traditionally studied in the context of what may be termed as {\em instantaneous attacks}, where a randomly selected set of nodes and edges are deleted while the network is kept {\em static}. In this paper, we shift the focus to the study of {\em progressive} and instantaneous attacks on {\em reactive} grown and random PL networks, which can respond to attacks and take remedial steps. In the process, we present several techniques that managed networks can adopt to minimize the damages during attacks, and also to efficiently recover from the aftermath of successful attacks. For example, we present (i) compensatory dynamics that minimize the damages inflicted by targeted progressive attacks, such as linear-preferential deletions of nodes in grown PL networks; the resulting dynamic naturally leads to the emergence of networks with PL degree distributions with exponential cutoffs; (ii) distributed healing algorithms that can scale the maximum degree of nodes in a PL network using only local decisions, and (iii) efficient means of creating giant connected components in a PL network that has been fragmented by attacks on a large number of high-degree nodes. Such targeted attacks are considered to be a major vulnerability of PL networks; however, our results show that the introduction of only a small number of random edges, through a {\em reverse percolation} process, can restore connectivity, which in turn allows restoration of other topological properties of the original network. Thus, the scale-free nature of the networks can itself be effectively utilized for protection and recovery purposes.
The promise of discovering a functional blueprint of a cellular system from large-scale and high-... more The promise of discovering a functional blueprint of a cellular system from large-scale and high-throughput sequence and experimental data is predicated on the belief that the same top-down investigative approach that proved successful in other biological problems (e.g. DNA sequencing) will be as effective when it comes to inferring more complex intracellular processes. The results in this paper address this fundamental issue in the specific context of transcription regulatory networks. Although simple recurring regulatory motifs have been identified in the past, due to the size and complexity of the connectivity structure, the subdivision of such networks into larger, and possibly inter-connected, regulatory modules is still under investigation. Specifically, it is unclear whether functionally well-characterized transcriptional sub-networks can be identified by solely analyzing the connectivity structure of the overall network topology. In this paper, we show that transcriptional regulatory networks can be systematically partitioned into communities whose members are consistently functionally related. We applied the partitioning method to the transcriptional regulatory networks of the yeast Saccharomyces cerevisiae; the resulting communities of gene and transcriptional regulators can be associated to distinct functional units, such as amino acid metabolism, cell cycle regulation, protein biosynthesis and localization, DNA replication and maintenance, lipid catabolism, stress response and so on. Moreover, the observation of inter-community connectivity patterns provides a valuable tool for elucidating the inter-dependency between the discovered regulatory modules.
The promise of discovering a functional blueprint of a cellular system from large-scale and high-... more The promise of discovering a functional blueprint of a cellular system from large-scale and high-throughput sequence and experimental data is predicated on the belief that the same top-down investigative approach that proved successful in other biological problems (e.g. DNA sequencing) will be as effective when it comes to inferring more complex intracellular processes. The results in this paper address this fundamental issue in the specific context of transcription regulatory networks. Although simple recurring regulatory motifs have been identified in the past, due to the size and complexity of the connectivity structure, the subdivision of such networks into larger, and possibly inter-connected, regulatory modules is still under investigation. Specifically, it is unclear whether functionally well-characterized transcriptional sub-networks can be identified by solely analyzing the connectivity structure of the overall network topology. In this paper, we show that transcriptional regulatory networks can be systematically partitioned into communities whose members are consistently functionally related. We applied the partitioning method to the transcriptional regulatory networks of the yeast Saccharomyces cerevisiae; the resulting communities of gene and transcriptional regulators can be associated to distinct functional units, such as amino acid metabolism, cell cycle regulation, protein biosynthesis and localization, DNA replication and maintenance, lipid catabolism, stress response and so on. Moreover, the observation of inter-community connectivity patterns provides a valuable tool for elucidating the inter-dependency between the discovered regulatory modules.
One natural constraint in the sponsored search advertising framework arises from the fact that th... more One natural constraint in the sponsored search advertising framework arises from the fact that there is a limit on the number of available slots, especially for the popular keywords, and as a result, a significant pool of advertisers are left out. We study the emergence of diversification in the adword market triggered by such capacity constraints in the sense that new market mechanisms, as well as, new for-profit agents are likely to emerge to combat or to make profit from the opportunities created by shortages in ad-space inventory. We propose a model where the additional capacity is provided by for-profit agents (or, mediators), who compete for slots in the original auction, draw traffic, and run their own sub-auctions. The quality of the additional capacity provided by a mediator is measured by its {\it fitness} factor. We compute revenues and payoffs for all the different parties at a {\it symmetric Nash equilibrium} (SNE) when the mediator-based model is operated by a mechanism currently being used by Google and Yahoo!, and then compare these numbers with those obtained at a corresponding SNE for the same mechanism, but without any mediators involved in the auctions. Such calculations allow us to determine the value of the additional capacity. Our results show that the revenue of the auctioneer, as well as the social value (i.e. efficiency), always increase when mediators are involved; moreover even the payoffs of {\em all} the bidders will increase if the mediator has a high enough fitness. Thus, our analysis indicates that there are significant opportunities for diversification in the internet economy and we should expect it to continue to develop richer structure, with room for different types of agents and mechanisms to coexist.
A mediator is a well-known construct in game theory, and is an entity that plays on behalf of som... more A mediator is a well-known construct in game theory, and is an entity that plays on behalf of some of the agents who choose to use its services, while the rest of the agents participate in the game directly. We initiate a game theoretic study of sponsored search auctions, such as those used by Google and Yahoo!, involving {\em incentive driven} mediators. We refer to such mediators as {\em for-profit} mediators, so as to distinguish them from mediators introduced in prior work, who have no monetary incentives, and are driven by the altruistic goal of implementing certain desired outcomes. We show that in our model, (i) players/advertisers can improve their payoffs by choosing to use the services of the mediator, compared to directly participating in the auction; (ii) the mediator can obtain monetary benefit by managing the advertising burden of its group of advertisers; and (iii) the payoffs of the mediator and the advertisers it plays for are compatible with the incentive constraints from the advertisers who do dot use its services. A simple intuition behind the above result comes from the observation that the mediator has more information about and more control over the bid profile than any individual advertiser, allowing her to reduce the payments made to the auctioneer, while still maintaining incentive constraints. Further, our results indicate that there are significant opportunities for diversification in the internet economy and we should expect it to continue to develop richer structure, with room for different types of agents to coexist.
We investigate market forces that would lead to the emergence of new classes of players in the sp... more We investigate market forces that would lead to the emergence of new classes of players in the sponsored search market. We report a 3-fold diversification triggered by two inherent features of the sponsored search market, namely, capacity constraints and collusion-vulnerability of current mechanisms. In the first scenario, we present a comparative study of two models motivated by capacity constraints - one where the additional capacity is provided by for-profit agents, who compete for slots in the original auction, draw traffic, and run their own sub-auctions, and the other, where the additional capacity is provided by the auctioneer herself, by essentially acting as a mediator and running a single combined auction. This study was initiated by us in \cite{SRGR07}, where the mediator-based model was studied. In the present work, we study the auctioneer-based model and show that this model seems inferior to the mediator-based model in terms of revenue or efficiency guarantee due to added capacity. In the second scenario, we initiate a game theoretic study of current sponsored search auctions, involving incentive driven mediators who exploit the fact that these mechanisms are not collusion-resistant. In particular, we show that advertisers can improve their payoffs by using the services of the mediator compared to directly participating in the auction, and that the mediator can also obtain monetary benefit, without violating incentive constraints from the advertisers who do not use its services. We also point out that the auctioneer can not do very much via mechanism design to avoid such for-profit mediation without losing badly in terms of revenue, and therefore, the mediators are likely to prevail.
Almost all of us have multiple cyberspace identities, and these {\em cyber}alter egos are network... more Almost all of us have multiple cyberspace identities, and these {\em cyber}alter egos are networked together to form a vast cyberspace social network. This network is distinct from the world-wide-web (WWW), which is being queried and mined to the tune of billions of dollars everyday, and until recently, has gone largely unexplored. Empirically, the cyberspace social networks have been found to possess many of the same complex features that characterize its real counterparts, including scale-free degree distributions, low diameter, and extensive connectivity. We show that these topological features make the latent networks particularly suitable for explorations and management via local-only messaging protocols. {\em Cyber}alter egos can communicate via their direct links (i.e., using only their own address books) and set up a highly decentralized and scalable message passing network that can allow large-scale sharing of information and data. As one particular example of such collaborative systems, we provide a design of a spam filtering system, and our large-scale simulations show that the system achieves a spam detection rate close to 100%, while the false positive rate is kept around zero. This system has several advantages over other recent proposals (i) It uses an already existing network, created by the same social dynamics that govern our daily lives, and no dedicated peer-to-peer (P2P) systems or centralized server-based systems need be constructed; (ii) It utilizes a percolation search algorithm that makes the query-generated traffic scalable; (iii) The network has a built in trust system (just as in social networks) that can be used to thwart malicious attacks; iv) It can be implemented right now as a plugin to popular email programs, such as MS Outlook, Eudora, and Sendmail.
We introduce BruNet, a general P2P software framework which we use to produce the first implement... more We introduce BruNet, a general P2P software framework which we use to produce the first implementation of Symphony, a 1-D Kleinberg small-world architecture. Our framework is designed to easily implement and measure different P2P protocols over different transport layers such as TCP or UDP. This paper discusses our implementation of the Symphony network, which allows each node to keep klelogNk \le \log NklelogN shortcut connections and to route to any other node with a short average delay of O(frac1klog2N)O(\frac{1}{k}\log^2 N)O(frac1klog2N). %This provides a continuous trade-off between node degree and routing latency. We present experimental results taken from several PlanetLab deployments of size up to 1060 nodes. These succes sful deployments represent some of the largest PlanetLab deployments of P2P overlays found in the literature, and show our implementation's robustness to massive node dynamics in a WAN environment.
Computing Research Repository, 2007
We introduce BruNet, a general P2P software framework which we use to produce the first implement... more We introduce BruNet, a general P2P software framework which we use to produce the first implementation of Symphony, a 1-D Kleinberg small-world architecture. Our framework is designed to easily implement and measure different P2P protocols over different transport layers such as TCP or UDP. This paper discusses our implementation of the Symphony network, which allows each node to keep k ≤ log N shortcut connections and to route to any other node with a short average delay of O( 1 k log 2 N ). We present experimental results taken from several PlanetLab deployments of size up to 1060 nodes. These successful deployments represent some of the largest PlanetLab deployments of P2P overlays found in the literature, and show our implementation's robustness to massive node dynamics in a WAN environment. Abstract
Computing Research Repository, 2005
Susceptibility of scale free Power Law (PL) networks to attacks has been traditionally studied in... more Susceptibility of scale free Power Law (PL) networks to attacks has been traditionally studied in the context of what may be termed as instantaneous attacks, where a randomly selected set of nodes and edges are deleted while the network is kept static. In this paper, we shift the focus to the study of progressive and instantaneous attacks on reactive grown and random PL networks, which can respond to attacks and take remedial steps. In the process, we present several techniques that managed networks can adopt to minimize the damages during attacks, and also to efficiently recover from the aftermath of successful attacks. For example, we present (i) compensatory dynamics that minimize the damages inflicted by targeted progressive attacks, such as linear-preferential deletions of nodes in grown PL networks; the resulting dynamic naturally leads to the emergence of networks with PL degree distributions with exponential cutoffs; (ii) distributed healing algorithms that can scale the maximum degree of nodes in a PL network using only local decisions, and (iii) efficient means of creating giant connected components in a PL network that has been fragmented by attacks on a large number of high-degree nodes. Such targeted attacks are considered to be a major vulnerability of PL networks; however, our results show that the introduction of only a small number of random edges, through a reverse percolation process, can restore connectivity, which in turn allows restoration of other topological properties of the original network. Thus, the scale-free nature of the networks can itself be effectively utilized for protection and recovery purposes. and more recently Peer-To-Peer (P2P) networks. With the economy going more global every year, the emergence of different kinds of complex networks that interconnect distributed centers of communication, finance, and manufacturing, will only see a rapid growth. In such networks, attacks are a fact of life, and simple attacks, such as Denial-of-Service (DOS), can cripple hubs and other nodes, leading to severe disruptions of services. Understanding the effect of attacks, and mechanisms to respond to attacks is thus of great practical importance to many network based systems.
Conference on Email and Anti-Spam, 2005
e introdue ollortive ntiEspm sysE tem tht is sed on pervsive glol soil emil networksF issentillyD... more e introdue ollortive ntiEspm sysE tem tht is sed on pervsive glol soil emil networksF issentillyD we provide soE lution to this open reserh prolemX given network of N users who re willing to shre informtion ollortively @eFgF the digests or ngerprints of known spmsAD how do we serh for eh user9s ontent eiently nd relily in distriuted mnner with miniml tr ost on the networkc es solution to this open prolemD our proposed system emE ploys the peroltion serh proessD whih mkes the tr generted due to queries for spm digests sle sulinerly s funtion of N F roweverD in order to rep the enets of this novel peroltion serh lgorithmD the node degree distriution of the underlying network must e hevyEtiledF snterestinglyD ltent glol soil emil networks omprisE ing of personl ontts possess powerElw hevyEtiled degree distriutionD whih renE ders itself n idel nturl pltform to emE ploy the peroltion serh lgorithmF es resultD our proposed distriuted spm lter requires no dedited peerEtoEpeer @PA sysE tems or entrlized serverEsed systemsF e hve performed lrgeEsle simultions nd we nd tht the system hieves spm deE tetion rte lose to IHH7D while the flse posE itive rte is kept round zeroF he ndwidth ost per user s well s the systemEwide ndE width ost re shown to e very lowF
Management Science Letters, 2013
Management Science Letters, 2013
Lecture Notes in Computer Science, 2007
One natural constraint in the sponsored search advertising framework arises from the fact that th... more One natural constraint in the sponsored search advertising framework arises from the fact that there is a limit on the number of available slots, especially for the popular keywords, and as a result, a significant pool of advertisers are left out. We study the emergence of diversification in the adword market triggered by such capacity constraints in the sense that new market mechanisms, as well as, new for-profit agents are likely to emerge to combat or to make profit from the opportunities created by shortages in ad-space inventory. We propose a model where the additional capacity is provided by for-profit agents (or, mediators), who compete for slots in the original auction, draw traffic, and run their own sub-auctions. The quality of the additional capacity provided by a mediator is measured by its fitness factor. We compute revenues and payoffs for all the different parties at a symmetric Nash equilibrium (SNE) when the mediator-based model is operated by a mechanism currently being used by Google and Yahoo!, and then compare these numbers with those obtained at a corresponding SNE for the same mechanism, but without any mediators involved in the auctions. Such calculations allow us to determine the value of the additional capacity. Our results show that the revenue of the auctioneer, as well as the social value (i.e. efficiency ), always increase when mediators are involved; moreover even the payoffs of all the bidders will increase if the mediator has a high enough fitness. Thus, our analysis indicates that there are significant opportunities for diversification in the internet economy and we should expect it to continue to develop richer structure, with room for different types of agents and mechanisms to coexist.
ACM SIGecom Exchanges, 2008
We investigate market forces that would lead to the emergence of new classes of players in the sp... more We investigate market forces that would lead to the emergence of new classes of players in the sponsored search market. We report a 3-fold diversification triggered by two inherent features of the sponsored search market, namely, capacity constraints and collusion-vulnerability of current mechanisms. In the first scenario, we present a comparative study of two models motivated by capacity constraints -one where the additional capacity is provided by for-profit agents (or, mediators), who compete for slots in the original auction, draw traffic, and run their own sub-auctions, and the other, where the additional capacity is provided by the auctioneer herself, by essentially acting as a mediator and running a single combined auction. This study was initiated by us in , where the mediator-based model was studied. In the present work, we study the auctioneer-based model and show that this single combinedauction model seems inferior to the mediator-based model in terms of revenue or efficiency guarantee due to added capacity. In the second scenario, we initiate a game theoretic study of current sponsored search auctions, involving incentive driven mediators who exploit the fact that these mechanisms are not collusion-resistant. In particular, we show that advertisers can improve their payoffs by using the services of the mediator compared to directly participating in the auction, and that the mediator can also obtain monetary benefit by managing the advertising burden of its advertisers, without violating incentive constraints from the advertisers who do not use its services. We also point out that the auctioneer can not do very much via mechanism design to avoid such for-profit mediation without losing badly in terms of revenue, and therefore, the mediators are likely to prevail. Thus, our analysis indicates that there are significant opportunities for diversification in the internet economy and we should expect it to continue to develop richer structure, with room for different types of agents and mechanisms to coexist.
Management Science Letters, 2013
This paper is concerned with the scaling of the number of hops in a large scale wireless ad-hoc n... more This paper is concerned with the scaling of the number of hops in a large scale wireless ad-hoc network (WANET), a quantity we call network latency. A large network latency affects all aspects of data communication in a WANET, including an increase in delay, packet loss, required processing power and memory. We consider network management and data routing challenges in WANETs with scalable network latency. On the physical side, reducing network latency imposes a significantly higher power and bandwidth demand on nodes, as is reflected in a set of new bounds. On the protocol front, designing distributed routing protocols that can guarantee the delivery of data packets within scalable number of hops is a challenging task. To solve this, we introduce multi-resolution randomized hierarchy (MRRH), a novel power and bandwidth efficient WANET protocol with scalable network latency. MRRH uses a randomized algorithm for building and maintaining a random hierarchical network topology, which together with the proposed routing algorithm can guarantee efficient delivery of data packets in the wireless network. For a network of size NNN, MRRH can provide an average latency of only O(log3N)O(\log^{3} N)O(log3N). The power and bandwidth consumption of MRRH are shown to be \emph{nearly} optimal for the latency it provides. Therefore, MRRH, is a provably efficient candidate for truly large scale wireless ad-hoc networking.
Computing Research Repository, 2006
This paper is concerned with the scaling of the number of hops in a large scale wireless ad-hoc n... more This paper is concerned with the scaling of the number of hops in a large scale wireless ad-hoc network (WANET), a quantity we call network latency. A large network latency affects all aspects of data communication in a WANET, including an increase in delay, packet loss, required processing power and memory. We consider network management and data routing challenges in WANETs with scalable network latency. On the physical side, reducing network latency imposes a significantly higher power and bandwidth demand on nodes, as is reflected in a set of new bounds. On the protocol front, designing distributed routing protocols that can guarantee the delivery of data packets within scalable number of hops is a challenging task. To solve this, we introduce multi-resolution randomized hierarchy (MRRH), a novel power and bandwidth efficient WANET protocol with scalable network latency. MRRH uses a randomized algorithm for building and maintaining a random hierarchical network topology, which together with the proposed routing algorithm can guarantee efficient delivery of data packets in the wireless network. For a network of size NNN, MRRH can provide an average latency of only O(log3N)O(\log^{3} N)O(log3N). The power and bandwidth consumption of MRRH are shown to be \emph{nearly} optimal for the latency it provides. Therefore, MRRH, is a provably efficient candidate for truly large scale wireless ad-hoc networking.
Computing Research Repository, 2007
A mediator is a well-known construct in game theory, and is an entity that plays on behalf of som... more A mediator is a well-known construct in game theory, and is an entity that plays on behalf of some of the agents who choose to use its services, while the rest of the agents participate in the game directly. We initiate a game theoretic study of sponsored search auctions, such as those used by Google and Yahoo!, involving {\em incentive driven} mediators. We refer to such mediators as {\em for-profit} mediators, so as to distinguish them from mediators introduced in prior work, who have no monetary incentives, and are driven by the altruistic goal of implementing certain desired outcomes. We show that in our model, (i) players/advertisers can improve their payoffs by choosing to use the services of the mediator, compared to directly participating in the auction; (ii) the mediator can obtain monetary benefit by managing the advertising burden of its group of advertisers; and (iii) the payoffs of the mediator and the advertisers it plays for are compatible with the incentive constraints from the advertisers who do dot use its services. A simple intuition behind the above result comes from the observation that the mediator has more information about and more control over the bid profile than any individual advertiser, allowing her to reduce the payments made to the auctioneer, while still maintaining incentive constraints. Further, our results indicate that there are significant opportunities for diversification in the internet economy and we should expect it to continue to develop richer structure, with room for different types of agents to coexist.
Sigecom Exchanges, 2008
We investigate market forces that would lead to the emergence of new classes of players in the sp... more We investigate market forces that would lead to the emergence of new classes of players in the sponsored search market. We report a 3-fold diversification triggered by two inherent features of the sponsored search market, namely, capacity constraints and collusion-vulnerability of current mechanisms. In the first scenario, we present a comparative study of two models motivated by capacity constraints -one where the additional capacity is provided by for-profit agents (or, mediators), who compete for slots in the original auction, draw traffic, and run their own sub-auctions, and the other, where the additional capacity is provided by the auctioneer herself, by essentially acting as a mediator and running a single combined auction. This study was initiated by us in , where the mediator-based model was studied. In the present work, we study the auctioneer-based model and show that this single combinedauction model seems inferior to the mediator-based model in terms of revenue or efficiency guarantee due to added capacity. In the second scenario, we initiate a game theoretic study of current sponsored search auctions, involving incentive driven mediators who exploit the fact that these mechanisms are not collusion-resistant. In particular, we show that advertisers can improve their payoffs by using the services of the mediator compared to directly participating in the auction, and that the mediator can also obtain monetary benefit by managing the advertising burden of its advertisers, without violating incentive constraints from the advertisers who do not use its services. We also point out that the auctioneer can not do very much via mechanism design to avoid such for-profit mediation without losing badly in terms of revenue, and therefore, the mediators are likely to prevail. Thus, our analysis indicates that there are significant opportunities for diversification in the internet economy and we should expect it to continue to develop richer structure, with room for different types of agents and mechanisms to coexist.
Susceptibility of scale free Power Law (PL) networks to attacks has been traditionally studied in... more Susceptibility of scale free Power Law (PL) networks to attacks has been traditionally studied in the context of what may be termed as {\em instantaneous attacks}, where a randomly selected set of nodes and edges are deleted while the network is kept {\em static}. In this paper, we shift the focus to the study of {\em progressive} and instantaneous attacks on {\em reactive} grown and random PL networks, which can respond to attacks and take remedial steps. In the process, we present several techniques that managed networks can adopt to minimize the damages during attacks, and also to efficiently recover from the aftermath of successful attacks. For example, we present (i) compensatory dynamics that minimize the damages inflicted by targeted progressive attacks, such as linear-preferential deletions of nodes in grown PL networks; the resulting dynamic naturally leads to the emergence of networks with PL degree distributions with exponential cutoffs; (ii) distributed healing algorithms that can scale the maximum degree of nodes in a PL network using only local decisions, and (iii) efficient means of creating giant connected components in a PL network that has been fragmented by attacks on a large number of high-degree nodes. Such targeted attacks are considered to be a major vulnerability of PL networks; however, our results show that the introduction of only a small number of random edges, through a {\em reverse percolation} process, can restore connectivity, which in turn allows restoration of other topological properties of the original network. Thus, the scale-free nature of the networks can itself be effectively utilized for protection and recovery purposes.
The promise of discovering a functional blueprint of a cellular system from large-scale and high-... more The promise of discovering a functional blueprint of a cellular system from large-scale and high-throughput sequence and experimental data is predicated on the belief that the same top-down investigative approach that proved successful in other biological problems (e.g. DNA sequencing) will be as effective when it comes to inferring more complex intracellular processes. The results in this paper address this fundamental issue in the specific context of transcription regulatory networks. Although simple recurring regulatory motifs have been identified in the past, due to the size and complexity of the connectivity structure, the subdivision of such networks into larger, and possibly inter-connected, regulatory modules is still under investigation. Specifically, it is unclear whether functionally well-characterized transcriptional sub-networks can be identified by solely analyzing the connectivity structure of the overall network topology. In this paper, we show that transcriptional regulatory networks can be systematically partitioned into communities whose members are consistently functionally related. We applied the partitioning method to the transcriptional regulatory networks of the yeast Saccharomyces cerevisiae; the resulting communities of gene and transcriptional regulators can be associated to distinct functional units, such as amino acid metabolism, cell cycle regulation, protein biosynthesis and localization, DNA replication and maintenance, lipid catabolism, stress response and so on. Moreover, the observation of inter-community connectivity patterns provides a valuable tool for elucidating the inter-dependency between the discovered regulatory modules.
The promise of discovering a functional blueprint of a cellular system from large-scale and high-... more The promise of discovering a functional blueprint of a cellular system from large-scale and high-throughput sequence and experimental data is predicated on the belief that the same top-down investigative approach that proved successful in other biological problems (e.g. DNA sequencing) will be as effective when it comes to inferring more complex intracellular processes. The results in this paper address this fundamental issue in the specific context of transcription regulatory networks. Although simple recurring regulatory motifs have been identified in the past, due to the size and complexity of the connectivity structure, the subdivision of such networks into larger, and possibly inter-connected, regulatory modules is still under investigation. Specifically, it is unclear whether functionally well-characterized transcriptional sub-networks can be identified by solely analyzing the connectivity structure of the overall network topology. In this paper, we show that transcriptional regulatory networks can be systematically partitioned into communities whose members are consistently functionally related. We applied the partitioning method to the transcriptional regulatory networks of the yeast Saccharomyces cerevisiae; the resulting communities of gene and transcriptional regulators can be associated to distinct functional units, such as amino acid metabolism, cell cycle regulation, protein biosynthesis and localization, DNA replication and maintenance, lipid catabolism, stress response and so on. Moreover, the observation of inter-community connectivity patterns provides a valuable tool for elucidating the inter-dependency between the discovered regulatory modules.
One natural constraint in the sponsored search advertising framework arises from the fact that th... more One natural constraint in the sponsored search advertising framework arises from the fact that there is a limit on the number of available slots, especially for the popular keywords, and as a result, a significant pool of advertisers are left out. We study the emergence of diversification in the adword market triggered by such capacity constraints in the sense that new market mechanisms, as well as, new for-profit agents are likely to emerge to combat or to make profit from the opportunities created by shortages in ad-space inventory. We propose a model where the additional capacity is provided by for-profit agents (or, mediators), who compete for slots in the original auction, draw traffic, and run their own sub-auctions. The quality of the additional capacity provided by a mediator is measured by its {\it fitness} factor. We compute revenues and payoffs for all the different parties at a {\it symmetric Nash equilibrium} (SNE) when the mediator-based model is operated by a mechanism currently being used by Google and Yahoo!, and then compare these numbers with those obtained at a corresponding SNE for the same mechanism, but without any mediators involved in the auctions. Such calculations allow us to determine the value of the additional capacity. Our results show that the revenue of the auctioneer, as well as the social value (i.e. efficiency), always increase when mediators are involved; moreover even the payoffs of {\em all} the bidders will increase if the mediator has a high enough fitness. Thus, our analysis indicates that there are significant opportunities for diversification in the internet economy and we should expect it to continue to develop richer structure, with room for different types of agents and mechanisms to coexist.
A mediator is a well-known construct in game theory, and is an entity that plays on behalf of som... more A mediator is a well-known construct in game theory, and is an entity that plays on behalf of some of the agents who choose to use its services, while the rest of the agents participate in the game directly. We initiate a game theoretic study of sponsored search auctions, such as those used by Google and Yahoo!, involving {\em incentive driven} mediators. We refer to such mediators as {\em for-profit} mediators, so as to distinguish them from mediators introduced in prior work, who have no monetary incentives, and are driven by the altruistic goal of implementing certain desired outcomes. We show that in our model, (i) players/advertisers can improve their payoffs by choosing to use the services of the mediator, compared to directly participating in the auction; (ii) the mediator can obtain monetary benefit by managing the advertising burden of its group of advertisers; and (iii) the payoffs of the mediator and the advertisers it plays for are compatible with the incentive constraints from the advertisers who do dot use its services. A simple intuition behind the above result comes from the observation that the mediator has more information about and more control over the bid profile than any individual advertiser, allowing her to reduce the payments made to the auctioneer, while still maintaining incentive constraints. Further, our results indicate that there are significant opportunities for diversification in the internet economy and we should expect it to continue to develop richer structure, with room for different types of agents to coexist.
We investigate market forces that would lead to the emergence of new classes of players in the sp... more We investigate market forces that would lead to the emergence of new classes of players in the sponsored search market. We report a 3-fold diversification triggered by two inherent features of the sponsored search market, namely, capacity constraints and collusion-vulnerability of current mechanisms. In the first scenario, we present a comparative study of two models motivated by capacity constraints - one where the additional capacity is provided by for-profit agents, who compete for slots in the original auction, draw traffic, and run their own sub-auctions, and the other, where the additional capacity is provided by the auctioneer herself, by essentially acting as a mediator and running a single combined auction. This study was initiated by us in \cite{SRGR07}, where the mediator-based model was studied. In the present work, we study the auctioneer-based model and show that this model seems inferior to the mediator-based model in terms of revenue or efficiency guarantee due to added capacity. In the second scenario, we initiate a game theoretic study of current sponsored search auctions, involving incentive driven mediators who exploit the fact that these mechanisms are not collusion-resistant. In particular, we show that advertisers can improve their payoffs by using the services of the mediator compared to directly participating in the auction, and that the mediator can also obtain monetary benefit, without violating incentive constraints from the advertisers who do not use its services. We also point out that the auctioneer can not do very much via mechanism design to avoid such for-profit mediation without losing badly in terms of revenue, and therefore, the mediators are likely to prevail.
Almost all of us have multiple cyberspace identities, and these {\em cyber}alter egos are network... more Almost all of us have multiple cyberspace identities, and these {\em cyber}alter egos are networked together to form a vast cyberspace social network. This network is distinct from the world-wide-web (WWW), which is being queried and mined to the tune of billions of dollars everyday, and until recently, has gone largely unexplored. Empirically, the cyberspace social networks have been found to possess many of the same complex features that characterize its real counterparts, including scale-free degree distributions, low diameter, and extensive connectivity. We show that these topological features make the latent networks particularly suitable for explorations and management via local-only messaging protocols. {\em Cyber}alter egos can communicate via their direct links (i.e., using only their own address books) and set up a highly decentralized and scalable message passing network that can allow large-scale sharing of information and data. As one particular example of such collaborative systems, we provide a design of a spam filtering system, and our large-scale simulations show that the system achieves a spam detection rate close to 100%, while the false positive rate is kept around zero. This system has several advantages over other recent proposals (i) It uses an already existing network, created by the same social dynamics that govern our daily lives, and no dedicated peer-to-peer (P2P) systems or centralized server-based systems need be constructed; (ii) It utilizes a percolation search algorithm that makes the query-generated traffic scalable; (iii) The network has a built in trust system (just as in social networks) that can be used to thwart malicious attacks; iv) It can be implemented right now as a plugin to popular email programs, such as MS Outlook, Eudora, and Sendmail.
We introduce BruNet, a general P2P software framework which we use to produce the first implement... more We introduce BruNet, a general P2P software framework which we use to produce the first implementation of Symphony, a 1-D Kleinberg small-world architecture. Our framework is designed to easily implement and measure different P2P protocols over different transport layers such as TCP or UDP. This paper discusses our implementation of the Symphony network, which allows each node to keep klelogNk \le \log NklelogN shortcut connections and to route to any other node with a short average delay of O(frac1klog2N)O(\frac{1}{k}\log^2 N)O(frac1klog2N). %This provides a continuous trade-off between node degree and routing latency. We present experimental results taken from several PlanetLab deployments of size up to 1060 nodes. These succes sful deployments represent some of the largest PlanetLab deployments of P2P overlays found in the literature, and show our implementation's robustness to massive node dynamics in a WAN environment.
Computing Research Repository, 2007
We introduce BruNet, a general P2P software framework which we use to produce the first implement... more We introduce BruNet, a general P2P software framework which we use to produce the first implementation of Symphony, a 1-D Kleinberg small-world architecture. Our framework is designed to easily implement and measure different P2P protocols over different transport layers such as TCP or UDP. This paper discusses our implementation of the Symphony network, which allows each node to keep k ≤ log N shortcut connections and to route to any other node with a short average delay of O( 1 k log 2 N ). We present experimental results taken from several PlanetLab deployments of size up to 1060 nodes. These successful deployments represent some of the largest PlanetLab deployments of P2P overlays found in the literature, and show our implementation's robustness to massive node dynamics in a WAN environment. Abstract
Computing Research Repository, 2005
Susceptibility of scale free Power Law (PL) networks to attacks has been traditionally studied in... more Susceptibility of scale free Power Law (PL) networks to attacks has been traditionally studied in the context of what may be termed as instantaneous attacks, where a randomly selected set of nodes and edges are deleted while the network is kept static. In this paper, we shift the focus to the study of progressive and instantaneous attacks on reactive grown and random PL networks, which can respond to attacks and take remedial steps. In the process, we present several techniques that managed networks can adopt to minimize the damages during attacks, and also to efficiently recover from the aftermath of successful attacks. For example, we present (i) compensatory dynamics that minimize the damages inflicted by targeted progressive attacks, such as linear-preferential deletions of nodes in grown PL networks; the resulting dynamic naturally leads to the emergence of networks with PL degree distributions with exponential cutoffs; (ii) distributed healing algorithms that can scale the maximum degree of nodes in a PL network using only local decisions, and (iii) efficient means of creating giant connected components in a PL network that has been fragmented by attacks on a large number of high-degree nodes. Such targeted attacks are considered to be a major vulnerability of PL networks; however, our results show that the introduction of only a small number of random edges, through a reverse percolation process, can restore connectivity, which in turn allows restoration of other topological properties of the original network. Thus, the scale-free nature of the networks can itself be effectively utilized for protection and recovery purposes. and more recently Peer-To-Peer (P2P) networks. With the economy going more global every year, the emergence of different kinds of complex networks that interconnect distributed centers of communication, finance, and manufacturing, will only see a rapid growth. In such networks, attacks are a fact of life, and simple attacks, such as Denial-of-Service (DOS), can cripple hubs and other nodes, leading to severe disruptions of services. Understanding the effect of attacks, and mechanisms to respond to attacks is thus of great practical importance to many network based systems.
Conference on Email and Anti-Spam, 2005
e introdue ollortive ntiEspm sysE tem tht is sed on pervsive glol soil emil networksF issentillyD... more e introdue ollortive ntiEspm sysE tem tht is sed on pervsive glol soil emil networksF issentillyD we provide soE lution to this open reserh prolemX given network of N users who re willing to shre informtion ollortively @eFgF the digests or ngerprints of known spmsAD how do we serh for eh user9s ontent eiently nd relily in distriuted mnner with miniml tr ost on the networkc es solution to this open prolemD our proposed system emE ploys the peroltion serh proessD whih mkes the tr generted due to queries for spm digests sle sulinerly s funtion of N F roweverD in order to rep the enets of this novel peroltion serh lgorithmD the node degree distriution of the underlying network must e hevyEtiledF snterestinglyD ltent glol soil emil networks omprisE ing of personl ontts possess powerElw hevyEtiled degree distriutionD whih renE ders itself n idel nturl pltform to emE ploy the peroltion serh lgorithmF es resultD our proposed distriuted spm lter requires no dedited peerEtoEpeer @PA sysE tems or entrlized serverEsed systemsF e hve performed lrgeEsle simultions nd we nd tht the system hieves spm deE tetion rte lose to IHH7D while the flse posE itive rte is kept round zeroF he ndwidth ost per user s well s the systemEwide ndE width ost re shown to e very lowF
Management Science Letters, 2013
Management Science Letters, 2013
Lecture Notes in Computer Science, 2007
One natural constraint in the sponsored search advertising framework arises from the fact that th... more One natural constraint in the sponsored search advertising framework arises from the fact that there is a limit on the number of available slots, especially for the popular keywords, and as a result, a significant pool of advertisers are left out. We study the emergence of diversification in the adword market triggered by such capacity constraints in the sense that new market mechanisms, as well as, new for-profit agents are likely to emerge to combat or to make profit from the opportunities created by shortages in ad-space inventory. We propose a model where the additional capacity is provided by for-profit agents (or, mediators), who compete for slots in the original auction, draw traffic, and run their own sub-auctions. The quality of the additional capacity provided by a mediator is measured by its fitness factor. We compute revenues and payoffs for all the different parties at a symmetric Nash equilibrium (SNE) when the mediator-based model is operated by a mechanism currently being used by Google and Yahoo!, and then compare these numbers with those obtained at a corresponding SNE for the same mechanism, but without any mediators involved in the auctions. Such calculations allow us to determine the value of the additional capacity. Our results show that the revenue of the auctioneer, as well as the social value (i.e. efficiency ), always increase when mediators are involved; moreover even the payoffs of all the bidders will increase if the mediator has a high enough fitness. Thus, our analysis indicates that there are significant opportunities for diversification in the internet economy and we should expect it to continue to develop richer structure, with room for different types of agents and mechanisms to coexist.
ACM SIGecom Exchanges, 2008
We investigate market forces that would lead to the emergence of new classes of players in the sp... more We investigate market forces that would lead to the emergence of new classes of players in the sponsored search market. We report a 3-fold diversification triggered by two inherent features of the sponsored search market, namely, capacity constraints and collusion-vulnerability of current mechanisms. In the first scenario, we present a comparative study of two models motivated by capacity constraints -one where the additional capacity is provided by for-profit agents (or, mediators), who compete for slots in the original auction, draw traffic, and run their own sub-auctions, and the other, where the additional capacity is provided by the auctioneer herself, by essentially acting as a mediator and running a single combined auction. This study was initiated by us in , where the mediator-based model was studied. In the present work, we study the auctioneer-based model and show that this single combinedauction model seems inferior to the mediator-based model in terms of revenue or efficiency guarantee due to added capacity. In the second scenario, we initiate a game theoretic study of current sponsored search auctions, involving incentive driven mediators who exploit the fact that these mechanisms are not collusion-resistant. In particular, we show that advertisers can improve their payoffs by using the services of the mediator compared to directly participating in the auction, and that the mediator can also obtain monetary benefit by managing the advertising burden of its advertisers, without violating incentive constraints from the advertisers who do not use its services. We also point out that the auctioneer can not do very much via mechanism design to avoid such for-profit mediation without losing badly in terms of revenue, and therefore, the mediators are likely to prevail. Thus, our analysis indicates that there are significant opportunities for diversification in the internet economy and we should expect it to continue to develop richer structure, with room for different types of agents and mechanisms to coexist.
Management Science Letters, 2013
This paper is concerned with the scaling of the number of hops in a large scale wireless ad-hoc n... more This paper is concerned with the scaling of the number of hops in a large scale wireless ad-hoc network (WANET), a quantity we call network latency. A large network latency affects all aspects of data communication in a WANET, including an increase in delay, packet loss, required processing power and memory. We consider network management and data routing challenges in WANETs with scalable network latency. On the physical side, reducing network latency imposes a significantly higher power and bandwidth demand on nodes, as is reflected in a set of new bounds. On the protocol front, designing distributed routing protocols that can guarantee the delivery of data packets within scalable number of hops is a challenging task. To solve this, we introduce multi-resolution randomized hierarchy (MRRH), a novel power and bandwidth efficient WANET protocol with scalable network latency. MRRH uses a randomized algorithm for building and maintaining a random hierarchical network topology, which together with the proposed routing algorithm can guarantee efficient delivery of data packets in the wireless network. For a network of size NNN, MRRH can provide an average latency of only O(log3N)O(\log^{3} N)O(log3N). The power and bandwidth consumption of MRRH are shown to be \emph{nearly} optimal for the latency it provides. Therefore, MRRH, is a provably efficient candidate for truly large scale wireless ad-hoc networking.
Computing Research Repository, 2006
This paper is concerned with the scaling of the number of hops in a large scale wireless ad-hoc n... more This paper is concerned with the scaling of the number of hops in a large scale wireless ad-hoc network (WANET), a quantity we call network latency. A large network latency affects all aspects of data communication in a WANET, including an increase in delay, packet loss, required processing power and memory. We consider network management and data routing challenges in WANETs with scalable network latency. On the physical side, reducing network latency imposes a significantly higher power and bandwidth demand on nodes, as is reflected in a set of new bounds. On the protocol front, designing distributed routing protocols that can guarantee the delivery of data packets within scalable number of hops is a challenging task. To solve this, we introduce multi-resolution randomized hierarchy (MRRH), a novel power and bandwidth efficient WANET protocol with scalable network latency. MRRH uses a randomized algorithm for building and maintaining a random hierarchical network topology, which together with the proposed routing algorithm can guarantee efficient delivery of data packets in the wireless network. For a network of size NNN, MRRH can provide an average latency of only O(log3N)O(\log^{3} N)O(log3N). The power and bandwidth consumption of MRRH are shown to be \emph{nearly} optimal for the latency it provides. Therefore, MRRH, is a provably efficient candidate for truly large scale wireless ad-hoc networking.
Computing Research Repository, 2007
A mediator is a well-known construct in game theory, and is an entity that plays on behalf of som... more A mediator is a well-known construct in game theory, and is an entity that plays on behalf of some of the agents who choose to use its services, while the rest of the agents participate in the game directly. We initiate a game theoretic study of sponsored search auctions, such as those used by Google and Yahoo!, involving {\em incentive driven} mediators. We refer to such mediators as {\em for-profit} mediators, so as to distinguish them from mediators introduced in prior work, who have no monetary incentives, and are driven by the altruistic goal of implementing certain desired outcomes. We show that in our model, (i) players/advertisers can improve their payoffs by choosing to use the services of the mediator, compared to directly participating in the auction; (ii) the mediator can obtain monetary benefit by managing the advertising burden of its group of advertisers; and (iii) the payoffs of the mediator and the advertisers it plays for are compatible with the incentive constraints from the advertisers who do dot use its services. A simple intuition behind the above result comes from the observation that the mediator has more information about and more control over the bid profile than any individual advertiser, allowing her to reduce the payments made to the auctioneer, while still maintaining incentive constraints. Further, our results indicate that there are significant opportunities for diversification in the internet economy and we should expect it to continue to develop richer structure, with room for different types of agents to coexist.
Sigecom Exchanges, 2008
We investigate market forces that would lead to the emergence of new classes of players in the sp... more We investigate market forces that would lead to the emergence of new classes of players in the sponsored search market. We report a 3-fold diversification triggered by two inherent features of the sponsored search market, namely, capacity constraints and collusion-vulnerability of current mechanisms. In the first scenario, we present a comparative study of two models motivated by capacity constraints -one where the additional capacity is provided by for-profit agents (or, mediators), who compete for slots in the original auction, draw traffic, and run their own sub-auctions, and the other, where the additional capacity is provided by the auctioneer herself, by essentially acting as a mediator and running a single combined auction. This study was initiated by us in , where the mediator-based model was studied. In the present work, we study the auctioneer-based model and show that this single combinedauction model seems inferior to the mediator-based model in terms of revenue or efficiency guarantee due to added capacity. In the second scenario, we initiate a game theoretic study of current sponsored search auctions, involving incentive driven mediators who exploit the fact that these mechanisms are not collusion-resistant. In particular, we show that advertisers can improve their payoffs by using the services of the mediator compared to directly participating in the auction, and that the mediator can also obtain monetary benefit by managing the advertising burden of its advertisers, without violating incentive constraints from the advertisers who do not use its services. We also point out that the auctioneer can not do very much via mechanism design to avoid such for-profit mediation without losing badly in terms of revenue, and therefore, the mediators are likely to prevail. Thus, our analysis indicates that there are significant opportunities for diversification in the internet economy and we should expect it to continue to develop richer structure, with room for different types of agents and mechanisms to coexist.