Bernardo Huberman - Profile on Academia.edu (original) (raw)

Papers by Bernardo Huberman

Research paper thumbnail of Breakdown of absolute rate theory and prefactor anomalies in superionic conductors

Solid State Communications, Mar 1, 1978

We suggest that the anomalously low prefactors observed for the ionic hopping rate in defect-stru... more We suggest that the anomalously low prefactors observed for the ionic hopping rate in defect-structure superionic conductors are due to a breakdown of absolute rate theory. A treatment that takes dissipative processes into account is introduced and a new expression for the prefactor is obtained. We present NMR data on the supenonic conductor Li 2Ti3O7 that supports these ideas and extract the value of the elemental energy transfer per collision. Recent Nuclear Magnetic Resonance measurements on that the actual rate is proportional to the frictional forces the class of superionic conductors that possess defect acting on the particle. In this case the prefactor of , can structures have revealed a puzzle in the values of the ionic then be considerably smaller than typical vibrational hopping rates. Conventional wisdom gives the hopping frequencies. These results have been lately utilized by Suhl rate, v, for a classical particle in a potential well in terms and coworkers 12 in the analysis of certain catalytic of absolute rate theory1 ~reactions on metals that show low prefactors in Eq. ( ). In our treatment, however, we will follow a formulation due to Iche and Nozieres13 who, by exploiting the analogy of

Research paper thumbnail of Collective attention and the dynamics of group deals

We present a study of the group purchasing behavior of daily deals in Groupon and LivingSocial an... more We present a study of the group purchasing behavior of daily deals in Groupon and LivingSocial and introduce a predictive dynamic model of collective attention for group buying behavior. In our model, the aggregate number of purchases at a given time comprises two types of processes: random discovery and social propagation. These processes are very clearly separated by an inflection point. Using large data sets from both Groupon and LivingSocial we show how the model is able to predict the success of group deals as a function of time. We find that Groupon deals are easier to predict accurately earlier in the deal lifecycle than LivingSocial deals due to the final number of deal purchases saturating quicker. One possible explanation for this is that the incentive to socially propagate a deal is based on an individual threshold in LivingSocial whereas it in Groupon is based on a collective threshold, which is reached very early. Furthermore, the personal benefit of propagating a deal is also greater in LivingSocial.

Research paper thumbnail of Collective attention and the dynamics of group deals

We present a study of the group purchasing behavior of daily deals in Groupon and LivingSocial an... more We present a study of the group purchasing behavior of daily deals in Groupon and LivingSocial and introduce a predictive dynamic model of collective attention for group buying behavior. In our model, the aggregate number of purchases at a given time comprises two types of processes: random discovery and social propagation. These processes are very clearly separated by an inflection point. Using large data sets from both Groupon and LivingSocial we show how the model is able to predict the success of group deals as a function of time. We find that Groupon deals are easier to predict accurately earlier in the deal lifecycle than LivingSocial deals due to the final number of deal purchases saturating quicker. One possible explanation for this is that the incentive to socially propagate a deal is based on an individual threshold in LivingSocial whereas it in Groupon is based on a collective threshold, which is reached very early. Furthermore, the personal benefit of propagating a deal is also greater in LivingSocial.

Research paper thumbnail of Long trend dynamics in social media

EPJ Data Science, May 18, 2012

A main characteristic of social media is that its diverse content, copiously generated by both st... more A main characteristic of social media is that its diverse content, copiously generated by both standard outlets and general users, constantly competes for the scarce attention of large audiences. Out of this flood of information some topics manage to get enough attention to become the most popular ones and thus to be prominently displayed as trends. Equally important, some of these trends persist long enough so as to shape part of the social agenda. How this happens is the focus of this paper. By introducing a stochastic dynamical model that takes into account the user's repeated involvement with given topics, we can predict the distribution of trend durations as well as the thresholds in popularity that lead to their emergence within social media. Detailed measurements of datasets from Twitter confirm the validity of the model and its predictions.

Research paper thumbnail of Long trend dynamics in social media

EPJ Data Science, May 18, 2012

A main characteristic of social media is that its diverse content, copiously generated by both st... more A main characteristic of social media is that its diverse content, copiously generated by both standard outlets and general users, constantly competes for the scarce attention of large audiences. Out of this flood of information some topics manage to get enough attention to become the most popular ones and thus to be prominently displayed as trends. Equally important, some of these trends persist long enough so as to shape part of the social agenda. How this happens is the focus of this paper. By introducing a stochastic dynamical model that takes into account the user's repeated involvement with given topics, we can predict the distribution of trend durations as well as the thresholds in popularity that lead to their emergence within social media. Detailed measurements of datasets from Twitter confirm the validity of the model and its predictions.

Research paper thumbnail of Electronic states of superionic conductors

Physical review, Apr 15, 1976

Short-range ionic disorder effects on the electronic states of superionic conductors were studied... more Short-range ionic disorder effects on the electronic states of superionic conductors were studied by wavelengthmodulated absorption and reflectance measurements of RbAg4I, and P-AgI at 4. 2'K through 297'K. The normalized-temperature-independent band gap increase of -0.3 eV for RbAg4I~over P-Agl is understood in terms of an increased ionicity for a homogeneous solid solution of Rbl(Agl)4. The bandage exciton of RbAg, I, greatly broadens with increasing temperature compared to P-AgI. %'e attribute this to a large smearing of the valence-band density of states caused by either near-neighbor wave-function hybridization or random-site scattering. A band structure for RbAg, I, is proposed.

Research paper thumbnail of Electronic states of superionic conductors

Physical review, Apr 15, 1976

Short-range ionic disorder effects on the electronic states of superionic conductors were studied... more Short-range ionic disorder effects on the electronic states of superionic conductors were studied by wavelengthmodulated absorption and reflectance measurements of RbAg4I, and P-AgI at 4. 2'K through 297'K. The normalized-temperature-independent band gap increase of -0.3 eV for RbAg4I~over P-Agl is understood in terms of an increased ionicity for a homogeneous solid solution of Rbl(Agl)4. The bandage exciton of RbAg, I, greatly broadens with increasing temperature compared to P-AgI. %'e attribute this to a large smearing of the valence-band density of states caused by either near-neighbor wave-function hybridization or random-site scattering. A band structure for RbAg, I, is proposed.

Research paper thumbnail of Complexity and Adaptation

Physica D: Nonlinear Phenomena, Oct 1, 1986

We introduce a physical measure of the complexity of a system based on its diversity, while ignor... more We introduce a physical measure of the complexity of a system based on its diversity, while ignoring its detailed specification. It applies to discrete hierarchical structures made up of elementary parts and provides a precise, readily computable quantitative measure. This measure of complexity is maximal for systems which are intermediate between perfect order and complete disorder, and which are also characterized by the existence of many relevant length and time scales. We also discuss the relationship between complexity and adaptation and present experimental results on adaptive parallel computing arrays which compute with attractors.

Research paper thumbnail of Complexity and Adaptation

Physica D: Nonlinear Phenomena, Oct 1, 1986

We introduce a physical measure of the complexity of a system based on its diversity, while ignor... more We introduce a physical measure of the complexity of a system based on its diversity, while ignoring its detailed specification. It applies to discrete hierarchical structures made up of elementary parts and provides a precise, readily computable quantitative measure. This measure of complexity is maximal for systems which are intermediate between perfect order and complete disorder, and which are also characterized by the existence of many relevant length and time scales. We also discuss the relationship between complexity and adaptation and present experimental results on adaptive parallel computing arrays which compute with attractors.

Research paper thumbnail of Why Neural Networks Work

arXiv (Cornell University), Nov 26, 2022

We argue that many properties of fully-connected feedforward neural networks (FCNNs), also called... more We argue that many properties of fully-connected feedforward neural networks (FCNNs), also called multi-layer perceptrons (MLPs), are explainable from the analysis of a single pair of operations, namely a random projection into a higher-dimensional space than the input, followed by a sparsification operation. For convenience, we call this pair of successive operations expand-and-sparsify following the terminology of Dasgupta. We show how expand-andsparsify can explain the observed phenomena that have been discussed in the literature, such as the so-called Lottery Ticket Hypothesis, the surprisingly good performance of randomlyinitialized untrained neural networks, the efficacy of Dropout in training and most importantly, the mysterious generalization ability of overparameterized models, first highlighted by Zhang et al. and subsequently identified even in non-neural network models by Belkin et al.

Research paper thumbnail of Why Neural Networks Work

arXiv (Cornell University), Nov 26, 2022

We argue that many properties of fully-connected feedforward neural networks (FCNNs), also called... more We argue that many properties of fully-connected feedforward neural networks (FCNNs), also called multi-layer perceptrons (MLPs), are explainable from the analysis of a single pair of operations, namely a random projection into a higher-dimensional space than the input, followed by a sparsification operation. For convenience, we call this pair of successive operations expand-and-sparsify following the terminology of Dasgupta. We show how expand-andsparsify can explain the observed phenomena that have been discussed in the literature, such as the so-called Lottery Ticket Hypothesis, the surprisingly good performance of randomlyinitialized untrained neural networks, the efficacy of Dropout in training and most importantly, the mysterious generalization ability of overparameterized models, first highlighted by Zhang et al. and subsequently identified even in non-neural network models by Belkin et al.

Research paper thumbnail of Boiling the frog optimally

Humboldt-Universität zu Berlin, Wirtschaftswissenschaftliche Fakultät eBooks, Oct 12, 2014

When should a necessary inconvenience be introduced gradually, and when should it be imposed all ... more When should a necessary inconvenience be introduced gradually, and when should it be imposed all at once? The question is crucial to web content providers, who in order to generate revenue must sooner or later introduce advertisements, subscription fees, or other inconveniences. Assuming that eventually people fully adapt to changes, the answer depends only on the shape of the survivor curve S(x), which represents the fraction of a user population willing to tolerate inconveniences of size x (Aperjis and Huberman 2011). We report a new laboratory experiment that, for the rst time, estimates the shape of survivor curves in several dierent settings. We engage laboratory subjects in a series of six desirable activities, e.g., playing a video game, viewing a chosen video clip, or earning money by answering questions. For each activity we introduce a chosen level x ∈ [x min , x max ] of a particular inconvenience, and each subject chooses whether to tolerate the inconvenience or to switch to a bland activity for the remaining time. Our key nding is that, in general, the survivor curve is log-convex. Theory suggests therefore that introducing inconveniences all at once will generally be more protable for web content providers.

Research paper thumbnail of Boiling the frog optimally

Humboldt-Universität zu Berlin, Wirtschaftswissenschaftliche Fakultät eBooks, Oct 12, 2014

When should a necessary inconvenience be introduced gradually, and when should it be imposed all ... more When should a necessary inconvenience be introduced gradually, and when should it be imposed all at once? The question is crucial to web content providers, who in order to generate revenue must sooner or later introduce advertisements, subscription fees, or other inconveniences. Assuming that eventually people fully adapt to changes, the answer depends only on the shape of the survivor curve S(x), which represents the fraction of a user population willing to tolerate inconveniences of size x (Aperjis and Huberman 2011). We report a new laboratory experiment that, for the rst time, estimates the shape of survivor curves in several dierent settings. We engage laboratory subjects in a series of six desirable activities, e.g., playing a video game, viewing a chosen video clip, or earning money by answering questions. For each activity we introduce a chosen level x ∈ [x min , x max ] of a particular inconvenience, and each subject chooses whether to tolerate the inconvenience or to switch to a bland activity for the remaining time. Our key nding is that, in general, the survivor curve is log-convex. Theory suggests therefore that introducing inconveniences all at once will generally be more protable for web content providers.

Research paper thumbnail of Partitioning uncertain workloads

Netnomics, Oct 25, 2016

We present a method for determining the ratio of the tasks when breaking any complex workload in ... more We present a method for determining the ratio of the tasks when breaking any complex workload in such a way that once the outputs from all tasks are joined, their full completion takes less time and exhibit smaller variance than when running on the undivided workload. To do that, we have to infer the capabilities of the processing unit executing the divided workloads or tasks. We propose a Bayesian Inference algorithm to infer the amount of time each task takes in a way that does not require prior knowledge on the processing unit capability. We demonstrate the effectiveness of this method in two different scenarios; the optimization of a convex function and the transmission of a large computer file over the Internet. Then we show that the Bayesian inference algorithm correctly estimates the amount of time each task takes when executed in one of the processing units.

Research paper thumbnail of Partitioning uncertain workloads

Netnomics, Oct 25, 2016

We present a method for determining the ratio of the tasks when breaking any complex workload in ... more We present a method for determining the ratio of the tasks when breaking any complex workload in such a way that once the outputs from all tasks are joined, their full completion takes less time and exhibit smaller variance than when running on the undivided workload. To do that, we have to infer the capabilities of the processing unit executing the divided workloads or tasks. We propose a Bayesian Inference algorithm to infer the amount of time each task takes in a way that does not require prior knowledge on the processing unit capability. We demonstrate the effectiveness of this method in two different scenarios; the optimization of a convex function and the transmission of a large computer file over the Internet. Then we show that the Bayesian inference algorithm correctly estimates the amount of time each task takes when executed in one of the processing units.

Research paper thumbnail of The Pulse of News in Social Media: Forecasting Popularity

arXiv (Cornell University), Feb 1, 2012

There is also intense competition among news items to propagate as widely as possible. Hence, the... more There is also intense competition among news items to propagate as widely as possible. Hence, the task of predicting the popularity of news items on the social web is both interesting and challenging. Prior research has dealt with predicting eventual online popularity based on early popularity. It is most desirable, however, to predict the popularity of items prior to their release, fostering the possibility of appropriate decision making to modify an article and the manner of its publication. In this paper, we construct a multi-dimensional feature space derived from properties of an article and evaluate the efficacy of these features to serve as predictors of online popularity. We examine both regression and classification algorithms and demonstrate that despite randomness in human behavior, it is possible to predict ranges of popularity on twitter with an overall 84% accuracy. Our study also serves to illustrate the differences between traditionally prominent sources and those immensely popular on the social web.

Research paper thumbnail of The Pulse of News in Social Media: Forecasting Popularity

arXiv (Cornell University), Feb 1, 2012

There is also intense competition among news items to propagate as widely as possible. Hence, the... more There is also intense competition among news items to propagate as widely as possible. Hence, the task of predicting the popularity of news items on the social web is both interesting and challenging. Prior research has dealt with predicting eventual online popularity based on early popularity. It is most desirable, however, to predict the popularity of items prior to their release, fostering the possibility of appropriate decision making to modify an article and the manner of its publication. In this paper, we construct a multi-dimensional feature space derived from properties of an article and evaluate the efficacy of these features to serve as predictors of online popularity. We examine both regression and classification algorithms and demonstrate that despite randomness in human behavior, it is possible to predict ranges of popularity on twitter with an overall 84% accuracy. Our study also serves to illustrate the differences between traditionally prominent sources and those immensely popular on the social web.

Research paper thumbnail of Quantum dynamics

Quantum dynamics

Springer eBooks, Apr 6, 2008

ABSTRACT

Research paper thumbnail of Quantum dynamics

Quantum dynamics

Springer eBooks, Apr 6, 2008

ABSTRACT

Research paper thumbnail of Adaptive Portfolio Selection by Investment Groups *

IFAC Proceedings Volumes, Jun 1, 1998

We present a new investment group model for the portfolio selection problem. Members of the group... more We present a new investment group model for the portfolio selection problem. Members of the group adjust their portfolio as they observe movements of the market over time and communicate to each other their current portfolio and its recent performance. Investors can choose to switch to any portfolio performing better than their own. We show that a group of adaptive investors will outperform a single adaptive investor for a simple market model. FUrthermore, a group of investors can improve their performance through communicatio n. Finally we show that communication is redundant in an extended market model that includes efficiency-con straints on correlations between stock price dynamics. Copyright ~ 19981FAC

Research paper thumbnail of Breakdown of absolute rate theory and prefactor anomalies in superionic conductors

Solid State Communications, Mar 1, 1978

We suggest that the anomalously low prefactors observed for the ionic hopping rate in defect-stru... more We suggest that the anomalously low prefactors observed for the ionic hopping rate in defect-structure superionic conductors are due to a breakdown of absolute rate theory. A treatment that takes dissipative processes into account is introduced and a new expression for the prefactor is obtained. We present NMR data on the supenonic conductor Li 2Ti3O7 that supports these ideas and extract the value of the elemental energy transfer per collision. Recent Nuclear Magnetic Resonance measurements on that the actual rate is proportional to the frictional forces the class of superionic conductors that possess defect acting on the particle. In this case the prefactor of , can structures have revealed a puzzle in the values of the ionic then be considerably smaller than typical vibrational hopping rates. Conventional wisdom gives the hopping frequencies. These results have been lately utilized by Suhl rate, v, for a classical particle in a potential well in terms and coworkers 12 in the analysis of certain catalytic of absolute rate theory1 ~reactions on metals that show low prefactors in Eq. ( ). In our treatment, however, we will follow a formulation due to Iche and Nozieres13 who, by exploiting the analogy of

Research paper thumbnail of Collective attention and the dynamics of group deals

We present a study of the group purchasing behavior of daily deals in Groupon and LivingSocial an... more We present a study of the group purchasing behavior of daily deals in Groupon and LivingSocial and introduce a predictive dynamic model of collective attention for group buying behavior. In our model, the aggregate number of purchases at a given time comprises two types of processes: random discovery and social propagation. These processes are very clearly separated by an inflection point. Using large data sets from both Groupon and LivingSocial we show how the model is able to predict the success of group deals as a function of time. We find that Groupon deals are easier to predict accurately earlier in the deal lifecycle than LivingSocial deals due to the final number of deal purchases saturating quicker. One possible explanation for this is that the incentive to socially propagate a deal is based on an individual threshold in LivingSocial whereas it in Groupon is based on a collective threshold, which is reached very early. Furthermore, the personal benefit of propagating a deal is also greater in LivingSocial.

Research paper thumbnail of Collective attention and the dynamics of group deals

We present a study of the group purchasing behavior of daily deals in Groupon and LivingSocial an... more We present a study of the group purchasing behavior of daily deals in Groupon and LivingSocial and introduce a predictive dynamic model of collective attention for group buying behavior. In our model, the aggregate number of purchases at a given time comprises two types of processes: random discovery and social propagation. These processes are very clearly separated by an inflection point. Using large data sets from both Groupon and LivingSocial we show how the model is able to predict the success of group deals as a function of time. We find that Groupon deals are easier to predict accurately earlier in the deal lifecycle than LivingSocial deals due to the final number of deal purchases saturating quicker. One possible explanation for this is that the incentive to socially propagate a deal is based on an individual threshold in LivingSocial whereas it in Groupon is based on a collective threshold, which is reached very early. Furthermore, the personal benefit of propagating a deal is also greater in LivingSocial.

Research paper thumbnail of Long trend dynamics in social media

EPJ Data Science, May 18, 2012

A main characteristic of social media is that its diverse content, copiously generated by both st... more A main characteristic of social media is that its diverse content, copiously generated by both standard outlets and general users, constantly competes for the scarce attention of large audiences. Out of this flood of information some topics manage to get enough attention to become the most popular ones and thus to be prominently displayed as trends. Equally important, some of these trends persist long enough so as to shape part of the social agenda. How this happens is the focus of this paper. By introducing a stochastic dynamical model that takes into account the user's repeated involvement with given topics, we can predict the distribution of trend durations as well as the thresholds in popularity that lead to their emergence within social media. Detailed measurements of datasets from Twitter confirm the validity of the model and its predictions.

Research paper thumbnail of Long trend dynamics in social media

EPJ Data Science, May 18, 2012

A main characteristic of social media is that its diverse content, copiously generated by both st... more A main characteristic of social media is that its diverse content, copiously generated by both standard outlets and general users, constantly competes for the scarce attention of large audiences. Out of this flood of information some topics manage to get enough attention to become the most popular ones and thus to be prominently displayed as trends. Equally important, some of these trends persist long enough so as to shape part of the social agenda. How this happens is the focus of this paper. By introducing a stochastic dynamical model that takes into account the user's repeated involvement with given topics, we can predict the distribution of trend durations as well as the thresholds in popularity that lead to their emergence within social media. Detailed measurements of datasets from Twitter confirm the validity of the model and its predictions.

Research paper thumbnail of Electronic states of superionic conductors

Physical review, Apr 15, 1976

Short-range ionic disorder effects on the electronic states of superionic conductors were studied... more Short-range ionic disorder effects on the electronic states of superionic conductors were studied by wavelengthmodulated absorption and reflectance measurements of RbAg4I, and P-AgI at 4. 2'K through 297'K. The normalized-temperature-independent band gap increase of -0.3 eV for RbAg4I~over P-Agl is understood in terms of an increased ionicity for a homogeneous solid solution of Rbl(Agl)4. The bandage exciton of RbAg, I, greatly broadens with increasing temperature compared to P-AgI. %'e attribute this to a large smearing of the valence-band density of states caused by either near-neighbor wave-function hybridization or random-site scattering. A band structure for RbAg, I, is proposed.

Research paper thumbnail of Electronic states of superionic conductors

Physical review, Apr 15, 1976

Short-range ionic disorder effects on the electronic states of superionic conductors were studied... more Short-range ionic disorder effects on the electronic states of superionic conductors were studied by wavelengthmodulated absorption and reflectance measurements of RbAg4I, and P-AgI at 4. 2'K through 297'K. The normalized-temperature-independent band gap increase of -0.3 eV for RbAg4I~over P-Agl is understood in terms of an increased ionicity for a homogeneous solid solution of Rbl(Agl)4. The bandage exciton of RbAg, I, greatly broadens with increasing temperature compared to P-AgI. %'e attribute this to a large smearing of the valence-band density of states caused by either near-neighbor wave-function hybridization or random-site scattering. A band structure for RbAg, I, is proposed.

Research paper thumbnail of Complexity and Adaptation

Physica D: Nonlinear Phenomena, Oct 1, 1986

We introduce a physical measure of the complexity of a system based on its diversity, while ignor... more We introduce a physical measure of the complexity of a system based on its diversity, while ignoring its detailed specification. It applies to discrete hierarchical structures made up of elementary parts and provides a precise, readily computable quantitative measure. This measure of complexity is maximal for systems which are intermediate between perfect order and complete disorder, and which are also characterized by the existence of many relevant length and time scales. We also discuss the relationship between complexity and adaptation and present experimental results on adaptive parallel computing arrays which compute with attractors.

Research paper thumbnail of Complexity and Adaptation

Physica D: Nonlinear Phenomena, Oct 1, 1986

We introduce a physical measure of the complexity of a system based on its diversity, while ignor... more We introduce a physical measure of the complexity of a system based on its diversity, while ignoring its detailed specification. It applies to discrete hierarchical structures made up of elementary parts and provides a precise, readily computable quantitative measure. This measure of complexity is maximal for systems which are intermediate between perfect order and complete disorder, and which are also characterized by the existence of many relevant length and time scales. We also discuss the relationship between complexity and adaptation and present experimental results on adaptive parallel computing arrays which compute with attractors.

Research paper thumbnail of Why Neural Networks Work

arXiv (Cornell University), Nov 26, 2022

We argue that many properties of fully-connected feedforward neural networks (FCNNs), also called... more We argue that many properties of fully-connected feedforward neural networks (FCNNs), also called multi-layer perceptrons (MLPs), are explainable from the analysis of a single pair of operations, namely a random projection into a higher-dimensional space than the input, followed by a sparsification operation. For convenience, we call this pair of successive operations expand-and-sparsify following the terminology of Dasgupta. We show how expand-andsparsify can explain the observed phenomena that have been discussed in the literature, such as the so-called Lottery Ticket Hypothesis, the surprisingly good performance of randomlyinitialized untrained neural networks, the efficacy of Dropout in training and most importantly, the mysterious generalization ability of overparameterized models, first highlighted by Zhang et al. and subsequently identified even in non-neural network models by Belkin et al.

Research paper thumbnail of Why Neural Networks Work

arXiv (Cornell University), Nov 26, 2022

We argue that many properties of fully-connected feedforward neural networks (FCNNs), also called... more We argue that many properties of fully-connected feedforward neural networks (FCNNs), also called multi-layer perceptrons (MLPs), are explainable from the analysis of a single pair of operations, namely a random projection into a higher-dimensional space than the input, followed by a sparsification operation. For convenience, we call this pair of successive operations expand-and-sparsify following the terminology of Dasgupta. We show how expand-andsparsify can explain the observed phenomena that have been discussed in the literature, such as the so-called Lottery Ticket Hypothesis, the surprisingly good performance of randomlyinitialized untrained neural networks, the efficacy of Dropout in training and most importantly, the mysterious generalization ability of overparameterized models, first highlighted by Zhang et al. and subsequently identified even in non-neural network models by Belkin et al.

Research paper thumbnail of Boiling the frog optimally

Humboldt-Universität zu Berlin, Wirtschaftswissenschaftliche Fakultät eBooks, Oct 12, 2014

When should a necessary inconvenience be introduced gradually, and when should it be imposed all ... more When should a necessary inconvenience be introduced gradually, and when should it be imposed all at once? The question is crucial to web content providers, who in order to generate revenue must sooner or later introduce advertisements, subscription fees, or other inconveniences. Assuming that eventually people fully adapt to changes, the answer depends only on the shape of the survivor curve S(x), which represents the fraction of a user population willing to tolerate inconveniences of size x (Aperjis and Huberman 2011). We report a new laboratory experiment that, for the rst time, estimates the shape of survivor curves in several dierent settings. We engage laboratory subjects in a series of six desirable activities, e.g., playing a video game, viewing a chosen video clip, or earning money by answering questions. For each activity we introduce a chosen level x ∈ [x min , x max ] of a particular inconvenience, and each subject chooses whether to tolerate the inconvenience or to switch to a bland activity for the remaining time. Our key nding is that, in general, the survivor curve is log-convex. Theory suggests therefore that introducing inconveniences all at once will generally be more protable for web content providers.

Research paper thumbnail of Boiling the frog optimally

Humboldt-Universität zu Berlin, Wirtschaftswissenschaftliche Fakultät eBooks, Oct 12, 2014

When should a necessary inconvenience be introduced gradually, and when should it be imposed all ... more When should a necessary inconvenience be introduced gradually, and when should it be imposed all at once? The question is crucial to web content providers, who in order to generate revenue must sooner or later introduce advertisements, subscription fees, or other inconveniences. Assuming that eventually people fully adapt to changes, the answer depends only on the shape of the survivor curve S(x), which represents the fraction of a user population willing to tolerate inconveniences of size x (Aperjis and Huberman 2011). We report a new laboratory experiment that, for the rst time, estimates the shape of survivor curves in several dierent settings. We engage laboratory subjects in a series of six desirable activities, e.g., playing a video game, viewing a chosen video clip, or earning money by answering questions. For each activity we introduce a chosen level x ∈ [x min , x max ] of a particular inconvenience, and each subject chooses whether to tolerate the inconvenience or to switch to a bland activity for the remaining time. Our key nding is that, in general, the survivor curve is log-convex. Theory suggests therefore that introducing inconveniences all at once will generally be more protable for web content providers.

Research paper thumbnail of Partitioning uncertain workloads

Netnomics, Oct 25, 2016

We present a method for determining the ratio of the tasks when breaking any complex workload in ... more We present a method for determining the ratio of the tasks when breaking any complex workload in such a way that once the outputs from all tasks are joined, their full completion takes less time and exhibit smaller variance than when running on the undivided workload. To do that, we have to infer the capabilities of the processing unit executing the divided workloads or tasks. We propose a Bayesian Inference algorithm to infer the amount of time each task takes in a way that does not require prior knowledge on the processing unit capability. We demonstrate the effectiveness of this method in two different scenarios; the optimization of a convex function and the transmission of a large computer file over the Internet. Then we show that the Bayesian inference algorithm correctly estimates the amount of time each task takes when executed in one of the processing units.

Research paper thumbnail of Partitioning uncertain workloads

Netnomics, Oct 25, 2016

We present a method for determining the ratio of the tasks when breaking any complex workload in ... more We present a method for determining the ratio of the tasks when breaking any complex workload in such a way that once the outputs from all tasks are joined, their full completion takes less time and exhibit smaller variance than when running on the undivided workload. To do that, we have to infer the capabilities of the processing unit executing the divided workloads or tasks. We propose a Bayesian Inference algorithm to infer the amount of time each task takes in a way that does not require prior knowledge on the processing unit capability. We demonstrate the effectiveness of this method in two different scenarios; the optimization of a convex function and the transmission of a large computer file over the Internet. Then we show that the Bayesian inference algorithm correctly estimates the amount of time each task takes when executed in one of the processing units.

Research paper thumbnail of The Pulse of News in Social Media: Forecasting Popularity

arXiv (Cornell University), Feb 1, 2012

There is also intense competition among news items to propagate as widely as possible. Hence, the... more There is also intense competition among news items to propagate as widely as possible. Hence, the task of predicting the popularity of news items on the social web is both interesting and challenging. Prior research has dealt with predicting eventual online popularity based on early popularity. It is most desirable, however, to predict the popularity of items prior to their release, fostering the possibility of appropriate decision making to modify an article and the manner of its publication. In this paper, we construct a multi-dimensional feature space derived from properties of an article and evaluate the efficacy of these features to serve as predictors of online popularity. We examine both regression and classification algorithms and demonstrate that despite randomness in human behavior, it is possible to predict ranges of popularity on twitter with an overall 84% accuracy. Our study also serves to illustrate the differences between traditionally prominent sources and those immensely popular on the social web.

Research paper thumbnail of The Pulse of News in Social Media: Forecasting Popularity

arXiv (Cornell University), Feb 1, 2012

There is also intense competition among news items to propagate as widely as possible. Hence, the... more There is also intense competition among news items to propagate as widely as possible. Hence, the task of predicting the popularity of news items on the social web is both interesting and challenging. Prior research has dealt with predicting eventual online popularity based on early popularity. It is most desirable, however, to predict the popularity of items prior to their release, fostering the possibility of appropriate decision making to modify an article and the manner of its publication. In this paper, we construct a multi-dimensional feature space derived from properties of an article and evaluate the efficacy of these features to serve as predictors of online popularity. We examine both regression and classification algorithms and demonstrate that despite randomness in human behavior, it is possible to predict ranges of popularity on twitter with an overall 84% accuracy. Our study also serves to illustrate the differences between traditionally prominent sources and those immensely popular on the social web.

Research paper thumbnail of Quantum dynamics

Quantum dynamics

Springer eBooks, Apr 6, 2008

ABSTRACT

Research paper thumbnail of Quantum dynamics

Quantum dynamics

Springer eBooks, Apr 6, 2008

ABSTRACT

Research paper thumbnail of Adaptive Portfolio Selection by Investment Groups *

IFAC Proceedings Volumes, Jun 1, 1998

We present a new investment group model for the portfolio selection problem. Members of the group... more We present a new investment group model for the portfolio selection problem. Members of the group adjust their portfolio as they observe movements of the market over time and communicate to each other their current portfolio and its recent performance. Investors can choose to switch to any portfolio performing better than their own. We show that a group of adaptive investors will outperform a single adaptive investor for a simple market model. FUrthermore, a group of investors can improve their performance through communicatio n. Finally we show that communication is redundant in an extended market model that includes efficiency-con straints on correlations between stock price dynamics. Copyright ~ 19981FAC