Peter Eckersley - Academia.edu (original) (raw)
Uploads
Papers by Peter Eckersley
Bepress Legal Series, 2004
The Internet and Copyright Law are particularly ill-suited to each other. One is designed to give... more The Internet and Copyright Law are particularly ill-suited to each other. One is designed to give as much information as possible to everyone who wants it; the other allows authors, artists and publishers to earn money by restricting the distribution of works made out of information. The beneficiaries of copyright law are lobbying for the re-design of computers and the Internet to instate "content control" and "digital rights management" (DRM). These technologies are intended to make copyright workable again by re-imposing limits on access to information goods, but they carry high direct and indirect social costs.
Abstract: A certain air of controversy has arisen around copyright law, as a result of its intera... more Abstract: A certain air of controversy has arisen around copyright law, as a result of its interactions with digital technology. The body of literature claiming that existing copyright laws are economically sub-optimal is growing rapidly. Some authors are even claiming ...
New Scientist, 2009
RefDoc Bienvenue - Welcome. Refdoc est un service / is powered by. ...
New Scientist, 2009
... Essay. The shape of e-capitalism. Peter Eckersley. Available online 26 June 2009. When techno... more ... Essay. The shape of e-capitalism. Peter Eckersley. Available online 26 June 2009. When technology makes knowledge globally available, how we change the economics of buying and selling it is crucial. This article is not included in your organization's subscription. ...
Communications of the ACM
Artificial intelligence and machine learning capabilities are growing at an unprecedented rate. T... more Artificial intelligence and machine learning capabilities are growing
at an unprecedented rate. These technologies have many widely
beneficial applications, ranging from machine translation to medical
image analysis. Countless more such applications are being
developed and can be expected over the long term. Less attention
has historically been paid to the ways in which artificial intelligence
can be used maliciously. This report surveys the landscape of
potential security threats from malicious uses of artificial intelligence
technologies, and proposes ways to better forecast, prevent, and
mitigate these threats. We analyze, but do not conclusively resolve,
the question of what the long-term equilibrium between attackers and
defenders will be. We focus instead on what sorts of attacks we are
likely to see soon if adequate defenses are not developed.
Journal of Artificial General Intelligence, 2013
Brain emulation is a hypothetical but extremely transformative technology which has a non-zero ch... more Brain emulation is a hypothetical but extremely transformative technology which has a non-zero chance of appearing during the next century. This paper investigates whether such a technology would also have any predictable characteristics that give it a chance of being catastrophically dangerous, and whether there are any policy levers which might be used to make it safer. We conclude that the riskiness of brain emulation probably depends on the order of the preceding research trajectory. Broadly speaking, it appears safer for brain emulation to happen sooner, because slower CPUs would make the technology‘s impact more gradual. It may also be safer if brains are scanned before they are fully understood from a neuroscience perspective, thereby increasing the initial population of emulations, although this prediction is weaker and more scenario-dependent. The risks posed by brain emulation also seem strongly connected to questions about the balance of power between attackers and defend...
Abstract In this report we consider a number of approaches to improving the efficacy of search al... more Abstract In this report we consider a number of approaches to improving the efficacy of search algorithms for the World Wide Web and similar hyperlinked document collections. We observe that the exploitation of link topology has resulted in high precision search engines ...
Comcast is the second largest Internet Service Provider (ISP) in the United States. They run the ... more Comcast is the second largest Internet Service Provider (ISP) in the United States. They run the cable TV and cable Internet networks in many parts of the United States, and many consumers know them as their duopoly or monopoly provider of residential broadband ...
Lecture Notes in Computer Science, 2002
This is a brief description of the WilmaScope interactive 3D graph visualisation system. Wilma fe... more This is a brief description of the WilmaScope interactive 3D graph visualisation system. Wilma features clustering of related groups of nodes, a GUI for editing graphs and adjusting the force layout parameters and a CORBA interface for creating and interacting with graphs remotely from other programs. It has also been used to construct 3D UML Class and Object models as part of a usability study. Wilma is freely available under the terms of the GNU Lesser General Public License 1 .
Mathematics and Visualization, 2004
NeuroImage, 2002
We report on differences in sensitivity and false-positive rate across five methods of global nor... more We report on differences in sensitivity and false-positive rate across five methods of global normalization using resting-state fMRI data embedded with simulated activation. These methods were grand mean session scaling, proportional scaling, ANCOVA, a masking method, and an orthogonalization method. We found that global normalization by proportional scaling and ANCOVA decreased the sensitivity of the statistical analysis and induced artifactual deactivation even when the correlation between the global signal and the experimental paradigm was relatively low. The masking method and the orthogonalization method performed better from this perspective but are both restricted to certain experimental conditions. Based on the results of these simulations, we offer practical guidelines for the choice of global normalization method least likely to bias the experimental results.
Neuroinformatics, 2003
The requirements for neuroinformatics to make a significant impact on neuroscience are not simply... more The requirements for neuroinformatics to make a significant impact on neuroscience are not simply technical-the hardware, software, and protocols for collaborative research-they also include the legal and policy frameworks within which projects operate. This is not least because the creation of large collaborative scientific databases amplifies the complicated interactions between proprietary, for-profit R&D and public "open science." In this paper, we draw on experiences from the field of genomics to examine some of the likely consequences of these interactions in neuroscience.
Electronic Frontier Foundation, 2007
Comcast is the second largest Internet Service Provider (ISP) in the United States. They run the ... more Comcast is the second largest Internet Service Provider (ISP) in the United States. They run the cable TV and cable Internet networks in many parts of the United States, and many consumers know them as their duopoly or monopoly provider of residential broadband ...
The requirements for neuroinformatics to make a significant impact on the field of neuroscience a... more The requirements for neuroinformatics to make a significant impact on the field of neuroscience as a whole are not simply technical - the hardware, software, and protocols for collaborative research - they also include the legal and policy frameworks within which research is conducted. This is not least because the creation of large collaborative scientific databases amplifies the complicated interactions
We investigate the degree to which modern web browsers are subject to " device fingerprinting " v... more We investigate the degree to which modern web browsers are subject to " device fingerprinting " via the version and configuration information that they will transmit to websites upon request. We implemented one possible fingerprinting algorithm, and collected these fingerprints from a large sample of browsers that visited our test side, panopticlick.eff.org. We observe that the distribution of our fingerprint contains at least 18.1 bits of entropy, meaning that if we pick a browser at random, at best we expect that only one in 286,777 other browsers will share its fingerprint. Among browsers that support Flash or Java, the situation is worse, with the average browser carrying at least 18.8 bits of identifying information. 94.2% of browsers with Flash or Java were unique in our sample. By observing returning visitors, we estimate how rapidly browser fingerprints might change over time. In our sample, fingerprints changed quite rapidly, but even a simple heuristic was usually able to guess when a fingerprint was an " upgraded " version of a previously observed browser's fingerprint, with 99.1% of guesses correct and a false positive rate of only 0.86%. We discuss what privacy threat browser fingerprinting poses in practice, and what countermeasures may be appropriate to prevent it. There is a tradeoff between protection against fingerprintability and certain kinds of debuggability, which in current browsers is weighted heavily against privacy. Paradoxically, anti-fingerprinting privacy technologies can be self-defeating if they are not used by a sufficient number of people; we show that some privacy measures currently fall victim to this paradox, but others do not.
Bepress Legal Series, 2004
The Internet and Copyright Law are particularly ill-suited to each other. One is designed to give... more The Internet and Copyright Law are particularly ill-suited to each other. One is designed to give as much information as possible to everyone who wants it; the other allows authors, artists and publishers to earn money by restricting the distribution of works made out of information. The beneficiaries of copyright law are lobbying for the re-design of computers and the Internet to instate "content control" and "digital rights management" (DRM). These technologies are intended to make copyright workable again by re-imposing limits on access to information goods, but they carry high direct and indirect social costs.
Abstract: A certain air of controversy has arisen around copyright law, as a result of its intera... more Abstract: A certain air of controversy has arisen around copyright law, as a result of its interactions with digital technology. The body of literature claiming that existing copyright laws are economically sub-optimal is growing rapidly. Some authors are even claiming ...
New Scientist, 2009
RefDoc Bienvenue - Welcome. Refdoc est un service / is powered by. ...
New Scientist, 2009
... Essay. The shape of e-capitalism. Peter Eckersley. Available online 26 June 2009. When techno... more ... Essay. The shape of e-capitalism. Peter Eckersley. Available online 26 June 2009. When technology makes knowledge globally available, how we change the economics of buying and selling it is crucial. This article is not included in your organization's subscription. ...
Communications of the ACM
Artificial intelligence and machine learning capabilities are growing at an unprecedented rate. T... more Artificial intelligence and machine learning capabilities are growing
at an unprecedented rate. These technologies have many widely
beneficial applications, ranging from machine translation to medical
image analysis. Countless more such applications are being
developed and can be expected over the long term. Less attention
has historically been paid to the ways in which artificial intelligence
can be used maliciously. This report surveys the landscape of
potential security threats from malicious uses of artificial intelligence
technologies, and proposes ways to better forecast, prevent, and
mitigate these threats. We analyze, but do not conclusively resolve,
the question of what the long-term equilibrium between attackers and
defenders will be. We focus instead on what sorts of attacks we are
likely to see soon if adequate defenses are not developed.
Journal of Artificial General Intelligence, 2013
Brain emulation is a hypothetical but extremely transformative technology which has a non-zero ch... more Brain emulation is a hypothetical but extremely transformative technology which has a non-zero chance of appearing during the next century. This paper investigates whether such a technology would also have any predictable characteristics that give it a chance of being catastrophically dangerous, and whether there are any policy levers which might be used to make it safer. We conclude that the riskiness of brain emulation probably depends on the order of the preceding research trajectory. Broadly speaking, it appears safer for brain emulation to happen sooner, because slower CPUs would make the technology‘s impact more gradual. It may also be safer if brains are scanned before they are fully understood from a neuroscience perspective, thereby increasing the initial population of emulations, although this prediction is weaker and more scenario-dependent. The risks posed by brain emulation also seem strongly connected to questions about the balance of power between attackers and defend...
Abstract In this report we consider a number of approaches to improving the efficacy of search al... more Abstract In this report we consider a number of approaches to improving the efficacy of search algorithms for the World Wide Web and similar hyperlinked document collections. We observe that the exploitation of link topology has resulted in high precision search engines ...
Comcast is the second largest Internet Service Provider (ISP) in the United States. They run the ... more Comcast is the second largest Internet Service Provider (ISP) in the United States. They run the cable TV and cable Internet networks in many parts of the United States, and many consumers know them as their duopoly or monopoly provider of residential broadband ...
Lecture Notes in Computer Science, 2002
This is a brief description of the WilmaScope interactive 3D graph visualisation system. Wilma fe... more This is a brief description of the WilmaScope interactive 3D graph visualisation system. Wilma features clustering of related groups of nodes, a GUI for editing graphs and adjusting the force layout parameters and a CORBA interface for creating and interacting with graphs remotely from other programs. It has also been used to construct 3D UML Class and Object models as part of a usability study. Wilma is freely available under the terms of the GNU Lesser General Public License 1 .
Mathematics and Visualization, 2004
NeuroImage, 2002
We report on differences in sensitivity and false-positive rate across five methods of global nor... more We report on differences in sensitivity and false-positive rate across five methods of global normalization using resting-state fMRI data embedded with simulated activation. These methods were grand mean session scaling, proportional scaling, ANCOVA, a masking method, and an orthogonalization method. We found that global normalization by proportional scaling and ANCOVA decreased the sensitivity of the statistical analysis and induced artifactual deactivation even when the correlation between the global signal and the experimental paradigm was relatively low. The masking method and the orthogonalization method performed better from this perspective but are both restricted to certain experimental conditions. Based on the results of these simulations, we offer practical guidelines for the choice of global normalization method least likely to bias the experimental results.
Neuroinformatics, 2003
The requirements for neuroinformatics to make a significant impact on neuroscience are not simply... more The requirements for neuroinformatics to make a significant impact on neuroscience are not simply technical-the hardware, software, and protocols for collaborative research-they also include the legal and policy frameworks within which projects operate. This is not least because the creation of large collaborative scientific databases amplifies the complicated interactions between proprietary, for-profit R&D and public "open science." In this paper, we draw on experiences from the field of genomics to examine some of the likely consequences of these interactions in neuroscience.
Electronic Frontier Foundation, 2007
Comcast is the second largest Internet Service Provider (ISP) in the United States. They run the ... more Comcast is the second largest Internet Service Provider (ISP) in the United States. They run the cable TV and cable Internet networks in many parts of the United States, and many consumers know them as their duopoly or monopoly provider of residential broadband ...
The requirements for neuroinformatics to make a significant impact on the field of neuroscience a... more The requirements for neuroinformatics to make a significant impact on the field of neuroscience as a whole are not simply technical - the hardware, software, and protocols for collaborative research - they also include the legal and policy frameworks within which research is conducted. This is not least because the creation of large collaborative scientific databases amplifies the complicated interactions
We investigate the degree to which modern web browsers are subject to " device fingerprinting " v... more We investigate the degree to which modern web browsers are subject to " device fingerprinting " via the version and configuration information that they will transmit to websites upon request. We implemented one possible fingerprinting algorithm, and collected these fingerprints from a large sample of browsers that visited our test side, panopticlick.eff.org. We observe that the distribution of our fingerprint contains at least 18.1 bits of entropy, meaning that if we pick a browser at random, at best we expect that only one in 286,777 other browsers will share its fingerprint. Among browsers that support Flash or Java, the situation is worse, with the average browser carrying at least 18.8 bits of identifying information. 94.2% of browsers with Flash or Java were unique in our sample. By observing returning visitors, we estimate how rapidly browser fingerprints might change over time. In our sample, fingerprints changed quite rapidly, but even a simple heuristic was usually able to guess when a fingerprint was an " upgraded " version of a previously observed browser's fingerprint, with 99.1% of guesses correct and a false positive rate of only 0.86%. We discuss what privacy threat browser fingerprinting poses in practice, and what countermeasures may be appropriate to prevent it. There is a tradeoff between protection against fingerprintability and certain kinds of debuggability, which in current browsers is weighted heavily against privacy. Paradoxically, anti-fingerprinting privacy technologies can be self-defeating if they are not used by a sufficient number of people; we show that some privacy measures currently fall victim to this paradox, but others do not.