James Crutchfield - Academia.edu (original) (raw)

Papers by James Crutchfield

Research paper thumbnail of Noise phenomena in Josephson junctions

Applied Physics Letters, 1980

ABSTRACT

Research paper thumbnail of Anatomy of a Spin: The Information-Theoretic Structure of Classical Spin Systems

Entropy, 2017

Collective organization in matter plays a significant role in its expressed physical properties. ... more Collective organization in matter plays a significant role in its expressed physical properties. Typically, it is detected via an order parameter, appropriately defined for each given system’s observed emergent patterns. Recent developments in information theory, however, suggest quantifying collective organization in a system- and phenomenon-agnostic way: decomposing the system’s thermodynamic entropy density into a localized entropy, that is solely contained in the dynamics at a single location, and a bound entropy, that is stored in space as domains, clusters, excitations, or other emergent structures. As a concrete demonstration, we compute this decomposition and related quantities explicitly for the nearest-neighbor Ising model on the 1D chain, on the Bethe lattice with coordination number k = 3 , and on the 2D square lattice, illustrating its generality and the functional insights it gives near and away from phase transitions. In particular, we consider the roles that differen...

Research paper thumbnail of Fluctuation spectroscopy

Chaos, Solitons & Fractals, 1994

Research paper thumbnail of Workshop on Information Engines at the Frontiers of Nanoscale Thermodynamics

Research paper thumbnail of Unique Information and Secret Key Agreement

Entropy

The partial information decomposition (PID) is a promising framework for decomposing a joint rand... more The partial information decomposition (PID) is a promising framework for decomposing a joint random variable into the amount of influence each source variable X i has on a target variable Y, relative to the other sources. For two sources, influence breaks down into the information that both X 0 and X 1 redundantly share with Y, what X 0 uniquely shares with Y, what X 1 uniquely shares with Y, and finally what X 0 and X 1 synergistically share with Y. Unfortunately, considerable disagreement has arisen as to how these four components should be quantified. Drawing from cryptography, we consider the secret key agreement rate as an operational method of quantifying unique information. Secret key agreement rate comes in several forms, depending upon which parties are permitted to communicate. We demonstrate that three of these four forms are inconsistent with the PID. The remaining form implies certain interpretations as to the PID’s meaning—interpretations not present in PID’s definitio...

Research paper thumbnail of Multivariate Dependence beyond Shannon Information

Research paper thumbnail of Unique Information via Dependency Constraints

Journal of Physics A: Mathematical and Theoretical

Research paper thumbnail of Prediction and generation of binary Markov processes: Can a finite-state fox catch a Markov mouse?

Chaos (Woodbury, N.Y.), 2018

Understanding the generative mechanism of a natural system is a vital component of the scientific... more Understanding the generative mechanism of a natural system is a vital component of the scientific method. Here, we investigate one of the fundamental steps toward this goal by presenting the minimal generator of an arbitrary binary Markov process. This is a class of processes whose predictive model is well known. Surprisingly, the generative model requires three distinct topologies for different regions of parameter space. We show that a previously proposed generator for a particular set of binary Markov processes is, in fact, not minimal. Our results shed the first quantitative light on the relative (minimal) costs of prediction and generation. We find, for instance, that the difference between prediction and generation is maximized when the process is approximately independently, identically distributed.

Research paper thumbnail of Spatio-temporal complexity in nonlinear image processing

IEEE Transactions on Circuits and Systems

Research paper thumbnail of Geometry from a Time Series

Research paper thumbnail of Phenomenology of Spatio-Temporal Chaos

Series on Directions in Condensed Matter Physics

Research paper thumbnail of Complex dynamical networks constructed with fully controllable nonlinear nanomechanical oscillators

Nano letters, Jan 8, 2017

Control of the global parameters of complex networks has been explored experimentally in a variet... more Control of the global parameters of complex networks has been explored experimentally in a variety of contexts. Yet, the more difficult prospect of realizing arbitrary network architectures, especially analog physical networks, that provide dynamical control of individual nodes and edges has remained elusive. It also proves challenging to measure a complex network's full internal dynamics given the vast hierarchy of timescales involved. These span from the fastest nodal dynamics to very slow epochs over which emergent global phenomena, including network synchronization and the manifestation of exotic steady states, eventually emerge. Here, we demonstrate an experimental system that satisfies these requirements. It is based upon modular, fully controllable, nonlinear radio-frequency nanomechanical oscillators, designed to form the nodes of complex dynamical networks with edges configured with arbitrary topology. The dynamics of these oscillators and their surrounding network are ...

Research paper thumbnail of Minimum memory for generating rare events

Physical review. E, 2017

We classify the rare events of structured, memoryful stochastic processes and use this to analyze... more We classify the rare events of structured, memoryful stochastic processes and use this to analyze sequential and parallel generators for these events. Given a stochastic process, we introduce a method to construct a process whose typical realizations are a given process' rare events. This leads to an expression for the minimum memory required to generate rare events. We then show that the recently discovered classical-quantum ambiguity of simplicity also occurs when comparing the structure of process fluctuations.

Research paper thumbnail of Nearly maximally predictive features and their dimensions

Physical review. E, 2017

Scientific explanation often requires inferring maximally predictive features from a given data s... more Scientific explanation often requires inferring maximally predictive features from a given data set. Unfortunately, the collection of minimal maximally predictive features for most stochastic processes is uncountably infinite. In such cases, one compromises and instead seeks nearly maximally predictive features. Here, we derive upper bounds on the rates at which the number and the coding cost of nearly maximally predictive features scale with desired predictive power. The rates are determined by the fractal dimensions of a process' mixed-state distribution. These results, in turn, show how widely used finite-order Markov models can fail as predictors and that mixed-state predictive features can offer a substantial improvement.

Research paper thumbnail of Thermodynamics of random number generation

Physical review. E, 2017

We analyze the thermodynamic costs of the three main approaches to generating random numbers via ... more We analyze the thermodynamic costs of the three main approaches to generating random numbers via the recently introduced Information Processing Second Law. Given access to a specified source of randomness, a random number generator (RNG) produces samples from a desired target probability distribution. This differs from pseudorandom number generators (PRNGs) that use wholly deterministic algorithms and from true random number generators (TRNGs) in which the randomness source is a physical system. For each class, we analyze the thermodynamics of generators based on algorithms implemented as finite-state machines, as these allow for direct bounds on the required physical resources. This establishes bounds on heat dissipation and work consumption during the operation of three main classes of RNG algorithms-including those of von Neumann, Knuth, and Yao and Roche and Hoshi-and for PRNG methods. We introduce a general TRNG and determine its thermodynamic costs exactly for arbitrary target...

Research paper thumbnail of Patterns of patterns of synchronization: Noise induced attractor switching in rings of coupled nonlinear oscillators

Chaos (Woodbury, N.Y.), 2016

Following the long-lived qualitative-dynamics tradition of explaining behavior in complex systems... more Following the long-lived qualitative-dynamics tradition of explaining behavior in complex systems via the architecture of their attractors and basins, we investigate the patterns of switching between distinct trajectories in a network of synchronized oscillators. Our system, consisting of nonlinear amplitude-phase oscillators arranged in a ring topology with reactive nearest-neighbor coupling, is simple and connects directly to experimental realizations. We seek to understand how the multiple stable synchronized states connect to each other in state space by applying Gaussian white noise to each of the oscillators' phases. To do this, we first analytically identify a set of locally stable limit cycles at any given coupling strength. For each of these attracting states, we analyze the effect of weak noise via the covariance matrix of deviations around those attractors. We then explore the noise-induced attractor switching behavior via numerical investigations. For a ring of three...

Research paper thumbnail of Information trimming: Sufficient statistics, mutual information, and predictability from effective channel states

Physical review. E, 2017

One of the most basic characterizations of the relationship between two random variables, X and Y... more One of the most basic characterizations of the relationship between two random variables, X and Y, is the value of their mutual information. Unfortunately, calculating it analytically and estimating it empirically are often stymied by the extremely large dimension of the variables. One might hope to replace such a high-dimensional variable by a smaller one that preserves its relationship with the other. It is well known that either X (or Y) can be replaced by its minimal sufficient statistic about Y (or X) while preserving the mutual information. While intuitively reasonable, it is not obvious or straightforward that both variables can be replaced simultaneously. We demonstrate that this is in fact possible: the information X's minimal sufficient statistic preserves about Y is exactly the information that Y's minimal sufficient statistic preserves about X. We call this procedure information trimming. As an important corollary, we consider the case where one variable is a sto...

Research paper thumbnail of Transient Dissipation and Structural Costs of Physical Information Transduction

Physical review letters, Jan 2, 2017

A central result that arose in applying information theory to the stochastic thermodynamics of no... more A central result that arose in applying information theory to the stochastic thermodynamics of nonlinear dynamical systems is the information-processing second law (IPSL): the physical entropy of the Universe can decrease if compensated by the Shannon-Kolmogorov-Sinai entropy change of appropriate information-carrying degrees of freedom. In particular, the asymptotic-rate IPSL precisely delineates the thermodynamic functioning of autonomous Maxwellian demons and information engines. How do these systems begin to function as engines, Landauer erasers, and error correctors? We identify a minimal, and thus inescapable, transient dissipation of physical information processing, which is not captured by asymptotic rates, but is critical to adaptive thermodynamic processes such as those found in biological systems. A component of transient dissipation, we also identify an implementation-dependent cost that varies from one physical substrate to another for the same information processing ta...

Research paper thumbnail of Correlation-powered information engines and the thermodynamics of self-correction

Physical review. E, 2017

Information engines can use structured environments as a resource to generate work by randomizing... more Information engines can use structured environments as a resource to generate work by randomizing ordered inputs and leveraging the increased Shannon entropy to transfer energy from a thermal reservoir to a work reservoir. We give a broadly applicable expression for the work production of an information engine, generally modeled as a memoryful channel that communicates inputs to outputs as it interacts with an evolving environment. The expression establishes that an information engine must have more than one memory state in order to leverage input environment correlations. To emphasize this functioning, we designed an information engine powered solely by temporal correlations and not by statistical biases, as employed by previous engines. Key to this is the engine's ability to synchronize-the engine automatically returns to a desired dynamical phase when thrown into an unwanted, dissipative phase by corruptions in the input-that is, by unanticipated environmental fluctuations. T...

Research paper thumbnail of Information Accessibility and Cryptic Processes: Linear Combinations of Causal States

We show in detail how to determine the time-reversed representation of a stationary hidden stocha... more We show in detail how to determine the time-reversed representation of a stationary hidden stochastic process from linear combinations of its forward-time epsilon\epsilonepsilon-machine causal states. This also gives a check for the kkk-cryptic expansion recently introduced to explore the temporal range over which internal state information is spread.

Research paper thumbnail of Noise phenomena in Josephson junctions

Applied Physics Letters, 1980

ABSTRACT

Research paper thumbnail of Anatomy of a Spin: The Information-Theoretic Structure of Classical Spin Systems

Entropy, 2017

Collective organization in matter plays a significant role in its expressed physical properties. ... more Collective organization in matter plays a significant role in its expressed physical properties. Typically, it is detected via an order parameter, appropriately defined for each given system’s observed emergent patterns. Recent developments in information theory, however, suggest quantifying collective organization in a system- and phenomenon-agnostic way: decomposing the system’s thermodynamic entropy density into a localized entropy, that is solely contained in the dynamics at a single location, and a bound entropy, that is stored in space as domains, clusters, excitations, or other emergent structures. As a concrete demonstration, we compute this decomposition and related quantities explicitly for the nearest-neighbor Ising model on the 1D chain, on the Bethe lattice with coordination number k = 3 , and on the 2D square lattice, illustrating its generality and the functional insights it gives near and away from phase transitions. In particular, we consider the roles that differen...

Research paper thumbnail of Fluctuation spectroscopy

Chaos, Solitons & Fractals, 1994

Research paper thumbnail of Workshop on Information Engines at the Frontiers of Nanoscale Thermodynamics

Research paper thumbnail of Unique Information and Secret Key Agreement

Entropy

The partial information decomposition (PID) is a promising framework for decomposing a joint rand... more The partial information decomposition (PID) is a promising framework for decomposing a joint random variable into the amount of influence each source variable X i has on a target variable Y, relative to the other sources. For two sources, influence breaks down into the information that both X 0 and X 1 redundantly share with Y, what X 0 uniquely shares with Y, what X 1 uniquely shares with Y, and finally what X 0 and X 1 synergistically share with Y. Unfortunately, considerable disagreement has arisen as to how these four components should be quantified. Drawing from cryptography, we consider the secret key agreement rate as an operational method of quantifying unique information. Secret key agreement rate comes in several forms, depending upon which parties are permitted to communicate. We demonstrate that three of these four forms are inconsistent with the PID. The remaining form implies certain interpretations as to the PID’s meaning—interpretations not present in PID’s definitio...

Research paper thumbnail of Multivariate Dependence beyond Shannon Information

Research paper thumbnail of Unique Information via Dependency Constraints

Journal of Physics A: Mathematical and Theoretical

Research paper thumbnail of Prediction and generation of binary Markov processes: Can a finite-state fox catch a Markov mouse?

Chaos (Woodbury, N.Y.), 2018

Understanding the generative mechanism of a natural system is a vital component of the scientific... more Understanding the generative mechanism of a natural system is a vital component of the scientific method. Here, we investigate one of the fundamental steps toward this goal by presenting the minimal generator of an arbitrary binary Markov process. This is a class of processes whose predictive model is well known. Surprisingly, the generative model requires three distinct topologies for different regions of parameter space. We show that a previously proposed generator for a particular set of binary Markov processes is, in fact, not minimal. Our results shed the first quantitative light on the relative (minimal) costs of prediction and generation. We find, for instance, that the difference between prediction and generation is maximized when the process is approximately independently, identically distributed.

Research paper thumbnail of Spatio-temporal complexity in nonlinear image processing

IEEE Transactions on Circuits and Systems

Research paper thumbnail of Geometry from a Time Series

Research paper thumbnail of Phenomenology of Spatio-Temporal Chaos

Series on Directions in Condensed Matter Physics

Research paper thumbnail of Complex dynamical networks constructed with fully controllable nonlinear nanomechanical oscillators

Nano letters, Jan 8, 2017

Control of the global parameters of complex networks has been explored experimentally in a variet... more Control of the global parameters of complex networks has been explored experimentally in a variety of contexts. Yet, the more difficult prospect of realizing arbitrary network architectures, especially analog physical networks, that provide dynamical control of individual nodes and edges has remained elusive. It also proves challenging to measure a complex network's full internal dynamics given the vast hierarchy of timescales involved. These span from the fastest nodal dynamics to very slow epochs over which emergent global phenomena, including network synchronization and the manifestation of exotic steady states, eventually emerge. Here, we demonstrate an experimental system that satisfies these requirements. It is based upon modular, fully controllable, nonlinear radio-frequency nanomechanical oscillators, designed to form the nodes of complex dynamical networks with edges configured with arbitrary topology. The dynamics of these oscillators and their surrounding network are ...

Research paper thumbnail of Minimum memory for generating rare events

Physical review. E, 2017

We classify the rare events of structured, memoryful stochastic processes and use this to analyze... more We classify the rare events of structured, memoryful stochastic processes and use this to analyze sequential and parallel generators for these events. Given a stochastic process, we introduce a method to construct a process whose typical realizations are a given process' rare events. This leads to an expression for the minimum memory required to generate rare events. We then show that the recently discovered classical-quantum ambiguity of simplicity also occurs when comparing the structure of process fluctuations.

Research paper thumbnail of Nearly maximally predictive features and their dimensions

Physical review. E, 2017

Scientific explanation often requires inferring maximally predictive features from a given data s... more Scientific explanation often requires inferring maximally predictive features from a given data set. Unfortunately, the collection of minimal maximally predictive features for most stochastic processes is uncountably infinite. In such cases, one compromises and instead seeks nearly maximally predictive features. Here, we derive upper bounds on the rates at which the number and the coding cost of nearly maximally predictive features scale with desired predictive power. The rates are determined by the fractal dimensions of a process' mixed-state distribution. These results, in turn, show how widely used finite-order Markov models can fail as predictors and that mixed-state predictive features can offer a substantial improvement.

Research paper thumbnail of Thermodynamics of random number generation

Physical review. E, 2017

We analyze the thermodynamic costs of the three main approaches to generating random numbers via ... more We analyze the thermodynamic costs of the three main approaches to generating random numbers via the recently introduced Information Processing Second Law. Given access to a specified source of randomness, a random number generator (RNG) produces samples from a desired target probability distribution. This differs from pseudorandom number generators (PRNGs) that use wholly deterministic algorithms and from true random number generators (TRNGs) in which the randomness source is a physical system. For each class, we analyze the thermodynamics of generators based on algorithms implemented as finite-state machines, as these allow for direct bounds on the required physical resources. This establishes bounds on heat dissipation and work consumption during the operation of three main classes of RNG algorithms-including those of von Neumann, Knuth, and Yao and Roche and Hoshi-and for PRNG methods. We introduce a general TRNG and determine its thermodynamic costs exactly for arbitrary target...

Research paper thumbnail of Patterns of patterns of synchronization: Noise induced attractor switching in rings of coupled nonlinear oscillators

Chaos (Woodbury, N.Y.), 2016

Following the long-lived qualitative-dynamics tradition of explaining behavior in complex systems... more Following the long-lived qualitative-dynamics tradition of explaining behavior in complex systems via the architecture of their attractors and basins, we investigate the patterns of switching between distinct trajectories in a network of synchronized oscillators. Our system, consisting of nonlinear amplitude-phase oscillators arranged in a ring topology with reactive nearest-neighbor coupling, is simple and connects directly to experimental realizations. We seek to understand how the multiple stable synchronized states connect to each other in state space by applying Gaussian white noise to each of the oscillators' phases. To do this, we first analytically identify a set of locally stable limit cycles at any given coupling strength. For each of these attracting states, we analyze the effect of weak noise via the covariance matrix of deviations around those attractors. We then explore the noise-induced attractor switching behavior via numerical investigations. For a ring of three...

Research paper thumbnail of Information trimming: Sufficient statistics, mutual information, and predictability from effective channel states

Physical review. E, 2017

One of the most basic characterizations of the relationship between two random variables, X and Y... more One of the most basic characterizations of the relationship between two random variables, X and Y, is the value of their mutual information. Unfortunately, calculating it analytically and estimating it empirically are often stymied by the extremely large dimension of the variables. One might hope to replace such a high-dimensional variable by a smaller one that preserves its relationship with the other. It is well known that either X (or Y) can be replaced by its minimal sufficient statistic about Y (or X) while preserving the mutual information. While intuitively reasonable, it is not obvious or straightforward that both variables can be replaced simultaneously. We demonstrate that this is in fact possible: the information X's minimal sufficient statistic preserves about Y is exactly the information that Y's minimal sufficient statistic preserves about X. We call this procedure information trimming. As an important corollary, we consider the case where one variable is a sto...

Research paper thumbnail of Transient Dissipation and Structural Costs of Physical Information Transduction

Physical review letters, Jan 2, 2017

A central result that arose in applying information theory to the stochastic thermodynamics of no... more A central result that arose in applying information theory to the stochastic thermodynamics of nonlinear dynamical systems is the information-processing second law (IPSL): the physical entropy of the Universe can decrease if compensated by the Shannon-Kolmogorov-Sinai entropy change of appropriate information-carrying degrees of freedom. In particular, the asymptotic-rate IPSL precisely delineates the thermodynamic functioning of autonomous Maxwellian demons and information engines. How do these systems begin to function as engines, Landauer erasers, and error correctors? We identify a minimal, and thus inescapable, transient dissipation of physical information processing, which is not captured by asymptotic rates, but is critical to adaptive thermodynamic processes such as those found in biological systems. A component of transient dissipation, we also identify an implementation-dependent cost that varies from one physical substrate to another for the same information processing ta...

Research paper thumbnail of Correlation-powered information engines and the thermodynamics of self-correction

Physical review. E, 2017

Information engines can use structured environments as a resource to generate work by randomizing... more Information engines can use structured environments as a resource to generate work by randomizing ordered inputs and leveraging the increased Shannon entropy to transfer energy from a thermal reservoir to a work reservoir. We give a broadly applicable expression for the work production of an information engine, generally modeled as a memoryful channel that communicates inputs to outputs as it interacts with an evolving environment. The expression establishes that an information engine must have more than one memory state in order to leverage input environment correlations. To emphasize this functioning, we designed an information engine powered solely by temporal correlations and not by statistical biases, as employed by previous engines. Key to this is the engine's ability to synchronize-the engine automatically returns to a desired dynamical phase when thrown into an unwanted, dissipative phase by corruptions in the input-that is, by unanticipated environmental fluctuations. T...

Research paper thumbnail of Information Accessibility and Cryptic Processes: Linear Combinations of Causal States

We show in detail how to determine the time-reversed representation of a stationary hidden stocha... more We show in detail how to determine the time-reversed representation of a stationary hidden stochastic process from linear combinations of its forward-time epsilon\epsilonepsilon-machine causal states. This also gives a check for the kkk-cryptic expansion recently introduced to explore the temporal range over which internal state information is spread.