Louis Narens - Academia.edu (original) (raw)

Papers by Louis Narens

Research paper thumbnail of Modern Measurement

The Pursuit of Happiness, 2020

Extensive measurement from Helmholtz and Holder is the standard physical account. It gives a rati... more Extensive measurement from Helmholtz and Holder is the standard physical account. It gives a ratio scale. Stevens introduces interval and ordinal scales. Scott and Suppes give a representational foundation to Stevens’ scales. Suppes, Zinnes, Luce and Narens analyse meaningfulness.

Research paper thumbnail of The Pursuit of Happiness

The Pursuit of Happiness, 2020

This is a preview and roadmap of the entire book

Research paper thumbnail of Why Investigate Metacognition?

Research paper thumbnail of Intrinsic Archimedeanness and the Continuum

Research paper thumbnail of Measurement, Theory of

Research paper thumbnail of doi:10.1006/jmps.2002.1429 The Irony of Measurement by Subjective Estimations

measurement that radically differed from the dominate theory of the time. The dominate theory hel... more measurement that radically differed from the dominate theory of the time. The dominate theory held that all strong forms of scientific measurement— for example, those that yielded ratio scales—had to be based on an observable ordering and an observable commutative and associative operation. Stevens proposed different criteria and introduced his method of magnitude estimation. Stevens as well as measurement theorists considered his method to be radically different from those based on commutative and associative operations. Although his method was controversial, it became a standard tool in the behavioral sciences. This article argues that Stevens ’ method, together with implicit assumptions he made about the scales of measurement it generated, is from a mathematical perspective the same as the measurement process based on commutative and associative operations. The article also provides a theory of qualitative numbers and shows an interesting relationship

Research paper thumbnail of MEASUREMENT WITHOUT ARCHMEDEAN AXIOMS&quot

Axiomatizations of measurement systems usually require an axiom--called an Archimedean axiom-that... more Axiomatizations of measurement systems usually require an axiom--called an Archimedean axiom-that allows quantities to be compared. This type of axiom has a different form from the other measurement axioms, and cannot-except in the most trivial cases-be empirically verified. In this paper, representation theorems for extensive measurement structures without Archimedean axioms are given. Such structures are represented in measurement spaces that are generalizations of the real number system. Furthermore, a precise description of "Archimedean axioms " is given and it is shown that in all interesting cases "Archimedean axioms " are independent of other measurement axioms. 1. Preliminaries. Notation. Throughout this paper the following convention will be observed. Re will stand for the real numbers; Re+ for the positive real numbers; I for the set of integers; I+ for the set of positive integers; and (x,,..., x,) and (x,,..., x,) for ordered n-tuples; ifSwil1 stand f...

Research paper thumbnail of PSYCHOLOGICAL SCffiNCE Research Article UTILIZATION OF METACOGNTTIVE JUDGMENTS IN THE ALLOCATION OF STUDY DURING MULTTTRIAL LEARNING

Abstract—We contrasted several ways that an mdividual'i judgments of learning (JOLs) can be ... more Abstract—We contrasted several ways that an mdividual'i judgments of learning (JOLs) can be utilized when allocating additional study ("restudy") during the learning of Swahili-English translation equivalents The findings demonstrate AOH metacognitive monitoring can be utilized to benefit multitrial learning Computer-controlled allocation of restudy based people's JOLs was equivalent to most people's own allocati of restudy (indicating that the computer algorithm can provide a sufficient account of people's allocation of restudy) and was more effective than a computer-controlled allocation based on normative performance (indicating that people's metacognitive monitoring of idiosyncratic knowledge has functional utility m causal chains for learning) Self-monitonng and control are fundamental categones of metacognition and consciousness (Kihlstrom, 1984) Few people nowadays would doubt the importance of self-monitonng as a construct m theones of meta...

Research paper thumbnail of Journal of Mathematical Psychology 47 (2003) 1–31 A theory of belief

A theory of belief is presented in which uncertainty has two dimensions. The two dimensions have ... more A theory of belief is presented in which uncertainty has two dimensions. The two dimensions have a variety of interpretations. The article focusses on two of these interpretations. The first is that one dimension corresponds to probability and the other to ‘‘definiteness,’ ’ which itself has a variety of interpretations. One interpretation of definiteness is as the ordinal inverse of an aspect of uncertainty called ‘‘ambiguity’ ’ that is often considered important in the decision theory literature. (Greater ambiguity produces less definiteness and vice versa.) Another interpretation of definiteness is as a factor that measures the distortion of an individual’s probability judgments that is due to specific factors involved in the cognitive processing leading to judgments. This interpretation is used to provide a new foundation for support theories of probability judgments and a new formulation of the ‘‘Unpacking Principle’ ’ of Tversky and Koehler. The second interpretation of the tw...

Research paper thumbnail of Pseudo Complemented Distributive Lattices

Advanced Series on Mathematical Psychology, 2015

Research paper thumbnail of Metacognitive Aspects of Implicit/Explicit Memory

Implicit Memory and Metacognition, 2014

Research paper thumbnail of Orthomodular Modeling of Psychological Paradigms

Research paper thumbnail of Torgerson’s Conjecture

Research paper thumbnail of Modeling Decisions Involving Ambiguous, Vague, or Rare Events

Almost all models of decision making assume an underlying boolean space of events. This gives a l... more Almost all models of decision making assume an underlying boolean space of events. This gives a logical structure to events that matches the structure of propositions of classical logic. This chapter takes a different approach, employing events that form a topology instead of a boolean algebra. This allows for new modeling concepts for judgmental heuristics, rare events, and the influence of context on decisions.

Research paper thumbnail of Basic Lattice Theory

Research paper thumbnail of Probability and Coherence

Research paper thumbnail of Axioms for Choice Proportions

Research paper thumbnail of An Application of Quantum Logic to Experimental Behavioral Science

Quantum Reports, 2021

In 1933, Kolmogorov synthesized the basic concepts of probability that were in general use at the... more In 1933, Kolmogorov synthesized the basic concepts of probability that were in general use at the time into concepts and deductions from a simple set of axioms that said probability was a σ-additive function from a boolean algebra of events into [0, 1]. In 1932, von Neumann realized that the use of probability in quantum mechanics required a different concept that he formulated as a σ-additive function from the closed subspaces of a Hilbert space onto [0,1]. In 1935, Birkhoff & von Neumann replaced Hilbert space with an algebraic generalization. Today, a slight modification of the Birkhoff-von Neumann generalization is called “quantum logic”. A central problem in the philosophy of probability is the justification of the definition of probability used in a given application. This is usually done by arguing for the rationality of that approach to the situation under consideration. A version of the Dutch book argument given by de Finetti in 1972 is often used to justify the Kolmogorov ...

Research paper thumbnail of Surmising Cognitive Universals for Extraterrestrial Intelligences

International Astronomical Union Colloquium, 1997

Cognitive universals are concepts that our civilization and technologically advanced extraterrest... more Cognitive universals are concepts that our civilization and technologically advanced extraterrestrial civilizations can easily interpret. The universality of certain mathematically and perceptually based concepts are discussed. It is argued that continuously based concepts are more fertile ground for surmising cognitive universals than discretely based ones, and in particular, one should be suspicious of the use of inductively based numerical concepts, including the totality of natural numbers. Ideas about intuitive evolutionary theory, physical and perceptual invariance, and the efficient processing of information are linked to provide a framework for searching for cognitive universals.

Research paper thumbnail of Subthreshold Priming and Memory Monitoring

Research paper thumbnail of Modern Measurement

The Pursuit of Happiness, 2020

Extensive measurement from Helmholtz and Holder is the standard physical account. It gives a rati... more Extensive measurement from Helmholtz and Holder is the standard physical account. It gives a ratio scale. Stevens introduces interval and ordinal scales. Scott and Suppes give a representational foundation to Stevens’ scales. Suppes, Zinnes, Luce and Narens analyse meaningfulness.

Research paper thumbnail of The Pursuit of Happiness

The Pursuit of Happiness, 2020

This is a preview and roadmap of the entire book

Research paper thumbnail of Why Investigate Metacognition?

Research paper thumbnail of Intrinsic Archimedeanness and the Continuum

Research paper thumbnail of Measurement, Theory of

Research paper thumbnail of doi:10.1006/jmps.2002.1429 The Irony of Measurement by Subjective Estimations

measurement that radically differed from the dominate theory of the time. The dominate theory hel... more measurement that radically differed from the dominate theory of the time. The dominate theory held that all strong forms of scientific measurement— for example, those that yielded ratio scales—had to be based on an observable ordering and an observable commutative and associative operation. Stevens proposed different criteria and introduced his method of magnitude estimation. Stevens as well as measurement theorists considered his method to be radically different from those based on commutative and associative operations. Although his method was controversial, it became a standard tool in the behavioral sciences. This article argues that Stevens ’ method, together with implicit assumptions he made about the scales of measurement it generated, is from a mathematical perspective the same as the measurement process based on commutative and associative operations. The article also provides a theory of qualitative numbers and shows an interesting relationship

Research paper thumbnail of MEASUREMENT WITHOUT ARCHMEDEAN AXIOMS&quot

Axiomatizations of measurement systems usually require an axiom--called an Archimedean axiom-that... more Axiomatizations of measurement systems usually require an axiom--called an Archimedean axiom-that allows quantities to be compared. This type of axiom has a different form from the other measurement axioms, and cannot-except in the most trivial cases-be empirically verified. In this paper, representation theorems for extensive measurement structures without Archimedean axioms are given. Such structures are represented in measurement spaces that are generalizations of the real number system. Furthermore, a precise description of "Archimedean axioms " is given and it is shown that in all interesting cases "Archimedean axioms " are independent of other measurement axioms. 1. Preliminaries. Notation. Throughout this paper the following convention will be observed. Re will stand for the real numbers; Re+ for the positive real numbers; I for the set of integers; I+ for the set of positive integers; and (x,,..., x,) and (x,,..., x,) for ordered n-tuples; ifSwil1 stand f...

Research paper thumbnail of PSYCHOLOGICAL SCffiNCE Research Article UTILIZATION OF METACOGNTTIVE JUDGMENTS IN THE ALLOCATION OF STUDY DURING MULTTTRIAL LEARNING

Abstract—We contrasted several ways that an mdividual'i judgments of learning (JOLs) can be ... more Abstract—We contrasted several ways that an mdividual'i judgments of learning (JOLs) can be utilized when allocating additional study ("restudy") during the learning of Swahili-English translation equivalents The findings demonstrate AOH metacognitive monitoring can be utilized to benefit multitrial learning Computer-controlled allocation of restudy based people's JOLs was equivalent to most people's own allocati of restudy (indicating that the computer algorithm can provide a sufficient account of people's allocation of restudy) and was more effective than a computer-controlled allocation based on normative performance (indicating that people's metacognitive monitoring of idiosyncratic knowledge has functional utility m causal chains for learning) Self-monitonng and control are fundamental categones of metacognition and consciousness (Kihlstrom, 1984) Few people nowadays would doubt the importance of self-monitonng as a construct m theones of meta...

Research paper thumbnail of Journal of Mathematical Psychology 47 (2003) 1–31 A theory of belief

A theory of belief is presented in which uncertainty has two dimensions. The two dimensions have ... more A theory of belief is presented in which uncertainty has two dimensions. The two dimensions have a variety of interpretations. The article focusses on two of these interpretations. The first is that one dimension corresponds to probability and the other to ‘‘definiteness,’ ’ which itself has a variety of interpretations. One interpretation of definiteness is as the ordinal inverse of an aspect of uncertainty called ‘‘ambiguity’ ’ that is often considered important in the decision theory literature. (Greater ambiguity produces less definiteness and vice versa.) Another interpretation of definiteness is as a factor that measures the distortion of an individual’s probability judgments that is due to specific factors involved in the cognitive processing leading to judgments. This interpretation is used to provide a new foundation for support theories of probability judgments and a new formulation of the ‘‘Unpacking Principle’ ’ of Tversky and Koehler. The second interpretation of the tw...

Research paper thumbnail of Pseudo Complemented Distributive Lattices

Advanced Series on Mathematical Psychology, 2015

Research paper thumbnail of Metacognitive Aspects of Implicit/Explicit Memory

Implicit Memory and Metacognition, 2014

Research paper thumbnail of Orthomodular Modeling of Psychological Paradigms

Research paper thumbnail of Torgerson’s Conjecture

Research paper thumbnail of Modeling Decisions Involving Ambiguous, Vague, or Rare Events

Almost all models of decision making assume an underlying boolean space of events. This gives a l... more Almost all models of decision making assume an underlying boolean space of events. This gives a logical structure to events that matches the structure of propositions of classical logic. This chapter takes a different approach, employing events that form a topology instead of a boolean algebra. This allows for new modeling concepts for judgmental heuristics, rare events, and the influence of context on decisions.

Research paper thumbnail of Basic Lattice Theory

Research paper thumbnail of Probability and Coherence

Research paper thumbnail of Axioms for Choice Proportions

Research paper thumbnail of An Application of Quantum Logic to Experimental Behavioral Science

Quantum Reports, 2021

In 1933, Kolmogorov synthesized the basic concepts of probability that were in general use at the... more In 1933, Kolmogorov synthesized the basic concepts of probability that were in general use at the time into concepts and deductions from a simple set of axioms that said probability was a σ-additive function from a boolean algebra of events into [0, 1]. In 1932, von Neumann realized that the use of probability in quantum mechanics required a different concept that he formulated as a σ-additive function from the closed subspaces of a Hilbert space onto [0,1]. In 1935, Birkhoff & von Neumann replaced Hilbert space with an algebraic generalization. Today, a slight modification of the Birkhoff-von Neumann generalization is called “quantum logic”. A central problem in the philosophy of probability is the justification of the definition of probability used in a given application. This is usually done by arguing for the rationality of that approach to the situation under consideration. A version of the Dutch book argument given by de Finetti in 1972 is often used to justify the Kolmogorov ...

Research paper thumbnail of Surmising Cognitive Universals for Extraterrestrial Intelligences

International Astronomical Union Colloquium, 1997

Cognitive universals are concepts that our civilization and technologically advanced extraterrest... more Cognitive universals are concepts that our civilization and technologically advanced extraterrestrial civilizations can easily interpret. The universality of certain mathematically and perceptually based concepts are discussed. It is argued that continuously based concepts are more fertile ground for surmising cognitive universals than discretely based ones, and in particular, one should be suspicious of the use of inductively based numerical concepts, including the totality of natural numbers. Ideas about intuitive evolutionary theory, physical and perceptual invariance, and the efficient processing of information are linked to provide a framework for searching for cognitive universals.

Research paper thumbnail of Subthreshold Priming and Memory Monitoring