10031 Abstracts Collection – Quantitative Models: Expressiveness and Analysis (original) (raw)

Quantitative Models: Expressiveness, Analysis, and New Applications (Dagstuhl Seminar 14041)

2014

From Jan. 19 to Jan. 24, 2014, "Quantitative Models: Expressiveness, Analysis, and New Applications" was held in Schloss Dagstuhl-Leibniz Center for Informatics. During the seminar, several participants presented their current research, and ongoing work and open problems were discussed. Abstracts of the presentations given during the seminar as well as abstracts of seminar results and ideas are put together in this paper. The first section describes the seminar topics and goals in general. Links to extended abstracts or full papers are provided, if available.

Report from Dagstuhl Seminar 14041

From Jan. 19 to Jan. 24, 2014, "Quantitative Models: Expressiveness, Analysis, and New Applications " was held in Schloss Dagstuhl-Leibniz Center for Informatics. During the seminar, several participants presented their current research, and ongoing work and open problems were discussed. Abstracts of the presentations given during the seminar as well as abstracts of seminar results and ideas are put together in this paper. The first section describes the seminar topics and goals in general. Links to extended abstracts or full papers are provided, if available.

Quantitative models and implicit complexity

FSTTCS 2005: Foundations of Software …, 2005

We give new proofs of soundness (all representable functions on base types lies in certain complexity classes) for Elementary Affine Logic, LFPL (a language for polytime computation close to realistic functional programming introduced by one of us), Light Affine Logic and Soft Affine Logic. The proofs are based on a common semantical framework which is merely instantiated in four different ways. The framework consists of an innovative modification of realizability which allows us to use resource-bounded computations as realisers as opposed to including all Turing computable functions as is usually the case in realizability constructions. For example, all realisers in the model for LFPL are polynomially bounded computations whence soundness holds by construction of the model. The work then lies in being able to interpret all the required constructs in the model. While being the first entirely semantical proof of polytime soundness for light logics, our proof also provides a notable simplification of the original already semantical proof of polytime soundness for LFPL. A new result made possible by the semantic framework is the addition of polymorphism and a modality to LFPL thus allowing for an internal definition of inductive datatypes. Φ(ε) = λx.λy.λz.z Φ(0s) = λx.λy.λz.xΦ(s) Φ(1s) = λx.λy.λz.yΦ(s) Given M, N ∈ Λ, consider the following definitions:

From approximative to descriptive models

Ninth IEEE International Conference on Fuzzy Systems. FUZZ- IEEE 2000 (Cat. No.00CH37063), 2000

The University of Edinbnrgb, UK { j avierg , qiangs}@dai I ed . ac uk ABslmcl-TI& paper presents an cffcctivo and efflciant tcctiniqtra for translating rules t h a t use fipproximativo sets to rules that use descriptive sets and llnguistic badges of predefined meanlng. The translatad doscriptive rules will be functlonally equivalant to tho original approximative ones, or the closest equivalence pos8ible, whilc reflecting their undorlying semantics, Thus, descrlptlvo models can take advantage of any uxisting approach to approximatho modclllng which Is gonorally efHcient nnd accurate, whilst employing rules that are comprulrensible t o human users.

Computational model theory: An overview

1998

Abstract The computational complexity of a problem is the amount of resources, such as time or space, required by a machine that solves the problem. The descriptive complexity of problems is the complexity of describing problems in some logical formalism over finite structures. One of the exciting developments in complexity theory is the discovery of a very intimate connection between computational and descriptive complexity.

UNIVERSITATIS APULENSIS No 10 / 2005 Proceedings of the International Conference on Theory and Application of Mathematics and Informatics

2006

Data consolidation is the process synthesizing pieces of information into a essential knowledge single block. The highest level in data consolidation process is referred through the data dimension concept. There is a number of different dimensions from which a given pool of data, can be analyzed.Multidimensional conceptual view will be the way for most business persons to organize their global enterprize. An user-analyst’s view of the global enterprize’s universe is a multidimensional one. Accordingly with this way, the user analyst’s conceptual view of OnLine Analytical Processing(OLAP) models, introduced in 1993, is a multidimensional one. Multidimensional data type implemented through a topological model, provides an important theoretic tool for OLAP[9]. The main target of data management is to retrieve data, following a query pattern. This query pattern can be provided as a restriction of the class of simplicial complexes provided by spatial data objects.Following this strategy ...

On Properties of Modeling Approaches

2013

This paper argues for three fundamental properties that every modeling approach should posses. It presents a simple classification schema for models and proposes some comparison criteria. Finally it provides some reflection on properties currently in the CMA ”parking

Special Issue on the Ninth European Conference on Symbolic and Quantitative Approaches to Reasoning with Uncertainty (ECSQARU 2007)

International Journal of Approximate Reasoning, 2009

This special issue of the International Journal of Approximate Reasoning is devoted to some of the best papers presented at the Ninth European Conference on Symbolic and Quantitative Approaches to Reasoning with Uncertainty (ECSQARU'07) which took place in Hammamet, Tunisia from October 31 to November 2, 2007. For this edition, we have selected a collection of 11 papers (from 75 accepted in ECSQARU 2007) which have been chosen following the recommendations of reviewers, chairmen and the editorial committee of this edition. Extended versions of these papers have been submitted for this special edition and for each of them, at least three reviewers were assigned to report on these extended versions. Finally, seven papers have been accepted to be published in this edition. The first two papers deal with logics under uncertainty. The first one by Codara et al. addresses the problem of describing in the language of Gödel logic the notion of ''Ruspini partition" for a (finite) set P of fuzzy sets on [0, 1]. The authors provide an in-depth analysis of what is the best approximation to such a notion expressible in Gödel logic; for that purpose, they resort to the combinatorial representation free Gödel algebras in terms of forests, obtaining a formula which axiomatizes the related notion ''weak Ruspini partition". The second paper by Lukasiewic and Straccia presents a web semantic oriented language mixing fuzzy and probabilistic aspects. Its main contribution is the management of fuzzy vagueness and probabilistic uncertainty in a unique framework for the semantic web. The third accepted paper in this collection has argumentation as a main topic. In this paper, Baroni and Giacomin introduce a set of skepticism relations, providing a formal counterpart to several alternative notions of skepticism at an intuitive level. A systematic comparison of a significant set of semantics on the basis of the proposed skepticism relations is also performed. The fourth accepted paper by Holeña proposes quality measures of sets of rules extracted from data. The author introduces three approaches to extend quality measures from classification rulesets to general rulesets and discusses in detail one of these approaches. The paper also proposes a generalization of ROC curves to general rulesets. The fifth paper in this collection, by Dubois and Fargier, deals with qualitative evaluation processes when the worth of items is computed by means of Sugeno integral. It presents an in-depth comparison of Sugeno integral-based decision making with respect to the classical axioms of Savage. It shows several deficiencies of qualitative decision making. The authors develop several approaches based on the Choquet integral to overcome the weak discriminating power of the Sugeno integral. The sixth paper by Bonzon et al. explores (n-players) Boolean games that are defined over compact propositional languages. It considers the two cases where the players have dichotomous and non-dichotomous preferences. This paper defines graphical dependencies between the players, as well as Nash equilibria as solutions to Boolean games. It turns out that the Nash equilibria have simple characterizations in certain graphical special cases. Finally, the last accepted paper of Ogryczak and Sliwinski addresses the problem of averaging outcomes under several scenarios to form overall objective functions. This problem is of considerable importance in decision support under uncertainty. The so-called Weighted OWA (WOWA) aggregation offers a well-suited approach to this problem. The authors propose a subtle approach to reduce the WOWA optimization problem to a linear program. They also study a generalized WOWA objective function, related to a popular measure in financial applications (CVaR) where they show that a very similar approach still can be applied. These accepted papers constitute only a small subset of the papers presented at ECSAQARU 2007, but as detailed above, they belong to different fields of symbolic and quantitative approaches to reasoning with uncertainty. We hope that the reader of this special issue will find them equally interesting.