An Intensional Functional Model of Trust (original) (raw)

Towards a formal notion of trust

Proceedings of the 5th ACM SIGPLAN international conference on Principles and practice of declaritive programming - PPDP '03, 2003

Trust management systems have been proposed as an alternative to traditional security mechanisms in Global Computing. We present some challenges in establishing a formal foundation for the notion of trust, and some preliminary ideas towards a category of trust models.

Towards a dynamic and composable model of trust

Symposium on Access Control Models and Technologies, 2009

During their everyday decision making, humans consider the interplay between two types of trust: vertical trust and horizontal trust. Vertical trust captures the trust relationships that exist between individuals and institutions, while horizontal trust represents the trust that can be inferred from the observations and opinions of others. Although researchers are actively exploring both vertical and horizontal trust within the context of distributed computing (e.g., credential-based trust and reputation-based trust, respectively), the specification and enforcement of composite trust management policies involving the flexible composition of both types of trust metrics is currently an unexplored area.

Towards trust management standardization

Computer Standards & Interfaces, 2004

Recent research in the field of security has evolved into trust issues, which are now one of the interesting research topics. A majority of current approaches proposes techniques that support users' trust processes, while a minority of them addresses the essence of trust. The latter approaches form the basis for the work presented in this paper. Outer manifestations of trust phenomenon are formalized in order to facilitate the development in this field. The main goal is to provide means for computable trust that can be used in a standardized way for contemporary internet-based applications, independently of its cognitive principles.

A Formal Model for Trust Lifecycle Management

2000

The rapid development of collaborative environments over the internet has highlighted new concerns over security and trust in such global computing systems. The global computing infrastructure poses an issue of uncertainty about the potential collaborators. Reaching a trusting decision in such environments encompasses both risk and trust assessments. While much work has been done in terms of modelling trust, the

Designing for Trust

2011

Information systems exist in every aspect of our life. Therefore we need to make sure that users trust these systems in order to continue to use them. However, developing a trustworthy system is a challenging task for the right operation of the system as well as system’s acceptance from the users’ perspective. Ensuring well-placed trust always depends on the trustworthiness of the trustee, i.e. the system. Therefore, properties that belong to the trustworthiness highly influence to the development of trust. Nevertheless, trust has to be considered from a holistic perspective, since it covers a wide range of issues that belong to different disciplines. In this paper, we present the relevant properties of trust and trustworthiness from a holistic perspective and we define trustworthiness as the competence and ability of a system.

AN EXPLICIT TRUST MODEL TOWARDS BETTER SYSTEM SECURITY

Trust is an absolute necessity for digital communications; but is often viewed as an implicit singular entity. The use of the internet as the primary vehicle for information exchange has made accountability and verifiability of system code almost obsolete. This paper proposes a novel approach towards enforcing system security by requiring the explicit definition of trust for all operating code. By identifying the various classes and levels of trust required within a computing system; trust is defined as a combination of individual characteristics. Trust is then represented as a calculable metric obtained through the collective enforcement of each of these characteristics to varying degrees. System Security is achieved by facilitating trust to be a constantly evolving aspect for each operating code segment capable of getting stronger or weaker over time.

Modeling Trust Relationships for Developing Trustworthy Information Systems

International Journal of Information System Modeling and Design, 2014

Developing a trustworthy information system is a challenging task. The overall trustworthiness of an information system depends on trust relationships that are generally assumed without adequate justification. However, lack of appropriate analysis of such relationships and of appropriate justification of relevant trust assumptions might lead to systems that fail to fully achieve their functionalities. Existing literature does not provide adequate guidelines for a systematic process or an appropriate modeling language to support such trust-focused analysis. This paper fills this gap by introducing a process that allows developers to capture possible trust relationships and to reason about them. The process is supported by a modeling language based on a set of concepts relating to trust and control and a CASE tool. An illustrative example from the UK health care domain is used to demonstrate the applicability and usefulness of the approach.

A Vector Model of Trust for Developing Trustworthy Systems

2004

All security services rely to a great extent on some notion of trust. However, even today, there is no accepted formalism or technique for the specification of trust and for reasoning about trust. Secure systems have been developed under the premise that concepts like ”trusted” or ”trustworthy” are well understood, unfortunately without even agreeing to what ”trust” means, how to measure it, how to compare two trust values and how to combine two trust values. In this work we propose a new vector model of trust. Our model proposes the notion of different degrees of trust, differentiates between trust and distrust and formalizes the dependence of trust on time. We believe that our model will help answer some of the questions posed earlier.

Trust and Technology

In this book our effort is to model and rationalize the trust notion trying to catch all the different and varying aspects of this broad concept. In fact, there are today many studies, models, simulations and experiments trying to integrate trust in the technological infrastructures: The most advanced disciplines in Human-) are forced to cope with trust. But why does trust seem to be so important in the advanced technological contexts? Is it necessary to involve such a complex, fuzzy and human related concept? Is it not sufficient to consider just more technical and simply applicable notions like security? To give a satisfactory response to these questions we have to evaluate which kind of network infrastructures are taken into consideration in the new communication and interaction scenarios, which kind of peculiar features should have the artificial agents we have to cope with, which kind of computing is going to invade (pervade) the future physical environments? In fact, trust becomes fundamental in the open multi-agent systems where the agents (which could be both human beings and artificial agents owned by other human stakeholders) can (more or less freely) enter and leave the system. The evolution of the interaction and communication technological paradigms toward human style, is, on the one hand, a really difficult task to realize, but, on the other hand, it potentially increases the people accessing to (and fruitful in using) the new technologies. In fact, in the history of their evolution humans have learned to cooperate in many ways and environments; on different tasks; and to achieve different goals. They have intentionally realized (or they were spontaneously emerging) diverse cooperative constructs (purely interactional, technical-legal, organizational, socio-cognitive, etc.) for establishing trust among them. It is now necessary to remodel the trust concept in the new current and future scenarios (new channels and infrastructures of communication; new artificial entities, new environments) and the efforts in the previously cited scientific fields (HCI, MAS, DAI, NCS) are trying to give positive answers to these main requirements. Trust Theory: A Socio-Cognitive and Computational Model Cristiano Castelfranchi and Rino Falcone

An interoperable context sensitive model of trust

Journal of Intelligent Information Systems, 2009

Although the notion of trust is widely used in secure information systems, very few works attempt to formally define it or reason about it. Moreover, in most works, trust is defined as a binary concept—either an entity is completely trusted or not at all. Absolute trust on an entity requires one to have complete knowledge about the entity. This is rarely the case in real-world applications. Not trusting an entity, on the other hand, prohibits all communications with the entity rendering it useless. In short, treating trust as a binary concept is not acceptable in practice. Consequently, a model is needed that incorporates the notion of different degrees of trust. We propose a model that allows us to formalize trust relationships. The trust relationship between a truster and a trustee is associated with a context and depends on the experience, knowledge, and recommendation that the truster has with respect to the trustee in the given context. We show how our model can measure trust and compare two trust relationships in a given context. Sometimes enough information is not available about a given context to evaluate trust. Towards this end we show how the relationships between different contexts can be captured using a context graph. Formalizing the relationships between contexts allows us to extrapolate values from related contexts to approximate the trust of an entity even when all the information needed to calculate the trust is not available. Finally, we show how the semantic mismatch that arises because of different sources using different context graphs can be resolved and the trust of information obtained from these different sources compared.