eCite Digital Repository

On Trust and Privacy in Context-Aware Systems

Citation

Wagealla, W and Terzis, S and English, C and Nixon, P, On Trust and Privacy in Context-Aware Systems, Second iTrust Workshop, 15-17 September 2003, Imperial College London EJ (2003) [Conference Extract]

PDF
Not available
166Kb
  

Abstract

Recent advances in networking, handheld computing and sensors technologies have led to the emergence of context-aware systems. The vast amounts of personal information collected by such systems has led to growing concerns about the privacy of their users. Users concerned about their private information are likely to refuse participation in such systems. Therefore, it is quite clear that for any context-aware system to be acceptable by the users, mechanisms for controlling access to personal information are a necessity. According to Alan Westin "privacy is the claim of individuals, groups, or institutions to determine for themselves when, how and to what extent information is communicated to others. Within this context we can classify users as either information owners or information receivers. It is also acknowledged that information owners are willing to disclose personal information if this disclosure is potentially beneficial. So, the acceptance of any context-aware system depends on the provision of mechanisms for fine-grained control of the disclosure of personal information incorporating an explicit notion of benefit. In the SECURE project2, we envisage that trust could be exploited to protect usersí privacy, in the sense that reasoning about the trustworthiness of information receivers allows us to decide the amount of information that can be disclosed to them. Our approach also incorporates an explicit notion of risk. Reasoning about the risk involved in interactions allows us to adjust the amount of disclosed information according to their expected benefit. For example, information could only be revealed to trustworthy users, i.e. users that are expected to provide significant benefits to the information owner. Moreover, each information owner can specify his/her own privacy policy, in which he/she can articulate his/her preferences by adjusting his/her risk aversion (acceptable costs) for the various outcomes of an interaction. Allowing information owners to specify their own privacy policy is very important because users have significantly different attitudes towards privacy. Furthermore, our model of trust and risk supports learning from past interactions. We observe the outcomes of each interaction and we change the information receiverís trustworthiness to reflect our observations. We have applied the approach outlined above in a smart space scenario. The scenario looks at a university department equipped with a context information server, which tracks the location of users and can provide location information on demand. The access to the location information is controlled by the tracked userís privacy policy (information owner), which is expressed in terms of the trustworthiness of the requesting users (information receivers). The application of our approach to this scenario has provided some useful insight on the engineering of trust-based privacy solutions. We are currently evaluating the performance of our approach in the context of this scenario.

Item Details

Item Type:Conference Extract
Research Division:Information and Computing Sciences
Research Group:Information Systems
Research Field:Computer-Human Interaction
Objective Division:Information and Communication Services
Objective Group:Computer Software and Services
Objective Field:Computer Software and Services not elsewhere classified
Author:Nixon, P (Professor Paddy Nixon)
ID Code:69433
Year Published:2003
Deposited By:Research Division
Deposited On:2011-04-20
Last Modified:2011-05-26
Downloads:0

Repository Staff Only: item control page