Data Privacy

The problem of data privacy is to verify that confidential information stored in an information system is not provided to unauthorized users and, therefore, personal and other sensitive data remain private. The main challenge in such a context is to share some data while protecting other personally identifiable information. The aim of our project is to develop formal methods and the corresponding algorithms to enable automated reasoning about data privacy. Data stored in a relational database or knowledge base system usually is protected from unauthorized access. Users of such a system are then only allowed to access a limited portion of the stored information. In this situation the following important questions arise:
  1. What can a user infer from the information to which he has access?
  2. Can we guarantee that a user cannot obtain knowledge about certain sensitive information?
  3. Is it possible to grant a user information access in such a way that she is able to fulfil her duties without letting her know secret information?
We will address these question not in their full generality, but we intend to study the following concrete issue. Controlled query evaluation is an approach to privacy preserving query answering where the answer to a query is distorted if it would leak sensitive information to the user. We plan to develop a formal framework to enable controlled query evaluation for ontological knowledge base systems. To achieve this aim we will mainly use tools and techniques from modal logic in general and description logic in particular.