Anomaly detection systems are usually employed to monitor database activities
in order to detect... more Anomaly detection systems are usually employed to monitor database activities
in order to detect security incidents. These systems raise alerts when anomalous
activities are detected. The alerts raised have to be analyzed to timely respond
to the security incidents. Their analysis, however, is time-consuming and costly.
This problem increases with the large number of alerts often raised by anomaly detection
systems. To timely and effectively handle security incidents, alerts should
be accompanied by information which allows the understanding of incidents and
their context (e.g., root causes, attack type) and their prioritization (e.g., criticality
level). Unfortunately, the current state of affairs regarding the information about
alerts provided by existing anomaly detection systems is not very satisfactory. This
work presents an anomaly analysis framework that facilitates the analysis of alerts
raised by an anomaly detection system monitoring a database system. The framework
provides an approach to assess the criticality of alerts with respect to the
disclosure of sensitive information and a feature-based approach for the classification
of alerts with respect to database attacks. The framework has been deployed as
a web-based alert audit tool that provides alert classification and risk-based ranking
capabilities, significantly easing the analysis of alerts. We validate the classification
and ranking approaches using synthetic data generated through an existing
healthcare management system.
""The detection and handling of data leakages is becoming a critical issue
for organizations. To... more ""The detection and handling of data leakages is becoming a critical issue
for organizations. To this end, data leakage solutions are usually employed by
organizations to monitor network traffic and the use of portable storage devices.
These solutions often produce a large number of alerts, whose analysis is timeconsuming
and costly for organizations. To effectively handle leakage incidents,
organizations should be able to focus on the most severe incidents. Therefore,
alerts need to be prioritized with respect to their severity. This work presents
a novel approach for the quantification of data leakages based on their severity.
The approach quantifies leakages with respect to the amount and sensitivity of the
leaked information as well as the ability to identify the data subjects of the leaked
information. To specify and reason on data sensitivity in an application domain,
we propose a data model representing the knowledge in the domain. We validate
our approach by analyzing data leakages within a healthcare environment.""
Recent advances in ICT have led to a vast and expeditious development of e-services and technolog... more Recent advances in ICT have led to a vast and expeditious development of e-services and technology. Trust is a fundamental aspect for the acceptance and adoption of these new services. Reputation is commonly employed as the measure of the trustworthiness of users in on-line communities. However, to facilitate their acceptance, reputation systems should be able to deal with the trust challenges and needs of those services.
The aim of this survey is to propose a framework for the analysis of reputation systems. We elicit the requirements for reputations metrics along with the features necessary to achieve such requirements. The identified requirements and features form a reference framework which allows an objective evaluation and comparison of reputation systems. We demonstrate its applicability by analyzing and classifying a number of existing reputation systems. Our framework can serve as a reference model for the analysis of reputation systems. It is also helpful for the design of new reputation systems as it provides an analysis of the implications of design choices.
"Home healthcare services are emerging as a new frontier
in healthcare practices. Data reliabili... more "Home healthcare services are emerging as a new frontier
in healthcare practices. Data reliability, however, is
crucial for the acceptance of these new services. This work
presents a semi-automated system to evaluate the quality
of medical measurements taken by patients. The system relies
on data qualifiers to evaluate various quality aspects
of measurements. The overall quality of measurements is
determined on the basis of these qualifiers enhanced with
a troubleshooting mechanism. Namely, the troubleshooting
mechanism guides healthcare professionals in the investigation
of the root causes of low quality values."
Innovation in information and communication technology has a great potential to create large impa... more Innovation in information and communication technology has a great potential to create large impact on modern healthcare. However, for the new technologies to be adopted, the innovations have to be meaningful and timely, taking into account user needs and addressing societal and ethical concerns. In this paper, we focus on ICT innovations related to home healthcare domain, in which patient safety and security, but also trust and privacy are of utmost importance. To ensure the adoption of new healthcare services, the new innovative technologies need to be complemented with new methods that can help patients to establish trust in healthcare service providers in terms of privacy, reliability, integrity of the data chain and techniques that help service providers to assess the reliability of information and data contributed by patients. This paper sketches various lines of research for the development of trusted healthcare services namely, patient compliance, reliability of information in healthcare, and user-friendly access control.
The disclosure of sensitive data to unauthorized entities is a critical issue for organizations. ... more The disclosure of sensitive data to unauthorized entities is a critical issue for organizations. Timely detection
of data leakage is crucial to reduce possible damages. Therefore, breaches should be detected as early as
possible, e.g., when data are leaving the database. In this paper, we focus on data leakage detection by
monitoring database activities. We present a framework that automatically learns normal user behavior, in
terms of database activities, and detects anomalies as deviation from such behavior. In addition, our approach
explicitly indicates the root cause of an anomaly. Finally, the framework assesses the severity of data leakages
based on the sensitivity of the disclosed data.
Innovation in information and communication technology has a great potential to create large impa... more Innovation in information and communication technology has a great potential to create large impact on modern healthcare. However, for the new technologies to be adopted, the innovations have to be meaningful and timely, taking into account user needs and addressing societal and ethical concerns. In this paper, we focus on ICT innovations related to home healthcare domain, in which patient safety and security, but also trust and privacy are of utmost im-portance. To ensure the adoption of new healthcare services, the new innovative technologies need to be complemented with new methods that can help patients to establish trust in healthcare service providers in terms of privacy, reliability, integrity of the data chain and techniques that help service providers to assess the reliability of information and data contributed by patients. This paper sketches various lines of research for the development of trusted healthcare ser-vices namely, patient compliance, reliability of information in healthcare, and user-friendly access control.
The last years have seen a growing interest in collaborative systems like electronic marketplaces... more The last years have seen a growing interest in collaborative systems like electronic marketplaces and P2P file sharing systems where people are intended to interact with other people. Those systems, however, are subject to security and operational risks because of their open and distributed nature. Reputation systems provide a mechanism to reduce such risks by building trust relationships among entities and identifying malicious entities. A popular reputation model is the so called flow-based model. Most existing reputation systems based on such a model provide only a ranking, without absolute reputation values; this makes it difficult to determine whether entities are actually trustworthy or untrustworthy. In addition, those systems ignore a significant part of the available information; as a consequence, reputation values may not be accurate. In this paper, we present a flow-based reputation metric that gives absolute values instead of merely a ranking. Our metric makes use of all the available information. We study, both analytically and numerically, the properties of the proposed metric and the effect of attacks on reputation values.
With the growing amount of personal information exchanged over the Internet, privacy is becoming ... more With the growing amount of personal information exchanged over the Internet, privacy is becoming more and more a concern for users. One of the key principles in protecting privacy is data minimisation. This principle requires that only the minimum amount of information necessary to accomplish a certain goal is collected and processed. "Privacy-enhancing" communication protocols have been proposed to guarantee data minimisation in a wide range of applications. However, currently there is no satisfactory way to assess and compare the privacy they offer in a precise way: existing analyses are either too informal and high-level, or specific for one particular system. In this work, we propose a general formal framework to analyse and compare communication protocols with respect to privacy by data minimisation. Privacy requirements are formalised independent of a particular protocol in terms of the knowledge of (coalitions of) actors in a three-layer model of personal information. These requirements are then verified automatically for particular protocols by computing this knowledge from a description of their communication. We validate our framework in an identity management (IdM) case study. As IdM systems are used more and more to satisfy the increasing need for reliable on-line identification and authentication, privacy is becoming an increasingly critical issue. We use our framework to analyse and compare four identity management systems. Finally, we discuss the completeness and (re)usability of the proposed framework.
Trust management is an approach to access control in distributed systems where access decisions a... more Trust management is an approach to access control in distributed systems where access decisions are based on policy statements issued by multiple principals and stored in a distributed manner. In trust management, the policy statements of a principal can refer to other principals' statements; thus, the process of evaluating an access request (i.e., a goal) consists of finding a "chain" of policy statements that allows the access to the requested resource. Most existing goal evaluation algorithms for trust management either rely on a centralized evaluation strategy, which consists of collecting all the relevant policy statements in a single location (and therefore they do not guarantee the confidentiality of intensional policies), or do not detect the termination of the computation (i.e., when all the answers of a goal are computed). In this paper we present GEM, a distributed goal evaluation algorithm for trust management systems that relies on function-free logic programming for the specification of policy statements. GEM detects termination in a completely distributed way without disclosing intensional policies, thereby preserving their confidentiality. We demonstrate that the algorithm terminates and is sound and complete with respect to the standard semantics for logic programs.
International Journal of Business Intelligence and Data Mining, 2008
Evaluating business solutions before being deployed is essential for any organization. Risk is em... more Evaluating business solutions before being deployed is essential for any organization. Risk is emerging as one of the most preeminent and accepted metrics for the evaluations of business solutions. In this paper, we present a comprehensive case study where the Tropos Goal-Risk framework is used to assess and treat risk on the basis of the likelihood and severity of failures within organizational settings. We present an analysis and an evaluation of business solutions within manufacturing enterprises.
Many security breaches occur because of exploitation of vulnerabilities within the system. Vulner... more Many security breaches occur because of exploitation of vulnerabilities within the system. Vulnerabilities are weaknesses in the requirements, design, and implementation, which attackers exploit to compromise the system. This paper proposes a methodological framework for security requirements elicitation and analysis centered on vulnerabilities. The framework offers modeling and analysis facilities to assist system designers in analyzing vulnerabilities and their effects on the system; identifying potential attackers and analyzing their behavior for compromising the system; and identifying and analyzing the countermeasures to protect the system. The framework proposes a qualitative goal model evaluation analysis for assessing the risks of vulnerabilities exploitation and analyzing the impact of countermeasures on such risks.
In this paper we identify the requirements for the definition of a security framework for distrib... more In this paper we identify the requirements for the definition of a security framework for distributed access control in dynamic coalitions of heterogeneous systems. Based on the elicited requirements, we introduce the POLIPO framework that combines distributed access control with ontologies to give a globally understandable semantics to policies, enabling interoperability among heterogeneous systems.
A translation of the Business Process Modeling Notation into the process calculus COWS is present... more A translation of the Business Process Modeling Notation into the process calculus COWS is presented. The stochastic extension of COWS is then exploited to address quantitative reasoning about the behaviour of business processes. An example of such reasoning is shown by running the PRISM probabilistic model checker on a case study.
Privacy and data protection are pivotal issues in the nowadays society. They concern the right to... more Privacy and data protection are pivotal issues in the nowadays society. They concern the right to prevent dissemination of sensitive or confidential information of individuals. Many studies have been proposed on this topic from various perspectives, namely sociological, economic, legal, and technological. We have recognized the legal perspective as being the basis of all other perspectives. Actually, data protection regulations set the legal principles and requirements that must be met by organizations when processing personal data. The objective of this work is to provide a reference base for the development of methodologies tailored to design privacy-aware systems to be compliant with data protection regulations.
The analysis of business solutions is one of critical issues in industry. Risk is one of the most... more The analysis of business solutions is one of critical issues in industry. Risk is one of the most preeminent and accepted metrics for the evaluation of business solutions. Not surprisingly, many research efforts have been devoted to develop risk management frameworks. Among them, Tropos Goal-Risk offers a formal framework for assessing and treating risks on the basis of the likelihood and severity of failures. In this paper, we extend the Tropos Goal-Risk to assess and treat risks by considering the interdependency among actors within an organization. To make the discussion more concrete, we apply the proposed framework for analysis of the risks within manufacturing organizations. * This work was done when the author was at the University of Trento.
The last years have seen a peak in privacy related research. The focus has been mostly on how to ... more The last years have seen a peak in privacy related research. The focus has been mostly on how to protect the individual from being tracked, with plenty of anonymizing solutions. We advocate another model that is closer to the "physical" world: we consider our privacy respected when our personal data is used for the purpose for which we gave it in the first place. Essentially, in any distributed authorization protocol, credentials should mention their purpose beside their powers. For this information to be meaningful we should link it to the functional requirements of the original application. We sketch how one can modify a requirement engineering methodology to incorporate security concerns so that we explicitly trace back the highlevel goals for which a functionality has been delegated by a (human or software) agent to another one. Then one could be directly derive purpose-based trust management solutions from the requirements. This work has been partially funded by the IST programme of the EU Commission, FET under the IST-2001-37004 WASP project and by the FIRB programme of MIUR under the RBNE0195K5 ASTRO Project. We would like to thank P. Giorgini, M. Pistore, and J. Mylopoulos for many useful discussions on Tropos. 1 Privacy is the right of individuals to determine for themselves when, how, and to what extent information about them is communicated to others (Alan Westin).
Although the concepts of security and trust play an important issue in the development of informa... more Although the concepts of security and trust play an important issue in the development of information systems, they have been mainly neglected by software engineering methodologies. In this chapter we present an approach that considers security and trust throughout the software development process. Our approach integrates two prominent software engineering approaches, one that provides a securityoriented process and one that provides a trust management process.
The increasing complexity of IT systems and the growing demand for regulation compliance are main... more The increasing complexity of IT systems and the growing demand for regulation compliance are main issues for the design of IT systems. Addressing these issues requires the developing of effective methods to support the analysis of regulations and the elicitation of any organizational and system requirements from them. This work investigates the problem of designing regulation-compliant systems and, in particular, the challenges in eliciting and managing legal requirements.
Anomaly detection systems are usually employed to monitor database activities
in order to detect... more Anomaly detection systems are usually employed to monitor database activities
in order to detect security incidents. These systems raise alerts when anomalous
activities are detected. The alerts raised have to be analyzed to timely respond
to the security incidents. Their analysis, however, is time-consuming and costly.
This problem increases with the large number of alerts often raised by anomaly detection
systems. To timely and effectively handle security incidents, alerts should
be accompanied by information which allows the understanding of incidents and
their context (e.g., root causes, attack type) and their prioritization (e.g., criticality
level). Unfortunately, the current state of affairs regarding the information about
alerts provided by existing anomaly detection systems is not very satisfactory. This
work presents an anomaly analysis framework that facilitates the analysis of alerts
raised by an anomaly detection system monitoring a database system. The framework
provides an approach to assess the criticality of alerts with respect to the
disclosure of sensitive information and a feature-based approach for the classification
of alerts with respect to database attacks. The framework has been deployed as
a web-based alert audit tool that provides alert classification and risk-based ranking
capabilities, significantly easing the analysis of alerts. We validate the classification
and ranking approaches using synthetic data generated through an existing
healthcare management system.
""The detection and handling of data leakages is becoming a critical issue
for organizations. To... more ""The detection and handling of data leakages is becoming a critical issue
for organizations. To this end, data leakage solutions are usually employed by
organizations to monitor network traffic and the use of portable storage devices.
These solutions often produce a large number of alerts, whose analysis is timeconsuming
and costly for organizations. To effectively handle leakage incidents,
organizations should be able to focus on the most severe incidents. Therefore,
alerts need to be prioritized with respect to their severity. This work presents
a novel approach for the quantification of data leakages based on their severity.
The approach quantifies leakages with respect to the amount and sensitivity of the
leaked information as well as the ability to identify the data subjects of the leaked
information. To specify and reason on data sensitivity in an application domain,
we propose a data model representing the knowledge in the domain. We validate
our approach by analyzing data leakages within a healthcare environment.""
Recent advances in ICT have led to a vast and expeditious development of e-services and technolog... more Recent advances in ICT have led to a vast and expeditious development of e-services and technology. Trust is a fundamental aspect for the acceptance and adoption of these new services. Reputation is commonly employed as the measure of the trustworthiness of users in on-line communities. However, to facilitate their acceptance, reputation systems should be able to deal with the trust challenges and needs of those services.
The aim of this survey is to propose a framework for the analysis of reputation systems. We elicit the requirements for reputations metrics along with the features necessary to achieve such requirements. The identified requirements and features form a reference framework which allows an objective evaluation and comparison of reputation systems. We demonstrate its applicability by analyzing and classifying a number of existing reputation systems. Our framework can serve as a reference model for the analysis of reputation systems. It is also helpful for the design of new reputation systems as it provides an analysis of the implications of design choices.
"Home healthcare services are emerging as a new frontier
in healthcare practices. Data reliabili... more "Home healthcare services are emerging as a new frontier
in healthcare practices. Data reliability, however, is
crucial for the acceptance of these new services. This work
presents a semi-automated system to evaluate the quality
of medical measurements taken by patients. The system relies
on data qualifiers to evaluate various quality aspects
of measurements. The overall quality of measurements is
determined on the basis of these qualifiers enhanced with
a troubleshooting mechanism. Namely, the troubleshooting
mechanism guides healthcare professionals in the investigation
of the root causes of low quality values."
Innovation in information and communication technology has a great potential to create large impa... more Innovation in information and communication technology has a great potential to create large impact on modern healthcare. However, for the new technologies to be adopted, the innovations have to be meaningful and timely, taking into account user needs and addressing societal and ethical concerns. In this paper, we focus on ICT innovations related to home healthcare domain, in which patient safety and security, but also trust and privacy are of utmost importance. To ensure the adoption of new healthcare services, the new innovative technologies need to be complemented with new methods that can help patients to establish trust in healthcare service providers in terms of privacy, reliability, integrity of the data chain and techniques that help service providers to assess the reliability of information and data contributed by patients. This paper sketches various lines of research for the development of trusted healthcare services namely, patient compliance, reliability of information in healthcare, and user-friendly access control.
The disclosure of sensitive data to unauthorized entities is a critical issue for organizations. ... more The disclosure of sensitive data to unauthorized entities is a critical issue for organizations. Timely detection
of data leakage is crucial to reduce possible damages. Therefore, breaches should be detected as early as
possible, e.g., when data are leaving the database. In this paper, we focus on data leakage detection by
monitoring database activities. We present a framework that automatically learns normal user behavior, in
terms of database activities, and detects anomalies as deviation from such behavior. In addition, our approach
explicitly indicates the root cause of an anomaly. Finally, the framework assesses the severity of data leakages
based on the sensitivity of the disclosed data.
Innovation in information and communication technology has a great potential to create large impa... more Innovation in information and communication technology has a great potential to create large impact on modern healthcare. However, for the new technologies to be adopted, the innovations have to be meaningful and timely, taking into account user needs and addressing societal and ethical concerns. In this paper, we focus on ICT innovations related to home healthcare domain, in which patient safety and security, but also trust and privacy are of utmost im-portance. To ensure the adoption of new healthcare services, the new innovative technologies need to be complemented with new methods that can help patients to establish trust in healthcare service providers in terms of privacy, reliability, integrity of the data chain and techniques that help service providers to assess the reliability of information and data contributed by patients. This paper sketches various lines of research for the development of trusted healthcare ser-vices namely, patient compliance, reliability of information in healthcare, and user-friendly access control.
The last years have seen a growing interest in collaborative systems like electronic marketplaces... more The last years have seen a growing interest in collaborative systems like electronic marketplaces and P2P file sharing systems where people are intended to interact with other people. Those systems, however, are subject to security and operational risks because of their open and distributed nature. Reputation systems provide a mechanism to reduce such risks by building trust relationships among entities and identifying malicious entities. A popular reputation model is the so called flow-based model. Most existing reputation systems based on such a model provide only a ranking, without absolute reputation values; this makes it difficult to determine whether entities are actually trustworthy or untrustworthy. In addition, those systems ignore a significant part of the available information; as a consequence, reputation values may not be accurate. In this paper, we present a flow-based reputation metric that gives absolute values instead of merely a ranking. Our metric makes use of all the available information. We study, both analytically and numerically, the properties of the proposed metric and the effect of attacks on reputation values.
With the growing amount of personal information exchanged over the Internet, privacy is becoming ... more With the growing amount of personal information exchanged over the Internet, privacy is becoming more and more a concern for users. One of the key principles in protecting privacy is data minimisation. This principle requires that only the minimum amount of information necessary to accomplish a certain goal is collected and processed. "Privacy-enhancing" communication protocols have been proposed to guarantee data minimisation in a wide range of applications. However, currently there is no satisfactory way to assess and compare the privacy they offer in a precise way: existing analyses are either too informal and high-level, or specific for one particular system. In this work, we propose a general formal framework to analyse and compare communication protocols with respect to privacy by data minimisation. Privacy requirements are formalised independent of a particular protocol in terms of the knowledge of (coalitions of) actors in a three-layer model of personal information. These requirements are then verified automatically for particular protocols by computing this knowledge from a description of their communication. We validate our framework in an identity management (IdM) case study. As IdM systems are used more and more to satisfy the increasing need for reliable on-line identification and authentication, privacy is becoming an increasingly critical issue. We use our framework to analyse and compare four identity management systems. Finally, we discuss the completeness and (re)usability of the proposed framework.
Trust management is an approach to access control in distributed systems where access decisions a... more Trust management is an approach to access control in distributed systems where access decisions are based on policy statements issued by multiple principals and stored in a distributed manner. In trust management, the policy statements of a principal can refer to other principals' statements; thus, the process of evaluating an access request (i.e., a goal) consists of finding a "chain" of policy statements that allows the access to the requested resource. Most existing goal evaluation algorithms for trust management either rely on a centralized evaluation strategy, which consists of collecting all the relevant policy statements in a single location (and therefore they do not guarantee the confidentiality of intensional policies), or do not detect the termination of the computation (i.e., when all the answers of a goal are computed). In this paper we present GEM, a distributed goal evaluation algorithm for trust management systems that relies on function-free logic programming for the specification of policy statements. GEM detects termination in a completely distributed way without disclosing intensional policies, thereby preserving their confidentiality. We demonstrate that the algorithm terminates and is sound and complete with respect to the standard semantics for logic programs.
International Journal of Business Intelligence and Data Mining, 2008
Evaluating business solutions before being deployed is essential for any organization. Risk is em... more Evaluating business solutions before being deployed is essential for any organization. Risk is emerging as one of the most preeminent and accepted metrics for the evaluations of business solutions. In this paper, we present a comprehensive case study where the Tropos Goal-Risk framework is used to assess and treat risk on the basis of the likelihood and severity of failures within organizational settings. We present an analysis and an evaluation of business solutions within manufacturing enterprises.
Many security breaches occur because of exploitation of vulnerabilities within the system. Vulner... more Many security breaches occur because of exploitation of vulnerabilities within the system. Vulnerabilities are weaknesses in the requirements, design, and implementation, which attackers exploit to compromise the system. This paper proposes a methodological framework for security requirements elicitation and analysis centered on vulnerabilities. The framework offers modeling and analysis facilities to assist system designers in analyzing vulnerabilities and their effects on the system; identifying potential attackers and analyzing their behavior for compromising the system; and identifying and analyzing the countermeasures to protect the system. The framework proposes a qualitative goal model evaluation analysis for assessing the risks of vulnerabilities exploitation and analyzing the impact of countermeasures on such risks.
In this paper we identify the requirements for the definition of a security framework for distrib... more In this paper we identify the requirements for the definition of a security framework for distributed access control in dynamic coalitions of heterogeneous systems. Based on the elicited requirements, we introduce the POLIPO framework that combines distributed access control with ontologies to give a globally understandable semantics to policies, enabling interoperability among heterogeneous systems.
A translation of the Business Process Modeling Notation into the process calculus COWS is present... more A translation of the Business Process Modeling Notation into the process calculus COWS is presented. The stochastic extension of COWS is then exploited to address quantitative reasoning about the behaviour of business processes. An example of such reasoning is shown by running the PRISM probabilistic model checker on a case study.
Privacy and data protection are pivotal issues in the nowadays society. They concern the right to... more Privacy and data protection are pivotal issues in the nowadays society. They concern the right to prevent dissemination of sensitive or confidential information of individuals. Many studies have been proposed on this topic from various perspectives, namely sociological, economic, legal, and technological. We have recognized the legal perspective as being the basis of all other perspectives. Actually, data protection regulations set the legal principles and requirements that must be met by organizations when processing personal data. The objective of this work is to provide a reference base for the development of methodologies tailored to design privacy-aware systems to be compliant with data protection regulations.
The analysis of business solutions is one of critical issues in industry. Risk is one of the most... more The analysis of business solutions is one of critical issues in industry. Risk is one of the most preeminent and accepted metrics for the evaluation of business solutions. Not surprisingly, many research efforts have been devoted to develop risk management frameworks. Among them, Tropos Goal-Risk offers a formal framework for assessing and treating risks on the basis of the likelihood and severity of failures. In this paper, we extend the Tropos Goal-Risk to assess and treat risks by considering the interdependency among actors within an organization. To make the discussion more concrete, we apply the proposed framework for analysis of the risks within manufacturing organizations. * This work was done when the author was at the University of Trento.
The last years have seen a peak in privacy related research. The focus has been mostly on how to ... more The last years have seen a peak in privacy related research. The focus has been mostly on how to protect the individual from being tracked, with plenty of anonymizing solutions. We advocate another model that is closer to the "physical" world: we consider our privacy respected when our personal data is used for the purpose for which we gave it in the first place. Essentially, in any distributed authorization protocol, credentials should mention their purpose beside their powers. For this information to be meaningful we should link it to the functional requirements of the original application. We sketch how one can modify a requirement engineering methodology to incorporate security concerns so that we explicitly trace back the highlevel goals for which a functionality has been delegated by a (human or software) agent to another one. Then one could be directly derive purpose-based trust management solutions from the requirements. This work has been partially funded by the IST programme of the EU Commission, FET under the IST-2001-37004 WASP project and by the FIRB programme of MIUR under the RBNE0195K5 ASTRO Project. We would like to thank P. Giorgini, M. Pistore, and J. Mylopoulos for many useful discussions on Tropos. 1 Privacy is the right of individuals to determine for themselves when, how, and to what extent information about them is communicated to others (Alan Westin).
Although the concepts of security and trust play an important issue in the development of informa... more Although the concepts of security and trust play an important issue in the development of information systems, they have been mainly neglected by software engineering methodologies. In this chapter we present an approach that considers security and trust throughout the software development process. Our approach integrates two prominent software engineering approaches, one that provides a securityoriented process and one that provides a trust management process.
The increasing complexity of IT systems and the growing demand for regulation compliance are main... more The increasing complexity of IT systems and the growing demand for regulation compliance are main issues for the design of IT systems. Addressing these issues requires the developing of effective methods to support the analysis of regulations and the elicitation of any organizational and system requirements from them. This work investigates the problem of designing regulation-compliant systems and, in particular, the challenges in eliciting and managing legal requirements.
Uploads
Papers by Nicola Zannone
in order to detect security incidents. These systems raise alerts when anomalous
activities are detected. The alerts raised have to be analyzed to timely respond
to the security incidents. Their analysis, however, is time-consuming and costly.
This problem increases with the large number of alerts often raised by anomaly detection
systems. To timely and effectively handle security incidents, alerts should
be accompanied by information which allows the understanding of incidents and
their context (e.g., root causes, attack type) and their prioritization (e.g., criticality
level). Unfortunately, the current state of affairs regarding the information about
alerts provided by existing anomaly detection systems is not very satisfactory. This
work presents an anomaly analysis framework that facilitates the analysis of alerts
raised by an anomaly detection system monitoring a database system. The framework
provides an approach to assess the criticality of alerts with respect to the
disclosure of sensitive information and a feature-based approach for the classification
of alerts with respect to database attacks. The framework has been deployed as
a web-based alert audit tool that provides alert classification and risk-based ranking
capabilities, significantly easing the analysis of alerts. We validate the classification
and ranking approaches using synthetic data generated through an existing
healthcare management system.
for organizations. To this end, data leakage solutions are usually employed by
organizations to monitor network traffic and the use of portable storage devices.
These solutions often produce a large number of alerts, whose analysis is timeconsuming
and costly for organizations. To effectively handle leakage incidents,
organizations should be able to focus on the most severe incidents. Therefore,
alerts need to be prioritized with respect to their severity. This work presents
a novel approach for the quantification of data leakages based on their severity.
The approach quantifies leakages with respect to the amount and sensitivity of the
leaked information as well as the ability to identify the data subjects of the leaked
information. To specify and reason on data sensitivity in an application domain,
we propose a data model representing the knowledge in the domain. We validate
our approach by analyzing data leakages within a healthcare environment.""
The aim of this survey is to propose a framework for the analysis of reputation systems. We elicit the requirements for reputations metrics along with the features necessary to achieve such requirements. The identified requirements and features form a reference framework which allows an objective evaluation and comparison of reputation systems. We demonstrate its applicability by analyzing and classifying a number of existing reputation systems. Our framework can serve as a reference model for the analysis of reputation systems. It is also helpful for the design of new reputation systems as it provides an analysis of the implications of design choices.
in healthcare practices. Data reliability, however, is
crucial for the acceptance of these new services. This work
presents a semi-automated system to evaluate the quality
of medical measurements taken by patients. The system relies
on data qualifiers to evaluate various quality aspects
of measurements. The overall quality of measurements is
determined on the basis of these qualifiers enhanced with
a troubleshooting mechanism. Namely, the troubleshooting
mechanism guides healthcare professionals in the investigation
of the root causes of low quality values."
of data leakage is crucial to reduce possible damages. Therefore, breaches should be detected as early as
possible, e.g., when data are leaving the database. In this paper, we focus on data leakage detection by
monitoring database activities. We present a framework that automatically learns normal user behavior, in
terms of database activities, and detects anomalies as deviation from such behavior. In addition, our approach
explicitly indicates the root cause of an anomaly. Finally, the framework assesses the severity of data leakages
based on the sensitivity of the disclosed data.
in order to detect security incidents. These systems raise alerts when anomalous
activities are detected. The alerts raised have to be analyzed to timely respond
to the security incidents. Their analysis, however, is time-consuming and costly.
This problem increases with the large number of alerts often raised by anomaly detection
systems. To timely and effectively handle security incidents, alerts should
be accompanied by information which allows the understanding of incidents and
their context (e.g., root causes, attack type) and their prioritization (e.g., criticality
level). Unfortunately, the current state of affairs regarding the information about
alerts provided by existing anomaly detection systems is not very satisfactory. This
work presents an anomaly analysis framework that facilitates the analysis of alerts
raised by an anomaly detection system monitoring a database system. The framework
provides an approach to assess the criticality of alerts with respect to the
disclosure of sensitive information and a feature-based approach for the classification
of alerts with respect to database attacks. The framework has been deployed as
a web-based alert audit tool that provides alert classification and risk-based ranking
capabilities, significantly easing the analysis of alerts. We validate the classification
and ranking approaches using synthetic data generated through an existing
healthcare management system.
for organizations. To this end, data leakage solutions are usually employed by
organizations to monitor network traffic and the use of portable storage devices.
These solutions often produce a large number of alerts, whose analysis is timeconsuming
and costly for organizations. To effectively handle leakage incidents,
organizations should be able to focus on the most severe incidents. Therefore,
alerts need to be prioritized with respect to their severity. This work presents
a novel approach for the quantification of data leakages based on their severity.
The approach quantifies leakages with respect to the amount and sensitivity of the
leaked information as well as the ability to identify the data subjects of the leaked
information. To specify and reason on data sensitivity in an application domain,
we propose a data model representing the knowledge in the domain. We validate
our approach by analyzing data leakages within a healthcare environment.""
The aim of this survey is to propose a framework for the analysis of reputation systems. We elicit the requirements for reputations metrics along with the features necessary to achieve such requirements. The identified requirements and features form a reference framework which allows an objective evaluation and comparison of reputation systems. We demonstrate its applicability by analyzing and classifying a number of existing reputation systems. Our framework can serve as a reference model for the analysis of reputation systems. It is also helpful for the design of new reputation systems as it provides an analysis of the implications of design choices.
in healthcare practices. Data reliability, however, is
crucial for the acceptance of these new services. This work
presents a semi-automated system to evaluate the quality
of medical measurements taken by patients. The system relies
on data qualifiers to evaluate various quality aspects
of measurements. The overall quality of measurements is
determined on the basis of these qualifiers enhanced with
a troubleshooting mechanism. Namely, the troubleshooting
mechanism guides healthcare professionals in the investigation
of the root causes of low quality values."
of data leakage is crucial to reduce possible damages. Therefore, breaches should be detected as early as
possible, e.g., when data are leaving the database. In this paper, we focus on data leakage detection by
monitoring database activities. We present a framework that automatically learns normal user behavior, in
terms of database activities, and detects anomalies as deviation from such behavior. In addition, our approach
explicitly indicates the root cause of an anomaly. Finally, the framework assesses the severity of data leakages
based on the sensitivity of the disclosed data.