As technology is used to support team-based activities, one important factor affecting the perfor... more As technology is used to support team-based activities, one important factor affecting the performance of teams is the kind of mental model shared between team members. This paper describes a novel conceptual graph based methodology to study these mental models to better understand how shared mental models affect performance and other factors of a team's behavior.
Relational databases are in widespread use, yet they suffer from serious limitations when one use... more Relational databases are in widespread use, yet they suffer from serious limitations when one uses them for reasoning about real-world enterprises. This is due to the fact that database relations possess no inherent semantics. This paper describes an approach called microanalysis that we have used to effectively capture database semantics represented by conceptual graphs. The technique prescribes a manual knowledge acquisition process whereby each relation schema is captured in a single conceptual graph. The schema's graph can then easily be instantiated for each tuple in the database forming a set of graphs representing the entire database's semantics. Although our technique originally was developed to capture semantics in a restricted domain of interest, namely database inference detection, we believe that domain-directed microanalysis is a general approach that can be of significant value for databases in many domains. We describe the approach and give a brief example.
This paper addresses operational aspects of conceptual graph systems. This paper is an attempt to... more This paper addresses operational aspects of conceptual graph systems. This paper is an attempt to formalize operations within a conceptual graph system by using conceptual graphs themselves to describe the mechanism. We outline a unifying approach that can integrate the notions of a fact base, type definitions, actor definitions, messages, and the assertion and retraction of graphs. Our approach formalizes the notion of type expansion and actor definitions, and in the process also formalizes the notion for any sort of formal assertion in a conceptual graph system. We introduce definitions as concept types called assertional types which are animated through a concept type called an assertional event. We illustrate the assertion of a type definition, a nested definition and an actor definition, using one extended example. We believe this mechanism has immediate and far-reaching value in offering a self-contained, yet animate conceptual graph system architecture. This has been achieved by introducing a very few conceptual graph concept types and relations, and attaching an operational semantics to them. Conceptual Graphs Operations Conceptual Graphs Operations
International Journal of Computers and Applications, Dec 12, 2017
ABSTRACT Multiple-viewed requirements modeling allows modelers to focus on different aspects of a... more ABSTRACT Multiple-viewed requirements modeling allows modelers to focus on different aspects of a system’s requirements and express them in appropriate modeling notations. As a result, the requirements are scattered in a set of different models. The semantic overlap among them makes model transformation possible, and such a transformation can be used for acquiring requirements knowledge. In this paper, we demonstrate the process of deriving a set of sequence diagrams with requirements knowledge acquisition opportunities from a state diagram. This set can be used as a requirements elicitation medium for sequence diagram modelers. The transformation is based on the rich semantic relationship between the two diagrams and proved graph theory algorithms for finding sequences. The set of derived sequence diagrams is consistent with the state model and achieves minimum state transition path coverage in the state diagram. Such a set of sequence diagrams with knowledge acquisition opportunities can be used as a modest spur to induce human modelers to provide valuable requirements knowledge.
IFIP advances in information and communication technology, 1997
This paper presents a second-path inference-detection approach based on association cardinalities... more This paper presents a second-path inference-detection approach based on association cardinalities.*It is applicable to the detection of second paths that do not involve functional dependencies or foreign keys. It provides for an analysis sieve that begins with the analysis of an object model of the database. The goal of the analysis is to detect cases in the database in which a small number of values in the target entity can be associated with a single value in the anchor entity. The number of values is called the association cardinality from anchor to target. Inference vulnerabilities occur for cases of small association cardinalities. The analysis sieve processes the data model of the database to detect cases of small association cardinality. For cases with high cardinality associations, the sieve mines the database to detect cases of small instance-level association cardinalities.
The notion of semantic distance or similarity measurement is closely related to the process of ca... more The notion of semantic distance or similarity measurement is closely related to the process of categorization. Any theory of measurement must take into account some real-world results from cognitive science: it is not possible to make a priori judgments about how similar are two random concepts.
A problem in current requirements development techniques is that the viewpoints (including implic... more A problem in current requirements development techniques is that the viewpoints (including implicit pre-existing assumptions) of multiple participants must be satisfied by the resulting requirements. Choosing a single language for all participants requires the additional burden of learning the new language and being prepared to overlook some requirements for which the language was not intended. This dissertation proposes a framework and methodology whereby multiple requirements specifications can be produced in several languages of the participants' own choosing, then translated into a metalanguage (that is, a language capable of capturing features of several different requirements languages) in order to analyze their common features. The metalanguage of conceptual graphs is used to capture requirements expressed in various conventional requirements development notations. Once translated into conceptual graphs, the set of requirements is analyzed in order to find the most likely counterpart concepts between the partcipants' requirements specifications. These counterparts represent the overlap between views and form the basis for joining the multiple graphs into one requirements graph. Ambiguous counterparts (i.e., concepts that cannot be distinguished from other concepts) are also identified, guiding the requirements analyst in asking participants further questions about the assumptions that underlie the concepts.
This paper presents an approach for detecting potential second-path inference problems in a way t... more This paper presents an approach for detecting potential second-path inference problems in a way that is significantly faster than previous approaches. The algorithm uses a relational database schema and functional dependencies to detect the potential for second-path inferences. The second-path inference problem involves the ability to infer higher classified data from lower classified data within a relational database system using joins. In previous research, this type of inference vulnerability was detected by actually finding a path. The approach presented in this paper does not find the path, but detects the existence of a path by adapting a well known algorithm used in database design to test a relational decomposition for the lossless join property. The original lossless join algorithm has been extended to include subtypes. The paper compares the performance of the new algorithm with that of a conventional path-finding algorithm and shows that the new algorithm is 10 to 14 times faster than the path-finding approach using schemas that range from 33 to 48 relations. The final contribution of the paper is the presentation of an algorithm for automatically classifying the discovered paths into various groups, based on their potential for indicating a significant potential security vulnerability.
In recent years much work has been pelformed in developing suites of metrics that are targeted fo... more In recent years much work has been pelformed in developing suites of metrics that are targeted for object-oriented software, rather than functionally-oriented sojhare. This is necessary since good object-oriented software has several characteristics, such as inheritance and ...
Automated tools are often used to support software development workflows. Many of these tools are... more Automated tools are often used to support software development workflows. Many of these tools are aimed toward a development workflow that relies implicitly on particular supported roles and activities. Developers may already understand how a tool operates; however, developers do not always understand or adhere to a development process supported (or implied) by the tools, nor adhere to prescribed processes when they are explicit. This chapter is aimed at helping both developers and their managers understand and manage workflows by describing a preliminary formal model of roles and activities in software development. Using this purely descriptive model as a starting point, the authors evaluate some existing tools with respect to their description of roles in their processes, and finally show one application where process modeling was helpful to managers. We also introduce an extended model of problem status as an example of how formal models can enrich understanding of the software development process, based on the analysis of process roles
ABSTRACT Current research suggests that many students do not know how to program at the conclusio... more ABSTRACT Current research suggests that many students do not know how to program at the conclusion of their introductory course, which has been taught predominately with textual and auditory lecturing. By primarily appealing to programming novices who prefer to understand visually, an understanding method not currently accommodated through the standard lecture style used in most classes, we develop a method that encourages communication of programming solutions. This method builds upon previous research that suggests that most engineering students are visual learners and we contribute that using a flow-model visual programming language will address important and difficult topics to novices of programming. We performed a pilot study using a knowledge modeling tool instead of using an existing visual programming tool to test this method, and share the program understanding results using this theory.
For a set of UML models built for a system, ensuring that each model is relatively complete with ... more For a set of UML models built for a system, ensuring that each model is relatively complete with respect to the rest of the set is critical to further analysis and design. In this paper, we present a novel idea to identify requirements gap in a model by synthesizing requirements from other types of models. This is accomplished by a bidirectional transformation between a set of partially complete UML models and conceptual graphs. UML models are first transformed to conceptual graphs based on a well-defined set of primitives and canonical graphs to form a centralized requirements knowledge reservoir, then inference rules are applied to unveil possible missing requirements. These identified missing requirements, when transformed back to UML, can provide prompts of requirements gap to software modelers and stimulate them to come up with more requirements in order to resolve them, thus making UML models more complete.
Software product metrics have been studied as a means to measure the complexity of software artif... more Software product metrics have been studied as a means to measure the complexity of software artifacts. Most software product metrics focus on capturing the complexity exhibited by the program. This paper suggests that there are two types of complexity in software artifacts: the structural complexity and the application domain complexity. The structural complexity is expressed in the program structure. The application-domain complexity is inherited from an application domain. Traditional software metrics are effective in measuring the structural complexity of software artifacts. However these metrics are not effective in measuring application-domain complexity.
A software product-line is a set of products built from a core set of software components. Althou... more A software product-line is a set of products built from a core set of software components. Although software engineers develop software productlines for various application types, they are most commonly used for embedded systems development, where the variability of hardware features requires variability in the supporting firmware. Feature models are used to represent the variability in these software product-lines. Various feature modeling approaches have been proposed, including feature diagrams, domain specific languages, constraint languages, and the semantic web language OWL. This paper explores a conceptual graph approach to feature modeling in an effort to produce feature models that have a more natural, and more easily expressed mapping to the problem domain. It demonstrates the approach using a standard Graph Productline problem that has been discussed in various software product-line papers. A conceptual graph feature model is developed for the graph product-line and it is compared to other feature models for this product-line.
As technology is used to support team-based activities, one important factor affecting the perfor... more As technology is used to support team-based activities, one important factor affecting the performance of teams is the kind of mental model shared between team members. This paper describes a novel conceptual graph based methodology to study these mental models to better understand how shared mental models affect performance and other factors of a team's behavior.
Relational databases are in widespread use, yet they suffer from serious limitations when one use... more Relational databases are in widespread use, yet they suffer from serious limitations when one uses them for reasoning about real-world enterprises. This is due to the fact that database relations possess no inherent semantics. This paper describes an approach called microanalysis that we have used to effectively capture database semantics represented by conceptual graphs. The technique prescribes a manual knowledge acquisition process whereby each relation schema is captured in a single conceptual graph. The schema's graph can then easily be instantiated for each tuple in the database forming a set of graphs representing the entire database's semantics. Although our technique originally was developed to capture semantics in a restricted domain of interest, namely database inference detection, we believe that domain-directed microanalysis is a general approach that can be of significant value for databases in many domains. We describe the approach and give a brief example.
This paper addresses operational aspects of conceptual graph systems. This paper is an attempt to... more This paper addresses operational aspects of conceptual graph systems. This paper is an attempt to formalize operations within a conceptual graph system by using conceptual graphs themselves to describe the mechanism. We outline a unifying approach that can integrate the notions of a fact base, type definitions, actor definitions, messages, and the assertion and retraction of graphs. Our approach formalizes the notion of type expansion and actor definitions, and in the process also formalizes the notion for any sort of formal assertion in a conceptual graph system. We introduce definitions as concept types called assertional types which are animated through a concept type called an assertional event. We illustrate the assertion of a type definition, a nested definition and an actor definition, using one extended example. We believe this mechanism has immediate and far-reaching value in offering a self-contained, yet animate conceptual graph system architecture. This has been achieved by introducing a very few conceptual graph concept types and relations, and attaching an operational semantics to them. Conceptual Graphs Operations Conceptual Graphs Operations
International Journal of Computers and Applications, Dec 12, 2017
ABSTRACT Multiple-viewed requirements modeling allows modelers to focus on different aspects of a... more ABSTRACT Multiple-viewed requirements modeling allows modelers to focus on different aspects of a system’s requirements and express them in appropriate modeling notations. As a result, the requirements are scattered in a set of different models. The semantic overlap among them makes model transformation possible, and such a transformation can be used for acquiring requirements knowledge. In this paper, we demonstrate the process of deriving a set of sequence diagrams with requirements knowledge acquisition opportunities from a state diagram. This set can be used as a requirements elicitation medium for sequence diagram modelers. The transformation is based on the rich semantic relationship between the two diagrams and proved graph theory algorithms for finding sequences. The set of derived sequence diagrams is consistent with the state model and achieves minimum state transition path coverage in the state diagram. Such a set of sequence diagrams with knowledge acquisition opportunities can be used as a modest spur to induce human modelers to provide valuable requirements knowledge.
IFIP advances in information and communication technology, 1997
This paper presents a second-path inference-detection approach based on association cardinalities... more This paper presents a second-path inference-detection approach based on association cardinalities.*It is applicable to the detection of second paths that do not involve functional dependencies or foreign keys. It provides for an analysis sieve that begins with the analysis of an object model of the database. The goal of the analysis is to detect cases in the database in which a small number of values in the target entity can be associated with a single value in the anchor entity. The number of values is called the association cardinality from anchor to target. Inference vulnerabilities occur for cases of small association cardinalities. The analysis sieve processes the data model of the database to detect cases of small association cardinality. For cases with high cardinality associations, the sieve mines the database to detect cases of small instance-level association cardinalities.
The notion of semantic distance or similarity measurement is closely related to the process of ca... more The notion of semantic distance or similarity measurement is closely related to the process of categorization. Any theory of measurement must take into account some real-world results from cognitive science: it is not possible to make a priori judgments about how similar are two random concepts.
A problem in current requirements development techniques is that the viewpoints (including implic... more A problem in current requirements development techniques is that the viewpoints (including implicit pre-existing assumptions) of multiple participants must be satisfied by the resulting requirements. Choosing a single language for all participants requires the additional burden of learning the new language and being prepared to overlook some requirements for which the language was not intended. This dissertation proposes a framework and methodology whereby multiple requirements specifications can be produced in several languages of the participants' own choosing, then translated into a metalanguage (that is, a language capable of capturing features of several different requirements languages) in order to analyze their common features. The metalanguage of conceptual graphs is used to capture requirements expressed in various conventional requirements development notations. Once translated into conceptual graphs, the set of requirements is analyzed in order to find the most likely counterpart concepts between the partcipants' requirements specifications. These counterparts represent the overlap between views and form the basis for joining the multiple graphs into one requirements graph. Ambiguous counterparts (i.e., concepts that cannot be distinguished from other concepts) are also identified, guiding the requirements analyst in asking participants further questions about the assumptions that underlie the concepts.
This paper presents an approach for detecting potential second-path inference problems in a way t... more This paper presents an approach for detecting potential second-path inference problems in a way that is significantly faster than previous approaches. The algorithm uses a relational database schema and functional dependencies to detect the potential for second-path inferences. The second-path inference problem involves the ability to infer higher classified data from lower classified data within a relational database system using joins. In previous research, this type of inference vulnerability was detected by actually finding a path. The approach presented in this paper does not find the path, but detects the existence of a path by adapting a well known algorithm used in database design to test a relational decomposition for the lossless join property. The original lossless join algorithm has been extended to include subtypes. The paper compares the performance of the new algorithm with that of a conventional path-finding algorithm and shows that the new algorithm is 10 to 14 times faster than the path-finding approach using schemas that range from 33 to 48 relations. The final contribution of the paper is the presentation of an algorithm for automatically classifying the discovered paths into various groups, based on their potential for indicating a significant potential security vulnerability.
In recent years much work has been pelformed in developing suites of metrics that are targeted fo... more In recent years much work has been pelformed in developing suites of metrics that are targeted for object-oriented software, rather than functionally-oriented sojhare. This is necessary since good object-oriented software has several characteristics, such as inheritance and ...
Automated tools are often used to support software development workflows. Many of these tools are... more Automated tools are often used to support software development workflows. Many of these tools are aimed toward a development workflow that relies implicitly on particular supported roles and activities. Developers may already understand how a tool operates; however, developers do not always understand or adhere to a development process supported (or implied) by the tools, nor adhere to prescribed processes when they are explicit. This chapter is aimed at helping both developers and their managers understand and manage workflows by describing a preliminary formal model of roles and activities in software development. Using this purely descriptive model as a starting point, the authors evaluate some existing tools with respect to their description of roles in their processes, and finally show one application where process modeling was helpful to managers. We also introduce an extended model of problem status as an example of how formal models can enrich understanding of the software development process, based on the analysis of process roles
ABSTRACT Current research suggests that many students do not know how to program at the conclusio... more ABSTRACT Current research suggests that many students do not know how to program at the conclusion of their introductory course, which has been taught predominately with textual and auditory lecturing. By primarily appealing to programming novices who prefer to understand visually, an understanding method not currently accommodated through the standard lecture style used in most classes, we develop a method that encourages communication of programming solutions. This method builds upon previous research that suggests that most engineering students are visual learners and we contribute that using a flow-model visual programming language will address important and difficult topics to novices of programming. We performed a pilot study using a knowledge modeling tool instead of using an existing visual programming tool to test this method, and share the program understanding results using this theory.
For a set of UML models built for a system, ensuring that each model is relatively complete with ... more For a set of UML models built for a system, ensuring that each model is relatively complete with respect to the rest of the set is critical to further analysis and design. In this paper, we present a novel idea to identify requirements gap in a model by synthesizing requirements from other types of models. This is accomplished by a bidirectional transformation between a set of partially complete UML models and conceptual graphs. UML models are first transformed to conceptual graphs based on a well-defined set of primitives and canonical graphs to form a centralized requirements knowledge reservoir, then inference rules are applied to unveil possible missing requirements. These identified missing requirements, when transformed back to UML, can provide prompts of requirements gap to software modelers and stimulate them to come up with more requirements in order to resolve them, thus making UML models more complete.
Software product metrics have been studied as a means to measure the complexity of software artif... more Software product metrics have been studied as a means to measure the complexity of software artifacts. Most software product metrics focus on capturing the complexity exhibited by the program. This paper suggests that there are two types of complexity in software artifacts: the structural complexity and the application domain complexity. The structural complexity is expressed in the program structure. The application-domain complexity is inherited from an application domain. Traditional software metrics are effective in measuring the structural complexity of software artifacts. However these metrics are not effective in measuring application-domain complexity.
A software product-line is a set of products built from a core set of software components. Althou... more A software product-line is a set of products built from a core set of software components. Although software engineers develop software productlines for various application types, they are most commonly used for embedded systems development, where the variability of hardware features requires variability in the supporting firmware. Feature models are used to represent the variability in these software product-lines. Various feature modeling approaches have been proposed, including feature diagrams, domain specific languages, constraint languages, and the semantic web language OWL. This paper explores a conceptual graph approach to feature modeling in an effort to produce feature models that have a more natural, and more easily expressed mapping to the problem domain. It demonstrates the approach using a standard Graph Productline problem that has been discussed in various software product-line papers. A conceptual graph feature model is developed for the graph product-line and it is compared to other feature models for this product-line.
Uploads
Papers by Harry Delugach