Papers by Frederick Sheldon
Annals of Software Engineering, Oct 3, 1999
This paper describes a methodology for the specification and analysis of distributed real-time sy... more This paper describes a methodology for the specification and analysis of distributed real-time systems using the toolset called PARAGON. PARAGON is based on the Communicating Shared Resources paradigm, which allows a real-time system to be modeled as a set of communicating processes that compete for shared resources. PARAGON supports both visual and textual languages for describing real-time systems. It offers automatic analysis based on state space exploration as well as user-directed ...
Annals of Software Engineering, Oct 3, 1999
This paper describes a methodology for the specification and analysis of distributed real-time sy... more This paper describes a methodology for the specification and analysis of distributed real-time systems using the toolset called PARAGON. PARAGON is based on the Communicating Shared Resources paradigm, which allows a real-time system to be modeled as a set of communicating processes that compete for shared resources. PARAGON supports both visual and textual languages for describing real-time systems. It offers automatic analysis based on state space exploration as well as user-directed ...
International Journal of Critical Infrastructures
International Journal of Computers and Applications
Quantum computing is an emerging technology. The clock frequency of current computer processor sy... more Quantum computing is an emerging technology. The clock frequency of current computer processor systems may reach about 40 GHz within the next 10 years. By then, one atom may represent one bit. Electrons under such conditions are no longer described by classical physics, and a new model of the computer may be necessary by that time. The quantum computer is one proposal that may have merit in dealing with the problems presented. Currently, there exist some algorithms utilizing the advantage of quantum computers. For example, Shor's algorithm performs factoring of a large integer in polynomial time, whereas classical factoring algorithms can do it in exponential time. In this paper we briefly survey the current status of quantum computers, quantum computer systems, and quantum simulators.
2016 IEEE World Congress on Services (SERVICES), 2016
— The emerging paradigm of cloud computing (CC) presents many security risks that can potentially... more — The emerging paradigm of cloud computing (CC) presents many security risks that can potentially and adversely impact any one of the plethora of stakeholders. The widespread deployment and service models of CC in addition to the wide variety of stakeholders make it difficult to comprehend security and privacy (S&P). In this paper, we present CSSR 1 , a Cloud Services Security Recommender tool. CSSR codifies a stakeholder-oriented taxonomy. The goal for CSSR is to identify the various S&P risks for the kaleidoscope of different CC models from the stakeholder's perspective. CSSR will recommend a comprehensive list of S&P attributes that must be considered as controls necessary to minimize the CC attack surface. By identifying the S&P concerns that are unique to the particular usage scenarios (again from a stakeholder perspective), CSSR provides a comprehensive basis from which to choose alternative security solutions. This model then provides a structured and well-informed process of mitigating risk as envisioned by each and every stakeholder based on their needs.
Cyber-physical systems (CPS) are characterized by the close linkage of computational resources an... more Cyber-physical systems (CPS) are characterized by the close linkage of computational resources and phys- ical devices. These systems can be deployed in a num- ber of critical infrastructure settings. As a result, the security requirements of CPS are different than tra- ditional computing architectures. For example, criti- cal functions must be identified and isolated from in- terference by other functions.
Sac, 2004
Very faithful but intractable Faithful but relatively intractable Relatively faithful and tractab... more Very faithful but intractable Faithful but relatively intractable Relatively faithful and tractable Unfaithful but very tractable System Under Study (Proposed / Exisitng) REAL SYSTEM Challenges: (1) large state space, (2) model stiffness (relative order of magnitude among parameters) and, (3) the memoryless property assumes that events are independent and identically distributed. A model is always a compromise between faithfulness and simplicity. Results should be precise and correct (model predictions should be realistic and accurate based on a well founded mathematical theory, such as reliability theory). Model 1 Model 2 Model 3 Model 4 Fig. 1: Precise realistic models.
2011 44th Hawaii International Conference on System Sciences, 2011
Vulnerabilities in a system may have widely varying impacts on system security. In practice, secu... more Vulnerabilities in a system may have widely varying impacts on system security. In practice, security should not be defined as the absence of vulnerabilities. In practice, security should not be quantified by the number of vulnerabilities. Security should be managed by pursuing a policy that leads us first to the highest impact vulnerabilities. In light of these observations, we argue in favor of shifting our focus from vulnerability avoidance/removal to measurable security attributes. To this effect, we recommend a logic be used for system security, which captures/represents security properties in quantifiable, verifiable, measurable terms-so that it is possible to reason about security in terms of its observable/ perceptible effects rather than its hypothesized causes. This approach is orthogonal to existing techniques for vulnerability avoidance, removal, detection, and recovery, in the sense that it provides a means to assess, quantify, and combine these techniques.
ABSTRACT The workshop theme is Cyber Security: Beyond the Maginot Line Recently the FBI reported ... more ABSTRACT The workshop theme is Cyber Security: Beyond the Maginot Line Recently the FBI reported that computer crime has skyrocketed costing over $67 billion in 2005 alone and affecting 2.8M+ businesses and organizations. Attack sophistication is unprecedented along with availability of open source concomitant tools. Private, academic, and public sectors invest significant resources in cyber security. Industry primarily performs cyber security research as an investment in future products and services. While the public sector also funds cyber security R&D, the majority of this activity focuses on the specific mission(s) of the funding agency. Thus, broad areas of cyber security remain neglected or underdeveloped. Consequently, this workshop endeavors to explore issues involving cyber security and related technologies toward strengthening such areas and enabling the development of new tools and methods for securing our information infrastructure critical assets. We aim to assemble new ideas and proposals about robust models on which we can build the architecture of a secure cyberspace including but not limited to: * Knowledge discovery and management * Critical infrastructure protection * De-obfuscating tools for the validation and verification of tamper-proofed software * Computer network defense technologies * Scalable information assurance strategies * Assessment-driven design for trust * Security metrics and testing methodologies * Validation of security and survivability properties * Threat assessment and risk analysis * Early accurate detection of the insider threat * Security hardened sensor networks and ubiquitous computing environments * Mobile software authentication protocols * A new "model" of the threat to replace the "Maginot Line" model and more . . .
Developing a knowledge-sharing capability across distributed heterogeneous data sources remains a... more Developing a knowledge-sharing capability across distributed heterogeneous data sources remains a significant challenge. Ontology-based approaches to this problem show promise by resolving heterogeneity, if the participating data owners agree to use a common ontology (i.e., a set of common attributes). Such common ontologies offer the capability to work with distributed data as if it were located in a central repository. This knowledge sharing may be achieved by determining the intersection of similar concepts from across various heterogeneous systems. However, if information is sought from a subset of the participating data sources, there may be concepts common to the subset that are not included in the full common ontology, and therefore are unavailable for knowledge sharing. One way to solve this problem is to construct a series of ontologies, one for each possible combination of data sources. In this way, no concepts are lost, but the number of possible subsets is prohibitively large. We offer a novel software agent approach as an alternative that provides a flexible and dynamic fusion of data across any combination of the participating heterogeneous data sources to maximize knowledge sharing. The software agents generate the largest intersection of shared data across any selected subset of data sources. This ontology-based agent approach maximizes knowledge sharing by dynamically generating common ontologies over the data sources of interest. The approach was validated using data provided by five (disparate) national laboratories by defining a local ontology for each laboratory (i.e., data source). In this experiment, the ontologies are used to specify how to format the data using XML to make it suitable for query. Consequently, software agents are empowered to provide the ability to dynamically form local ontologies from the data sources. In this way, the cost of developing these ontologies is reduced while providing the broadest possible access to available data sources.
As the critical infrastructures of the United States have become more and more dependent on publi... more As the critical infrastructures of the United States have become more and more dependent on public and private networks, the potential for widespread national impact resulting from disruption or failure of these networks has also increased. Securing the nation's critical infrastructures requires protecting not only their physical systems but, just as important, the cyber portions of the systems on which they rely. A failure is inclusive of random events, design flaws, and instabilities caused by cyber (and/or physical) attack. One such domain is failure in critical equipment. A second is aging bridges. We discuss the workings of such a system in the context of the necessary sensors, command and control and data collection as well as the cyber security efforts that would support this system. Their application and the implications of this computing architecture are also discussed, with respect to our nation's aging infrastructure.
One of the main components of the Environmental Protection Agency's (EPA) Clean Materials Program... more One of the main components of the Environmental Protection Agency's (EPA) Clean Materials Program is to prevent the loss of radioactive materials through the use of tracking technologies. If a source is inadvertently lost or purposely abandoned or stolen, it is critical that the source be recovered before harm to the public or the environment occurs. Radio frequency identification (RFID) tagging on radioactive sources is a technology that can be operated in the active or passive mode, has a variety of frequencies available allowing for flexibility in use, is able to transmit detailed data and is discreet. The purpose of the joint DOE and EPA Radiological Source Tracking and Monitoring (RadSTraM) project is to evaluate the viability, effectiveness and scalability of RFID technology under a variety of transportation scenarios. The goal of the Phase II was to continue testing integrated RFID tag systems from various vendors for feasibility in tracking radioactive sealed sources which included the following performance objectives: 1. Validate the performance of RFID intelligent systems to monitor express air shipments of medical radioisotopes in the nationwide supply chain, Phase II of the RadSTraM project verified that RFID tagging can be applied to the tracking and monitoring of medical radioisotope air express shipments. This study demonstrated that active RFID tagging systems can be feasibly integrated and scaled into the nationwide supply chain to track and monitor medical radioisotopes.
Proceedings of the 4th Annual Workshop on Cyber Security and Information Intelligence Research Developing Strategies to Meet the Cyber Security and Information Intelligence Challenges Ahead, 2008
Information systems now form the backbone of nearly every government and private system-Web servi... more Information systems now form the backbone of nearly every government and private system-Web services currently does or will play a major role in supporting access to distributed resources, command and control to exploit the backbone. Increasingly these systems are networked together allowing for distributed operations, sharing of databases, and redundant capability. Ensuring these networks are secure, robust, and reliable is critical for the strategic and economic well being of the Nation. This paper argues in favor of a biologically inspired approach to creating survivable cyber-secure infrastructures (SCI). Our discussion employs the power transmission grid.
We present breakthrough findings via significant modifications to the Weigh-in-Motion (WIM) Gen I... more We present breakthrough findings via significant modifications to the Weigh-in-Motion (WIM) Gen II approach, so-called the modified Gen II. The revisions enable slow speed weight measurements at least as precise as in ground static scales, which are certified to 0.1% error. Concomitant software and hardware revisions reflect a philosophical and practical change that enables an order of magnitude improvement to
Formal methods such as CSP (Communicating Sequential Processes), CCS (Calculus of Communicating S... more Formal methods such as CSP (Communicating Sequential Processes), CCS (Calculus of Communicating Systems) and Dataflow based process models are widely used for formal reasoning in the areas of concurrency, communication, and distributed systems. The research in formal specification and verification of complex systems has often ignored the specification of stochastic properties of the system. We are exploring new methodologies and tools to permit stochastic analysis of CSP-based systems specifications. In doing so, we have investigated the relationship between specification models and stochastic models by translating the specification into another form that is amenable to such analyses (e.g., from CSP to stochastic Petri Nets). This process can give insight for further refinements of the original specification (i.e., identify potential failure processes and recovery actions). It does this by relating the parameters needed for reliability analysis to user level specifications which is essential for realizing systems that meet the users needs in terms of cost, functionality, performance and reliability.
Iscapdcs, 2003
The U.S. Army is faced with the challenge of dramatically improving its war fighting capability t... more The U.S. Army is faced with the challenge of dramatically improving its war fighting capability through advanced technologies. Any new technology must provide significant improvement over existing technologies, yet be reliable enough to provide a fielded system. The focus of this paper is to assess the novelty and maturity of agent technology for use in the Future Combat System. The Future Combat System (FCS) concept represents the U.S. Army's "mounted" form of the Objective Force. This concept of vehicles, communications, and weaponry is viewed as a "system of systems" which includes net-centric command and control (C 2) capabilities. This networked C 2 is an important transformation from the historically centralized, or platform-based, C 2 function, since a centralized command architecture may become a decision-making and execution bottleneck, particularly as the pace of war accelerates. A mechanism to ensure an effective network-centric C 2 capacity-combining intelligence-gathering and analysis available at lower levels in the military hierarchy-is needed. Achieving a networked C 2 capability will require breakthroughs in current software technology. Many have proposed the use of agent technology as a potential solution. Agents are an emerging technology, and it is not yet clear whether it is suitable for addressing the networked C 2 challenge, particularly in satisfying battlespace scalability, mobility, and security expectations. We have developed a set of software requirements for FCS based on military requirements for this system. We have then evaluated these software requirements against current computer science technology. This analysis provides a set of limitations in the current technology when applied to the FCS challenge. Agent technology is compared against this set of limitations to provide a means of assessing the novelty of agent technology in an FCS environment. From this analysis we find that existing technologies will not likely be sufficient to meet the networked C 2 requirements of FCS due to limitations in scalability, mobility, and security. Agent technology provides a number of advantages in these areas, mainly through much stronger messaging and coordination models. These models theoretically allow for significant improvements in many areas, including scalability, mobility, and security. However, the demonstration of such capabilities in an FCS environment does not currently exist, although a number of strong agent-based systems have been deployed in related areas. Additionally, there are challenges in FCS that neither current technology, nor agent technology are particularly well suited for, such as information fusion and decision support. In summary, we believe that agent technology has the capability to support most of the networked C 2 requirements of FCS. However, we would recommend proof of principle experiments to verify the theoretical advantages of this technology in an FCS environment.
Uploads
Papers by Frederick Sheldon