Interoperability is the ability of a programming language to work together with systems based on ... more Interoperability is the ability of a programming language to work together with systems based on dieren t languages and paradigms. Presently, many widely used high-level languages implementations are providing access to external functionalities. Since we are researching for a new concurrent programming language design, our hope would be to see its widespread adoption in the next future. For this reason
Proceedings of the 9th International Conference on Software Paradigm Trends, 2014
ABSTRACT We describe a business workflow case study with abnormal behavior management (i.e. recov... more ABSTRACT We describe a business workflow case study with abnormal behavior management (i.e. recovery) and demonstrate how temporal logics and model checking can provide a methodology to iteratively revise the design and obtain a correct-by construction system. To do so we define a formal semantics by giving a compilation of generic workflow patterns into LTL and we use the bound model checker Zot to prove specific properties and requirements validity. The working assumption is that such a lightweight approach would easily fit into processes that are already in place without the need for a radical change of procedures, tools and people's attitudes. The complexity of formalisms and invasiveness of methods have been demonstrated to be one of the major drawback and obstacle for deployment of formal engineering techniques into mundane projects.
International Journal of Systems and Service-Oriented Engineering, 2014
Web Services provide interoperable mechanisms for describing, locating and invoking services over... more Web Services provide interoperable mechanisms for describing, locating and invoking services over the Internet; composition further enables to build complex services out of simpler ones for complex B2B applications. While current studies on these topics are mostly focused -from the technical viewpoint -on standards and protocols, this paper investigates the adoption of formal methods, especially for composition. We logically classify and analyze three different (but interconnected) kinds of important issues towards this goal, namely foundations, verification and extensions. The aim of this work is to individuate the proper questions on the adoption of formal methods for dependable composition of Web Services, not necessarily to find the optimal answers. Nevertheless, we still try to propose some tentative answers based on our proposal for a composition calculus, which we hope can animate a proper discussion.
Abstract This paper reviews the major lessons learnt during two significant pilot projects by Bos... more Abstract This paper reviews the major lessons learnt during two significant pilot projects by Bosch Research during the DEPLOY project. Principally, the use of a single formalism, even when it comes together with a rigorous refinement methodology like Event-B, cannot offer a complete solution. Unfortunately (but not unexpectedly), we cannot offer a panacea to cover every phase from requirements to code; in fact any specific formalism or language (or tool) should be used only where and when it is really suitable and not necessarily (and ...
Road traffic in big cities is becoming increasingly chaotic. The use of public transportation is ... more Road traffic in big cities is becoming increasingly chaotic. The use of public transportation is an alternative to improve this scenario, since it diminishes the number of private vehicles on the roads. However, to improve its quality is important to understand the problems people are facing when using this kind of transportation. This paper presents the results of a survey conducted with public transportation users which investigates how collaborative systems based on social networks and collective intelligence can support sharing information with the passengers. The results show that there is a scarcity of ways to obtain real-time information related to public transportation and that the use of social network applications and collective intelligence is an interesting way to share and obtain this kind of information.
Road traffic in big cities is becoming increasingly chaotic. The use of public transportation is ... more Road traffic in big cities is becoming increasingly chaotic. The use of public transportation is an alternative to improve this scenario, since it diminishes the number of private vehicles on the roads. However, to improve its quality is important to understand the problems people are facing when using this kind of transportation. This paper presents the results of a survey conducted with public transportation users which investigates how collaborative systems based on social networks and collective intelligence can support sharing information with the passengers. The results show that there is a scarcity of ways to obtain real-time information related to public transportation and that the use of social network applications and collective intelligence is an interesting way to share and obtain this kind of information.
Three formalisms of different kinds -VDM, Maude, and basic CCS dp -are evaluated for their suitab... more Three formalisms of different kinds -VDM, Maude, and basic CCS dp -are evaluated for their suitability for the modelling and verification of dynamic software reconfiguration using as a case study the dynamic reconfiguration of a simple office workflow for order processing. The research is ongoing, and initial results are reported.
Access to global information is of primary importance in a global world. The Internet contains a ... more Access to global information is of primary importance in a global world. The Internet contains a huge amount of documents and it has a big potential as a news media, but the key is in the mechanism in which information is accessed. This paper describes a novel idea consisting in combining the potential of both social networks and search engines. The project is still at its preliminary stages and this paper has to be intended as work in progress. We describe here the basic ideas behind a trust ranking algorithm based on the activities and networking performed by users on a social network. We motivate the need for Polidoxa, the combined social network and search engine, and we finally describe the advantages over traditional media, traditional search engine like Google and social network such as Facebook.
ABSTRACT Nowadays, acquisition of trustable information is increasingly important in both profess... more ABSTRACT Nowadays, acquisition of trustable information is increasingly important in both professional and private contexts. However, establishing what information is trustable and what is not, is a very challenging task. For example, how can information quality be reliably assessed? How can sources? credibility be fairly assessed? How can gatekeeping processes be found trustworthy when filtering out news and deciding ranking and priorities of traditional media? An Internet-based solution to a human-based ancient issue is being studied, and it is called Polidoxa, from Greek "poly", meaning "many" or "several" and "doxa", meaning "common belief" or "popular opinion". This old problem will be solved by means of ancient philosophies and processes with truly modern tools and technologies. This is why this work required a collaborative and interdisciplinary joint effort from researchers with very different backgrounds and institutes with significantly different agendas. Polidoxa aims at offering: 1) a trust-based search engine algorithm, which exploits stigmergic behaviours of users? network, 2) a trust-based social network, where the notion of trust derives from network activity and 3) a holonic system for bottom-up self-protection and social privacy. By presenting the Polidoxa solution, this work also describes the current state of traditional media as well as newer ones, providing an accurate analysis of major search engines such as Google and social network (e.g., Facebook). The advantages that Polidoxa offers, compared to these, are also clearly detailed and motivated. Finally, a Twitter application (Polidoxa@twitter) which enables experimentation of basic Polidoxa principles is presented.
Access to global information is of primary importance in a global world. The Internet contains a ... more Access to global information is of primary importance in a global world. The Internet contains a huge amount of documents and it has a big potential as a news media, but the key is in the mechanism in which information is accessed. This paper describes a novel idea consisting in combining the potential of both social networks and search engines. We describe here the basic ideas behind a trust ranking algorithm based on the activities and networking performed by users on a social network. We motivate the need for Polidoxa, the combined social network and search engine, and we finally describe the advantages over traditional media, traditional search engine like Google and social network such as Facebook.
This paper introduces different views for understanding problems and faults with the goal of defi... more This paper introduces different views for understanding problems and faults with the goal of defining a method for the formal specification of systems. The idea of Layered Fault Tolerant Specification (LFTS) is proposed to make the method extensible to fault tolerant systems. The principle is layering the specification in different levels, the first one for the normal behavior and the others for the abnormal. The abnormal behavior is described in terms of an Error Injector (EI) which represents a model of the erroneous interference coming from the environment. This structure has been inspired by the notion of idealized fault tolerant component but the combination of LFTS and EI using Rely/Guarantee reasoning to describe their interaction can be considered as a novel contribution. The progress toward this method and this way to organize fault tolerant specifications has been made experimenting on case studies and an example is presented.
Interoperability is the ability of a programming language to work together with systems based on ... more Interoperability is the ability of a programming language to work together with systems based on dieren t languages and paradigms. Presently, many widely used high-level languages implementations are providing access to external functionalities. Since we are researching for a new concurrent programming language design, our hope would be to see its widespread adoption in the next future. For this reason
Proceedings of the 9th International Conference on Software Paradigm Trends, 2014
ABSTRACT We describe a business workflow case study with abnormal behavior management (i.e. recov... more ABSTRACT We describe a business workflow case study with abnormal behavior management (i.e. recovery) and demonstrate how temporal logics and model checking can provide a methodology to iteratively revise the design and obtain a correct-by construction system. To do so we define a formal semantics by giving a compilation of generic workflow patterns into LTL and we use the bound model checker Zot to prove specific properties and requirements validity. The working assumption is that such a lightweight approach would easily fit into processes that are already in place without the need for a radical change of procedures, tools and people's attitudes. The complexity of formalisms and invasiveness of methods have been demonstrated to be one of the major drawback and obstacle for deployment of formal engineering techniques into mundane projects.
International Journal of Systems and Service-Oriented Engineering, 2014
Web Services provide interoperable mechanisms for describing, locating and invoking services over... more Web Services provide interoperable mechanisms for describing, locating and invoking services over the Internet; composition further enables to build complex services out of simpler ones for complex B2B applications. While current studies on these topics are mostly focused -from the technical viewpoint -on standards and protocols, this paper investigates the adoption of formal methods, especially for composition. We logically classify and analyze three different (but interconnected) kinds of important issues towards this goal, namely foundations, verification and extensions. The aim of this work is to individuate the proper questions on the adoption of formal methods for dependable composition of Web Services, not necessarily to find the optimal answers. Nevertheless, we still try to propose some tentative answers based on our proposal for a composition calculus, which we hope can animate a proper discussion.
Abstract This paper reviews the major lessons learnt during two significant pilot projects by Bos... more Abstract This paper reviews the major lessons learnt during two significant pilot projects by Bosch Research during the DEPLOY project. Principally, the use of a single formalism, even when it comes together with a rigorous refinement methodology like Event-B, cannot offer a complete solution. Unfortunately (but not unexpectedly), we cannot offer a panacea to cover every phase from requirements to code; in fact any specific formalism or language (or tool) should be used only where and when it is really suitable and not necessarily (and ...
Road traffic in big cities is becoming increasingly chaotic. The use of public transportation is ... more Road traffic in big cities is becoming increasingly chaotic. The use of public transportation is an alternative to improve this scenario, since it diminishes the number of private vehicles on the roads. However, to improve its quality is important to understand the problems people are facing when using this kind of transportation. This paper presents the results of a survey conducted with public transportation users which investigates how collaborative systems based on social networks and collective intelligence can support sharing information with the passengers. The results show that there is a scarcity of ways to obtain real-time information related to public transportation and that the use of social network applications and collective intelligence is an interesting way to share and obtain this kind of information.
Road traffic in big cities is becoming increasingly chaotic. The use of public transportation is ... more Road traffic in big cities is becoming increasingly chaotic. The use of public transportation is an alternative to improve this scenario, since it diminishes the number of private vehicles on the roads. However, to improve its quality is important to understand the problems people are facing when using this kind of transportation. This paper presents the results of a survey conducted with public transportation users which investigates how collaborative systems based on social networks and collective intelligence can support sharing information with the passengers. The results show that there is a scarcity of ways to obtain real-time information related to public transportation and that the use of social network applications and collective intelligence is an interesting way to share and obtain this kind of information.
Three formalisms of different kinds -VDM, Maude, and basic CCS dp -are evaluated for their suitab... more Three formalisms of different kinds -VDM, Maude, and basic CCS dp -are evaluated for their suitability for the modelling and verification of dynamic software reconfiguration using as a case study the dynamic reconfiguration of a simple office workflow for order processing. The research is ongoing, and initial results are reported.
Access to global information is of primary importance in a global world. The Internet contains a ... more Access to global information is of primary importance in a global world. The Internet contains a huge amount of documents and it has a big potential as a news media, but the key is in the mechanism in which information is accessed. This paper describes a novel idea consisting in combining the potential of both social networks and search engines. The project is still at its preliminary stages and this paper has to be intended as work in progress. We describe here the basic ideas behind a trust ranking algorithm based on the activities and networking performed by users on a social network. We motivate the need for Polidoxa, the combined social network and search engine, and we finally describe the advantages over traditional media, traditional search engine like Google and social network such as Facebook.
ABSTRACT Nowadays, acquisition of trustable information is increasingly important in both profess... more ABSTRACT Nowadays, acquisition of trustable information is increasingly important in both professional and private contexts. However, establishing what information is trustable and what is not, is a very challenging task. For example, how can information quality be reliably assessed? How can sources? credibility be fairly assessed? How can gatekeeping processes be found trustworthy when filtering out news and deciding ranking and priorities of traditional media? An Internet-based solution to a human-based ancient issue is being studied, and it is called Polidoxa, from Greek "poly", meaning "many" or "several" and "doxa", meaning "common belief" or "popular opinion". This old problem will be solved by means of ancient philosophies and processes with truly modern tools and technologies. This is why this work required a collaborative and interdisciplinary joint effort from researchers with very different backgrounds and institutes with significantly different agendas. Polidoxa aims at offering: 1) a trust-based search engine algorithm, which exploits stigmergic behaviours of users? network, 2) a trust-based social network, where the notion of trust derives from network activity and 3) a holonic system for bottom-up self-protection and social privacy. By presenting the Polidoxa solution, this work also describes the current state of traditional media as well as newer ones, providing an accurate analysis of major search engines such as Google and social network (e.g., Facebook). The advantages that Polidoxa offers, compared to these, are also clearly detailed and motivated. Finally, a Twitter application (Polidoxa@twitter) which enables experimentation of basic Polidoxa principles is presented.
Access to global information is of primary importance in a global world. The Internet contains a ... more Access to global information is of primary importance in a global world. The Internet contains a huge amount of documents and it has a big potential as a news media, but the key is in the mechanism in which information is accessed. This paper describes a novel idea consisting in combining the potential of both social networks and search engines. We describe here the basic ideas behind a trust ranking algorithm based on the activities and networking performed by users on a social network. We motivate the need for Polidoxa, the combined social network and search engine, and we finally describe the advantages over traditional media, traditional search engine like Google and social network such as Facebook.
This paper introduces different views for understanding problems and faults with the goal of defi... more This paper introduces different views for understanding problems and faults with the goal of defining a method for the formal specification of systems. The idea of Layered Fault Tolerant Specification (LFTS) is proposed to make the method extensible to fault tolerant systems. The principle is layering the specification in different levels, the first one for the normal behavior and the others for the abnormal. The abnormal behavior is described in terms of an Error Injector (EI) which represents a model of the erroneous interference coming from the environment. This structure has been inspired by the notion of idealized fault tolerant component but the combination of LFTS and EI using Rely/Guarantee reasoning to describe their interaction can be considered as a novel contribution. The progress toward this method and this way to organize fault tolerant specifications has been made experimenting on case studies and an example is presented.
Uploads
Papers by Manuel Mazzara