This article is an open access article distributed under the terms and conditions of the Creative... more This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY
The automakers are continuously interested in noise vibration and harshness solutions for high qu... more The automakers are continuously interested in noise vibration and harshness solutions for high quality vehicles in general and quieter passenger cabins in special. In this paper the modal parameters of a car windshield are determined by using experimental modal analysis. The frequency response functions have been measured by using impulsive technique. An impact hammer has been used to generate the impulsive force applied at various locations evenly spread on the windshield surface. A fixed mini-accelerometer has been used to measure the structure response. Starting from the set of FRF functions of free-free conditions, the Least Square Complex Exponential parameter estimation method implemented in LMS Test.Lab software has been used to extract the modal parameters. The validation phase sustained the correctness of the modal parameters found.
This paper presents a study of the nonmonotonic consequence relation which models the skeptical r... more This paper presents a study of the nonmonotonic consequence relation which models the skeptical reasoning formalised by constrained default logic. The nonmonotonic skeptical consequence relation is defined using the sequent calculus axiomatic system. We study the formal properties desirable for a “good” nonmonotonic relation: supraclassicality, cut, cautious monotony, cumulativity, absorption, distribution.
Default logics represent a simple but a powerful class of nonmonotonic formalisms. The main compu... more Default logics represent a simple but a powerful class of nonmonotonic formalisms. The main computational problem specific to these logical systems is a search problem: finding all the extensions (sets of nonmonotonic theorems-beliefs) of a default theory. GADEL is an automated system based on a heuristic approach of the classical default extension computing problem and applies the principles of genetic algorithms to solve the problem. The purpose of this paper is to extend this heuristic approach for computing all type of default extensions: classical, justified, constrained and rational.
Nonmonotonic reasoning is succesfully formalized by the class of default logics. In this paper we... more Nonmonotonic reasoning is succesfully formalized by the class of default logics. In this paper we introduce an axiomatic system for credulous reasoning in rational default logic. Based on classical sequent calculus and anti-sequent calculus, an abstract characterization of credulous nonmonotonic default inference in this variant of default logic is presented.
2008 10th International Symposium on Symbolic and Numeric Algorithms for Scientific Computing, 2008
The family of default logics formalize the default reasoning using nonmonotonic inference rules c... more The family of default logics formalize the default reasoning using nonmonotonic inference rules called defaults. In this paper we propose a uniform abstract characterization of credulous default inference associated to all versions (classical, justified, constrained, rational) of propositional default logic using the credulous default sequent calculi. These axiomatic systems combine sequent and anti-sequent calculus rules for propositional logic with reduction rules specific to the application of the defaults.
Emotions play a central role in both writing and understanding literary works, and poetry is a ge... more Emotions play a central role in both writing and understanding literary works, and poetry is a genre rich in emotional content, vivid imagery and abstract language. This paper proposes a clustering-based approach to unsupervisedly mine emotional patterns in Mihai Eminescu's poetry. Lexicon-based emotion features are used for the clustering algorithm. Resulting clusters are assessed with regard to manually added characteristics of poems in the form of literary themes. There is a partial overlap between affective and thematic content, consistent with literary evaluations of the same works. Computational approaches have the advantage of being objective and replicable, with unsupervised techniques such as clustering representing a valuable tool in the exploration of literary works. Nonetheless, no specific emotional patterns, as determined by the proposed method, can be fully associated with particular literary themes.
Software authorship attribution, defined as the problem of software authentication and resolution... more Software authorship attribution, defined as the problem of software authentication and resolution of source code ownership, is of major relevance in the software engineering field. Authorship analysis of source code is more difficult than the classic task on literature, but it would be of great use in various software development activities such as software maintenance, software quality analysis or project management. This paper addresses the problem of code authorship attribution and introduces, as a proof of concept, a new supervised classification model AutoSoft for identifying the developer of a certain piece of code. The proposed model is composed of an ensemble of autoencoders that are trained to encode and recognize the programming style of software developers. An extension of the AutoSoft classifier, able to recognize an unknown developer (a developer that was not seen during the training), is also discussed and evaluated. Experiments conducted on software programs collected...
In this study it is proven that the Hrebs used in Denotation analysis of texts and Cohesion Chain... more In this study it is proven that the Hrebs used in Denotation analysis of texts and Cohesion Chains (defined as a fusion between Lexical Chains and Coreference Chains) represent similar linguistic tools. This result gives us the possibility to extend to Cohesion Chains (CCs) some important indicators as, for example the Kernel of CCs, the topicality of a CC, text concentration, CCdiffuseness and mean diffuseness of the text. Let us mention that nowhere in the Lexical Chains or Coreference Chains literature these kinds of indicators are introduced and used since now. Similarly, some applications of CCs in the study of a text (as for example segmentation or summarization of a text) could be realized starting from hrebs. As an illustration of the similarity between Hrebs and CCs a detailed analyze of the poem "Lacul" by Mihai Eminescu is given.
2017 13th IEEE International Conference on Intelligent Computer Communication and Processing (ICCP), 2017
Semantic lexicons are widely used resources in the implementation of solutions for a number of Na... more Semantic lexicons are widely used resources in the implementation of solutions for a number of Natural Language Processing tasks. However, availability of these lexicons is limited for most languages other than English, including Romanian. The aim of this paper is to bridge this gap by processing and improving a translated lexicon of Romanian words tagged with Plutchik's eight emotions and polarity data. We present the development of RoEmoLex (Romanian Emotion Lexicon) from early stages to an in depth analysis of its content using Formal Concept Analysis. Results show that the lexicon could represent a stepping stone in the creation of efficient emotion analysis systems for the Romanian language.
This paper discusses the comparison between two kinds of features in clustering of some literary ... more This paper discusses the comparison between two kinds of features in clustering of some literary poems by the same author, the Romanian poet, Mihai Eminescu. Using Precision, Recall, Rand Index, Relative Precision and Purity measures we conclude that the topics of poems are better characterized by the phonemes as features than by geometric properties (described by six indicators: V/N ;A; Λ;V ar(Λ), Gini;V ar(Gini)) of the rank-frequency sequence of word forms.
2018 IEEE 14th International Conference on Intelligent Computer Communication and Processing (ICCP), 2018
Mental health represents an important topic in the discussion about public health, as the prevale... more Mental health represents an important topic in the discussion about public health, as the prevalence of disorders increases, and the consequences are harder to manage. One way to address this problem is the use of computational methods to process social media data that provides valuable insight into individuals’ everyday activity. Particularly, there has been a focus on studying the language of mental health in textual content, as word choice, discourse structure and style have significant psychological implications. This paper aims to quantify the language of mental disorders in a set of posts and comments collected from a Romanian medical forum and explore illness specific characteristics through classification tasks. Results suggest that both vocabulary use and form of textual expression are largely discriminative for individual mental disorders.
This article is an open access article distributed under the terms and conditions of the Creative... more This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY
The automakers are continuously interested in noise vibration and harshness solutions for high qu... more The automakers are continuously interested in noise vibration and harshness solutions for high quality vehicles in general and quieter passenger cabins in special. In this paper the modal parameters of a car windshield are determined by using experimental modal analysis. The frequency response functions have been measured by using impulsive technique. An impact hammer has been used to generate the impulsive force applied at various locations evenly spread on the windshield surface. A fixed mini-accelerometer has been used to measure the structure response. Starting from the set of FRF functions of free-free conditions, the Least Square Complex Exponential parameter estimation method implemented in LMS Test.Lab software has been used to extract the modal parameters. The validation phase sustained the correctness of the modal parameters found.
This paper presents a study of the nonmonotonic consequence relation which models the skeptical r... more This paper presents a study of the nonmonotonic consequence relation which models the skeptical reasoning formalised by constrained default logic. The nonmonotonic skeptical consequence relation is defined using the sequent calculus axiomatic system. We study the formal properties desirable for a “good” nonmonotonic relation: supraclassicality, cut, cautious monotony, cumulativity, absorption, distribution.
Default logics represent a simple but a powerful class of nonmonotonic formalisms. The main compu... more Default logics represent a simple but a powerful class of nonmonotonic formalisms. The main computational problem specific to these logical systems is a search problem: finding all the extensions (sets of nonmonotonic theorems-beliefs) of a default theory. GADEL is an automated system based on a heuristic approach of the classical default extension computing problem and applies the principles of genetic algorithms to solve the problem. The purpose of this paper is to extend this heuristic approach for computing all type of default extensions: classical, justified, constrained and rational.
Nonmonotonic reasoning is succesfully formalized by the class of default logics. In this paper we... more Nonmonotonic reasoning is succesfully formalized by the class of default logics. In this paper we introduce an axiomatic system for credulous reasoning in rational default logic. Based on classical sequent calculus and anti-sequent calculus, an abstract characterization of credulous nonmonotonic default inference in this variant of default logic is presented.
2008 10th International Symposium on Symbolic and Numeric Algorithms for Scientific Computing, 2008
The family of default logics formalize the default reasoning using nonmonotonic inference rules c... more The family of default logics formalize the default reasoning using nonmonotonic inference rules called defaults. In this paper we propose a uniform abstract characterization of credulous default inference associated to all versions (classical, justified, constrained, rational) of propositional default logic using the credulous default sequent calculi. These axiomatic systems combine sequent and anti-sequent calculus rules for propositional logic with reduction rules specific to the application of the defaults.
Emotions play a central role in both writing and understanding literary works, and poetry is a ge... more Emotions play a central role in both writing and understanding literary works, and poetry is a genre rich in emotional content, vivid imagery and abstract language. This paper proposes a clustering-based approach to unsupervisedly mine emotional patterns in Mihai Eminescu's poetry. Lexicon-based emotion features are used for the clustering algorithm. Resulting clusters are assessed with regard to manually added characteristics of poems in the form of literary themes. There is a partial overlap between affective and thematic content, consistent with literary evaluations of the same works. Computational approaches have the advantage of being objective and replicable, with unsupervised techniques such as clustering representing a valuable tool in the exploration of literary works. Nonetheless, no specific emotional patterns, as determined by the proposed method, can be fully associated with particular literary themes.
Software authorship attribution, defined as the problem of software authentication and resolution... more Software authorship attribution, defined as the problem of software authentication and resolution of source code ownership, is of major relevance in the software engineering field. Authorship analysis of source code is more difficult than the classic task on literature, but it would be of great use in various software development activities such as software maintenance, software quality analysis or project management. This paper addresses the problem of code authorship attribution and introduces, as a proof of concept, a new supervised classification model AutoSoft for identifying the developer of a certain piece of code. The proposed model is composed of an ensemble of autoencoders that are trained to encode and recognize the programming style of software developers. An extension of the AutoSoft classifier, able to recognize an unknown developer (a developer that was not seen during the training), is also discussed and evaluated. Experiments conducted on software programs collected...
In this study it is proven that the Hrebs used in Denotation analysis of texts and Cohesion Chain... more In this study it is proven that the Hrebs used in Denotation analysis of texts and Cohesion Chains (defined as a fusion between Lexical Chains and Coreference Chains) represent similar linguistic tools. This result gives us the possibility to extend to Cohesion Chains (CCs) some important indicators as, for example the Kernel of CCs, the topicality of a CC, text concentration, CCdiffuseness and mean diffuseness of the text. Let us mention that nowhere in the Lexical Chains or Coreference Chains literature these kinds of indicators are introduced and used since now. Similarly, some applications of CCs in the study of a text (as for example segmentation or summarization of a text) could be realized starting from hrebs. As an illustration of the similarity between Hrebs and CCs a detailed analyze of the poem "Lacul" by Mihai Eminescu is given.
2017 13th IEEE International Conference on Intelligent Computer Communication and Processing (ICCP), 2017
Semantic lexicons are widely used resources in the implementation of solutions for a number of Na... more Semantic lexicons are widely used resources in the implementation of solutions for a number of Natural Language Processing tasks. However, availability of these lexicons is limited for most languages other than English, including Romanian. The aim of this paper is to bridge this gap by processing and improving a translated lexicon of Romanian words tagged with Plutchik's eight emotions and polarity data. We present the development of RoEmoLex (Romanian Emotion Lexicon) from early stages to an in depth analysis of its content using Formal Concept Analysis. Results show that the lexicon could represent a stepping stone in the creation of efficient emotion analysis systems for the Romanian language.
This paper discusses the comparison between two kinds of features in clustering of some literary ... more This paper discusses the comparison between two kinds of features in clustering of some literary poems by the same author, the Romanian poet, Mihai Eminescu. Using Precision, Recall, Rand Index, Relative Precision and Purity measures we conclude that the topics of poems are better characterized by the phonemes as features than by geometric properties (described by six indicators: V/N ;A; Λ;V ar(Λ), Gini;V ar(Gini)) of the rank-frequency sequence of word forms.
2018 IEEE 14th International Conference on Intelligent Computer Communication and Processing (ICCP), 2018
Mental health represents an important topic in the discussion about public health, as the prevale... more Mental health represents an important topic in the discussion about public health, as the prevalence of disorders increases, and the consequences are harder to manage. One way to address this problem is the use of computational methods to process social media data that provides valuable insight into individuals’ everyday activity. Particularly, there has been a focus on studying the language of mental health in textual content, as word choice, discourse structure and style have significant psychological implications. This paper aims to quantify the language of mental disorders in a set of posts and comments collected from a Romanian medical forum and explore illness specific characteristics through classification tasks. Results suggest that both vocabulary use and form of textual expression are largely discriminative for individual mental disorders.
Uploads
Papers by Mihaiela Lupea