Papers by Santwana Sagnika
Advances in Intelligent Systems and Computing
Machine Learning and Information Processing
Journal of Engineering Science and Technology Review
International Journal of Computer Applications
India's heritage texts have had a long history of being mined for knowledge of language and cultu... more India's heritage texts have had a long history of being mined for knowledge of language and culture by Christian missionaries to India, colonial officers of the East Indian Company and the British Raj, German, European and American Indologists and later by native scholars driven by nationalist sentiments. It was during their investigative exercises that a vast body of India's heritage texts was recovered and made the subject of rigorous study. A large number of editions in English translation as well as in modern Indian vernacular languages started appearing on the scene. The focus then was primarily on patthoddhar [retrieval of the 'ur'-text] or making shuddhasanskarana [correct edition]. The exercise was purely manual and time-consuming and concentrated on a limited number of texts. But there still lies a vast treasure of ancient knowledge in India's palm leaf manuscripts, waiting to be discovered, deciphered and interpreted for contemporary readers and scholars. It is impossible to ignore the ubiquitousness of Information Technology based tools and the scope that they offer for large-scale data mining. Of late, a large body of historical texts is being made available digitally by repositories and institutions worldwide. The time is ripe for digitally inspired editions, beginning with studies in corpus linguistics. This paper throws light on the challenges to be addressed for the preparation of a digital historical corpus edition of Sarala Mahabharata, a local version of the famous Sanskrit Mahabharata by Vyasa, from Odisha in the eastern part of India.
Periodica Polytechnica Electrical Engineering and Computer Science
Cognitive radio systems have taken a fore-running position in the wireless communication technolo... more Cognitive radio systems have taken a fore-running position in the wireless communication technology. With most of the communication taking place through multi-carrier systems, the allocation of available spectrum to various carriers is a prominent issue. Since cognitive systems provide an environment of dynamic spectrum allocation, it becomes necessary to perform dynamic spectrum allocation swiftly with due consideration of parameters, like power consumption, fair distribution and minimal error. This paper considers a Particle Swarm Optimization-based approach, popularly used for solving large problems involving complex solution spaces to reach an optimal solution within feasible time. The mentioned spectrum allocation problem has been solved using PSO with a view to maximize the total transfer rate of the system, within specified constraints of maximum error rate, maximum power consumption and minimum transfer rate per user. The results have been compared with the existing Genetic ...
JOURNAL OF ENGINEERING SCIENCE AND TECHNOLOGY REVIEW
In the last few years, the size and functionality of software have experienced a massive growth. ... more In the last few years, the size and functionality of software have experienced a massive growth. Along with this, cost estimation plays a major role in the whole cycle of software development, and hence, it is a necessary task that should be done before the development cycle begins and may run throughout the software life cycle. It helps in making accurate estimation for any project so that appropriate charges and delivery date can be obtained. It also helps in identifying the effort required for developing the application, which assures the project acceptance or denial. Since late 90's, Agile Software Development (ASD) methodologies have shown high success rates for projects due to their capability of coping with changing requirements of the customers. Commencing product development using agile methods is a challenging task due to the live and dynamic nature of ASD. So, accurate cost estimation is a must for such development models in order to fine-tune the delivery date and estimation, while keeping the quality of software as the most important priority. This paper presents a systematic survey of cost estimation in ASD, which will be useful for the agile users to understand current trends in cost estimation in ASD.
International Journal of Computer Applications, 2017
Sentiment Analysis (SA) is one of the greatest broadly planned applications of Natural Language P... more Sentiment Analysis (SA) is one of the greatest broadly planned applications of Natural Language Processing (NLP) and Machine Learning (ML). This field has grown enormously with the advent of the Web 2.0. The Internet has as long as a platform for people to express their opinions, emotions and feelings towards products, persons, and life in general. Accordingly, the Internet is nowadays a massive resource of opinion amusing written data. A vital job of sentiment analysis is sentiment classification, which intentions to automatically classify opinionated text as being negative, positive, or neutral. This paper provides a comparative study on sentimental analysis and its applications mostly for recommendation system. Recommender systems have grown to be a serious research area after the emergence of the first paper on collaborative filtering in the Nineties.
Studies in Big Data, 2016
Studies in Big Data, 2016
Journal of Theoretical and Applied Information Technology
Change detection refers to recognizing dissimilarities arising in the characteristics of an objec... more Change detection refers to recognizing dissimilarities arising in the characteristics of an object, over a period of time. Widespread application of change detection in areas like remote sensing, machine vision, video compression, military reconnaissance, etc. has made it demanding area of research. In image processing, detecting changes is an essential and crucial component. Several techniques like image differencing, principal component analysis, object based methods, visual analysis, etc. have been applied successfully and some new techniques like clustering, probabilistic change detection, hyperspectral detection, etc. are currently under active research. This paper analyses various traditional and emerging techniques that efficiently detect changes in different kinds of images and explores the suitability of each method for respective areas.
All Rights Reserved Ā© 2013 IJARCET 2678 ļ Abstractā a transparency is some aspect of the distribu... more All Rights Reserved Ā© 2013 IJARCET 2678 ļ Abstractā a transparency is some aspect of the distributed database system that is hidden from the user (programmer, system developer, and application program). A transparency is provided by including some set of mechanisms in the distributed system at a layer interface where the transparency is required. A number of basic transparencies have been defined for a distributed system. This paper present an overview of different types of transparency such as (Access,, and three type of levels transparency which is (Fragmentation, Location, and Local mapping). It's important to realize that not all of these are appropriate for every system, or are available at the same level of interface. In this paper, if all transparency have an associated cost with others, matter for much research how the cost of implementing multiple transparencies interact and how to reduce operating system and communication stack. It considered that if using such type of...
International Journal of Modern Education and Computer Science, 2015
Cloud computing is a popular computing concept that performs processing of huge volume of data us... more Cloud computing is a popular computing concept that performs processing of huge volume of data using highly accessible geographically distributed resources that can be accessed by users on the basis of Pay as per Use policy. Requirements of different users may change so the amount of processing involved in such paradigm also changes. Sometimes they need huge data processing. Such highly volumetric processing results in higher computing time and cost which is not a desirable part of a good computing model. So there must be some intelligent distribution of user's work on the available resources which will result in an optimized computing environment. This paper gives a comprehensive survey on such problems and provide a detailed analysis of some best scheduling techniques from the domain of soft computing with their performance in cloud computing.
2015 International Conference on Computational Intelligence and Networks, 2015
ABSTRACT
International Journal of Computer Applications, 2014
Cloud computing is a popular computing paradigm that performs processing of huge volumes of data ... more Cloud computing is a popular computing paradigm that performs processing of huge volumes of data using highly available geographically distributed resources that can be accessed by users on the basis of Pay As per Use policy. In the modern computing environment where the amount of data to be processed is increasing day by day, the costs involved in the transmission and execution of such amount of data is mounting significantly. So there is a requirement of appropriate scheduling of tasks which will help to manage the escalating costs of data intensive applications. This paper analyzes various evolutionary and swarm based task scheduling algorithms that address the above mentioned problem.
Advances in Intelligent Systems and Computing, 2014
2014 IEEE International Advance Computing Conference (IACC), 2014
Cloud computing is new era of network based computing, where resources are distributed over the n... more Cloud computing is new era of network based computing, where resources are distributed over the network and shared among its users. Any user can use these resources through internet on the basis of Pay-As-Per-Use system. A service used by any user can produce a very large amount of data. So in this case, the data transfer cost between two dependent resources will be very high. In addition, a complex application can have a large number of tasks which may cause an increase in total cost of execution of that application, if not scheduled in an optimized way. So to overcome these problems, the authors present a Cat Swarm Optimization (CSO)-based heuristic scheduling algorithm to schedule the tasks of an application onto available resources. The CSO heuristic algorithm considers both data transmission cost between two dependent resources and execution cost of tasks on different resources. The authors experiment with the proposed CSO algorithm using a hypothetical workflow and compare the workflow scheduling results with the existing Particle Swarm Optimization (PSO) algorithm. The experimental results show-(1) CSO gives an optimal task-to-resource (TOR) scheduling scheme that minimizes the total cost, (2) CSO shows an improvement over existing PSO in terms of number of iterations, and (3) CSO ensures fair load distribution on the available resources.
Advances in Intelligent Systems and Computing, 2014
Multimedia Tools and Applications
Uploads
Papers by Santwana Sagnika