International journal of computer applications, Dec 17, 2015
A large number of reviews for the product are available on the internet .To classify these review... more A large number of reviews for the product are available on the internet .To classify these reviews is very difficult task. The Sentiment classification is one of the ongoing research areas in text mining field which is used for classifying the polarity of the reviews. In this paper, we study the survey of different techniques for sentiment classification.
It is an exclusive approach for steganography using a reversible texture synthesis. The texture s... more It is an exclusive approach for steganography using a reversible texture synthesis. The texture synthesis process recreate a smaller texture image, this image synthesizes a new texture image with a similar local presence. The new image may be in arbitrary size. This idea comprises of texture synthesis procedure into steganography to hide secret messages or posts. As an alternative of spending an existing cover image to hide messages, the systems wait the source texture image and inserts secret messages by means of texture synthesis. It establishments us to extract secret messages and the source texture. This method has several benefits like submit the embedding capacity that is directly proportional to the size of the stego image. The steganography approach is not likely to discomfiture by the steganalytic algorithm. Because of reversible capability it gives allowance of recovery of the source texture. It compromises the embedding capabilities, harvest a visually plausible texture images and recover the source texture." Keywords-Steganography, Data embedding, Texture synthesis. I. INTRODUCTION In extreme of the previous methods for the cover image, a prevailing image is used. Due to this image there is distortion when embedding secrete message. And it introduces two drawbacks [1]. First, it limits the ability of embedding message because it stretches outcome as image distortion. So the embedding capability and quality of cover image is condensed. Second, it is possible to expose the unseen message in to stego image by using any image stegnalitic algorithm. This survey paper comprise a texture synthesis process to re-samples a small texture image which may be captured or tense to generate a new texture image with an arbitrary size and akin appearance .To hide source texture and secret messages it introduce a new method in steganography called as texture synthesis process. In course of texture synthesis as an alternative of using cover image to hide secrete message, this algorithm use texture synthesis to embed message and hide source texture. First, it creates a stego image from source texture which gives the benefit of reversibility. This tactic has three benefits. First, though the texture synthesis provides the ability to synthesize texture images of arbitrary size, the embedding capability is become proportional to the stego texture image size. Secondly, steganography approach incapable to defeat by the steganalytic algorithm because instead of changing the matters of existing image this approach unite the stego texture image from source texture. Third, to improve the unique source texture this pattern bids the ability called reversibility. The reversibility method generates the exactly similar and visually plausible source texture image to the original texture give chance to apply second plump of steganography for more mystery of the message. II. LITERATURE SURVEY A. Line-Based Cubism-like Approach [2] Now days, the theme of automatic generation of art image creation by using the computers rises the interests of many computer operators. In the paper the writer design new algorithms of producing art images by the method of stroke-based rendering. This is an involuntary method for producing non
Advances in intelligent systems and computing, Sep 30, 2022
Development portrayal is a troublesome investigation issue in the space of therapeutic science. T... more Development portrayal is a troublesome investigation issue in the space of therapeutic science. The frontal cortex development is fundamentally fundamental and envisions a threat to life. We propose a novel and totally automated brain development portrayal CAD system using significant learning estimations. The implemented model involves steps like pre-planning, division, features planning, and portrayal. In division, we revolve around discovering the ROI of tainted regions using dynamic thresholding. We dealt with the pre-taken care of MR image to the generous CNN model for modified features extraction using the prearranged ResNet50 model. The isolated components are moreover lessened using the Principal Component Analysis (PCA). We plan the Long Short-Term Memory (LSTM) classifier to vanquish the dissipating point issue. The vanishing point issue of neural association classifiers prompts request botches.
International Journal of Innovative Research in Computer and Communication Engineering, Jul 30, 2016
Today's era of technology and advancements, the centric issue is in focus of Data Mining, which m... more Today's era of technology and advancements, the centric issue is in focus of Data Mining, which mainly include Information Retrieval (IR) or rather it can be said as Relevant Information Retrieval. Information Retrieval has become a challenging area of researcher's, because there is rapid growth of computer and Internet technologies. This rapid growth generates huge amount of text-based data, for example, web pages, e-mails, news articles, etc. Manually handling this large amount of corpus is quiet complicated, expensive and in feasible. To handle and organize this large amount of data many techniques are available such as Text classifiers, Text categorization, etc. The main issue is Information retrieval from this categorized or classified text. There are many approaches or algorithms for IR such as clustering, fusion, tokenization, Virtual Center based Matching function, K-Nearest Neighbor, Naive Bayes algorithm, Genetic Algorithm, Decision Tree, etc. The main objective of the researcher is extract relevant information from large amount of data which should be beneficial for increasing effectiveness, scalability, reliability and efficiency of the Information Retrieval System. The preliminary focus is to load the corpus on HDFS using Map-Reduce technique. After Map-Reduce technique, the preprocessing of the documents is done that includes tokenization, stop-word removal and stemming of the documents. The performance of IRS is calculated by fitness values of Genetic algorithm and Adaptive Genetic Algorithm.
Volume 4, Issue 3, May – June 2015 Page 65 Abstract This paper presents a technique to discover p... more Volume 4, Issue 3, May – June 2015 Page 65 Abstract This paper presents a technique to discover plant disease. We are concentrating on agriculture plants only. The method which we are using is relying on edge finding, color as well as histogram matching. This is nothing but content based image retrieval. In India, Farmers are bearing from an issues growing from several types of plant illnesses. Sometimes plant’s experts are also not able to identify the disease that results in need of recognition of accurate type of disease and consequently to crop spoil if not taken care of at appropriate time. Hence, we should take the advantage of available technology in automatic detection and categorization of agricultural plants has become crucial. This paper used CANNY’s edge detection procedure and compare the accuracy of disease plant with Sobel Edge Detection procedure and to obtain histogram for both healthy and unhealthy plant leaves. After histogram design, we are comparing all the stor...
2014 Fourth International Conference on Communication Systems and Network Technologies, 2014
Cloud computing is an emerging, on-demand and Internet based popular technology. It provides lots... more Cloud computing is an emerging, on-demand and Internet based popular technology. It provides lots of services over Internet such as, software, hardware, network, data storage, and computing and memory management. Besides this companies are binding their business from cloud computing because the fear of data leakage. Due to lack of security control mechanism and weakness in safeguard which lead to many security issues in cloud computing. In Internet world, Confidentiality, Integrity and availability is a model designed to guide policies for information security within business organization. CIA (Confidentiality, Integrity, Availability)[1] factors are used for data classification which is done by client before storing the data on server. The CIA factors are the measure part of CIA Algorithm. The client who wants to upload the data for storage needs to give the value of the factors C (confidentiality), I (integrity), and A (Availability). The value of C is depends on level of secrecy at each level of data accessing and prevents unauthorized access of files, value of I based on how much accuracy is provided, unauthorized modification is required and reliability of information, and value of A is depends on accessibility of file. With the help of proposed method, the priority rating is calculated. Accordingly if data having the higher priority rating is considered to be critical and 3D security is recommended on that data or file. After completion of this phase the data which is received by cloud user for storage, uses 3 Dimentional system for accessibility.
Web content mining is the process of pulling knowledge from the content of documents or their des... more Web content mining is the process of pulling knowledge from the content of documents or their descriptions. This is one of the types of Web mining. Data mining is the computational process of discovering patterns in large data sets involving methods at the intersection of artificial intelligence (AI), machine learning, statistics, and database systems. The term used earlier – ‘Web mining’ is one of the methods of data mining. It is the extraction of interesting and potentially useful patterns and implicit information from artifacts or activity related to the World Wide Web (WWW). This theory can be applied at various places like agriculture sector, etc. Keywords—Web mining, Support Vector Machine, AI, SGDM, Hue, SIFT, neural network classifier, DWT.
2020 International Conference for Emerging Technology (INCET), 2020
One of the life-threatening disease affecting the brain is the cancer of the brain. Detection of ... more One of the life-threatening disease affecting the brain is the cancer of the brain. Detection of the tumor at an early stage becomes essential in order to save life’s. One of the techniques used for detection of brain tumor is based on medical images. Deep learning is being used in order to detect brain tumor. Using deep learning techniques, has shown reduction of error in human early diagnosis of the disease. Especially, brain tumor diagnosis requires high accuracy, where minute errors in diagnosis may lead to complications. In medical image processing, brain tumor disclosure remains a demanding job. The image of the brain is complicated to detect the tumor. Several noises moreover delay affects the image accuracy. Image segmentation and MRI (magnetic resonance imaging) techniques have become a helpful medical diagnostic tool for the examination of the brain and other medical images. Image segmentation is an influential area of medical image processing. It is applied to bring out t...
Cloud computing is an emerging, on-demand and internetbased technology. It provides variety of se... more Cloud computing is an emerging, on-demand and internetbased technology. It provides variety of services over internet such as, software, hardware, data storage and infrastructure. To utilize these services by authorized customer and to secure a data on the cloud, it is necessary to have strict security checking system. The 3D security checking system by using the multi-level authentication technique generates the password in multiple levels to access the cloud services. This system is able for thwarting Shoulder attack, Tempest attack, and Brute-force attack, dictionary attacks and many more which are present at client side, with the use of strong techniques in the Graphical password.
“It is an exclusive approach for steganography using a reversible texture synthesis. The texture ... more “It is an exclusive approach for steganography using a reversible texture synthesis. The texture synthesis process re-create a smaller texture image, this image synthesizes a new texture image with a similar local presence. The new image may be in arbitrary size. This idea comprises of texture synthesis procedure into steganography to hide secret messages or posts. As an alternative of spending an existing cover image to hide messages, the systems wait the source texture image and inserts secret messages by means of texture synthesis. It establishments us to extract secret messages and the source texture. This method has several benefits like submit the embedding capacity that is directly proportional to the size of the stego image. The steganography approach is not likely to discomfiture by the steganalytic algorithm. Because of reversible capability it gives allowance of recovery of the source texture. It compromises the embedding capabilities, harvest a visually plausible texture images and recover the source texture.” KeywordsSteganography, Data embedding, Texture synthesis.
It is an exclusive method or approach for a steganography, which is used for communication or sha... more It is an exclusive method or approach for a steganography, which is used for communication or sharing the private data with the help of multimedia carriers. There are many stenographic approaches available. The main goal of using stenography is to hide the data or secret data from the striker. This paper explains steganography in texture image. Texture formation or syntheses is a process in which we are first taking large texture image from a smaller texture image, we then merge texture synthesis image with steganography data to embed secret message and generate arbitrary size image and by using Image Quilting we can enhance the quality of the image and a simple image-based method of generating novel visual appearance in which a new image is synthesized by stitching together small patches of existing images. We call this process image quilting. First, we use quilting as a fast and very simple texture synthesis algorithm which produces surprisingly good results for a wide range of te...
Journal of emerging technologies and innovative research, 2020
Social network sites involve billions of users around the world wide. User interactions with thes... more Social network sites involve billions of users around the world wide. User interactions with these social sites, like twitter have tremendous and occasionally undesirable impact implications for daily life. The major social networking sites have become a target platform for spammers to disperse a large amount of irrelevant and harmful information. Twitter, it has become one of the most extravagant platforms of all time and, most popular microblogging services which is generally used to share unreasonable amount of spam. Fake users send unwanted tweets to users to promote services or websites that do not only affect legitimate users, but also interrupt resource consumption. Furthermore, the possibility of expanding invalid information to users through false identities has increased, resulting in malicious content. Recently, the detection of spammers and the identification of fake users and fake tweets on Twitter has become an important area of research in online social networks (OSN)...
Wireless Sensor Networks are used in many applications. It consists of sensor nodes. Sensor nodes... more Wireless Sensor Networks are used in many applications. It consists of sensor nodes. Sensor nodes are powered by renewable batteries. Since it is not possible to replenish sensor node battery, it is important to conserve node energy. Plenty of work in this field is done in homogeneous WSNs, research in heterogeneous WSN still remain unexplored. The proposed approach uses ACO methodology to maximize lifetime of heterogeneous WSN. Proposed algorithm uses two types of pheromones to find shortest distance from source to destination. Based on pheromone and heuristic information, the ants seek an optimal path on the network. Further if the data packet to be sent is very small then the non participating nodes are put to low power state rather than to sleep state. This helps to reduce the power consumed during transition from active state to sleep state and vice versa. The results show that the approach is effective and efficient in finding high quality solutions for maximizing the lifetime...
Wireless Sensor Network is made up of numerous tiny sensor nodes. These sensor nodes are battery ... more Wireless Sensor Network is made up of numerous tiny sensor nodes. These sensor nodes are battery powered. Maximizing the lifetime of wireless sensor network is very important. Inspired by the noval approach followed by Ant Colony Optimization algorithm to solve combinatorial problems, this paper proposes an ACO-based approach to solve the energy efficient coverage problem of wireless sensor network. The algorithm finds maximum number of disjoint sensor node sets that satisfies sensing coverage and network connectivity. Probabilistic sensor detection model is used to get the intensity of sensed data. The approach is effective and efficient in increasing the lifetime of wireless sensor network.
There are large number of websites, web portals available in market places in order to buy or sel... more There are large number of websites, web portals available in market places in order to buy or sell objects that means any kind of products. For examples filpkart, ebay, Amazon etc. due increase number of this web portals as well as its application Text mining, integration is become important area of Data mining which deals with classification. The problem with this portal is data integration. That is product with there description placed and identify in Record. The straight method for automate this process is to learn classifier for each text in document and classify and predict them there category. In this paper we used powerful method based on parallel text classification. To attack above problem availability of source data or document could help to find out better prediction. We formulated above problem to the best from our knowledge and study we showed classifier with parallel approach. Our analysis and empirical calculation gives substantial improvement in results and output wi...
Association rule mining is one of the most important aspects of data mining. It aims at searching... more Association rule mining is one of the most important aspects of data mining. It aims at searching for interesting relationships among items in a large data set or database and discovers association rules among the large no of item sets. The importance of ARM is increasing with the demand of finding frequent patterns from large data sources. Researchers developed a lot of algorithms and techniques for generating association rules. Among the existing techniques Apriori and frequent pattern methods are the most efficient and scalable options. Apriori algorithm mines the frequent item set by generating the candidate data set. But this takes lot of time and space. To overcome this drawback FP growth algorithm is introduced which mines the frequent items without generating candidate data set. But the obstacle is it generates a massive number of conditional FP trees. An alternate approach to this is systolic tree due to its faster execution and high throughput. In this paper the comparativ...
Now-a-days an internet has become an essential thing, as it provides more facilities to its users... more Now-a-days an internet has become an essential thing, as it provides more facilities to its users. There are many social networking sites which offer users to share their views. People share their thoughts about politics, social issues as well as about different products. It is a common practice today that before purchasing anything user checks the reviews of that product online. There are multiple sites which deal with these reviews. They provide ratings for the products as well as show comparison between different products. Some enterprises attempt to create fake reviews to affect customer behaviours and increase their sales. But, how to identify those fake reviews is a difficult task for customers. In today’s world of competition it is necessary for any enterprise to maintain its reputation in a market. So it is necessary for both, i.e. enterprise and customer to identify manipulated reviews. This paper studies different approaches for identifying manipulated reviews and proposes...
The M-band wavelet decomposition, which is a direct generalization of the standard 2-band wavelet... more The M-band wavelet decomposition, which is a direct generalization of the standard 2-band wavelet decomposition is applied to the problem of an unsupervised segmentation of different texture images. Orthogonal and linear phase M-band wavelet transform is used to decompose the image into MXM channels. Various sections of these bandpass sections are combined to obtain different scales and orientations in the frequency plane. Texture features are extracted by applying each bandpass section to a nonlinear transformation and computing the measure of energy in a window around each pixel of the filtered texture images. Then the window size is adaptively selected depending on the frequency content of the images. Unsupervised texture segmentation derived by combination of different clustering and feature extraction techniques is compared. Keywords— Discrete Wavelet Transform, M-band WT, Discrete Wavelet Packet, K Means, FarthestFirst.
International journal of computer applications, Dec 17, 2015
A large number of reviews for the product are available on the internet .To classify these review... more A large number of reviews for the product are available on the internet .To classify these reviews is very difficult task. The Sentiment classification is one of the ongoing research areas in text mining field which is used for classifying the polarity of the reviews. In this paper, we study the survey of different techniques for sentiment classification.
It is an exclusive approach for steganography using a reversible texture synthesis. The texture s... more It is an exclusive approach for steganography using a reversible texture synthesis. The texture synthesis process recreate a smaller texture image, this image synthesizes a new texture image with a similar local presence. The new image may be in arbitrary size. This idea comprises of texture synthesis procedure into steganography to hide secret messages or posts. As an alternative of spending an existing cover image to hide messages, the systems wait the source texture image and inserts secret messages by means of texture synthesis. It establishments us to extract secret messages and the source texture. This method has several benefits like submit the embedding capacity that is directly proportional to the size of the stego image. The steganography approach is not likely to discomfiture by the steganalytic algorithm. Because of reversible capability it gives allowance of recovery of the source texture. It compromises the embedding capabilities, harvest a visually plausible texture images and recover the source texture." Keywords-Steganography, Data embedding, Texture synthesis. I. INTRODUCTION In extreme of the previous methods for the cover image, a prevailing image is used. Due to this image there is distortion when embedding secrete message. And it introduces two drawbacks [1]. First, it limits the ability of embedding message because it stretches outcome as image distortion. So the embedding capability and quality of cover image is condensed. Second, it is possible to expose the unseen message in to stego image by using any image stegnalitic algorithm. This survey paper comprise a texture synthesis process to re-samples a small texture image which may be captured or tense to generate a new texture image with an arbitrary size and akin appearance .To hide source texture and secret messages it introduce a new method in steganography called as texture synthesis process. In course of texture synthesis as an alternative of using cover image to hide secrete message, this algorithm use texture synthesis to embed message and hide source texture. First, it creates a stego image from source texture which gives the benefit of reversibility. This tactic has three benefits. First, though the texture synthesis provides the ability to synthesize texture images of arbitrary size, the embedding capability is become proportional to the stego texture image size. Secondly, steganography approach incapable to defeat by the steganalytic algorithm because instead of changing the matters of existing image this approach unite the stego texture image from source texture. Third, to improve the unique source texture this pattern bids the ability called reversibility. The reversibility method generates the exactly similar and visually plausible source texture image to the original texture give chance to apply second plump of steganography for more mystery of the message. II. LITERATURE SURVEY A. Line-Based Cubism-like Approach [2] Now days, the theme of automatic generation of art image creation by using the computers rises the interests of many computer operators. In the paper the writer design new algorithms of producing art images by the method of stroke-based rendering. This is an involuntary method for producing non
Advances in intelligent systems and computing, Sep 30, 2022
Development portrayal is a troublesome investigation issue in the space of therapeutic science. T... more Development portrayal is a troublesome investigation issue in the space of therapeutic science. The frontal cortex development is fundamentally fundamental and envisions a threat to life. We propose a novel and totally automated brain development portrayal CAD system using significant learning estimations. The implemented model involves steps like pre-planning, division, features planning, and portrayal. In division, we revolve around discovering the ROI of tainted regions using dynamic thresholding. We dealt with the pre-taken care of MR image to the generous CNN model for modified features extraction using the prearranged ResNet50 model. The isolated components are moreover lessened using the Principal Component Analysis (PCA). We plan the Long Short-Term Memory (LSTM) classifier to vanquish the dissipating point issue. The vanishing point issue of neural association classifiers prompts request botches.
International Journal of Innovative Research in Computer and Communication Engineering, Jul 30, 2016
Today's era of technology and advancements, the centric issue is in focus of Data Mining, which m... more Today's era of technology and advancements, the centric issue is in focus of Data Mining, which mainly include Information Retrieval (IR) or rather it can be said as Relevant Information Retrieval. Information Retrieval has become a challenging area of researcher's, because there is rapid growth of computer and Internet technologies. This rapid growth generates huge amount of text-based data, for example, web pages, e-mails, news articles, etc. Manually handling this large amount of corpus is quiet complicated, expensive and in feasible. To handle and organize this large amount of data many techniques are available such as Text classifiers, Text categorization, etc. The main issue is Information retrieval from this categorized or classified text. There are many approaches or algorithms for IR such as clustering, fusion, tokenization, Virtual Center based Matching function, K-Nearest Neighbor, Naive Bayes algorithm, Genetic Algorithm, Decision Tree, etc. The main objective of the researcher is extract relevant information from large amount of data which should be beneficial for increasing effectiveness, scalability, reliability and efficiency of the Information Retrieval System. The preliminary focus is to load the corpus on HDFS using Map-Reduce technique. After Map-Reduce technique, the preprocessing of the documents is done that includes tokenization, stop-word removal and stemming of the documents. The performance of IRS is calculated by fitness values of Genetic algorithm and Adaptive Genetic Algorithm.
Volume 4, Issue 3, May – June 2015 Page 65 Abstract This paper presents a technique to discover p... more Volume 4, Issue 3, May – June 2015 Page 65 Abstract This paper presents a technique to discover plant disease. We are concentrating on agriculture plants only. The method which we are using is relying on edge finding, color as well as histogram matching. This is nothing but content based image retrieval. In India, Farmers are bearing from an issues growing from several types of plant illnesses. Sometimes plant’s experts are also not able to identify the disease that results in need of recognition of accurate type of disease and consequently to crop spoil if not taken care of at appropriate time. Hence, we should take the advantage of available technology in automatic detection and categorization of agricultural plants has become crucial. This paper used CANNY’s edge detection procedure and compare the accuracy of disease plant with Sobel Edge Detection procedure and to obtain histogram for both healthy and unhealthy plant leaves. After histogram design, we are comparing all the stor...
2014 Fourth International Conference on Communication Systems and Network Technologies, 2014
Cloud computing is an emerging, on-demand and Internet based popular technology. It provides lots... more Cloud computing is an emerging, on-demand and Internet based popular technology. It provides lots of services over Internet such as, software, hardware, network, data storage, and computing and memory management. Besides this companies are binding their business from cloud computing because the fear of data leakage. Due to lack of security control mechanism and weakness in safeguard which lead to many security issues in cloud computing. In Internet world, Confidentiality, Integrity and availability is a model designed to guide policies for information security within business organization. CIA (Confidentiality, Integrity, Availability)[1] factors are used for data classification which is done by client before storing the data on server. The CIA factors are the measure part of CIA Algorithm. The client who wants to upload the data for storage needs to give the value of the factors C (confidentiality), I (integrity), and A (Availability). The value of C is depends on level of secrecy at each level of data accessing and prevents unauthorized access of files, value of I based on how much accuracy is provided, unauthorized modification is required and reliability of information, and value of A is depends on accessibility of file. With the help of proposed method, the priority rating is calculated. Accordingly if data having the higher priority rating is considered to be critical and 3D security is recommended on that data or file. After completion of this phase the data which is received by cloud user for storage, uses 3 Dimentional system for accessibility.
Web content mining is the process of pulling knowledge from the content of documents or their des... more Web content mining is the process of pulling knowledge from the content of documents or their descriptions. This is one of the types of Web mining. Data mining is the computational process of discovering patterns in large data sets involving methods at the intersection of artificial intelligence (AI), machine learning, statistics, and database systems. The term used earlier – ‘Web mining’ is one of the methods of data mining. It is the extraction of interesting and potentially useful patterns and implicit information from artifacts or activity related to the World Wide Web (WWW). This theory can be applied at various places like agriculture sector, etc. Keywords—Web mining, Support Vector Machine, AI, SGDM, Hue, SIFT, neural network classifier, DWT.
2020 International Conference for Emerging Technology (INCET), 2020
One of the life-threatening disease affecting the brain is the cancer of the brain. Detection of ... more One of the life-threatening disease affecting the brain is the cancer of the brain. Detection of the tumor at an early stage becomes essential in order to save life’s. One of the techniques used for detection of brain tumor is based on medical images. Deep learning is being used in order to detect brain tumor. Using deep learning techniques, has shown reduction of error in human early diagnosis of the disease. Especially, brain tumor diagnosis requires high accuracy, where minute errors in diagnosis may lead to complications. In medical image processing, brain tumor disclosure remains a demanding job. The image of the brain is complicated to detect the tumor. Several noises moreover delay affects the image accuracy. Image segmentation and MRI (magnetic resonance imaging) techniques have become a helpful medical diagnostic tool for the examination of the brain and other medical images. Image segmentation is an influential area of medical image processing. It is applied to bring out t...
Cloud computing is an emerging, on-demand and internetbased technology. It provides variety of se... more Cloud computing is an emerging, on-demand and internetbased technology. It provides variety of services over internet such as, software, hardware, data storage and infrastructure. To utilize these services by authorized customer and to secure a data on the cloud, it is necessary to have strict security checking system. The 3D security checking system by using the multi-level authentication technique generates the password in multiple levels to access the cloud services. This system is able for thwarting Shoulder attack, Tempest attack, and Brute-force attack, dictionary attacks and many more which are present at client side, with the use of strong techniques in the Graphical password.
“It is an exclusive approach for steganography using a reversible texture synthesis. The texture ... more “It is an exclusive approach for steganography using a reversible texture synthesis. The texture synthesis process re-create a smaller texture image, this image synthesizes a new texture image with a similar local presence. The new image may be in arbitrary size. This idea comprises of texture synthesis procedure into steganography to hide secret messages or posts. As an alternative of spending an existing cover image to hide messages, the systems wait the source texture image and inserts secret messages by means of texture synthesis. It establishments us to extract secret messages and the source texture. This method has several benefits like submit the embedding capacity that is directly proportional to the size of the stego image. The steganography approach is not likely to discomfiture by the steganalytic algorithm. Because of reversible capability it gives allowance of recovery of the source texture. It compromises the embedding capabilities, harvest a visually plausible texture images and recover the source texture.” KeywordsSteganography, Data embedding, Texture synthesis.
It is an exclusive method or approach for a steganography, which is used for communication or sha... more It is an exclusive method or approach for a steganography, which is used for communication or sharing the private data with the help of multimedia carriers. There are many stenographic approaches available. The main goal of using stenography is to hide the data or secret data from the striker. This paper explains steganography in texture image. Texture formation or syntheses is a process in which we are first taking large texture image from a smaller texture image, we then merge texture synthesis image with steganography data to embed secret message and generate arbitrary size image and by using Image Quilting we can enhance the quality of the image and a simple image-based method of generating novel visual appearance in which a new image is synthesized by stitching together small patches of existing images. We call this process image quilting. First, we use quilting as a fast and very simple texture synthesis algorithm which produces surprisingly good results for a wide range of te...
Journal of emerging technologies and innovative research, 2020
Social network sites involve billions of users around the world wide. User interactions with thes... more Social network sites involve billions of users around the world wide. User interactions with these social sites, like twitter have tremendous and occasionally undesirable impact implications for daily life. The major social networking sites have become a target platform for spammers to disperse a large amount of irrelevant and harmful information. Twitter, it has become one of the most extravagant platforms of all time and, most popular microblogging services which is generally used to share unreasonable amount of spam. Fake users send unwanted tweets to users to promote services or websites that do not only affect legitimate users, but also interrupt resource consumption. Furthermore, the possibility of expanding invalid information to users through false identities has increased, resulting in malicious content. Recently, the detection of spammers and the identification of fake users and fake tweets on Twitter has become an important area of research in online social networks (OSN)...
Wireless Sensor Networks are used in many applications. It consists of sensor nodes. Sensor nodes... more Wireless Sensor Networks are used in many applications. It consists of sensor nodes. Sensor nodes are powered by renewable batteries. Since it is not possible to replenish sensor node battery, it is important to conserve node energy. Plenty of work in this field is done in homogeneous WSNs, research in heterogeneous WSN still remain unexplored. The proposed approach uses ACO methodology to maximize lifetime of heterogeneous WSN. Proposed algorithm uses two types of pheromones to find shortest distance from source to destination. Based on pheromone and heuristic information, the ants seek an optimal path on the network. Further if the data packet to be sent is very small then the non participating nodes are put to low power state rather than to sleep state. This helps to reduce the power consumed during transition from active state to sleep state and vice versa. The results show that the approach is effective and efficient in finding high quality solutions for maximizing the lifetime...
Wireless Sensor Network is made up of numerous tiny sensor nodes. These sensor nodes are battery ... more Wireless Sensor Network is made up of numerous tiny sensor nodes. These sensor nodes are battery powered. Maximizing the lifetime of wireless sensor network is very important. Inspired by the noval approach followed by Ant Colony Optimization algorithm to solve combinatorial problems, this paper proposes an ACO-based approach to solve the energy efficient coverage problem of wireless sensor network. The algorithm finds maximum number of disjoint sensor node sets that satisfies sensing coverage and network connectivity. Probabilistic sensor detection model is used to get the intensity of sensed data. The approach is effective and efficient in increasing the lifetime of wireless sensor network.
There are large number of websites, web portals available in market places in order to buy or sel... more There are large number of websites, web portals available in market places in order to buy or sell objects that means any kind of products. For examples filpkart, ebay, Amazon etc. due increase number of this web portals as well as its application Text mining, integration is become important area of Data mining which deals with classification. The problem with this portal is data integration. That is product with there description placed and identify in Record. The straight method for automate this process is to learn classifier for each text in document and classify and predict them there category. In this paper we used powerful method based on parallel text classification. To attack above problem availability of source data or document could help to find out better prediction. We formulated above problem to the best from our knowledge and study we showed classifier with parallel approach. Our analysis and empirical calculation gives substantial improvement in results and output wi...
Association rule mining is one of the most important aspects of data mining. It aims at searching... more Association rule mining is one of the most important aspects of data mining. It aims at searching for interesting relationships among items in a large data set or database and discovers association rules among the large no of item sets. The importance of ARM is increasing with the demand of finding frequent patterns from large data sources. Researchers developed a lot of algorithms and techniques for generating association rules. Among the existing techniques Apriori and frequent pattern methods are the most efficient and scalable options. Apriori algorithm mines the frequent item set by generating the candidate data set. But this takes lot of time and space. To overcome this drawback FP growth algorithm is introduced which mines the frequent items without generating candidate data set. But the obstacle is it generates a massive number of conditional FP trees. An alternate approach to this is systolic tree due to its faster execution and high throughput. In this paper the comparativ...
Now-a-days an internet has become an essential thing, as it provides more facilities to its users... more Now-a-days an internet has become an essential thing, as it provides more facilities to its users. There are many social networking sites which offer users to share their views. People share their thoughts about politics, social issues as well as about different products. It is a common practice today that before purchasing anything user checks the reviews of that product online. There are multiple sites which deal with these reviews. They provide ratings for the products as well as show comparison between different products. Some enterprises attempt to create fake reviews to affect customer behaviours and increase their sales. But, how to identify those fake reviews is a difficult task for customers. In today’s world of competition it is necessary for any enterprise to maintain its reputation in a market. So it is necessary for both, i.e. enterprise and customer to identify manipulated reviews. This paper studies different approaches for identifying manipulated reviews and proposes...
The M-band wavelet decomposition, which is a direct generalization of the standard 2-band wavelet... more The M-band wavelet decomposition, which is a direct generalization of the standard 2-band wavelet decomposition is applied to the problem of an unsupervised segmentation of different texture images. Orthogonal and linear phase M-band wavelet transform is used to decompose the image into MXM channels. Various sections of these bandpass sections are combined to obtain different scales and orientations in the frequency plane. Texture features are extracted by applying each bandpass section to a nonlinear transformation and computing the measure of energy in a window around each pixel of the filtered texture images. Then the window size is adaptively selected depending on the frequency content of the images. Unsupervised texture segmentation derived by combination of different clustering and feature extraction techniques is compared. Keywords— Discrete Wavelet Transform, M-band WT, Discrete Wavelet Packet, K Means, FarthestFirst.
Uploads
Papers by Deipali Gore