The major problems in a Wireless Sensor Networks (WSNs) is the localization problem, that relates... more The major problems in a Wireless Sensor Networks (WSNs) is the localization problem, that relates to how an area covers by the sensor nodes. In this study, the problem formulates as the decision problem, that takes the best location for all sensors in the sensor field. Butterfly Optimization Algorithm (BOA), proposes to calculate the estimate locations for all sensors. Simulating the BOA with using number of sensors from 25 to 150 sensors and number of the anchor nodes. The distance between sensors and anchors measures by Received Signal Strength (RSS) so, this strategy is known as RSS-BOA. The obtained results shed that, the performance of the proposed algorithm is more accurate in comparing with BOA approach in the term sensor's location and the average error.
Software-Defined Networking (SDN) is a developing architecture that provides scalability, flexibi... more Software-Defined Networking (SDN) is a developing architecture that provides scalability, flexibility, and efficient network management. However, optimal controller placement faces many problems, which affect the performance of the overall network. To resolve the Multi-controller SDN (MC-SDN) that is deployed in the SDN environment, we propose an approach that uses a hybrid metaheuristic algorithm that improves network performance. Initially, the proposed SDN network is constructed based on graph theory, which improves the connectivity and flexibility between switches and controllers. After that, the controller selection is performed by selecting an optimal controller from multiple controllers based on controller features using the firefly optimization algorithm (FA), which improves the network performance. Finally, multi-controller placement is performed to reduce the communication latency between the switch to controllers. Here, multiple controllers are placed by considering locat...
International Journal of Image and Graphics, Jul 21, 2023
Image editing technologies have been advanced that can significantly enhance the image, but can a... more Image editing technologies have been advanced that can significantly enhance the image, but can also be used maliciously. Colorization is a new image editing technology that uses realistic colors to colorize grayscale photos. However, this strategy can be used on natural color images for a malicious purpose (e.g. to confuse object recognition systems that depend on the colors of objects for recognition). Image forensics is a well-developed field that examines photos of specified conditions to build confidence and authenticity. This work proposes a new fake colorized image detection approach based on the special Residual Network (ResNet) architecture. ResNets are a kind of Convolutional Neural Networks (CNNs) architecture that has been widely adopted and applied for various tasks. At first, the input image is reconstructed via a special image representation that combines color information from three separate color spaces (HSV, Lab, and Ycbcr); then, the new reconstructed images have been used for training the proposed ResNet model. Experimental results have demonstrated that our proposed method is highly generalized and significantly robust for revealing fake colorized images generated by various colorization methods.
International Journal of Computational Intelligence and Applications
Nowadays, images have become one of the most popular forms of communication as image editing tool... more Nowadays, images have become one of the most popular forms of communication as image editing tools have evolved. Image manipulation, particularly image colorization, has become easier, making it harder to differentiate between fake colorized images and actual images. Furthermore, the RGB space is no longer considered to be the best option for color-based detection techniques due to the high correlation between channels and its blending of luminance and chrominance information. This paper proposes a new approach for fake colorized image detection based on a novel image representation created by combining color information from three separate color spaces (HSV, Lab, and Ycbcr) and selecting the most different channels from each color space to reconstruct the image. Features from the proposed image representation are extracted based on transfer learning using the pre-trained CNNs ResNet50 model. The Support Vector Machine (SVM) approach has been used for classification purposes due to ...
Internet-based platforms such as social media have a great deal of big data that is available in ... more Internet-based platforms such as social media have a great deal of big data that is available in the shape of text, audio, video, and image. Sentiment Analysis (SA) of this big data has become a field of computational studies. Therefore, SA is necessary in texts in the form of messages or posts to determine whether a sentiment is negative or positive. SA is also crucial for the development of opinion mining systems. SA combines techniques of Natural Language Processing (NLP) with data mining approaches for developing inelegant systems. Therefore, an approach that can classify sentiments into two classes, namely, positive sentiment and negative sentiment is proposed. A Multilayer Perceptron (MLP) classifier has been used in this document classification system. The present research aims to provide an effective approach to improving the accuracy of SA systems. The proposed approach is applied to and tested on two datasets, namely, a Twitter dataset and a movie review dataset; the accuracies achieved reach 85% and 99% respectively.
The multi-controller placement problem (MCPP) represents one of the most challenging issues in so... more The multi-controller placement problem (MCPP) represents one of the most challenging issues in software-defined networks (SDNs). High-efficiency and scalable optimized solutions can be achieved for a given position in such networks, thereby enhancing various aspects of programmability, configuration, and construction. In this paper, we propose a model called simulated annealing for multi-controllers in SDN (SA-MCSDN) to solve the problem of placing multiple controllers in appropriate locations by considering estimated distances and distribution times among the controllers, as well as between controllers and switches (C2S). We simulated the proposed mathematical model using Network Simulator NS3 in the Linux Ubuntu environment to extract the performance results. We then compared the results of this single-solution algorithm with those obtained by our previously proposed multi-solution harmony search particle swarm optimization (HS-PSO) algorithm. The results reveal interesting aspect...
2021 14th International Conference on Developments in eSystems Engineering (DeSE), 2021
Wireless sensor network has emerged recently as an effective way of monitoring inhospitable or re... more Wireless sensor network has emerged recently as an effective way of monitoring inhospitable or remote physical environments. One of the main challenges in the networks lies in the constrained energy and computational resources available to sensor nodes. The deployment of the sensor nodes is the initial step to establishing a sensor network. In this paper the objective of the method of the location estimation is to estimate locations of the sensor nodes with respect of the set of the anchor nodes that their locations' information is known. This network consists of sensors that are battery-powered, and that gather the data and route it to the central Base Station(BS). Cluster-based routing efficiently utilizes with a limited energy of the sensor devices by selecting the Cluster Head (CH) at every round depending in the number, energy, and prior of the nodes. Also, Sensor Mode Setup Phase(SMSF) help increase the wireless sensor networks lifetime, and also see that the lifetime increases when the sensor nodes number is increased.
Software-defined networking (SDN) has emerged in response to increasing requirements for new netw... more Software-defined networking (SDN) has emerged in response to increasing requirements for new networks and expansion of Internet coverage. Modern needs exceed the limitations of traditional networks, for which, to simplify management, SDN is proposed as a promising paradigm that separates the control and data planes, allowing for the programming of network configuration. SDN deployment and applications are directly affected by the controller position. Single or multiple controllers are used in SDN architecture to enable programmable, flexible, and scalable configurations. Multiple controllers are essential in the current SDN, and various solutions have been recently developed to improve scalability and placement selection. In this study, the Controller Placement Problem (CPP) is explored using objective optimisation with proposed algorithms. An overview of SDN issues and the controller role is provided through its three-plane architecture with a focus on scalability and reliability. ...
Metaheuristic techniques become considerably popular in solving feature selection (FS) problems d... more Metaheuristic techniques become considerably popular in solving feature selection (FS) problems due to their flexibility and ability to avoid the local optimum problem. Features selection is important and essential mean to tackle the classification problems through choosing an optimal features subset according to a certain criterion. FS is used to reduce dimensionality and remove noise from data, these are given rise to speed of learning, simplicity of rules, visualizes the data and predictive accuracy. Salp Swarm Algorithm (SSA) is a new metaheuristic algorithm that emulates the inbred behaviour of the Salp chain. In this study, a new FS approach applies the native SSA in machine learning domain to select the optimal feature group on the basis of wrapper mode. Subsequently, SSA is hybridised with a mutation operator. Mutation is embedded to act as an internal operator and consequently maintain diversity and improve the exploration ability within the SSA. The performance of SSA with...
2021 International Conference on Communication & Information Technology (ICICT), 2021
Early diagnosis of diseases is necessary and important to assist doctors in order to treat and co... more Early diagnosis of diseases is necessary and important to assist doctors in order to treat and control diseases such as breast cancer early. A high accuracy machine learning techniques should be available to assist clinician to come out with accurate diagnosis. In this work we tackle this problem using the coronavirus algorithm. The Coronavirus algorithm was used as a feature optimization method, while J48 and PART were used for classification. The proposed approach has been tested on the Wisconsin Diagnosis Breast Cancer (WDBC) dataset located in the UCI Machine Learning Database repository. Experimental results show that our approach is able to generate competitive results when compared with previous available approaches. The proposed approach could obtain an accuracy of 94.01 % for the j48 algorithm and an accuracy of 94.18% for the PART algorithm.
Communications in Computer and Information Science, 2018
Met-heuristics are becoming increasingly popular in solving real world problems. Modern meta-heur... more Met-heuristics are becoming increasingly popular in solving real world problems. Modern meta-heuristics leading to a new branch of optimization, called meta-heuristic optimization. These applied to all areas of data mining, planning and scheduling, design, machine intelligence, and features selection (FS). FS is used to remove noise from data and dimensionality reduction; these properties could give rise to simplicity of rules, speed of learning, predictive accuracy and visualizes the data. Salp Swarm (SSA) is a recent meta-heuristic optimization method that mimics the innate behaviour of the Salp swarm chain. In this study, SSA is hybridised with a simulated annealing (SA). SA is employed as internal functions to improve the exploitation ability that utilizes to accept a worse quality solution than the current one. The performance of suggested approach is evaluated on 16 datasets including two high dimensional from UCI repository and compared with the native (SSA) and other (FS) approaches include ALO and PSO, the experimental results clearly proved the adequacy of the proposed approach to search the features space for optimal features. SSA-SA gave excellent performance as a multi objective optimisation where achieved two contradictory goals, maximal accuracy of a classification with minimal size of features on all used datasets.
Bulletin of Electrical Engineering and Informatics, 2020
The advanced technology in the internet and social media, communication companies, health care re... more The advanced technology in the internet and social media, communication companies, health care records and cloud computing applications made the data around us increase dramatically every minute and continuously. These renewals big data involve sensitive information such as password, PIN number, credential numbers, secret identifications and etc. which require maintaining with some high secret procedures. The present paper involves proposing a secret multi-dimensional symmetric cipher with six dimensions as a cubic algorithm. The proposed algorithm works with the substitution permutation network (SPN) structure and supports a high processing data rate in six directions. The introduced algorithm includes six symmetry rounds transformations for encryption the plaintext, where each dimension represents an independent algorithm for big data manipulation. The proposed cipher deals with parallel encryption structures of the 128-bit data block for each dimension in order to handle large vo...
International Journal of Electrical and Computer Engineering (IJECE), 2020
In this study, we present an investigation of comparing the capability of a big bang-big crunch m... more In this study, we present an investigation of comparing the capability of a big bang-big crunch metaheuristic (BBBC) for managing operational problems including combinatorial optimization problems. The BBBC is a product of the evolution theory of the universe in physics and astronomy. Two main phases of BBBC are the big bang and the big crunch. The big bang phase involves the creation of a population of random initial solutions, while in the big crunch phase these solutions are shrunk into one elite solution exhibited by a mass center. This study looks into the BBBC’s effectiveness in assignment and scheduling problems. Where it was enhanced by incorporating an elite pool of diverse and high quality solutions; a simple descent heuristic as a local search method; implicit recombination; Euclidean distance; dynamic population size; and elitism strategies. Those strategies provide a balanced search of diverse and good quality population. The investigation is conducted by comparing the ...
Medical data classification is an important factor in improving diagnosis and treatment and can a... more Medical data classification is an important factor in improving diagnosis and treatment and can assist physicians in making decisions about serious diseases by collecting symptoms and medical analyses. In this work, hybrid classification optimization methods such as Genetic Algorithm (GA), Particle Swam Optimization (PSO), and Fireworks Algorithm (FWA), are proposed for enhancing the classification accuracy of the Artificial Neural Network (ANN). The enhancement process is tested through two experiments. First, the proposed algorithms are applied on five benchmark medical data sets from the repository of the University of California in Irvine (UCI). The model with the best results is then used in the second experiment, which focuses on tuning the parameters of the selected algorithm by choosing a different number of iterations in ANNs with different numbers of hidden layers. Enhanced ANN with the three optimization algorithms are tested on biological gene sequence big dataset obtained from The Cancer Genome Atlas (TCGA) repository. GA and FWA are statistically significant but PSO was statistically not, and GA overcame PSO and FWA in performance. The methodology is successful and registers improvements in every step, as significant results are obtained.
International Journal of Distributed Sensor Networks, 2012
Localization is one of the key techniques in wireless sensor network. The location estimation met... more Localization is one of the key techniques in wireless sensor network. The location estimation methods can be classified into target/source localization and node self-localization. In target localization, we mainly introduce the energy-based method. Then we investigate the node self-localization methods. Since the widespread adoption of the wireless sensor network, the localization methods are different in various applications. And there are several challenges in some special scenarios. In this paper, we present a comprehensive survey of these challenges: localization in non-line-of-sight, node selection criteria for localization in energy-constrained network, scheduling the sensor node to optimize the tradeoff between localization performance and energy consumption, cooperative node localization, and localization algorithm in heterogeneous network. Finally, we introduce the evaluation criteria for localization in wireless sensor network.
In general, course timetabling refers to assignment processes that assign events (courses) to a g... more In general, course timetabling refers to assignment processes that assign events (courses) to a given rooms and timeslots subject to a list of hard and soft constraints. It is a challenging task for the educational institutions. In this study we employed a great deluge algorithm with kempe chain neighbourhood structure as an improvement algorithm. The Round Robin (RR) algorithm is used to control the selection of neighbourhood structures within the great deluge algorithm. The performance of our approach is tested over eleven benchmark datasets (representing one large, five medium and five small problems). Experimental results show that our approach is able to generate competitive results when compared with previous available approaches. Possible extensions upon this simple approach are also discussed.
AbstractConstructing university course timetable is a very difficult task where a set of events ... more AbstractConstructing university course timetable is a very difficult task where a set of events has to be scheduled in timeslots and located in suitable rooms. The objective of course timetabling problem is to satisfy the hard constraints and minimize the violation of soft constraints. In ...
The major problems in a Wireless Sensor Networks (WSNs) is the localization problem, that relates... more The major problems in a Wireless Sensor Networks (WSNs) is the localization problem, that relates to how an area covers by the sensor nodes. In this study, the problem formulates as the decision problem, that takes the best location for all sensors in the sensor field. Butterfly Optimization Algorithm (BOA), proposes to calculate the estimate locations for all sensors. Simulating the BOA with using number of sensors from 25 to 150 sensors and number of the anchor nodes. The distance between sensors and anchors measures by Received Signal Strength (RSS) so, this strategy is known as RSS-BOA. The obtained results shed that, the performance of the proposed algorithm is more accurate in comparing with BOA approach in the term sensor's location and the average error.
Software-Defined Networking (SDN) is a developing architecture that provides scalability, flexibi... more Software-Defined Networking (SDN) is a developing architecture that provides scalability, flexibility, and efficient network management. However, optimal controller placement faces many problems, which affect the performance of the overall network. To resolve the Multi-controller SDN (MC-SDN) that is deployed in the SDN environment, we propose an approach that uses a hybrid metaheuristic algorithm that improves network performance. Initially, the proposed SDN network is constructed based on graph theory, which improves the connectivity and flexibility between switches and controllers. After that, the controller selection is performed by selecting an optimal controller from multiple controllers based on controller features using the firefly optimization algorithm (FA), which improves the network performance. Finally, multi-controller placement is performed to reduce the communication latency between the switch to controllers. Here, multiple controllers are placed by considering locat...
International Journal of Image and Graphics, Jul 21, 2023
Image editing technologies have been advanced that can significantly enhance the image, but can a... more Image editing technologies have been advanced that can significantly enhance the image, but can also be used maliciously. Colorization is a new image editing technology that uses realistic colors to colorize grayscale photos. However, this strategy can be used on natural color images for a malicious purpose (e.g. to confuse object recognition systems that depend on the colors of objects for recognition). Image forensics is a well-developed field that examines photos of specified conditions to build confidence and authenticity. This work proposes a new fake colorized image detection approach based on the special Residual Network (ResNet) architecture. ResNets are a kind of Convolutional Neural Networks (CNNs) architecture that has been widely adopted and applied for various tasks. At first, the input image is reconstructed via a special image representation that combines color information from three separate color spaces (HSV, Lab, and Ycbcr); then, the new reconstructed images have been used for training the proposed ResNet model. Experimental results have demonstrated that our proposed method is highly generalized and significantly robust for revealing fake colorized images generated by various colorization methods.
International Journal of Computational Intelligence and Applications
Nowadays, images have become one of the most popular forms of communication as image editing tool... more Nowadays, images have become one of the most popular forms of communication as image editing tools have evolved. Image manipulation, particularly image colorization, has become easier, making it harder to differentiate between fake colorized images and actual images. Furthermore, the RGB space is no longer considered to be the best option for color-based detection techniques due to the high correlation between channels and its blending of luminance and chrominance information. This paper proposes a new approach for fake colorized image detection based on a novel image representation created by combining color information from three separate color spaces (HSV, Lab, and Ycbcr) and selecting the most different channels from each color space to reconstruct the image. Features from the proposed image representation are extracted based on transfer learning using the pre-trained CNNs ResNet50 model. The Support Vector Machine (SVM) approach has been used for classification purposes due to ...
Internet-based platforms such as social media have a great deal of big data that is available in ... more Internet-based platforms such as social media have a great deal of big data that is available in the shape of text, audio, video, and image. Sentiment Analysis (SA) of this big data has become a field of computational studies. Therefore, SA is necessary in texts in the form of messages or posts to determine whether a sentiment is negative or positive. SA is also crucial for the development of opinion mining systems. SA combines techniques of Natural Language Processing (NLP) with data mining approaches for developing inelegant systems. Therefore, an approach that can classify sentiments into two classes, namely, positive sentiment and negative sentiment is proposed. A Multilayer Perceptron (MLP) classifier has been used in this document classification system. The present research aims to provide an effective approach to improving the accuracy of SA systems. The proposed approach is applied to and tested on two datasets, namely, a Twitter dataset and a movie review dataset; the accuracies achieved reach 85% and 99% respectively.
The multi-controller placement problem (MCPP) represents one of the most challenging issues in so... more The multi-controller placement problem (MCPP) represents one of the most challenging issues in software-defined networks (SDNs). High-efficiency and scalable optimized solutions can be achieved for a given position in such networks, thereby enhancing various aspects of programmability, configuration, and construction. In this paper, we propose a model called simulated annealing for multi-controllers in SDN (SA-MCSDN) to solve the problem of placing multiple controllers in appropriate locations by considering estimated distances and distribution times among the controllers, as well as between controllers and switches (C2S). We simulated the proposed mathematical model using Network Simulator NS3 in the Linux Ubuntu environment to extract the performance results. We then compared the results of this single-solution algorithm with those obtained by our previously proposed multi-solution harmony search particle swarm optimization (HS-PSO) algorithm. The results reveal interesting aspect...
2021 14th International Conference on Developments in eSystems Engineering (DeSE), 2021
Wireless sensor network has emerged recently as an effective way of monitoring inhospitable or re... more Wireless sensor network has emerged recently as an effective way of monitoring inhospitable or remote physical environments. One of the main challenges in the networks lies in the constrained energy and computational resources available to sensor nodes. The deployment of the sensor nodes is the initial step to establishing a sensor network. In this paper the objective of the method of the location estimation is to estimate locations of the sensor nodes with respect of the set of the anchor nodes that their locations' information is known. This network consists of sensors that are battery-powered, and that gather the data and route it to the central Base Station(BS). Cluster-based routing efficiently utilizes with a limited energy of the sensor devices by selecting the Cluster Head (CH) at every round depending in the number, energy, and prior of the nodes. Also, Sensor Mode Setup Phase(SMSF) help increase the wireless sensor networks lifetime, and also see that the lifetime increases when the sensor nodes number is increased.
Software-defined networking (SDN) has emerged in response to increasing requirements for new netw... more Software-defined networking (SDN) has emerged in response to increasing requirements for new networks and expansion of Internet coverage. Modern needs exceed the limitations of traditional networks, for which, to simplify management, SDN is proposed as a promising paradigm that separates the control and data planes, allowing for the programming of network configuration. SDN deployment and applications are directly affected by the controller position. Single or multiple controllers are used in SDN architecture to enable programmable, flexible, and scalable configurations. Multiple controllers are essential in the current SDN, and various solutions have been recently developed to improve scalability and placement selection. In this study, the Controller Placement Problem (CPP) is explored using objective optimisation with proposed algorithms. An overview of SDN issues and the controller role is provided through its three-plane architecture with a focus on scalability and reliability. ...
Metaheuristic techniques become considerably popular in solving feature selection (FS) problems d... more Metaheuristic techniques become considerably popular in solving feature selection (FS) problems due to their flexibility and ability to avoid the local optimum problem. Features selection is important and essential mean to tackle the classification problems through choosing an optimal features subset according to a certain criterion. FS is used to reduce dimensionality and remove noise from data, these are given rise to speed of learning, simplicity of rules, visualizes the data and predictive accuracy. Salp Swarm Algorithm (SSA) is a new metaheuristic algorithm that emulates the inbred behaviour of the Salp chain. In this study, a new FS approach applies the native SSA in machine learning domain to select the optimal feature group on the basis of wrapper mode. Subsequently, SSA is hybridised with a mutation operator. Mutation is embedded to act as an internal operator and consequently maintain diversity and improve the exploration ability within the SSA. The performance of SSA with...
2021 International Conference on Communication & Information Technology (ICICT), 2021
Early diagnosis of diseases is necessary and important to assist doctors in order to treat and co... more Early diagnosis of diseases is necessary and important to assist doctors in order to treat and control diseases such as breast cancer early. A high accuracy machine learning techniques should be available to assist clinician to come out with accurate diagnosis. In this work we tackle this problem using the coronavirus algorithm. The Coronavirus algorithm was used as a feature optimization method, while J48 and PART were used for classification. The proposed approach has been tested on the Wisconsin Diagnosis Breast Cancer (WDBC) dataset located in the UCI Machine Learning Database repository. Experimental results show that our approach is able to generate competitive results when compared with previous available approaches. The proposed approach could obtain an accuracy of 94.01 % for the j48 algorithm and an accuracy of 94.18% for the PART algorithm.
Communications in Computer and Information Science, 2018
Met-heuristics are becoming increasingly popular in solving real world problems. Modern meta-heur... more Met-heuristics are becoming increasingly popular in solving real world problems. Modern meta-heuristics leading to a new branch of optimization, called meta-heuristic optimization. These applied to all areas of data mining, planning and scheduling, design, machine intelligence, and features selection (FS). FS is used to remove noise from data and dimensionality reduction; these properties could give rise to simplicity of rules, speed of learning, predictive accuracy and visualizes the data. Salp Swarm (SSA) is a recent meta-heuristic optimization method that mimics the innate behaviour of the Salp swarm chain. In this study, SSA is hybridised with a simulated annealing (SA). SA is employed as internal functions to improve the exploitation ability that utilizes to accept a worse quality solution than the current one. The performance of suggested approach is evaluated on 16 datasets including two high dimensional from UCI repository and compared with the native (SSA) and other (FS) approaches include ALO and PSO, the experimental results clearly proved the adequacy of the proposed approach to search the features space for optimal features. SSA-SA gave excellent performance as a multi objective optimisation where achieved two contradictory goals, maximal accuracy of a classification with minimal size of features on all used datasets.
Bulletin of Electrical Engineering and Informatics, 2020
The advanced technology in the internet and social media, communication companies, health care re... more The advanced technology in the internet and social media, communication companies, health care records and cloud computing applications made the data around us increase dramatically every minute and continuously. These renewals big data involve sensitive information such as password, PIN number, credential numbers, secret identifications and etc. which require maintaining with some high secret procedures. The present paper involves proposing a secret multi-dimensional symmetric cipher with six dimensions as a cubic algorithm. The proposed algorithm works with the substitution permutation network (SPN) structure and supports a high processing data rate in six directions. The introduced algorithm includes six symmetry rounds transformations for encryption the plaintext, where each dimension represents an independent algorithm for big data manipulation. The proposed cipher deals with parallel encryption structures of the 128-bit data block for each dimension in order to handle large vo...
International Journal of Electrical and Computer Engineering (IJECE), 2020
In this study, we present an investigation of comparing the capability of a big bang-big crunch m... more In this study, we present an investigation of comparing the capability of a big bang-big crunch metaheuristic (BBBC) for managing operational problems including combinatorial optimization problems. The BBBC is a product of the evolution theory of the universe in physics and astronomy. Two main phases of BBBC are the big bang and the big crunch. The big bang phase involves the creation of a population of random initial solutions, while in the big crunch phase these solutions are shrunk into one elite solution exhibited by a mass center. This study looks into the BBBC’s effectiveness in assignment and scheduling problems. Where it was enhanced by incorporating an elite pool of diverse and high quality solutions; a simple descent heuristic as a local search method; implicit recombination; Euclidean distance; dynamic population size; and elitism strategies. Those strategies provide a balanced search of diverse and good quality population. The investigation is conducted by comparing the ...
Medical data classification is an important factor in improving diagnosis and treatment and can a... more Medical data classification is an important factor in improving diagnosis and treatment and can assist physicians in making decisions about serious diseases by collecting symptoms and medical analyses. In this work, hybrid classification optimization methods such as Genetic Algorithm (GA), Particle Swam Optimization (PSO), and Fireworks Algorithm (FWA), are proposed for enhancing the classification accuracy of the Artificial Neural Network (ANN). The enhancement process is tested through two experiments. First, the proposed algorithms are applied on five benchmark medical data sets from the repository of the University of California in Irvine (UCI). The model with the best results is then used in the second experiment, which focuses on tuning the parameters of the selected algorithm by choosing a different number of iterations in ANNs with different numbers of hidden layers. Enhanced ANN with the three optimization algorithms are tested on biological gene sequence big dataset obtained from The Cancer Genome Atlas (TCGA) repository. GA and FWA are statistically significant but PSO was statistically not, and GA overcame PSO and FWA in performance. The methodology is successful and registers improvements in every step, as significant results are obtained.
International Journal of Distributed Sensor Networks, 2012
Localization is one of the key techniques in wireless sensor network. The location estimation met... more Localization is one of the key techniques in wireless sensor network. The location estimation methods can be classified into target/source localization and node self-localization. In target localization, we mainly introduce the energy-based method. Then we investigate the node self-localization methods. Since the widespread adoption of the wireless sensor network, the localization methods are different in various applications. And there are several challenges in some special scenarios. In this paper, we present a comprehensive survey of these challenges: localization in non-line-of-sight, node selection criteria for localization in energy-constrained network, scheduling the sensor node to optimize the tradeoff between localization performance and energy consumption, cooperative node localization, and localization algorithm in heterogeneous network. Finally, we introduce the evaluation criteria for localization in wireless sensor network.
In general, course timetabling refers to assignment processes that assign events (courses) to a g... more In general, course timetabling refers to assignment processes that assign events (courses) to a given rooms and timeslots subject to a list of hard and soft constraints. It is a challenging task for the educational institutions. In this study we employed a great deluge algorithm with kempe chain neighbourhood structure as an improvement algorithm. The Round Robin (RR) algorithm is used to control the selection of neighbourhood structures within the great deluge algorithm. The performance of our approach is tested over eleven benchmark datasets (representing one large, five medium and five small problems). Experimental results show that our approach is able to generate competitive results when compared with previous available approaches. Possible extensions upon this simple approach are also discussed.
AbstractConstructing university course timetable is a very difficult task where a set of events ... more AbstractConstructing university course timetable is a very difficult task where a set of events has to be scheduled in timeslots and located in suitable rooms. The objective of course timetabling problem is to satisfy the hard constraints and minimize the violation of soft constraints. In ...
Uploads
Papers by Khalid Shaker