IJCI. International Journal of Computers and Information, 2021
These days, e-learning has become indispensable as it facilitates the learning process and enable... more These days, e-learning has become indispensable as it facilitates the learning process and enables the students to obtain educational resources faster. With the increase in the number of learners and the number of requests on the e-learning frameworks, the e-learning framework has become suffering from some shortcomings, which prompted to search for a model that could facilitate students' access to educational resources. Therefore, in this research, a model based on fog computing was proposed, in which the e-learning resources are closer to the end-users. A test of the proposed model was conducted on a sample of students to measure the response time. Result data are collected and analyzed. The response time resulting from the proposed model compared with that resulting from the current model based on cloud computing. It founded that the proposed model has advantages as the number of students is divided on the fog computing nodes, unlike what happens in the cloud-based model in w...
IJCI. International Journal of Computers and Information
3D models unauthorized modification is an attractive research problem nowadays due to the widespr... more 3D models unauthorized modification is an attractive research problem nowadays due to the widespread availability of technologies and designs on the internet. In this paper, we propose a blind fragile watermarking scheme in the spatial domain for 3D model authentication based on the curvature features of the 3D model and chaos sequence. First, we compute the curvature feature for every vertex of the 3D model, after that, the vertices are classified into 3 classes: flat, peak, and In-between vertices using K-means clustering algorithm. We suggested two methods for embedding the watermark based on the least significant bit (LSB) substitution technique; Cluster Type-based Embedding (CTE) and Cluster Size-based Embedding (CSE). The proposed methods employ a Chaos sequence generator to generate a Chaos sequence that is used to generate the embedded watermark, where the tampering region can be verified and located by the Chaos sequence-based watermark check. Many assessment methods are employed to evaluate the proposed method with various unauthorized attacks like rotation, translation, scaling, cropping, and noise addition. The experiment results show an improvement of embedding imperceptibility as well as tempered regions detection compared to existing literature works.
2009 International Conference for Technical Postgraduates (TECHPOS), 2009
The demand for adequate security to electronic data system grows high over the decades. As the se... more The demand for adequate security to electronic data system grows high over the decades. As the security issue also impact on the performance analysis of the wireless network, data encryption is necessary for sending and receiving information secretly over the network. Since 1970s, Data Encryption Standard (DES) has received a substantial amount of attention from academic cryptanalysts. However, in 1998 it has been proved insecure and on October 2000, National Institute of Standards and Technology (NIST) announced that the Rijndael algorithm has been selected as the Advance Encryption Standard (AES).Our algorithm is based on AES. In this paper, we examine the performance of our new cipher in MANET and wireless LAN networks and make a performance comparison with that of AES.
IJCI. International Journal of Computers and Information, 2015
The demand for real-time database is increasing. Indeed, most real-time systems are inherently di... more The demand for real-time database is increasing. Indeed, most real-time systems are inherently distributed in nature and need to handle data in a timely fashion. Obtaining data from remote sites may take long time making the temporal data invalid. This results in large number of tardy transactions with their catastrophic effect. Clustering the database sites nodes can help distributed real-time database systems to face the challenges meeting their time requirements. Reducing the large number of network sites into many clusters with smaller number of sites will effectively decrease the response time, resulting in better meeting of time constraints. In this paper, we introduce a clustering algorithm for distributed real-time database that depend on both the communication time cost and the timing properties of data. The results show the effectiveness of the proposed approach via achieving lower communication time, higher database performance and better meeting of timing requirements. Keywords-clustering; database; real-time; distributed systems I. INTRODUCTION Recently, the demand for real-time database is increasing. Many applications such as e-commerce, mobile communication, accounting, information services, medical monitoring, nuclear reactor control, traffic control systems and telecommunications are some examples of application which require real-time data support [1]. A real-time database system (RTDBS) is defined in [2] as a database system that includes all features of traditional database system, while enforcing real-time constraints or deadlines. According to [3], the time constraints can be on the data level in the form of time validation attribute making temporal data whose validity is lost after the elapse of some pre_specified time interval, or on the transaction level in the form of deadline used by the real-time scheduling and concurrency control. Real-time systems are often classified depending on the value of the deadline. There are three types of deadlines, hard, firm or soft, depending on the resulting value of the computation when missing a deadline. If the hard deadline is missed, a large or infinity penalty returns. when a firm deadline is missed, no value returns. while some value from the computation may be still for some time if the soft deadline is lost [1]. A transaction that executes outside of the deadline boundaries has less value or may damage the system, depending on the type of deadline associated with it [4]. Like any information systems, database is the main component of real-time information system. However, most real-time systems are inherently distributed in nature. Such critical systems always require to deal with their data in a timely fashion [5]. Sometimes data which is required at a particular location is not available, and it has to be obtained from remote site. This may take long time which consumes the validation duration of data make them invalid. This leads to large number of tardy transactions (transactions that miss their deadline).
ABSTRACT Colors Hiding refers to the process where the chromaticity values are processed to be hi... more ABSTRACT Colors Hiding refers to the process where the chromaticity values are processed to be hidden in the achromatic channel. This concept can be found in the literature as color protection. In this paper we propose a new color hiding system based on decolorization. Decolorization refers to eliminating the colors to just few color seeds, used in the colorization process. In this paper the proposed decolorization system depends on extracting the color seeds using morphology operations. The proposed Morphological Decolorization System (MDS) can extract very few seeds -compared to other methods -and results in very qualified colorization. The seeds then are hided in the luminance channel after encoding using Least Significant Bit (LSB) with very few bit planes. The results of the system show very high quality color retrieval with high chromatic compression ratio compared to other literature methods.
Network intrusion detection based on anomaly detection techniques has a significant role in prote... more Network intrusion detection based on anomaly detection techniques has a significant role in protecting networks and systems against harmful activities. Different metaheuristic techniques have been used for anomaly detector generation. Yet, reported literature has not studied the use of the multi-start metaheuristic method for detector generation. This paper proposes a hybrid approach for anomaly detection in large scale datasets using detectors generated based on multi-start metaheuristic method and genetic algorithms. The proposed approach has taken some inspiration of negative selection-based detector generation. The evaluation of this approach is performed using NSL-KDD dataset which is a modified version of the widely used KDD CUP 99 dataset. The results show its effectiveness in generating a suitable number of detectors with an accuracy of 96.1% compared to other competitors of machine learning algorithms. ª 2014 Production and hosting by Elsevier B.V. on behalf of Cairo University.
2014 9th International Conference on Computer Engineering & Systems (ICCES), 2014
Clustering multi-dense large scale high dimensional datasets are a challenging task duo to scalab... more Clustering multi-dense large scale high dimensional datasets are a challenging task duo to scalability limits of most of clustering algorithms. Nowadays, data collection tools produce large amounts of data. So, fast and scalable algorithms are vital requirement for clustering such data. In this paper, a fast and scalable algorithm called dimension-based partitioning and merging clustering (DPM) is proposed. In DPM, data is partitioned into small dense volumes while processing each dimension values range. Next, noise are filtered out using dimensional densities of the generated partitions. At last, merging process in invoked to construct clusters based on partitions boundary data samples. DPM algorithm detects automatically the number of data clusters based on three insensitive tuning parameters which decrease the burden of its usage. Performance evaluation on different datasets proves the extreme fastness and scalability of the proposed algorithm along with clustering accuracy compared to other large scale clustering competitors.
Clustering multi-dense large scale high dimensional numeric datasets is a challenging task duo to... more Clustering multi-dense large scale high dimensional numeric datasets is a challenging task duo to high time complexity of most clustering algorithms. Nowadays, data collection tools produce a large amount of data. So, fast algorithms are vital requirement for clustering such data. In this paper, a fast clustering algorithm, called Dimension-based Partitioning and Merging (DPM), is proposed. In DPM, first, data is partitioned into small dense volumes during the successive processing of dataset dimensions. Then, noise is filtered out using dimensional densities of the generated partitions. Finally, merging process is invoked to construct clusters based on partition boundary data samples. DPM algorithm automatically detects the number of data clusters based on three insensitive tuning parameters which decrease the burden of its usage. Performance evaluation of the proposed algorithm using different datasets shows its fastness and accuracy compared to other clustering competitors.
Cloud Computing becomes the next generation architecture of IT Enterprise. In contrast to traditi... more Cloud Computing becomes the next generation architecture of IT Enterprise. In contrast to traditional solutions, Cloud computing moves the application software and databases to the large data centers, where the management of the data and services may not be fully trustworthy. This unique feature, however, raises many new security challenges which have not been well understood. In cloud computing, both
IJCI. International Journal of Computers and Information
This paper presents the comparison of two metaheuristic approaches: Differential Evolution (DE) a... more This paper presents the comparison of two metaheuristic approaches: Differential Evolution (DE) and Particle Swarm Optimization (PSO) in the training of feed-forward neural network to predict the daily stock prices. Stock market prediction is the act of trying to determine the future value of a company stock or other financial instrument traded on a financial exchange. The successful prediction of a stock's future price could yield significant profit. The feasibility, effectiveness and generic nature of both DE and PSO approaches investigated are exemplarily demonstrated. Comparisons were made between the two approaches in terms of the prediction accuracy and convergence characteristics. The proposed model is based on the study of historical data, technical indicators and the application of Neural Networks trained with DE and PSO algorithms. Results presented in this paper show the potential of both algorithms applications for the decision making in the stock markets, but DE gives better accuracy compared with PSO.
IJCI. International Journal of Computers and Information
Real-time systems refer to systems that have real-time requirements for interacting with a human ... more Real-time systems refer to systems that have real-time requirements for interacting with a human operator or other agents with similar timescales. A n efficient simulation of real-time systems requires a model that is accurate enough to accomplish the simulation objective and is computationally eff icient. In this paper a real time modeling system for dynamic systems will be studied. Normally, Real time modeling can be classified into hardware and software systems, but this work focuses on the software techniques and systems. Finally a demonstration example for real time simulator has been simulated for complex dynamic system, namely a small nuclear fusion device (Egyptor Tokamak). The obtained results agree well with published work. Such simulator can be considered an imperative requirement for predicative control tasks.
IJCI. International Journal of Computers and Information
Mobile Worldwide Interoperability for Microwave Access (Mobile WiMAX) is a wireless Metropolitan ... more Mobile Worldwide Interoperability for Microwave Access (Mobile WiMAX) is a wireless Metropolitan Area Network technology based on IEEE 802.16e standard. Mobile WiMAX has been designed with the purpose of enabling mobile Internet from the physical layer to the network layer. In this paper, a comparative study between load balancing techniques in mobile WiMAX is presented. These techniques are Spare Capacity Procedure (SCP) and WiMAX QOS Aware Load Balancing Protocol (WQLP). These load balancing techniques used when network congestion occurred in Mobile WiMAX networks. These two techniques represent the major trends of load balancing in Mobile WiMAX IEEE 802.16e technology. These techniques use directed handovers for Load Balancing (LB) among cells; these handovers were initiated by the Serving Base Station. Performance of load balancing techniques have been analyzed and evaluated based on an extensive simulation. The simulation shows that the performance of WQLP is better than SCP in load distribution but SCP is faster than WQLP in handover process. The elaborated performance analysis shows a set of advantages and disadvantages of these techniques. This evaluation study represents an important entry point for choosing the best techniques that can distribute load among BSs and guarantee QoS for all MSs using real-time applications.
IJCI. International Journal of Computers and Information
Clinical records contain massive heterogeneity number of data types, generally written in free-no... more Clinical records contain massive heterogeneity number of data types, generally written in free-note without a linguistic standard. Other forms of medical data include medical images with/without metadata (e.g., CT, MRI, radiology, etc.), audios (e.g., transcriptions, ultrasound), videos (e.g., surgery recording), and structured data (e.g., laboratory test results, age, year, weight, billing, etc.). Consequently, to retrieve the knowledge from these data is not trivial task. Handling the heterogeneity besides largeness and complexity of these data is a challenge. The main purpose of this paper is proposing a framework with twofold. Firstly, it achieves a semantic-based integration approach, which resolves the heterogeneity issue during the integration process of healthcare data from various data sources. Secondly, it achieves a semantic-based medical retrieval approach with enhanced precision. Our experimental study on medical datasets demonstrates the significant accuracy and speedup of the proposed framework over existing approaches.
2016 7th International Conference on Information and Communication Systems (ICICS), 2016
Wireless sensor networks are characterized with large number of nodes deployed randomly over the ... more Wireless sensor networks are characterized with large number of nodes deployed randomly over the network area. In many real applications, random deployment process produce non-uniform distribution of nodes. This means that there is no constant density over any unit area and therefore no constant number of neighbours for each node. With non-uniform distribution, network appear as if it divided to a set of sub-regions each with a different density level. Density in these sub-regions ranging from high density areas, medium, and others are empty areas. For this reason, topology extraction techniques is needed for sensors which helps to discover the layout of the network around them. It assists to figure the skeleton of the whole network. It can help in the discovery of holes and used to solve this problem as a guide for redeployment process to produce a full covered area. Our aim is to define a simple distributed technique that allow all sensors in the network to share information between them and extract the layout of the network. This is done by defining the closed boundary of sub-regions of different density levels which form the network. Previous techniques used for topology extraction need networks with very high density and deal with special deployment figures. Many of them requires uniform distribution which is not always applicable in real situations. Our proposed technique is simple, use lower density than other previously proposed techniques, and do not need special deployment figures.
2016 8th International Conference on Knowledge and Smart Technology (KST), 2016
Sensor nodes are characterized with limited resources of processing, memory, and battery. These f... more Sensor nodes are characterized with limited resources of processing, memory, and battery. These features motivate the researchers to propose power-aware communication protocols. Clustering is used to help for this purpose. It is used to organize the massive number of deployed sensors in the network to minimize energy consumption. Different categories of clustering techniques were proposed. One of these categories is density-based clustering which mainly depends on measuring the density around nodes before grouping them into clusters. This paper proposes an Energy-Efficient Density-based clustering technique which aims to balance the energy consumption among all clusters. This is done by the adaptation of the transmission range of cluster heads to use a suitable value according to the density around it. Simulation results for the proposed technique shows its effectiveness as it achieves less power consumption and more network lifetime when compared with other density-based clustering techniques.
2016 8th International Conference on Knowledge and Smart Technology (KST), 2016
The need of improving the privacy on data publisher becomes more important because data grows ver... more The need of improving the privacy on data publisher becomes more important because data grows very fast. Traditional methods for privacy preserving data publishing cannot prevent privacy leakage. This causes the continuous research to find better methods to prevent privacy leakage. K-anonymity and L-diversity are well-known techniques for data privacy preserving. These techniques cannot prevent the similarity attack on the data privacy because they did not take into consider the semantic relation between the sensitive attributes of the categorical data. In this paper, we proposed an approach to categorical data preservation based on Domain-based of semantic rules to overcome the similarity attacks. The experimental results of the proposal approach focused to categorical data presented. The results showed that the semantic anonymization increases the privacy level with effect data utility.
Real-time systems refer to systems that have real-time requirements for interacting with a human ... more Real-time systems refer to systems that have real-time requirements for interacting with a human operator or other agents with similar timescales. A n efficient simulation of real-time systems requires a model that is accurate enough to accomplish the simulation objective and is computationally eff icient. In this paper a real time modeling system for dynamic systems will be studied. Normally, Real time modeling can be classified into hardware and software systems, but this work focuses on the software techniques and systems. Finally a demonstration example for real time simulator has been simulated for complex dynamic system, namely a small nuclear fusion device (Egyptor Tokamak). The obtained results agree well with published work. Such simulator can be considered an imperative requirement for predicative control tasks.
Page 1. New Encryption Schema Based on Swarm Intelli-gence Chaotic Map Reda M. Hussein#1, Hatem S... more Page 1. New Encryption Schema Based on Swarm Intelli-gence Chaotic Map Reda M. Hussein#1, Hatem S. Ahmed#2, Wail F. Abd El-Wahed#3 #1Information System Dept. #2Information System Dept., #3OR & DSS Dept. Computer Science Dept. ...
IJCI. International Journal of Computers and Information, 2021
These days, e-learning has become indispensable as it facilitates the learning process and enable... more These days, e-learning has become indispensable as it facilitates the learning process and enables the students to obtain educational resources faster. With the increase in the number of learners and the number of requests on the e-learning frameworks, the e-learning framework has become suffering from some shortcomings, which prompted to search for a model that could facilitate students' access to educational resources. Therefore, in this research, a model based on fog computing was proposed, in which the e-learning resources are closer to the end-users. A test of the proposed model was conducted on a sample of students to measure the response time. Result data are collected and analyzed. The response time resulting from the proposed model compared with that resulting from the current model based on cloud computing. It founded that the proposed model has advantages as the number of students is divided on the fog computing nodes, unlike what happens in the cloud-based model in w...
IJCI. International Journal of Computers and Information
3D models unauthorized modification is an attractive research problem nowadays due to the widespr... more 3D models unauthorized modification is an attractive research problem nowadays due to the widespread availability of technologies and designs on the internet. In this paper, we propose a blind fragile watermarking scheme in the spatial domain for 3D model authentication based on the curvature features of the 3D model and chaos sequence. First, we compute the curvature feature for every vertex of the 3D model, after that, the vertices are classified into 3 classes: flat, peak, and In-between vertices using K-means clustering algorithm. We suggested two methods for embedding the watermark based on the least significant bit (LSB) substitution technique; Cluster Type-based Embedding (CTE) and Cluster Size-based Embedding (CSE). The proposed methods employ a Chaos sequence generator to generate a Chaos sequence that is used to generate the embedded watermark, where the tampering region can be verified and located by the Chaos sequence-based watermark check. Many assessment methods are employed to evaluate the proposed method with various unauthorized attacks like rotation, translation, scaling, cropping, and noise addition. The experiment results show an improvement of embedding imperceptibility as well as tempered regions detection compared to existing literature works.
2009 International Conference for Technical Postgraduates (TECHPOS), 2009
The demand for adequate security to electronic data system grows high over the decades. As the se... more The demand for adequate security to electronic data system grows high over the decades. As the security issue also impact on the performance analysis of the wireless network, data encryption is necessary for sending and receiving information secretly over the network. Since 1970s, Data Encryption Standard (DES) has received a substantial amount of attention from academic cryptanalysts. However, in 1998 it has been proved insecure and on October 2000, National Institute of Standards and Technology (NIST) announced that the Rijndael algorithm has been selected as the Advance Encryption Standard (AES).Our algorithm is based on AES. In this paper, we examine the performance of our new cipher in MANET and wireless LAN networks and make a performance comparison with that of AES.
IJCI. International Journal of Computers and Information, 2015
The demand for real-time database is increasing. Indeed, most real-time systems are inherently di... more The demand for real-time database is increasing. Indeed, most real-time systems are inherently distributed in nature and need to handle data in a timely fashion. Obtaining data from remote sites may take long time making the temporal data invalid. This results in large number of tardy transactions with their catastrophic effect. Clustering the database sites nodes can help distributed real-time database systems to face the challenges meeting their time requirements. Reducing the large number of network sites into many clusters with smaller number of sites will effectively decrease the response time, resulting in better meeting of time constraints. In this paper, we introduce a clustering algorithm for distributed real-time database that depend on both the communication time cost and the timing properties of data. The results show the effectiveness of the proposed approach via achieving lower communication time, higher database performance and better meeting of timing requirements. Keywords-clustering; database; real-time; distributed systems I. INTRODUCTION Recently, the demand for real-time database is increasing. Many applications such as e-commerce, mobile communication, accounting, information services, medical monitoring, nuclear reactor control, traffic control systems and telecommunications are some examples of application which require real-time data support [1]. A real-time database system (RTDBS) is defined in [2] as a database system that includes all features of traditional database system, while enforcing real-time constraints or deadlines. According to [3], the time constraints can be on the data level in the form of time validation attribute making temporal data whose validity is lost after the elapse of some pre_specified time interval, or on the transaction level in the form of deadline used by the real-time scheduling and concurrency control. Real-time systems are often classified depending on the value of the deadline. There are three types of deadlines, hard, firm or soft, depending on the resulting value of the computation when missing a deadline. If the hard deadline is missed, a large or infinity penalty returns. when a firm deadline is missed, no value returns. while some value from the computation may be still for some time if the soft deadline is lost [1]. A transaction that executes outside of the deadline boundaries has less value or may damage the system, depending on the type of deadline associated with it [4]. Like any information systems, database is the main component of real-time information system. However, most real-time systems are inherently distributed in nature. Such critical systems always require to deal with their data in a timely fashion [5]. Sometimes data which is required at a particular location is not available, and it has to be obtained from remote site. This may take long time which consumes the validation duration of data make them invalid. This leads to large number of tardy transactions (transactions that miss their deadline).
ABSTRACT Colors Hiding refers to the process where the chromaticity values are processed to be hi... more ABSTRACT Colors Hiding refers to the process where the chromaticity values are processed to be hidden in the achromatic channel. This concept can be found in the literature as color protection. In this paper we propose a new color hiding system based on decolorization. Decolorization refers to eliminating the colors to just few color seeds, used in the colorization process. In this paper the proposed decolorization system depends on extracting the color seeds using morphology operations. The proposed Morphological Decolorization System (MDS) can extract very few seeds -compared to other methods -and results in very qualified colorization. The seeds then are hided in the luminance channel after encoding using Least Significant Bit (LSB) with very few bit planes. The results of the system show very high quality color retrieval with high chromatic compression ratio compared to other literature methods.
Network intrusion detection based on anomaly detection techniques has a significant role in prote... more Network intrusion detection based on anomaly detection techniques has a significant role in protecting networks and systems against harmful activities. Different metaheuristic techniques have been used for anomaly detector generation. Yet, reported literature has not studied the use of the multi-start metaheuristic method for detector generation. This paper proposes a hybrid approach for anomaly detection in large scale datasets using detectors generated based on multi-start metaheuristic method and genetic algorithms. The proposed approach has taken some inspiration of negative selection-based detector generation. The evaluation of this approach is performed using NSL-KDD dataset which is a modified version of the widely used KDD CUP 99 dataset. The results show its effectiveness in generating a suitable number of detectors with an accuracy of 96.1% compared to other competitors of machine learning algorithms. ª 2014 Production and hosting by Elsevier B.V. on behalf of Cairo University.
2014 9th International Conference on Computer Engineering & Systems (ICCES), 2014
Clustering multi-dense large scale high dimensional datasets are a challenging task duo to scalab... more Clustering multi-dense large scale high dimensional datasets are a challenging task duo to scalability limits of most of clustering algorithms. Nowadays, data collection tools produce large amounts of data. So, fast and scalable algorithms are vital requirement for clustering such data. In this paper, a fast and scalable algorithm called dimension-based partitioning and merging clustering (DPM) is proposed. In DPM, data is partitioned into small dense volumes while processing each dimension values range. Next, noise are filtered out using dimensional densities of the generated partitions. At last, merging process in invoked to construct clusters based on partitions boundary data samples. DPM algorithm detects automatically the number of data clusters based on three insensitive tuning parameters which decrease the burden of its usage. Performance evaluation on different datasets proves the extreme fastness and scalability of the proposed algorithm along with clustering accuracy compared to other large scale clustering competitors.
Clustering multi-dense large scale high dimensional numeric datasets is a challenging task duo to... more Clustering multi-dense large scale high dimensional numeric datasets is a challenging task duo to high time complexity of most clustering algorithms. Nowadays, data collection tools produce a large amount of data. So, fast algorithms are vital requirement for clustering such data. In this paper, a fast clustering algorithm, called Dimension-based Partitioning and Merging (DPM), is proposed. In DPM, first, data is partitioned into small dense volumes during the successive processing of dataset dimensions. Then, noise is filtered out using dimensional densities of the generated partitions. Finally, merging process is invoked to construct clusters based on partition boundary data samples. DPM algorithm automatically detects the number of data clusters based on three insensitive tuning parameters which decrease the burden of its usage. Performance evaluation of the proposed algorithm using different datasets shows its fastness and accuracy compared to other clustering competitors.
Cloud Computing becomes the next generation architecture of IT Enterprise. In contrast to traditi... more Cloud Computing becomes the next generation architecture of IT Enterprise. In contrast to traditional solutions, Cloud computing moves the application software and databases to the large data centers, where the management of the data and services may not be fully trustworthy. This unique feature, however, raises many new security challenges which have not been well understood. In cloud computing, both
IJCI. International Journal of Computers and Information
This paper presents the comparison of two metaheuristic approaches: Differential Evolution (DE) a... more This paper presents the comparison of two metaheuristic approaches: Differential Evolution (DE) and Particle Swarm Optimization (PSO) in the training of feed-forward neural network to predict the daily stock prices. Stock market prediction is the act of trying to determine the future value of a company stock or other financial instrument traded on a financial exchange. The successful prediction of a stock's future price could yield significant profit. The feasibility, effectiveness and generic nature of both DE and PSO approaches investigated are exemplarily demonstrated. Comparisons were made between the two approaches in terms of the prediction accuracy and convergence characteristics. The proposed model is based on the study of historical data, technical indicators and the application of Neural Networks trained with DE and PSO algorithms. Results presented in this paper show the potential of both algorithms applications for the decision making in the stock markets, but DE gives better accuracy compared with PSO.
IJCI. International Journal of Computers and Information
Real-time systems refer to systems that have real-time requirements for interacting with a human ... more Real-time systems refer to systems that have real-time requirements for interacting with a human operator or other agents with similar timescales. A n efficient simulation of real-time systems requires a model that is accurate enough to accomplish the simulation objective and is computationally eff icient. In this paper a real time modeling system for dynamic systems will be studied. Normally, Real time modeling can be classified into hardware and software systems, but this work focuses on the software techniques and systems. Finally a demonstration example for real time simulator has been simulated for complex dynamic system, namely a small nuclear fusion device (Egyptor Tokamak). The obtained results agree well with published work. Such simulator can be considered an imperative requirement for predicative control tasks.
IJCI. International Journal of Computers and Information
Mobile Worldwide Interoperability for Microwave Access (Mobile WiMAX) is a wireless Metropolitan ... more Mobile Worldwide Interoperability for Microwave Access (Mobile WiMAX) is a wireless Metropolitan Area Network technology based on IEEE 802.16e standard. Mobile WiMAX has been designed with the purpose of enabling mobile Internet from the physical layer to the network layer. In this paper, a comparative study between load balancing techniques in mobile WiMAX is presented. These techniques are Spare Capacity Procedure (SCP) and WiMAX QOS Aware Load Balancing Protocol (WQLP). These load balancing techniques used when network congestion occurred in Mobile WiMAX networks. These two techniques represent the major trends of load balancing in Mobile WiMAX IEEE 802.16e technology. These techniques use directed handovers for Load Balancing (LB) among cells; these handovers were initiated by the Serving Base Station. Performance of load balancing techniques have been analyzed and evaluated based on an extensive simulation. The simulation shows that the performance of WQLP is better than SCP in load distribution but SCP is faster than WQLP in handover process. The elaborated performance analysis shows a set of advantages and disadvantages of these techniques. This evaluation study represents an important entry point for choosing the best techniques that can distribute load among BSs and guarantee QoS for all MSs using real-time applications.
IJCI. International Journal of Computers and Information
Clinical records contain massive heterogeneity number of data types, generally written in free-no... more Clinical records contain massive heterogeneity number of data types, generally written in free-note without a linguistic standard. Other forms of medical data include medical images with/without metadata (e.g., CT, MRI, radiology, etc.), audios (e.g., transcriptions, ultrasound), videos (e.g., surgery recording), and structured data (e.g., laboratory test results, age, year, weight, billing, etc.). Consequently, to retrieve the knowledge from these data is not trivial task. Handling the heterogeneity besides largeness and complexity of these data is a challenge. The main purpose of this paper is proposing a framework with twofold. Firstly, it achieves a semantic-based integration approach, which resolves the heterogeneity issue during the integration process of healthcare data from various data sources. Secondly, it achieves a semantic-based medical retrieval approach with enhanced precision. Our experimental study on medical datasets demonstrates the significant accuracy and speedup of the proposed framework over existing approaches.
2016 7th International Conference on Information and Communication Systems (ICICS), 2016
Wireless sensor networks are characterized with large number of nodes deployed randomly over the ... more Wireless sensor networks are characterized with large number of nodes deployed randomly over the network area. In many real applications, random deployment process produce non-uniform distribution of nodes. This means that there is no constant density over any unit area and therefore no constant number of neighbours for each node. With non-uniform distribution, network appear as if it divided to a set of sub-regions each with a different density level. Density in these sub-regions ranging from high density areas, medium, and others are empty areas. For this reason, topology extraction techniques is needed for sensors which helps to discover the layout of the network around them. It assists to figure the skeleton of the whole network. It can help in the discovery of holes and used to solve this problem as a guide for redeployment process to produce a full covered area. Our aim is to define a simple distributed technique that allow all sensors in the network to share information between them and extract the layout of the network. This is done by defining the closed boundary of sub-regions of different density levels which form the network. Previous techniques used for topology extraction need networks with very high density and deal with special deployment figures. Many of them requires uniform distribution which is not always applicable in real situations. Our proposed technique is simple, use lower density than other previously proposed techniques, and do not need special deployment figures.
2016 8th International Conference on Knowledge and Smart Technology (KST), 2016
Sensor nodes are characterized with limited resources of processing, memory, and battery. These f... more Sensor nodes are characterized with limited resources of processing, memory, and battery. These features motivate the researchers to propose power-aware communication protocols. Clustering is used to help for this purpose. It is used to organize the massive number of deployed sensors in the network to minimize energy consumption. Different categories of clustering techniques were proposed. One of these categories is density-based clustering which mainly depends on measuring the density around nodes before grouping them into clusters. This paper proposes an Energy-Efficient Density-based clustering technique which aims to balance the energy consumption among all clusters. This is done by the adaptation of the transmission range of cluster heads to use a suitable value according to the density around it. Simulation results for the proposed technique shows its effectiveness as it achieves less power consumption and more network lifetime when compared with other density-based clustering techniques.
2016 8th International Conference on Knowledge and Smart Technology (KST), 2016
The need of improving the privacy on data publisher becomes more important because data grows ver... more The need of improving the privacy on data publisher becomes more important because data grows very fast. Traditional methods for privacy preserving data publishing cannot prevent privacy leakage. This causes the continuous research to find better methods to prevent privacy leakage. K-anonymity and L-diversity are well-known techniques for data privacy preserving. These techniques cannot prevent the similarity attack on the data privacy because they did not take into consider the semantic relation between the sensitive attributes of the categorical data. In this paper, we proposed an approach to categorical data preservation based on Domain-based of semantic rules to overcome the similarity attacks. The experimental results of the proposal approach focused to categorical data presented. The results showed that the semantic anonymization increases the privacy level with effect data utility.
Real-time systems refer to systems that have real-time requirements for interacting with a human ... more Real-time systems refer to systems that have real-time requirements for interacting with a human operator or other agents with similar timescales. A n efficient simulation of real-time systems requires a model that is accurate enough to accomplish the simulation objective and is computationally eff icient. In this paper a real time modeling system for dynamic systems will be studied. Normally, Real time modeling can be classified into hardware and software systems, but this work focuses on the software techniques and systems. Finally a demonstration example for real time simulator has been simulated for complex dynamic system, namely a small nuclear fusion device (Egyptor Tokamak). The obtained results agree well with published work. Such simulator can be considered an imperative requirement for predicative control tasks.
Page 1. New Encryption Schema Based on Swarm Intelli-gence Chaotic Map Reda M. Hussein#1, Hatem S... more Page 1. New Encryption Schema Based on Swarm Intelli-gence Chaotic Map Reda M. Hussein#1, Hatem S. Ahmed#2, Wail F. Abd El-Wahed#3 #1Information System Dept. #2Information System Dept., #3OR & DSS Dept. Computer Science Dept. ...
Uploads
Papers by Hatem Ahmed