Papers by Mohammad Hjouj Btoush
International Journal of Medical Engineering and Informatics
Journal of Software Engineering and Applications, 2012
The development of multimedia and digital imaging has led to high quantity of data required to re... more The development of multimedia and digital imaging has led to high quantity of data required to represent modern imagery. This requires large disk space for storage, and long time for transmission over computer networks, and these two are relatively expensive. These factors prove the need for images compression. Image compression addresses the problem of reducing the amount of space required to represent a digital image yielding a compact representation of an image, and thereby reducing the image storage/transmission time requirements. The key idea here is to remove redundancy of data presented within an image to reduce its size without affecting the essential information of it. We are concerned with lossless image compression in this paper. Our proposed approach is a mix of a number of already existing techniques. Our approach works as follows: first, we apply the well-known Lempel-Ziv-Welch (LZW) algorithm on the image in hand. What comes out of the first step is forward to the second step where the Bose, Chaudhuri and Hocquenghem (BCH) error correction and detected algorithm is used. To improve the compression ratio, the proposed approach applies the BCH algorithms repeatedly until "inflation" is detected. The experimental results show that the proposed algorithm could achieve an excellent compression ratio without losing data when compared to the standard compression algorithms.
Fifth International Conference on Information Technology: New Generations (itng 2008), 2008
The paper compares different data compression algorithms of text files: LZW, Huffman, fixed-lengt... more The paper compares different data compression algorithms of text files: LZW, Huffman, fixed-length code (FLC), and Huffman after using fixed-length code (HFLC). We compare these algorithms on different text files of different sizes in terms of compression scales of: size, ratio, time (speed), and entropy. Our evaluation reveals that initially for smaller size files the simplest algorithm namely LZW performs
Concepts, Methodologies, Tools and Applications (4 Volumes)
Without doubt information and communication technologies (ICTs) continue to change the world to v... more Without doubt information and communication technologies (ICTs) continue to change the world to varying degrees from one place to another, depending on a number of technological and socio-economical factors such as: physical access to technology, education, geographical locations,
Moment functions of the two dimensional image intensity distributions are used as descriptors of ... more Moment functions of the two dimensional image intensity distributions are used as descriptors of shape in a variety of applications. In this work, an improvement on the moment computation time is considered by providing the design and implementation of a parallel algorithm that will speedup moment computation.
International Conference on E-Learning, E-Business, Enterprise Information Systems, & E-Government, 2008
This paper examines what are known as the 'maturity models' or the 'stage-ladder models' which al... more This paper examines what are known as the 'maturity models' or the 'stage-ladder models' which all share stages of growth or maturity with one common point; the implication of the evolution of eservices in series of stages moving upward to embody the notion of a continual process improvement. However, this paper suggests a new model called the six dimension model (6I) that provides a framework for a detailed description of what each stage involves in terms of e-service characterization. It attempts to provide a common/standard characterization of the features of each stage and thereby enhances previous models
Air quality forecasting has acquired great significance in environmental sciences due to its adve... more Air quality forecasting has acquired great significance in environmental sciences due to its adverse affects on humans and the environment. The artificial neural network is one of the most common soft computing techniques that can be applied for modeling such complex problem. This study designed air quality forecasting model using three-layer FFNN's and recurrent Elman network to forecast PM10 air pollutant concentrations 1 day advance in Yilan County, Taiwan. Then, the optimal model is selected based on testing performance measurements (RMSE, MAE, r, IA and VAF) and learning time. This study used an hourly historical data set from 1/1/2009-31/12/2011 collected by Dongshan station. The data was entirely pre-processed and cleared form missing and outlier values then transformed into daily average values. The final results showed that the three-layer FFNN with One Step Secant (OSS) training algorithm achieved better results than Elman network with Gradient Descent adaptive learnin...
The rapid rise in the delivery of e-services which involve communications and transactions betwee... more The rapid rise in the delivery of e-services which involve communications and transactions between government, at various levels, and citizens has lead to a pressing need to develop models of users’ satisfaction that will gauge the extent to which e-service meet the users’ needs and expectations. This research aims to contribute to a growing major domain in e-government research, specifically in developing countries, that examines e-services delivery to users. The model that is used in this research, the 6I Model, has been developed by the researchers in an earlier stage. The results in this research which was conducted within the Jordanian context suggest that there is a value in utilizing a robust measure of users’ perception of the level of satisfaction with the e-services presented to them. Keywords— Public E-services, Jordan, 6I Model, Current Status, Desired Status
The complexity of today e-Business environments makes traditional security approaches and ad hoc ... more The complexity of today e-Business environments makes traditional security approaches and ad hoc security solutions insufficient to thwart the increasing numbers of security breaches. These approaches which perceive security from one dimension are unable to provide adequate ...
The aim is to build a system that takes the source text as an input and the result will be a summ... more The aim is to build a system that takes the source text as an input and the result will be a summery which contains sentences preserve the main theme of the source. We use a statistical approach to solve the problem of Arabic text summarization. Many researches which study Arabic text summarization problem focus on how writers of articles write, here we make a system take the advantage of how readers of an article read and make comments on article. Our approach assign scores for each segment of an article depending on the location of the sentences, the size of each sentence and term frequency on the article and comments of readers. Our system tested by 100 human evaluators, We gave each evaluator a copy of the summary produced by our system, with a question about the connectivity of ideas and sentences in <article, summary> and asked them to evaluate the summary. We asked them to answer by rejected, not-related, satisfactory, good, accepted, to test our approach which focus on...
This paper examines what are known as the ‘maturity models’ or the ‘stage-ladder models’ which al... more This paper examines what are known as the ‘maturity models’ or the ‘stage-ladder models’ which all share stages of growth or maturity with one common point; the implication of the evolution of eservices in series of stages moving upward to embody the notion of a continual process improvement. However, this paper suggests a new model called the six dimension model (6I) that provides a framework for a detailed description of what each stage involves in terms of e-service characterization. It attempts to provide a common/standard characterization of the features of each stage and thereby enhances previous models
A great deal of research works in the field information systems security has been based on a posi... more A great deal of research works in the field information systems security has been based on a positivist paradigm. Applying the reductionism concept of the positivist paradigm for information security means missing the bigger picture and thus, the lack of holism which could be one of the reasons why security is still overlooked, comes as an afterthought or perceived from a purely technical dimension. We need to reshape our thinking and attitudes towards security especially in a complex and dynamic environment such as e- Business to develop a holistic understanding of e-Business security in relation to its context as well as considering all the stakeholders in the problem area. In this paper we argue the suitability and need for more inductive interpretive approach and qualitative research method to investigate e-Business security. Our discussion is based on a holistic framework of enquiry, nature of the research problem, the underling theoretical lens and the complexity of e-Business...
E-Business is more than just combining business with Internet technologies. Diverse issues are fa... more E-Business is more than just combining business with Internet technologies. Diverse issues are facing organisations endeavour to adopt e-Business. E-Business Security is one of the major issues facing e-Business organisations. This issue is usually overlooked, comes as an ...
International Journal of Knowledge and Learning
International Journal of Advanced Computer Science and Applications, 2016
The aim of this study is to build a tool for Part of Speech (POS) tagging and Name Entity Recogni... more The aim of this study is to build a tool for Part of Speech (POS) tagging and Name Entity Recognition for Arabic Language, the approach used to build this tool is a rule base technique. The POS Tagger contains two phases:The first phase is to pass word into a lexicon phase, the second level is the morphological phase, and the tagset are (Noun, Verb and Determine). The Named-Entity detector will apply rules on the text and give the correct Labels for each word, the labels are Person(PERS), Location (LOC) and Organization (ORG).
Journal of Computer and Communications
In this paper, we analyze the complexity and entropy of different methods of data compression alg... more In this paper, we analyze the complexity and entropy of different methods of data compression algorithms: LZW, Huffman, Fixed-length code (FLC), and Huffman after using Fixed-length code (HFLC). We test those algorithms on different files of different sizes and then conclude that: LZW is the best one in all compression scales that we tested especially on the large files, then Huffman, HFLC, and FLC, respectively. Data compression still is an important topic for research these days, and has many applications and uses needed. Therefore, we suggest continuing searching in this field and trying to combine two techniques in order to reach a best one, or use another source mapping (Hamming) like embedding a linear array into a Hypercube with other good techniques like Huffman and trying to reach good results.
Research Journal of Applied Sciences, Engineering and Technology, 2015
Research Journal of Applied Sciences, Engineering and Technology
Abstract: Air quality forecasting has acquired great significance in environmental sciences due t... more Abstract: Air quality forecasting has acquired great significance in environmental sciences due to its adverse affects on humans and the environment. The artificial neural network is one of the most common soft computing techniques that can be applied for modeling such complex problem. This study designed air quality forecasting model using three-layer FFNN's and recurrent Elman network to forecast PM10 air pollutant concentrations 1 day advance in Yilan County, Taiwan. Then, the optimal model is selected based on testing performance measurements (RMSE, MAE, r, IA and VAF) and learning time. This study used an hourly historical data set from 1/1/2009-31/12/2011 collected by Dongshan station. The data was entirely pre-processed and cleared form missing and outlier values then transformed into daily average values. The final results showed that the three-layer FFNN with One Step Secant (OSS) training algorithm achieved better results than Elman network with Gradient Descent adapti...
Uploads
Papers by Mohammad Hjouj Btoush