Dr-Mohmmad Alshraideh - Academia.edu (original) (raw)

Papers by Dr-Mohmmad Alshraideh

Research paper thumbnail of Automatic Test Data Generation for Java Card Applications Using Genetic Algorithm

The main objective of software testing is to have the highest likelihood of finding the most faul... more The main objective of software testing is to have the highest likelihood of finding the most faults
with a minimum amount of time and effort. Genetic Algorithm (GA) has been successfully used by
researchers in software testing to automatically generate test data. In this paper, a GA is applied
using branch coverage criterion to generate the least possible set of test data to test JSC applications.
Results show that applying GA achieves better performance in terms of average number of
test data generations, execution time, and percentage of branch coverage.

Research paper thumbnail of Recognition Process of Handwritten and Typed Arabic letters

Research paper thumbnail of Search-based Software Test Data Generation for String Data using program-Specific Search Operators

This paper presents a novel approach to automatic software test data generation, where the test d... more This paper presents a novel approach to automatic software test data generation, where the test data is intended to cover program branches which depend on string predicates such as string equality, string ordering and regular expression matching. A search-based approach is assumed and some potential search operators and corresponding evaluation functions are assembled. Their performance is assessed empirically by using them to generate test data for a number of test programs. A novel approach of using search operators based on programming language string operators and parameterized by string literals from the program under test is introduced. These operators are also assessed empirically in generating test data for the test programs and are shown to provide a significant increase in performance.

Research paper thumbnail of Maintenance-Oriented Classifications of EFSM Transitions

Extended Finite State Machine modeling is a widely used technique to model state-based systems. A... more Extended Finite State Machine modeling is a widely used technique to model state-based systems. Although EFSM models are usually mainly used to simplify the design and implementation of the systems, their use can be extended to enhance and speed up system maintenance (e.g. Error localization, performance enhancement, change management, etc.). In this paper we present a classification approach for EFSM transitions based on their criticality during maintenance. The purpose of this classification is to give the system maintenance team a tool for estimating criticality level for each transition in the EFSM model and consequently to allow them to better plan and manage the change process according the identified criticality of the transitions involved in the required change. Our classification approach is based on transitions' complexity as well as the dependencies between the transitions in the model. An empirical study shows that the classification can be used to enhance and speed up the maintenance process for a required change.

Research paper thumbnail of A Complete Automation of Unit Testing for JavaScript Programs

Problem statement: Program testing is expensive and labor intensive, often consuming more than ha... more Problem statement: Program testing is expensive and labor intensive, often consuming more than half of the total development costs, and yet it is frequently not done well and the results are not always satisfactory. The objective of this paper is to present an automatic test data generation tool that aims to completely automate unit testing of JavaScript functions. The methodology: In order to use the proposed tool, the tester annotates the files that contain the class to be tested. Moreover, the tester must specify the test data coverage criterion to be used, either branch coverage or mutation analysis. However, the tool is then integrated into the JavaScript compiler and test generation is invoked by a command line option. Also, the code to be tested is parsed into an abstract syntax tree from which the test tool generates a program dependency graph for the function under test. However, if mutation analysis coverage is required, the abstract syntax tree for a meta-mutant program is also generated. To provide guidance for the test data search, the function under test instrumented in accordance with the coverage criterion. Branch predicate expressions are always instrumented, in the case of mutation coverage, mutated statements are also instrumented. Compilation then continues from the modified abstract syntax tree to generate instrumented executables that were loaded into the test data search module. Results: The experiment done in our study by using the proposed tool for branch coverage shows that the most effective result for string equality was obtained using the edit distance fitness function, while no significant difference was found in the fitness function for string ordering. Through exhaustive mulation coverage 8% are found to be equivalent. Conclusion: By having a complete automation it reduces the cost of software testing dramatically and also facilitates continuous testing. It is reported that at least 50% of the total software development costs is due to testing, and 10-15% of development time is wasted due to frequent stops for regression testing. Automation will also help get rid of cognitive biases that have been found in human testers. Acknowledgment: The researcher would like to express their gratitude to the anonymous referees for their valuable and helpful comments and suggestions in improving the study.

Research paper thumbnail of Interactive RPC Binding Model

This paper proposes a new Remote Procedure Call (RPC) binding model, in which some interaction be... more This paper proposes a new Remote Procedure Call (RPC) binding model, in which some interaction between the three parties of the system occurs (client, server and binder). The new model uses the server status and the load volume parameters to be exchanged between the parties. The experimental results revealed that the new model is able to decrease the number of retransmitted RPCs due to response time out increasingly when the request rate increase, and decreased the average execution time for a RPC (time needed to send RPC and receive the reply). Moreover, it was also able to increase the load balancing between the servers.

Research paper thumbnail of Using Program Specic Search Operators in Test Data Generation

There has recently been a great deal of interest in search based test data generation, with many ... more There has recently been a great deal of interest in search based test data generation, with many local and global search algorithms being proposed. In this paper, the program operations, in the form of the programspecific operations used to increase the performance in the generation of test data. The efficacy and performance of the proposed testing approach is assessed and validated using a variety of sample programs, and the empirical investigation is shown to give more than eightfold increase in performance.

Research paper thumbnail of Skin Cancer Recognition by Using a Neuro-Fuzzy System

Skin cancer is the most prevalent cancer in the light-skinned population and it is generally caus... more Skin cancer is the most prevalent cancer in the light-skinned population and it is generally caused by exposure to ultraviolet light. Early detection of skin cancer has the potential to reduce mortality and morbidity. There are many diagnostic technologies and tests to diagnose skin cancer. However many of these tests are extremely complex and subjective and depend heavily on the experience of the clinician. To obviate these problems, image processing techniques, a neural network system (NN) and a fuzzy inference system were used in this study as promising modalities for detection of different types of skin cancer. The accuracy rate of the diagnosis of skin cancer by using the hierarchal neural network was 90.67% while using neuro-fuzzy system yielded a slightly higher rate of accuracy of 91.26% in diagnosis skin cancer type. The sensitivity of NN in diagnosing skin cancer was 95%, while the specificity was 88%. Skin cancer diagnosis by neuro-fuzzy system achieved sensitivity of 98% and a specificity of 89%.

Research paper thumbnail of A robotic intelligent wheelchair system based on obstacle avoidance and navigation functions

Powered wheelchairs offer a means of independent mobility for older adults and patients who are u... more Powered wheelchairs offer a means of independent mobility for older adults and patients who are unable to walk and cannot propel a manual wheelchair. One concern is these adults and patients are unable to drive a powered wheelchair safely or properly. Intelligent wheelchairs offer an approach to address this problem by self-controlling the movement and direction of the wheelchair without patients and older adult's intervention. This paper describes the development and evaluation of a robotic intelligent wheelchair system (RIWS) based on obstacle avoidance and navigation functions that allow patients to navigate to scheduled outpatient appointments in a hospital safely and accurately. Experimental results show that all scheduled destinations were successfully reached by our RIWS.

Research paper thumbnail of Three-Dimensional Dynamic Based Borrowing Scheme for Wireless Cellular Networks

Mobility metrics of wireless networks such as link availability, number of neighboring nodes, lin... more Mobility metrics of wireless networks such as link availability, number of neighboring nodes, link duration, link state, and link stability make it difficult to provide a node with quality of services guarantee. In previous research on Quality of Service (QoS) for cellular networks especially for handling handoff connections, the design was based on a flat 2D hexagon cells. However, in reality Base Station antenna coverage is in a 3D space and there exists a blind spot; the area which is just above and bellow the radiated antenna. In this paper we introduce the concept of Blind Spot (BS) in which there is no signals to initiate a call or accepting a handoff one. In BS, the signal power equal zero. Even if there is enough bandwidth to initiate or accept a handoff call, it will be blocked or dropped respectively. We present an implementation of Static Borrowing Scheme (SBS) and we extend the dynamic-rate based borrowing scheme [1] into 3-Dimentional structure and call it 3-Dimensional Dynamic Based Borrowing Scheme (3D DBBS). The proposed new technique for resource sharing is to ensure the continuity for both originating and handoff connections in 3-D cellular networks based on Dynamic-Based Borrowing Scheme (3D BBS). This technique aims to minimize the blocking probability of the originating calls by minimizing the dropping probability of the handoff requests and maximizing the channel utilization. The results revealed that 3D DBBS outperformed the static based schemes by 5% on average even when the blind spot of the base station antenna is taken into consideration. When moving to a 3D space, the results of the simulation showed the 3D DBBS outperformed the static scheme by 2% on average. As a result, considering nodes in a 3D space will have better QoS guarantee as the blocking and dropping probabilities are decreased. Thus, the bandwidth utilization is increased.

Research paper thumbnail of How Technology Affects Our Life: The Case of Mobile Free Minutes in Jordan

In this paper, we investigate the effects of the free minutes offered by mobile line service prov... more In this paper, we investigate the effects of the free minutes offered by mobile line service providers on the mobiles customers (users) in Jordan; we analyzed the data collected from 298 users. Each user completed a survey (questionnaire) with 22 questions. For the purpose of analysis, the survey questions are grouped into three categories; the negative effects, the fairness of rules and the cost. We studied the effects based in these three categories based on four main users' merits; the age, the gender, the qualification and the work nature. We found that for the age and the gender, there was no statistically effect on the negative effects, the fairness of rules and cost while depending on the work and qualification; there is an impact of work (housewives) on the fairness of the applied rules and on the total responses, and the university degree users have an impact on the negative effects, fairness of rules, and cost.

Research paper thumbnail of An Efficient Priority Based Routing Technique That Maximizes the Lifetime and Coverage of Wireless Sensor Networks

Recent development in sensor technologies makes wireless sensor networks (WSN) very popular in th... more Recent development in sensor technologies makes wireless sensor networks (WSN) very popular in the last few years. A limitation of most popular sensors is that sensor nodes have a limited battery capacity that leads to lower the lifetime of WSN. For that, it raises the need to develop energy efficient solutions to keep WSN functioning for the longest period of time. Due to the fact that most of the nodes energy is spent on data transmission, many routing techniques in the literature have been proposed to expand the network lifetime such as the Online Maximum Lifetime heuristics (OML) and capacity maximization (CMAX). In this paper, we introduce an efficient priority based routing power management heuristic in order to increase both coverage and extend lifetime by managing the power at the sensor level. We accomplished that by setting priority metric in addition to dividing the node energy into two ratios; one for the sensor node originated data and the other part is for data relays from other sensors. This heuristic, which is called pERPMT (priority Efficient Routing Power Management Technique), has been applied to two well know routing techniques. Results from running extensive simulation runs revealed the superiority of the new methodology pERPMT over existing heuristics. The pEPRMT increases the lifetime up to 77% and 54% when compared to OML and CMAX respectively.

Research paper thumbnail of Development of a decision support system to predict physicians’ rehabilitation protocols for patients with knee osteoarthritis

To design a medical decision support system (MDSS) that would accurately predict the rehabilitati... more To design a medical decision support system (MDSS) that would accurately predict the rehabilitation protocols prescribed by the physicians for patients with knee osteoarthritis (OA) using only their demographic and clinical characteristics. The demographic and clinical variables for 170 patients receiving one of three treatment protocols for knee OA were entered into the MDSS. Demographic variables in the model were age and sex. Clinical variables entered into the model were height, weight, BMI, affected side, severity of knee OA, and severity of pain. All patients in the study received one of three treatment protocols for patients with knee OA: (a) hot packs, followed by electrotherapy and exercise, (b) ice packs, followed by ultrasound and exercise and (c) exercise alone. The resilient back propagation artificial neural network algorithm was used, with a ten-fold cross-validation. It was estimated that the MDSS is able to accurately predict the treatment prescribed by the physician for 87% of the patients. We developed an artificial neural network-based decision support system that can viably aid physicians in determining which treatment protocol would best match the anthropometric and clinical characteristics of patients with knee OA. Ziel der vorliegenden Studie war die Konzipierung eines Systems zur Unterstü tzung von medizinischen Entscheidungen (MDSS), das Ä rzten die akkurate Prognose der bei Patienten mit Gonarthrose verordneten Rehabilitation nur anhand ihrer demographischen und klinischen Eigenschaften erlaubt. Die demographischen und klinischen Variablen von 170 Patienten, die eine von drei Behandlungen fü r die Gonarthrose erhielten, wurden in das MDSS ü bertragen. Die demographischen Variablen in dem Modell waren Alter und Geschlecht. Die in das Modell ü bertragenen klinischen Variablen waren Kö rpergrö ße, Kö rpergewicht, BMI, betroffene Seite, Schweregrad der Gonarthrose und Schweregrad der Schmerzen. Alle Studienteilnehmer erhielten eine von drei Behandlungen fü r Patienten mit Gonarthrose: (a) Wä rmebehandlung, gefolgt von Elektrotherapie und Krankengymnastik, (b) Kä lte-Sofortkompressen, gefolgt von Ultraschall und Krankengymnastik, und (c) Krankengymnastik als Monotherapie. Angewandt wurde ein kü nstliches neuronales Netz auf der Basis des belastbaren Backpropagation-Algorithmus mit einer zehnfachen Kreuzvalidierung. Schä tzungen zufolge kann das MDSS die ä rztlich verordnete Therapie bei 87% der

Research paper thumbnail of Multi Standard Accreditation as Expert System Tool in Jordan University Hospitals

Research paper thumbnail of Multiple-Population Genetic Algorithm for Solving Min-Max Optimization Problems

A min-max optimization problem was originally designed for simultaneous maximization of the same ... more A min-max optimization problem was originally designed for simultaneous maximization of the same object functions during the same optimization run. The existing approaches for solving min-max problem using genetic algorithms is mainly focused on maintaining a single-population of candidate tests. In this paper, we explore a new approach for using genetic algorithms (GAs) to solve min-max problems. The approach uses a two-population GA to find Maximum and Minimum goals of separate search processes using distinct island populations. The advantage of the suggested approach is that its ability to explore a greater variety of execution paths increases the search efficiency under certain conditions. By applying this to a collection of benchmarks problems, it has been shown experimentally that the proposed multiple-population algorithm out performs the single-population algorithm in terms of the number of executions, execution time, performance improvement, and efficiency.

Research paper thumbnail of Applying Genetic Algorithms to Test JUH DBs Exceptions

Database represents an essential part of software applications. Many organizations use database a... more Database represents an essential part of software applications. Many organizations use database as a repository for large amount of current and historical information. With this context testing database applications is a key issue that deserves attention. SQL Exception handling mechanism can increase the reliability of the system and improve the robustness of the software. But the exception handling code that is used to respond to exceptional conditions tends to be the source of the systems failure. It is difficult to test the exception handling by traditional methods. This paper presents a new technique that combines mutation testing and global optimization based search algorithm to test exceptions code in Jordan University Hospital (JUH) database application. Thus, using mutation testing to speed the raising of exception and global optimization technique in order to automatically generate test cases, we used fitness function depends on range of data related to each query. We try to achieve the coverage of three types of PL/SQL exceptions, which are No_Data_Found (NDF), Too_Many_Rows (TMR) and Others exceptions. The results show that TMR exception is not always covered this due to existence of primary key in the query, also uncovered status appear in nested exceptions.

Research paper thumbnail of A Medical Decision Support System for ENT Disease Diagnosis using Artificial Neural Networks

Ear, nose and throat (ENT) diseases are one of the most common diseases in the world, where the q... more Ear, nose and throat (ENT) diseases are one of the most
common diseases in the world, where the quality of life of patients
decreased if they have any of these diseases. Diagnosing ENT
diseases are the most challenging for ENT doctors to diagnose,
because they all have many similar symptoms and signs. So there is
always a high chance of misdiagnosis. And so to prevent this, we
should increase the accuracy of the diagnosis of these diseases. The
aim of this paper is to develop a decision support system to predict
the diagnosis of common ENT diseases in patients. We have
developed a Medical Decision Support System (MDSS) for the
diagnosis of three ENT diseases using Artificial Neural Network
(ANN), these three diseases are: chronic infection rhinosinusitis,
Otitis external and Pharyngitis. We used a Multilayer Perceptron
(MLP) feed-forward neural network in developing the system. The
input layer includes thirty eight variables, while the output layer
contains one neuron which presents one type of the three ENT
diseases. An iterative process is used to determine the number of
hidden layers and the number of neurons in each one, and to train
the system we used a Resilient back propagation algorithm (Rprop).
For the systems we used multiple experiments models have been
completed with different activation functions such as, Linear
Activation Function (LF), Hyperbolic Tangent Sigmoid Activation
Function (TANH) and Log-Sigmoid Activation Function.
Particularly, 10-fold cross validation schemes are used to access the
generalization of the proposed system. The experiment results have
obtained 95.41% classification accuracy for correct diagnosis in our
system. The data had been taken from 240 patients whom according
to their medical records all were suffering from three ENT diseases,
which were treated managed at the ENT clinics at Jordan University
Hospital (JUH).

Research paper thumbnail of Branch coverage testing using anti-random technique

Software testing is one of the most important, costly and time consuming phase in software develo... more Software testing is one of the most important, costly and time consuming phase in software
development. Anti-random testing chooses the test case where it's total distance from all
previous test cases is the maximum, using the Hamming distance and Cartesian distance as
measures of difference. In this paper, the authors present an anti-random technique to
achieve high branch coverage in white-box testing, depending on the hypothesis that any
two test values with small distance mostly discover the same errors and faults. Experimental
results show that anti-random testing yields acceptable results, but the target of branch
coverage is not achieved in all cases. We executed the algorithm 60 times over ten different
programs, and they found that coverage achieved for eight programs runs with high
performance in terms of execution time.

Research paper thumbnail of A Genetic Algorithm Approach for Breaking of Simplified Data Encryption Standard

A genetic algorithm (GA) is a search algorithm for solving optimization problems due to it is rob... more A genetic algorithm (GA) is a search algorithm for solving optimization problems due to it is robustness; it offers benefits over optimization techniques in searching n-dimensional surface. In today's information age, information transfer has increased exponentially. Hence, security, confidentiality and authentication have become important factors in multimedia communications. Encryption is an effective technique that is preserving the confidentiality of data in Internet applications. Cryptanalysis is a technique of encoding and decoding ciphertext in such way it cannot be interpreted by anyone expects sender and receiver. In this paper, GA with an improved crossover operator was used for the cryptanalysis of Simplified data encryption standard problem (S-DES). Results have shown that GA performance is better than brute force search technique in breaking S-DES key.

Research paper thumbnail of A multiple-population genetic algorithm for branch coverage test data generation

The software testing phase in the software development process is considered a time-consuming pro... more The software testing phase in the software development process is considered a
time-consuming process. In order to reduce the overall development cost, automatic test
data generation techniques based on genetic algorithms have been widely applied. This
research explores a new approach for using genetic algorithms as test data generators to
execute all the branches in a program. In the literature, existing approaches for test data
generation using genetic algorithms are mainly focused on maintaining a single-population
of candidate tests, where the computation of the fitness function for a particular target
branch is based on the closeness of the input execution path to the control dependency
condition of that branch. The new approach utilizes acyclic predicate paths of the program’s
control flow graph containing the target branch as goals of separate search processes
using distinct island populations. The advantages of the suggested approach is its
ability to explore a greater variety of execution paths, and in certain conditions, increasing
the search effectiveness. When applied to a collection of programs with a moderate number
of branches, it has been shown experimentally that the proposed multiple-population
algorithm outperforms the single-population algorithm significantly in terms of the number
of executions, execution time, time improvement, and search effectiveness.

Research paper thumbnail of Automatic Test Data Generation for Java Card Applications Using Genetic Algorithm

The main objective of software testing is to have the highest likelihood of finding the most faul... more The main objective of software testing is to have the highest likelihood of finding the most faults
with a minimum amount of time and effort. Genetic Algorithm (GA) has been successfully used by
researchers in software testing to automatically generate test data. In this paper, a GA is applied
using branch coverage criterion to generate the least possible set of test data to test JSC applications.
Results show that applying GA achieves better performance in terms of average number of
test data generations, execution time, and percentage of branch coverage.

Research paper thumbnail of Recognition Process of Handwritten and Typed Arabic letters

Research paper thumbnail of Search-based Software Test Data Generation for String Data using program-Specific Search Operators

This paper presents a novel approach to automatic software test data generation, where the test d... more This paper presents a novel approach to automatic software test data generation, where the test data is intended to cover program branches which depend on string predicates such as string equality, string ordering and regular expression matching. A search-based approach is assumed and some potential search operators and corresponding evaluation functions are assembled. Their performance is assessed empirically by using them to generate test data for a number of test programs. A novel approach of using search operators based on programming language string operators and parameterized by string literals from the program under test is introduced. These operators are also assessed empirically in generating test data for the test programs and are shown to provide a significant increase in performance.

Research paper thumbnail of Maintenance-Oriented Classifications of EFSM Transitions

Extended Finite State Machine modeling is a widely used technique to model state-based systems. A... more Extended Finite State Machine modeling is a widely used technique to model state-based systems. Although EFSM models are usually mainly used to simplify the design and implementation of the systems, their use can be extended to enhance and speed up system maintenance (e.g. Error localization, performance enhancement, change management, etc.). In this paper we present a classification approach for EFSM transitions based on their criticality during maintenance. The purpose of this classification is to give the system maintenance team a tool for estimating criticality level for each transition in the EFSM model and consequently to allow them to better plan and manage the change process according the identified criticality of the transitions involved in the required change. Our classification approach is based on transitions' complexity as well as the dependencies between the transitions in the model. An empirical study shows that the classification can be used to enhance and speed up the maintenance process for a required change.

Research paper thumbnail of A Complete Automation of Unit Testing for JavaScript Programs

Problem statement: Program testing is expensive and labor intensive, often consuming more than ha... more Problem statement: Program testing is expensive and labor intensive, often consuming more than half of the total development costs, and yet it is frequently not done well and the results are not always satisfactory. The objective of this paper is to present an automatic test data generation tool that aims to completely automate unit testing of JavaScript functions. The methodology: In order to use the proposed tool, the tester annotates the files that contain the class to be tested. Moreover, the tester must specify the test data coverage criterion to be used, either branch coverage or mutation analysis. However, the tool is then integrated into the JavaScript compiler and test generation is invoked by a command line option. Also, the code to be tested is parsed into an abstract syntax tree from which the test tool generates a program dependency graph for the function under test. However, if mutation analysis coverage is required, the abstract syntax tree for a meta-mutant program is also generated. To provide guidance for the test data search, the function under test instrumented in accordance with the coverage criterion. Branch predicate expressions are always instrumented, in the case of mutation coverage, mutated statements are also instrumented. Compilation then continues from the modified abstract syntax tree to generate instrumented executables that were loaded into the test data search module. Results: The experiment done in our study by using the proposed tool for branch coverage shows that the most effective result for string equality was obtained using the edit distance fitness function, while no significant difference was found in the fitness function for string ordering. Through exhaustive mulation coverage 8% are found to be equivalent. Conclusion: By having a complete automation it reduces the cost of software testing dramatically and also facilitates continuous testing. It is reported that at least 50% of the total software development costs is due to testing, and 10-15% of development time is wasted due to frequent stops for regression testing. Automation will also help get rid of cognitive biases that have been found in human testers. Acknowledgment: The researcher would like to express their gratitude to the anonymous referees for their valuable and helpful comments and suggestions in improving the study.

Research paper thumbnail of Interactive RPC Binding Model

This paper proposes a new Remote Procedure Call (RPC) binding model, in which some interaction be... more This paper proposes a new Remote Procedure Call (RPC) binding model, in which some interaction between the three parties of the system occurs (client, server and binder). The new model uses the server status and the load volume parameters to be exchanged between the parties. The experimental results revealed that the new model is able to decrease the number of retransmitted RPCs due to response time out increasingly when the request rate increase, and decreased the average execution time for a RPC (time needed to send RPC and receive the reply). Moreover, it was also able to increase the load balancing between the servers.

Research paper thumbnail of Using Program Specic Search Operators in Test Data Generation

There has recently been a great deal of interest in search based test data generation, with many ... more There has recently been a great deal of interest in search based test data generation, with many local and global search algorithms being proposed. In this paper, the program operations, in the form of the programspecific operations used to increase the performance in the generation of test data. The efficacy and performance of the proposed testing approach is assessed and validated using a variety of sample programs, and the empirical investigation is shown to give more than eightfold increase in performance.

Research paper thumbnail of Skin Cancer Recognition by Using a Neuro-Fuzzy System

Skin cancer is the most prevalent cancer in the light-skinned population and it is generally caus... more Skin cancer is the most prevalent cancer in the light-skinned population and it is generally caused by exposure to ultraviolet light. Early detection of skin cancer has the potential to reduce mortality and morbidity. There are many diagnostic technologies and tests to diagnose skin cancer. However many of these tests are extremely complex and subjective and depend heavily on the experience of the clinician. To obviate these problems, image processing techniques, a neural network system (NN) and a fuzzy inference system were used in this study as promising modalities for detection of different types of skin cancer. The accuracy rate of the diagnosis of skin cancer by using the hierarchal neural network was 90.67% while using neuro-fuzzy system yielded a slightly higher rate of accuracy of 91.26% in diagnosis skin cancer type. The sensitivity of NN in diagnosing skin cancer was 95%, while the specificity was 88%. Skin cancer diagnosis by neuro-fuzzy system achieved sensitivity of 98% and a specificity of 89%.

Research paper thumbnail of A robotic intelligent wheelchair system based on obstacle avoidance and navigation functions

Powered wheelchairs offer a means of independent mobility for older adults and patients who are u... more Powered wheelchairs offer a means of independent mobility for older adults and patients who are unable to walk and cannot propel a manual wheelchair. One concern is these adults and patients are unable to drive a powered wheelchair safely or properly. Intelligent wheelchairs offer an approach to address this problem by self-controlling the movement and direction of the wheelchair without patients and older adult's intervention. This paper describes the development and evaluation of a robotic intelligent wheelchair system (RIWS) based on obstacle avoidance and navigation functions that allow patients to navigate to scheduled outpatient appointments in a hospital safely and accurately. Experimental results show that all scheduled destinations were successfully reached by our RIWS.

Research paper thumbnail of Three-Dimensional Dynamic Based Borrowing Scheme for Wireless Cellular Networks

Mobility metrics of wireless networks such as link availability, number of neighboring nodes, lin... more Mobility metrics of wireless networks such as link availability, number of neighboring nodes, link duration, link state, and link stability make it difficult to provide a node with quality of services guarantee. In previous research on Quality of Service (QoS) for cellular networks especially for handling handoff connections, the design was based on a flat 2D hexagon cells. However, in reality Base Station antenna coverage is in a 3D space and there exists a blind spot; the area which is just above and bellow the radiated antenna. In this paper we introduce the concept of Blind Spot (BS) in which there is no signals to initiate a call or accepting a handoff one. In BS, the signal power equal zero. Even if there is enough bandwidth to initiate or accept a handoff call, it will be blocked or dropped respectively. We present an implementation of Static Borrowing Scheme (SBS) and we extend the dynamic-rate based borrowing scheme [1] into 3-Dimentional structure and call it 3-Dimensional Dynamic Based Borrowing Scheme (3D DBBS). The proposed new technique for resource sharing is to ensure the continuity for both originating and handoff connections in 3-D cellular networks based on Dynamic-Based Borrowing Scheme (3D BBS). This technique aims to minimize the blocking probability of the originating calls by minimizing the dropping probability of the handoff requests and maximizing the channel utilization. The results revealed that 3D DBBS outperformed the static based schemes by 5% on average even when the blind spot of the base station antenna is taken into consideration. When moving to a 3D space, the results of the simulation showed the 3D DBBS outperformed the static scheme by 2% on average. As a result, considering nodes in a 3D space will have better QoS guarantee as the blocking and dropping probabilities are decreased. Thus, the bandwidth utilization is increased.

Research paper thumbnail of How Technology Affects Our Life: The Case of Mobile Free Minutes in Jordan

In this paper, we investigate the effects of the free minutes offered by mobile line service prov... more In this paper, we investigate the effects of the free minutes offered by mobile line service providers on the mobiles customers (users) in Jordan; we analyzed the data collected from 298 users. Each user completed a survey (questionnaire) with 22 questions. For the purpose of analysis, the survey questions are grouped into three categories; the negative effects, the fairness of rules and the cost. We studied the effects based in these three categories based on four main users' merits; the age, the gender, the qualification and the work nature. We found that for the age and the gender, there was no statistically effect on the negative effects, the fairness of rules and cost while depending on the work and qualification; there is an impact of work (housewives) on the fairness of the applied rules and on the total responses, and the university degree users have an impact on the negative effects, fairness of rules, and cost.

Research paper thumbnail of An Efficient Priority Based Routing Technique That Maximizes the Lifetime and Coverage of Wireless Sensor Networks

Recent development in sensor technologies makes wireless sensor networks (WSN) very popular in th... more Recent development in sensor technologies makes wireless sensor networks (WSN) very popular in the last few years. A limitation of most popular sensors is that sensor nodes have a limited battery capacity that leads to lower the lifetime of WSN. For that, it raises the need to develop energy efficient solutions to keep WSN functioning for the longest period of time. Due to the fact that most of the nodes energy is spent on data transmission, many routing techniques in the literature have been proposed to expand the network lifetime such as the Online Maximum Lifetime heuristics (OML) and capacity maximization (CMAX). In this paper, we introduce an efficient priority based routing power management heuristic in order to increase both coverage and extend lifetime by managing the power at the sensor level. We accomplished that by setting priority metric in addition to dividing the node energy into two ratios; one for the sensor node originated data and the other part is for data relays from other sensors. This heuristic, which is called pERPMT (priority Efficient Routing Power Management Technique), has been applied to two well know routing techniques. Results from running extensive simulation runs revealed the superiority of the new methodology pERPMT over existing heuristics. The pEPRMT increases the lifetime up to 77% and 54% when compared to OML and CMAX respectively.

Research paper thumbnail of Development of a decision support system to predict physicians’ rehabilitation protocols for patients with knee osteoarthritis

To design a medical decision support system (MDSS) that would accurately predict the rehabilitati... more To design a medical decision support system (MDSS) that would accurately predict the rehabilitation protocols prescribed by the physicians for patients with knee osteoarthritis (OA) using only their demographic and clinical characteristics. The demographic and clinical variables for 170 patients receiving one of three treatment protocols for knee OA were entered into the MDSS. Demographic variables in the model were age and sex. Clinical variables entered into the model were height, weight, BMI, affected side, severity of knee OA, and severity of pain. All patients in the study received one of three treatment protocols for patients with knee OA: (a) hot packs, followed by electrotherapy and exercise, (b) ice packs, followed by ultrasound and exercise and (c) exercise alone. The resilient back propagation artificial neural network algorithm was used, with a ten-fold cross-validation. It was estimated that the MDSS is able to accurately predict the treatment prescribed by the physician for 87% of the patients. We developed an artificial neural network-based decision support system that can viably aid physicians in determining which treatment protocol would best match the anthropometric and clinical characteristics of patients with knee OA. Ziel der vorliegenden Studie war die Konzipierung eines Systems zur Unterstü tzung von medizinischen Entscheidungen (MDSS), das Ä rzten die akkurate Prognose der bei Patienten mit Gonarthrose verordneten Rehabilitation nur anhand ihrer demographischen und klinischen Eigenschaften erlaubt. Die demographischen und klinischen Variablen von 170 Patienten, die eine von drei Behandlungen fü r die Gonarthrose erhielten, wurden in das MDSS ü bertragen. Die demographischen Variablen in dem Modell waren Alter und Geschlecht. Die in das Modell ü bertragenen klinischen Variablen waren Kö rpergrö ße, Kö rpergewicht, BMI, betroffene Seite, Schweregrad der Gonarthrose und Schweregrad der Schmerzen. Alle Studienteilnehmer erhielten eine von drei Behandlungen fü r Patienten mit Gonarthrose: (a) Wä rmebehandlung, gefolgt von Elektrotherapie und Krankengymnastik, (b) Kä lte-Sofortkompressen, gefolgt von Ultraschall und Krankengymnastik, und (c) Krankengymnastik als Monotherapie. Angewandt wurde ein kü nstliches neuronales Netz auf der Basis des belastbaren Backpropagation-Algorithmus mit einer zehnfachen Kreuzvalidierung. Schä tzungen zufolge kann das MDSS die ä rztlich verordnete Therapie bei 87% der

Research paper thumbnail of Multi Standard Accreditation as Expert System Tool in Jordan University Hospitals

Research paper thumbnail of Multiple-Population Genetic Algorithm for Solving Min-Max Optimization Problems

A min-max optimization problem was originally designed for simultaneous maximization of the same ... more A min-max optimization problem was originally designed for simultaneous maximization of the same object functions during the same optimization run. The existing approaches for solving min-max problem using genetic algorithms is mainly focused on maintaining a single-population of candidate tests. In this paper, we explore a new approach for using genetic algorithms (GAs) to solve min-max problems. The approach uses a two-population GA to find Maximum and Minimum goals of separate search processes using distinct island populations. The advantage of the suggested approach is that its ability to explore a greater variety of execution paths increases the search efficiency under certain conditions. By applying this to a collection of benchmarks problems, it has been shown experimentally that the proposed multiple-population algorithm out performs the single-population algorithm in terms of the number of executions, execution time, performance improvement, and efficiency.

Research paper thumbnail of Applying Genetic Algorithms to Test JUH DBs Exceptions

Database represents an essential part of software applications. Many organizations use database a... more Database represents an essential part of software applications. Many organizations use database as a repository for large amount of current and historical information. With this context testing database applications is a key issue that deserves attention. SQL Exception handling mechanism can increase the reliability of the system and improve the robustness of the software. But the exception handling code that is used to respond to exceptional conditions tends to be the source of the systems failure. It is difficult to test the exception handling by traditional methods. This paper presents a new technique that combines mutation testing and global optimization based search algorithm to test exceptions code in Jordan University Hospital (JUH) database application. Thus, using mutation testing to speed the raising of exception and global optimization technique in order to automatically generate test cases, we used fitness function depends on range of data related to each query. We try to achieve the coverage of three types of PL/SQL exceptions, which are No_Data_Found (NDF), Too_Many_Rows (TMR) and Others exceptions. The results show that TMR exception is not always covered this due to existence of primary key in the query, also uncovered status appear in nested exceptions.

Research paper thumbnail of A Medical Decision Support System for ENT Disease Diagnosis using Artificial Neural Networks

Ear, nose and throat (ENT) diseases are one of the most common diseases in the world, where the q... more Ear, nose and throat (ENT) diseases are one of the most
common diseases in the world, where the quality of life of patients
decreased if they have any of these diseases. Diagnosing ENT
diseases are the most challenging for ENT doctors to diagnose,
because they all have many similar symptoms and signs. So there is
always a high chance of misdiagnosis. And so to prevent this, we
should increase the accuracy of the diagnosis of these diseases. The
aim of this paper is to develop a decision support system to predict
the diagnosis of common ENT diseases in patients. We have
developed a Medical Decision Support System (MDSS) for the
diagnosis of three ENT diseases using Artificial Neural Network
(ANN), these three diseases are: chronic infection rhinosinusitis,
Otitis external and Pharyngitis. We used a Multilayer Perceptron
(MLP) feed-forward neural network in developing the system. The
input layer includes thirty eight variables, while the output layer
contains one neuron which presents one type of the three ENT
diseases. An iterative process is used to determine the number of
hidden layers and the number of neurons in each one, and to train
the system we used a Resilient back propagation algorithm (Rprop).
For the systems we used multiple experiments models have been
completed with different activation functions such as, Linear
Activation Function (LF), Hyperbolic Tangent Sigmoid Activation
Function (TANH) and Log-Sigmoid Activation Function.
Particularly, 10-fold cross validation schemes are used to access the
generalization of the proposed system. The experiment results have
obtained 95.41% classification accuracy for correct diagnosis in our
system. The data had been taken from 240 patients whom according
to their medical records all were suffering from three ENT diseases,
which were treated managed at the ENT clinics at Jordan University
Hospital (JUH).

Research paper thumbnail of Branch coverage testing using anti-random technique

Software testing is one of the most important, costly and time consuming phase in software develo... more Software testing is one of the most important, costly and time consuming phase in software
development. Anti-random testing chooses the test case where it's total distance from all
previous test cases is the maximum, using the Hamming distance and Cartesian distance as
measures of difference. In this paper, the authors present an anti-random technique to
achieve high branch coverage in white-box testing, depending on the hypothesis that any
two test values with small distance mostly discover the same errors and faults. Experimental
results show that anti-random testing yields acceptable results, but the target of branch
coverage is not achieved in all cases. We executed the algorithm 60 times over ten different
programs, and they found that coverage achieved for eight programs runs with high
performance in terms of execution time.

Research paper thumbnail of A Genetic Algorithm Approach for Breaking of Simplified Data Encryption Standard

A genetic algorithm (GA) is a search algorithm for solving optimization problems due to it is rob... more A genetic algorithm (GA) is a search algorithm for solving optimization problems due to it is robustness; it offers benefits over optimization techniques in searching n-dimensional surface. In today's information age, information transfer has increased exponentially. Hence, security, confidentiality and authentication have become important factors in multimedia communications. Encryption is an effective technique that is preserving the confidentiality of data in Internet applications. Cryptanalysis is a technique of encoding and decoding ciphertext in such way it cannot be interpreted by anyone expects sender and receiver. In this paper, GA with an improved crossover operator was used for the cryptanalysis of Simplified data encryption standard problem (S-DES). Results have shown that GA performance is better than brute force search technique in breaking S-DES key.

Research paper thumbnail of A multiple-population genetic algorithm for branch coverage test data generation

The software testing phase in the software development process is considered a time-consuming pro... more The software testing phase in the software development process is considered a
time-consuming process. In order to reduce the overall development cost, automatic test
data generation techniques based on genetic algorithms have been widely applied. This
research explores a new approach for using genetic algorithms as test data generators to
execute all the branches in a program. In the literature, existing approaches for test data
generation using genetic algorithms are mainly focused on maintaining a single-population
of candidate tests, where the computation of the fitness function for a particular target
branch is based on the closeness of the input execution path to the control dependency
condition of that branch. The new approach utilizes acyclic predicate paths of the program’s
control flow graph containing the target branch as goals of separate search processes
using distinct island populations. The advantages of the suggested approach is its
ability to explore a greater variety of execution paths, and in certain conditions, increasing
the search effectiveness. When applied to a collection of programs with a moderate number
of branches, it has been shown experimentally that the proposed multiple-population
algorithm outperforms the single-population algorithm significantly in terms of the number
of executions, execution time, time improvement, and search effectiveness.