Fua'ad Abdulrazzak - Academia.edu (original) (raw)
Uploads
Papers by Fua'ad Abdulrazzak
Journal of Computer Science, Mar 1, 2010
Problem statement: The communications domain has utilized the implementation of protocols for a w... more Problem statement: The communications domain has utilized the implementation of protocols for a wide spectrum of applications. This encompasses Medium Access Control (MAC) protocols. MAC protocols have been extensively researched from several angles. This encompasses the implementation in the area of Wave Division Multiplexing (WDM) networks and Mobile Adhoc Networks (MANET). The relevance of intelligence in sustaining the prerequisites for dynamic reconfiguration has gained an integral attention in MANET. Approach: The implementation of Token Ring in MANET can be correlated to its complementary implementation in IP networks. In this paper, the limitation of Token Ring algorithm for IP networks in the context of intelligent processing has been researched extensively. An enhanced Token Ring protocol governed by intelligent processing has been implemented in this paper. The core of the new protocol is based on the circulation mechanism of the token. As opposed to the traditional circulatory mechanism, a software agent is designed to become an intelligent circulatory agent is this research. The developed software agent is utilized to implement prioritized token access subject to the traffic type. Each station is coupled with a software agent who cohesively collaborates to assign the token. Results: The proposed agent and the enhanced Token Ring implementation have been extensively verified through simulation experiments. A complete circulation of the ring is defined upon all nodes being visited at least once. Discrete-event simulation models were developed and deployed for the purpose of performance analysis. The results acquired validated the improved results of the new software agent based implementation. The performance metrics studied were average delay and average buffer utilization. Conclusion: The proposed algorithm has enabled to derive an ideal balance between the complexity of intelligent processing and the versatility of managing the token ring.
2019 First International Conference of Intelligent Computing and Engineering (ICOICE)
5G technologies are expected to connect people, objects, data, applications, transport systems, a... more 5G technologies are expected to connect people, objects, data, applications, transport systems, and cities in intelligent networked communication environments. An enormous amount of data should move much faster, reliably deliver too many devices, and handle very large amounts of data with minimal delay. However, the performance will be required TCP protocols to effectively utilize the large air link capacity and provide the end-to-end performance required by future networks and mmWave (Millimeter Wave) technology. In this paper, the implemented framework of mmWave modeling has been analyzed using ns-3 simulator. The framework is demonstrated through several simulation scenarios to analyze the performance of TCP protocols over mmWave using three main performance measurements, which are Round Trip Time (RTT), Congestion Window size (CWnd) and Throughput. The achieved results show that it would provide internet connections 40 times faster. The coverage four times more worldwide than the current 4G.
2019 First International Conference of Intelligent Computing and Engineering (ICOICE)
In recent years, there is a general tendency in the field of networking to use programmable netwo... more In recent years, there is a general tendency in the field of networking to use programmable networks. One of the most important types of networks that have gained more attention is the Software Defined Networks (SDN), which is based on the principle of separation between the control plane and the data plane. This in turn made the network control centralized and programmable through a centralized controller that is managed by a protocol called OpenFlow. When the main OpenFlow protocol is used, the challenge of time usage in the process of creating communication channels becomes evident each time packets are transmitted. Therefore, this research intends to propose an enhanced algorithm to improve the network performance by integrating the idea of userspace that was successful with operating systems and activating it within SDN networks. The proposed OpenFlow based on UserSpace (OFUS) has been analyzed and compared to the OpenFlow-enabled SDN networks in terms of Round Trip Time (RTT) and the network throughput as performance measurements. The comparative analysis was made by using the open-source Mininet network simulator. The comparative analysis done by using open-source Mininet network simulator on four types of topologies. The results of the enhanced algorithm OFUS showed that the performance of the network has been improved for the RTT and throughput compare to OpenFlow-enabled networks. The RTT in single, linear, tree and custom topologies are 40.3%, 36.2%, 51% and 20.4% whereas throughput 65.4%, 57%, 104.2% and 25.6% respectively.
In distributed system, a single database is spread physically across computers in multiple locati... more In distributed system, a single database is spread physically across computers in multiple locations called distributed database. One of the most serious problems in distributed database is deadlock. The deadlock is a state of the system in which transactions are waiting for one another indefinitely. This paper presents a new algorithm to detect multi resource deadlocks using directed graph. The proposed algorithm is developed over the algorithms by Brain M. Johnston and Himanshi Grover. In previous algorithms, there are no criteria to decide the transaction, which needs to be aborted early to reduce the repeated time of detections. It makes this decision using incoming and out coming requests of transactions in graph as criteria to decide such transaction to detected and aborted early. It ensures that only one transaction will detect the deadlock cycle. All true deadlocks are detected in finite time and no false and undetected dead-locks are reported.
Journal of Ambient Intelligence and Humanized Computing, 2020
Data fusion is a fundamental research topic especially in the Internet of Things (IoT). A massive... more Data fusion is a fundamental research topic especially in the Internet of Things (IoT). A massive quantity of data is increasingly being generated by heterogeneous sensors which make data integration more difficult. A noticeable body of research has attempted to mitigate the incompatibility between the collected data to facilitate meaningful data integration between machines by using the semantic web technologies. However, there are still some critical issues including scalability and measurement unit conflicts. Therefore, this paper proposes a scalable semantic data fusion framework that aims at improving the scalability of data fusion and detecting and reconciling measurement unit conflicts. This framework is fully implemented to demonstrate its scalability during the process of data fusion, and its ability to handle measurement unit conflicts. Two experiments were conducted to evaluate the scalability and effectiveness of the proposed framework using real dataset that was collected from different sensors. To evaluate the scalability of the proposed framework, a set of queries was adapted and the average response time was calculated from the execution of every query. Whereas, the total number of the conflicts detected and resolved by the proposed framework were used to evaluate the effectiveness. Experimental results show that the proposed framework improves the scalability of data fusion among heterogeneous sensors’ data, and effective in detecting and resolving data unit conflicts.
2021 International Conference of Technology, Science and Administration (ICTSA), 2021
the cluster heads (CHs) are distributed randomly in the Low Energy Adaptive Clustering Hierarchy ... more the cluster heads (CHs) are distributed randomly in the Low Energy Adaptive Clustering Hierarchy (LEACH) protocol. Some sensor nodes may be placed further away from the CHs and thus are not covered. This causes excessive route energy consumption of the isolated nodes. Because of randomly selecting the CHs, some nodes can die quickly due to extra workload. By used Spider Monkey Optimization (SMO), we can elect the CHs and design cluster-based routing algorithms. The SMO looks for the social behaviors of spider monkeys to choose the optimal route. The social behaviors are an example of the fission-fusion system. In this paper, we proposed the Selection Optimal of Cluster Head Based Spider Monkey Optimization (OCHSMO) algorithm. The proposed algorithm has used the mechanism of the SMO to solve the problems of cluster routing in the WSNs. It improves the LEACH protocol in terms of energy consumption, system lifetime, stability period, and system quality of the network. Experimental results show that the OCHSMO algorithm is better than the LEACH algorithm in all performance measures. We used the MATLAB to evaluate the performance.
International Journal of Advanced Research in Computer Science and Software Engineering, 2017
Many banks around the world are starting to offer banking services through mobile phones. Therefo... more Many banks around the world are starting to offer banking services through mobile phones. Therefore, many studies have been proposed to help bankers to increase the number of customers. Despite the effort that has been founded in the state of the art, the adoption rate of mobile banking application has not reached the expected amount yet. Therefore, the aim of this study is to analyse the most well-known and accepted models to provide a comprehensive understanding of their impacts toward the adoption of mobile banking applications. Furthermore, this study aims also at exploring the most effected factors that have been used to influence the adoption behaviour intention of mobile banking applications. As a result of this survey study, some critical recommendations were stated clearly.
International Journal of Intelligent Networks, 2021
Realistic and lifelike 3D-reconstruction of virtual humans has various exciting and important use... more Realistic and lifelike 3D-reconstruction of virtual humans has various exciting and important use cases. Our and others' appearances have notable effects on ourselves and our interaction partners in virtual environments, e.g., on acceptance, preference, trust, believability, behavior (the Proteus effect), and more. Today, multiple approaches for the 3D-reconstruction of virtual humans exist. They significantly vary in terms of the degree of achievable realism, the technical complexities, and finally, the overall reconstruction costs involved. This article compares two 3D-reconstruction approaches with very different hardware requirements. The high-cost solution uses a typical complex and elaborated camera rig consisting of 94 digital single-lens reflex (DSLR) cameras. The recently developed low-cost solution uses a smartphone camera to create videos that capture multiple views of a person. Both methods use photogrammetric reconstruction and template fitting with the same template model and differ in their adaptation to the method-specific input material. Each method generates high-quality virtual humans ready to be processed, animated, and rendered by standard XR simulation and game engines such as Unreal or Unity. We compare the results of the two 3D-reconstruction methods in an immersive virtual environment against each other in a user study. Our results indicate that the virtual humans from the low-cost approach are perceived similarly to those from the high-cost approach regarding the perceived similarity to the original, human-likeness, beauty, and uncanniness, despite significant differences in the objectively measured quality. The perceived feeling of change of the own body was higher for the low-cost virtual humans. Quality differences were perceived more strongly for one's own body than for other virtual humans.
Journal of Computer Science, Mar 1, 2010
Problem statement: The communications domain has utilized the implementation of protocols for a w... more Problem statement: The communications domain has utilized the implementation of protocols for a wide spectrum of applications. This encompasses Medium Access Control (MAC) protocols. MAC protocols have been extensively researched from several angles. This encompasses the implementation in the area of Wave Division Multiplexing (WDM) networks and Mobile Adhoc Networks (MANET). The relevance of intelligence in sustaining the prerequisites for dynamic reconfiguration has gained an integral attention in MANET. Approach: The implementation of Token Ring in MANET can be correlated to its complementary implementation in IP networks. In this paper, the limitation of Token Ring algorithm for IP networks in the context of intelligent processing has been researched extensively. An enhanced Token Ring protocol governed by intelligent processing has been implemented in this paper. The core of the new protocol is based on the circulation mechanism of the token. As opposed to the traditional circulatory mechanism, a software agent is designed to become an intelligent circulatory agent is this research. The developed software agent is utilized to implement prioritized token access subject to the traffic type. Each station is coupled with a software agent who cohesively collaborates to assign the token. Results: The proposed agent and the enhanced Token Ring implementation have been extensively verified through simulation experiments. A complete circulation of the ring is defined upon all nodes being visited at least once. Discrete-event simulation models were developed and deployed for the purpose of performance analysis. The results acquired validated the improved results of the new software agent based implementation. The performance metrics studied were average delay and average buffer utilization. Conclusion: The proposed algorithm has enabled to derive an ideal balance between the complexity of intelligent processing and the versatility of managing the token ring.
2019 First International Conference of Intelligent Computing and Engineering (ICOICE)
5G technologies are expected to connect people, objects, data, applications, transport systems, a... more 5G technologies are expected to connect people, objects, data, applications, transport systems, and cities in intelligent networked communication environments. An enormous amount of data should move much faster, reliably deliver too many devices, and handle very large amounts of data with minimal delay. However, the performance will be required TCP protocols to effectively utilize the large air link capacity and provide the end-to-end performance required by future networks and mmWave (Millimeter Wave) technology. In this paper, the implemented framework of mmWave modeling has been analyzed using ns-3 simulator. The framework is demonstrated through several simulation scenarios to analyze the performance of TCP protocols over mmWave using three main performance measurements, which are Round Trip Time (RTT), Congestion Window size (CWnd) and Throughput. The achieved results show that it would provide internet connections 40 times faster. The coverage four times more worldwide than the current 4G.
2019 First International Conference of Intelligent Computing and Engineering (ICOICE)
In recent years, there is a general tendency in the field of networking to use programmable netwo... more In recent years, there is a general tendency in the field of networking to use programmable networks. One of the most important types of networks that have gained more attention is the Software Defined Networks (SDN), which is based on the principle of separation between the control plane and the data plane. This in turn made the network control centralized and programmable through a centralized controller that is managed by a protocol called OpenFlow. When the main OpenFlow protocol is used, the challenge of time usage in the process of creating communication channels becomes evident each time packets are transmitted. Therefore, this research intends to propose an enhanced algorithm to improve the network performance by integrating the idea of userspace that was successful with operating systems and activating it within SDN networks. The proposed OpenFlow based on UserSpace (OFUS) has been analyzed and compared to the OpenFlow-enabled SDN networks in terms of Round Trip Time (RTT) and the network throughput as performance measurements. The comparative analysis was made by using the open-source Mininet network simulator. The comparative analysis done by using open-source Mininet network simulator on four types of topologies. The results of the enhanced algorithm OFUS showed that the performance of the network has been improved for the RTT and throughput compare to OpenFlow-enabled networks. The RTT in single, linear, tree and custom topologies are 40.3%, 36.2%, 51% and 20.4% whereas throughput 65.4%, 57%, 104.2% and 25.6% respectively.
In distributed system, a single database is spread physically across computers in multiple locati... more In distributed system, a single database is spread physically across computers in multiple locations called distributed database. One of the most serious problems in distributed database is deadlock. The deadlock is a state of the system in which transactions are waiting for one another indefinitely. This paper presents a new algorithm to detect multi resource deadlocks using directed graph. The proposed algorithm is developed over the algorithms by Brain M. Johnston and Himanshi Grover. In previous algorithms, there are no criteria to decide the transaction, which needs to be aborted early to reduce the repeated time of detections. It makes this decision using incoming and out coming requests of transactions in graph as criteria to decide such transaction to detected and aborted early. It ensures that only one transaction will detect the deadlock cycle. All true deadlocks are detected in finite time and no false and undetected dead-locks are reported.
Journal of Ambient Intelligence and Humanized Computing, 2020
Data fusion is a fundamental research topic especially in the Internet of Things (IoT). A massive... more Data fusion is a fundamental research topic especially in the Internet of Things (IoT). A massive quantity of data is increasingly being generated by heterogeneous sensors which make data integration more difficult. A noticeable body of research has attempted to mitigate the incompatibility between the collected data to facilitate meaningful data integration between machines by using the semantic web technologies. However, there are still some critical issues including scalability and measurement unit conflicts. Therefore, this paper proposes a scalable semantic data fusion framework that aims at improving the scalability of data fusion and detecting and reconciling measurement unit conflicts. This framework is fully implemented to demonstrate its scalability during the process of data fusion, and its ability to handle measurement unit conflicts. Two experiments were conducted to evaluate the scalability and effectiveness of the proposed framework using real dataset that was collected from different sensors. To evaluate the scalability of the proposed framework, a set of queries was adapted and the average response time was calculated from the execution of every query. Whereas, the total number of the conflicts detected and resolved by the proposed framework were used to evaluate the effectiveness. Experimental results show that the proposed framework improves the scalability of data fusion among heterogeneous sensors’ data, and effective in detecting and resolving data unit conflicts.
2021 International Conference of Technology, Science and Administration (ICTSA), 2021
the cluster heads (CHs) are distributed randomly in the Low Energy Adaptive Clustering Hierarchy ... more the cluster heads (CHs) are distributed randomly in the Low Energy Adaptive Clustering Hierarchy (LEACH) protocol. Some sensor nodes may be placed further away from the CHs and thus are not covered. This causes excessive route energy consumption of the isolated nodes. Because of randomly selecting the CHs, some nodes can die quickly due to extra workload. By used Spider Monkey Optimization (SMO), we can elect the CHs and design cluster-based routing algorithms. The SMO looks for the social behaviors of spider monkeys to choose the optimal route. The social behaviors are an example of the fission-fusion system. In this paper, we proposed the Selection Optimal of Cluster Head Based Spider Monkey Optimization (OCHSMO) algorithm. The proposed algorithm has used the mechanism of the SMO to solve the problems of cluster routing in the WSNs. It improves the LEACH protocol in terms of energy consumption, system lifetime, stability period, and system quality of the network. Experimental results show that the OCHSMO algorithm is better than the LEACH algorithm in all performance measures. We used the MATLAB to evaluate the performance.
International Journal of Advanced Research in Computer Science and Software Engineering, 2017
Many banks around the world are starting to offer banking services through mobile phones. Therefo... more Many banks around the world are starting to offer banking services through mobile phones. Therefore, many studies have been proposed to help bankers to increase the number of customers. Despite the effort that has been founded in the state of the art, the adoption rate of mobile banking application has not reached the expected amount yet. Therefore, the aim of this study is to analyse the most well-known and accepted models to provide a comprehensive understanding of their impacts toward the adoption of mobile banking applications. Furthermore, this study aims also at exploring the most effected factors that have been used to influence the adoption behaviour intention of mobile banking applications. As a result of this survey study, some critical recommendations were stated clearly.
International Journal of Intelligent Networks, 2021
Realistic and lifelike 3D-reconstruction of virtual humans has various exciting and important use... more Realistic and lifelike 3D-reconstruction of virtual humans has various exciting and important use cases. Our and others' appearances have notable effects on ourselves and our interaction partners in virtual environments, e.g., on acceptance, preference, trust, believability, behavior (the Proteus effect), and more. Today, multiple approaches for the 3D-reconstruction of virtual humans exist. They significantly vary in terms of the degree of achievable realism, the technical complexities, and finally, the overall reconstruction costs involved. This article compares two 3D-reconstruction approaches with very different hardware requirements. The high-cost solution uses a typical complex and elaborated camera rig consisting of 94 digital single-lens reflex (DSLR) cameras. The recently developed low-cost solution uses a smartphone camera to create videos that capture multiple views of a person. Both methods use photogrammetric reconstruction and template fitting with the same template model and differ in their adaptation to the method-specific input material. Each method generates high-quality virtual humans ready to be processed, animated, and rendered by standard XR simulation and game engines such as Unreal or Unity. We compare the results of the two 3D-reconstruction methods in an immersive virtual environment against each other in a user study. Our results indicate that the virtual humans from the low-cost approach are perceived similarly to those from the high-cost approach regarding the perceived similarity to the original, human-likeness, beauty, and uncanniness, despite significant differences in the objectively measured quality. The perceived feeling of change of the own body was higher for the low-cost virtual humans. Quality differences were perceived more strongly for one's own body than for other virtual humans.