Ayman EL-SAYED | Menoufia University (original) (raw)
Papers by Ayman EL-SAYED
International Journal of Computational Intelligence Systems
In recent times, digital twins (DT) is becoming an emerging and key technology for smart industri... more In recent times, digital twins (DT) is becoming an emerging and key technology for smart industrial control systems and Industrial Internet of things (IIoT) applications. The DT presently supports a significant tool that can generate a huge dataset for fault prediction and diagnosis in a real-time scenario for critical industrial applications with the support of powerful artificial intelligence (AI). The physical assets of DT can produce system performance data that is close to reality, which delivers remarkable opportunities for machine fault diagnosis for effective measured fault conditions. Therefore, this study presents an intelligent and efficient AI-based fault diagnosis framework using new hybrid optimization and machine learning models for industrial DT systems, namely, the triplex pump model and transmission system. The proposed hybrid framework utilizes a combination of optimization techniques (OT) such as the flower pollination algorithm (FPA), particle swarm algorithm (P...
International Journal of Computational Intelligence Systems
In recent times, the Internet of Things (IoT) and Deep Learning Models (DLMs) can be utilized for... more In recent times, the Internet of Things (IoT) and Deep Learning Models (DLMs) can be utilized for developing smart agriculture to determine the exact location of the diseased part of the leaf on farmland in an efficient manner. There is no exception that convolutional neural networks (CNNs) have achieved the latest accomplishment in many aspects of human life and the farming sector. Semantic image segmentation is considered the main problem in computer vision. Despite tremendous progress in applications, approximately all semantic image segmentation algorithms fail to achieve sufficient hash results because of the absence of details sensitivity, problems in assessing the global similarity of image pixels, or both. Methods of post-processing improvement, as a wonderfully critical means of improving the underlying flaws mentioned above from algorithms, depend almost on Conditional Random Fields (CRFs). Therefore, plant disease prediction plays important role in the premature notificat...
Menoufia Journal of Electronic Engineering Research
the diagnosis of chest disease depends in most cases on the complex grouping of clinical data and... more the diagnosis of chest disease depends in most cases on the complex grouping of clinical data and images. According to this complexity, the debate is increased between researchers and doctors about the efficient and accurate method for chest disease prediction. The purpose of this research is to enhance the first handling of the patient data to get a prior diagnosis of the disease. The main problem in such diagnosis is the quality and quantity of the images.In this paper such problem is solved by utilizing some methods of preprocessing such as augmentation and segmentation. In addition are experimenting different machine learning techniques for feature selection and classification.The experiments have been conducted on three different data sets. As the results showed, the recognition accuracy using SVM algorithm in the classification stage, the VGG16 model for feature extraction, and LDA for dimension reduction is 67% without using image pre-processing techniques, by applying pre-processing the accuracy increased to 89%. Using a two-layer NN the recognition accuracy is 69.3%. For the same model, the accuracy has increased with the addition of image pre-processing techniques to reach 96%.
Menoufia Journal of Electronic Engineering Research
The estimation and the controlling processes in the crowd counting filed are the most important o... more The estimation and the controlling processes in the crowd counting filed are the most important on over applications related to surveillance systems and control follow related peoples to give them a complete safety especially in the large groups that contain a huge collection of individuals. There were found various events in human histories that gather a huge number of crowds in the same place. For example, in religion occasions as in HAJJ season which occurs every year, these events cause accidents that lead to death. So, to avoid and prevent all types of accidents related to these huge crowds; this paper presents an improved technique to estimate the density of peoples in the same place to help the decision makers to monitor and control the pedestrians overcrowded in the large collection of individuals. The proposed technique depends on the thermal bands for human, according to the big variance of temperature human body between the skin and the skin covered with clothes. Also, it presents the whole range of the temperature for each frame in the video. The essential characteristic of crowd counting technique is that it does not require a previously stored and trained data, but it uses a live video stream as input. Also, it does not require any intervention from individuals. The research's approach depends on capturing the thermal features of an individual. The result of this technique is introduced and proved to be highly accurate, and the experimental results demonstrate the effectiveness of the approach.
Multimedia Tools and Applications, 2021
One of the known problems in security systems is to identify persons based on certain signatures.... more One of the known problems in security systems is to identify persons based on certain signatures. Biometrics have been adopted in security systems to identify persons based on some physiological or behavioral characteristics that they own. Photoplethysmography (PPG) is a physiological signal that is used to describe the volumetric change of blood flow in peripherals with heartbeats. The PPG signals gained some interest of researchers in the last few years, because they are used non-invasively, and they are easily captured by the emerging IoT sensors from fingertips. This paper presents a PPG-based approach to identify persons using a neural network classifier. Firstly, PPG signals are captured from a number of persons using IoT sensors. Then, unique features are extracted from captured PPG signals by estimating the Mel-Frequency Cepstral Coefficients (MFCCs). These features are fed into an Artificial Neural Network (ANN) to be trained first and used for identification of persons. A dataset of PPG signals for 35 healthy persons was collected to test the performance of the proposed approach. Experimental results demonstrate 100% and 98.07% accuracy levels using the hold-out method and the 10-fold cross-validation method, respectively.
Journal of Internet Technology, 2019
Management of cloud computing resources is critical, especially when several cloudlets are submit... more Management of cloud computing resources is critical, especially when several cloudlets are submitted simultaneously to cloud computing. Therefore, it is very important to use high efficient cloudlet scheduling techniques to guarantee efficient utilization of computing resources. This paper presents a two-phase approach, called SAAC , for scheduling cloudlets onto Virtual Machines ( VMs ) of cloud computing environment to balance workload on the available VMs and minimize makespan (i.e., the completion time at the maximum loaded VM ). In the first phase, the SAAC approach applies the Simulated Annealing ( SA ) to find a near optimal scheduling of the cloudlets. While, in the second phase, the SAAC approach improves the cloudlets distribution by applying the Ant Colony Optimization ( ACO ) considering the solution obtained by the SA as the initial solution. The SAAC approach overcomes the computational time complexity of the ACO algorithm and low solutions quality of the SA . The prop...
Menoufia Journal of Electronic Engineering Research, 2020
Reliability assessment of a digital dynamic system using traditional Fault Tree Analysis (FTA) is... more Reliability assessment of a digital dynamic system using traditional Fault Tree Analysis (FTA) is difficult. This paper addresses the dynamic modeling of safety-critical complex systems such as the digital Reactor Protection System (RPS) in Nuclear Power Plants (NPPs). The digital RPS is a safety system utilized in the NPPs for safe operation and shutdown of the reactor in emergency events. A quantitative evaluation reliability analysis for the digital RPS with 2-out-of-4 architecture using the state transition diagram is presented in this paper. The study assesses the effects of independent hardware failures, Common Cause Failures (CCFs), and software failures on the failure of the RPS through calculating Probability of Failure on Demand (PFD). The results prove the validity of the proposed method in analyzing and evaluating reliability of the digital RPS and also show that the CCFs and longer detection time are the main contributions to the PFD of digital RPS.
The International Conference on Electrical Engineering, 2016
In distributed computing environment, efficient task scheduling is essential to obtain high perfo... more In distributed computing environment, efficient task scheduling is essential to obtain high performance. A vital role of designing and development of task scheduling algorithms is to achieve better makespan. Several task scheduling algorithms have been developed for homogeneous and heterogeneous distributed computing systems. In this paper, we proposed a static task scheduling algorithm that optimizes the performance of Leveled DAG Prioritized Task (LDPT) algorithm; namely ELDPT; to efficiently schedule tasks on homogeneous distributed computing systems. ELDPT algorithm aims to improve the performance of the system by minimizing the schedule length.
Communications on Applied Electronics, 2016
The coordination of the simultaneous execution of transactions in a multiuser database system may... more The coordination of the simultaneous execution of transactions in a multiuser database system may violates with consistency, performance and correctness of whole database. Data must maintain in consistency with correctness concurrency control techniques try to make balance between these characteristics and take in account time and performance of transactions. In this paper we had discussed various concurrency techniques, their advantages and disadvantages and make comparative study between them
We determined the minimum multicast tree by finding the Minimum Number of Transmission (MNT) usin... more We determined the minimum multicast tree by finding the Minimum Number of Transmission (MNT) using Minimum Cost Tree (MCT) and virtual mesh based. The proposed algorithm is compared to the Shortest Path Tree (SPT) algorithm, the one that most multicast routing protocols in "MANETs" use, on metrics number of transmission (bandwidth), faster join and throughput.
International Journal of Computer Applications, 2014
Efficient task scheduling is essential for obtaining high performance in heterogeneous distribute... more Efficient task scheduling is essential for obtaining high performance in heterogeneous distributed computing systems. Several algorithms are proposed for heterogeneous distributed computing systems. In this paper, a new static scheduling algorithm is proposed called Highest Communicated Path of Task (HCPT) algorithm to efficiently schedule tasks on the heterogeneous distributed computing systems. Our algorithm is based on the list-scheduling technique. The algorithm not only is focused on reducing the makespan, but also provides better performance than the other algorithms in terms of speedup and efficiency. It consists of three phases, level sorting phase, taskprioritizing phase and processor selection phase. From the theoretical analysis of the HCPT algorithm with other algorithms for a Directed A-cyclic Graph (DAG), the better performance is observed.
Bulletin of Sociological Methodology/Bulletin de Méthodologie Sociologique, 1992
Journal of Network and Systems Management, 2018
Cloudlet scheduling in cloud computing is one of the most issues that face the cloud computing en... more Cloudlet scheduling in cloud computing is one of the most issues that face the cloud computing environment. This paper presents a new efficient approach, called Traveling Salesman Approach for Cloudlet Scheduling (TSACS), to solve the cloudletscheduling problem. The main idea is to convert the cloudlet-scheduling problem into an instance of the Traveling Salesman Problem (TSP) and then apply one of the TSP solution strategies to solve the problem. The proposed approach consists of three phases: clustering phase, converting phase, and assignment phase. In the clustering phase, the proposed approach converts the large size cloudlet-scheduling problem into a small size cluster-scheduling problem to minimize computation time complexity of the proposed approach. In the converting phase, the approach forms the cluster-scheduling problem as an instance of the TSP. In the assignment phase, the approach schedules the clusters into the available virtual machines by using the nearest neighbor algorithm. The proposed approach is evaluated by using the CloudSim and the results are compared with that obtained by the most recent algorithms. The results show that the proposed approach enhances the overall system performance in terms of schedule length, balancing degree, and time complexity. In addition, the proposed TSACS overcomes the oscillation problem of the existing cloudlet-scheduling algorithms.
Multimedia Tools and Applications, 2022
Nowadays, deep learning achieves higher levels of accuracy than ever before. This evolution makes... more Nowadays, deep learning achieves higher levels of accuracy than ever before. This evolution makes deep learning crucial for applications that care for safety, like self-driving cars and helps consumers to meet most of their expectations. Further, Deep Neural Networks (DNNs) are powerful approaches that employed to solve several issues. These issues include healthcare, advertising, marketing, computer vision, speech processing, natural language processing. The DNNs have marvelous progress in these different fields, but training such DNN models requires a lot of time, a vast amount of data and in most cases a lot of computational steps. Selling such pre-trained models is a profitable business model. But, sharing them without the owner permission is a serious threat. Unfortunately, once the models are sold, they can be easily copied and redistributed. This paper first presents a review of how digital watermarking technologies are really very helpful in the copyright protection of the D...
IEEE Access, 2020
Nowadays, there is a strong demand for increasing the protection of resource-constrained devices ... more Nowadays, there is a strong demand for increasing the protection of resource-constrained devices such as Radio frequency identification (RFID) systems. Current cryptographic algorithms are sufficient for high-resource desktop computers. RFID systems are commonly used in high-security applications such as access control systems, transaction banking systems, and payment systems. The attacker attempts to mislead RFIDs for unauthorized access to services without payment or to circumvent security mechanisms by detecting a secret password. The biggest challenge in RFID systems is how to ensure successful protection against such infringements. Lightweight cryptography can provide security assurance for protecting RFID systems. This article presents a new ultra-lightweight cryptography algorithm for RFID systems called SLIM. SLIM is a 32-bit block cipher based on the Feistel structure since block ciphers are the most used cryptographic and provide very tight protection for IoT devices. The key challenge in designing a lightweight block cipher is to cope with performance, cost, and security. SLIM, like all symmetric block cipher, uses the same key for encryption and decryption. The proposed algorithm has an excellent performance in both hardware and software environments, with a limited implementation area, an acceptable cost/security for RFID systems, and an energy-efficient behaviour. SLIM has demonstrated high immunity against the most effective linear and differential cryptanalysis attacks and has a sufficient margin of defence against these attacks. INDEX TERMS RFID, block ciphers, lightweight cryptography, feistel ciphers, cryptanalysis.
Complex & Intelligent Systems, 2021
Communication between sensors spread everywhere in healthcare systems may cause some missing in t... more Communication between sensors spread everywhere in healthcare systems may cause some missing in the transferred features. Repairing the data problems of sensing devices by artificial intelligence technologies have facilitated the Medical Internet of Things (MIoT) and its emerging applications in Healthcare. MIoT has great potential to affect the patient's life. Data collected from smart wearable devices size dramatically increases with data collected from millions of patients who are suffering from diseases such as diabetes. However, sensors or human errors lead to missing some values of the data. The major challenge of this problem is how to predict this value to maintain the data analysis model performance within a good range. In this paper, a complete healthcare system for diabetics has been used, as well as two new algorithms are developed to handle the crucial problem of missed data from MIoT wearable sensors. The proposed work is based on the integration of Random Forest, ...
Menoufia Journal of Electronic Engineering Research, 2021
Traffic congestions is a crucial problem affecting cities around the globe and they are only gett... more Traffic congestions is a crucial problem affecting cities around the globe and they are only getting worse as the number of vehicles tends to increase significantly. Traffic signal controllers are considered as the most important mechanism to control traffic, specifically at intersections, the field of Machine Learning introduces advanced techniques which can be applied to provide more flexibility and adaptiveness to traffic control techniques. Efficient traffic controllers can be designed using a reinforcement learning (RL) approach but major problems of following RL approach are, exponential growth in the state and action spaces and the need for coordination. We use real traffic data of 65 intersection of the city of Ottawa to build our simulations and show that, clustering the network using hierarchal techniques has a great potential in reducing the stateaction pair significantly and enhance overall traffic performance.
Menoufia Journal of Electronic Engineering Research, 2020
Analyzing the Stable election protocol (SEP) in wireless sensor networks and aiming at the proble... more Analyzing the Stable election protocol (SEP) in wireless sensor networks and aiming at the problem of network structure as, unbalanced nodes distribution per cluster, random deployment of advanced nodes. unreasonable cluster-heads selectivity and undefined time period in which the nodes to become a cluster head. So, an optimal energy leach protocol (OLEP) is proposed. In this algorithm, the network structure constructed by using Voronoi tessellation, thus the balanced nodes deployment. Based on SEP, the node's residual energy is added in cluster-heads election. The information is transmitted with mode of multi hops between network nodes. The simulation experiments showed that compared to the classical LEACH and SEP, this algorithm balances the energy consumption of the entire network and significantly prolongs network lifetime.
Menoufia Journal of Electronic Engineering Research, 2008
Many network applications require transmitting the same single copy of data packets concurrently ... more Many network applications require transmitting the same single copy of data packets concurrently to many destinations, it is called multicasting. Wired and infrastructure-based wireless networks are supported by many multicast routing protocols. But, applying this concept in Mobile Ad hoc wireless NETworks (MANETs) is a big challenge. Ad hoc wireless networks composed of self-organized mobile nodes that can move arbitrarily without any preexisting communication infrastructure base stations. It causes producing dynamic and unpredictable network topology. Many proposals are introduced trying to solve multicast supporting problem in MANETs. In this paper, multicast routing protocols in MANETs that was proposed in recent years will be classified according to different view points such as multicast topology, topology initialization, topology maintenance, core or coreless approach, and dependency on unicast routing protocols.
Menoufia Journal of Electronic Engineering Research, 2008
A method for face detection and recognition using geometrical relationships of faces is presented... more A method for face detection and recognition using geometrical relationships of faces is presented in this paper. A gray level of image which is thresholded to produce a binary, edged image is required for face detection step then finding the head outline and then extracting it to perform the process of recognition. A pre-processing step will occur to remove the noise effect and to produce the accurate outline of head curve. The system is composed of a face detection stage presented previously which provides good results maintaining a low computational cost. After the integration of the two stages, several improvements are proposed which increase the face detection and recognition rate and the overall performance of the system. Application of these techniques has led to measurements of characteristic features of the human face with sufficient accuracy to merit later inclusion in full package for automated facial recognition. Good results have been obtained.
International Journal of Computational Intelligence Systems
In recent times, digital twins (DT) is becoming an emerging and key technology for smart industri... more In recent times, digital twins (DT) is becoming an emerging and key technology for smart industrial control systems and Industrial Internet of things (IIoT) applications. The DT presently supports a significant tool that can generate a huge dataset for fault prediction and diagnosis in a real-time scenario for critical industrial applications with the support of powerful artificial intelligence (AI). The physical assets of DT can produce system performance data that is close to reality, which delivers remarkable opportunities for machine fault diagnosis for effective measured fault conditions. Therefore, this study presents an intelligent and efficient AI-based fault diagnosis framework using new hybrid optimization and machine learning models for industrial DT systems, namely, the triplex pump model and transmission system. The proposed hybrid framework utilizes a combination of optimization techniques (OT) such as the flower pollination algorithm (FPA), particle swarm algorithm (P...
International Journal of Computational Intelligence Systems
In recent times, the Internet of Things (IoT) and Deep Learning Models (DLMs) can be utilized for... more In recent times, the Internet of Things (IoT) and Deep Learning Models (DLMs) can be utilized for developing smart agriculture to determine the exact location of the diseased part of the leaf on farmland in an efficient manner. There is no exception that convolutional neural networks (CNNs) have achieved the latest accomplishment in many aspects of human life and the farming sector. Semantic image segmentation is considered the main problem in computer vision. Despite tremendous progress in applications, approximately all semantic image segmentation algorithms fail to achieve sufficient hash results because of the absence of details sensitivity, problems in assessing the global similarity of image pixels, or both. Methods of post-processing improvement, as a wonderfully critical means of improving the underlying flaws mentioned above from algorithms, depend almost on Conditional Random Fields (CRFs). Therefore, plant disease prediction plays important role in the premature notificat...
Menoufia Journal of Electronic Engineering Research
the diagnosis of chest disease depends in most cases on the complex grouping of clinical data and... more the diagnosis of chest disease depends in most cases on the complex grouping of clinical data and images. According to this complexity, the debate is increased between researchers and doctors about the efficient and accurate method for chest disease prediction. The purpose of this research is to enhance the first handling of the patient data to get a prior diagnosis of the disease. The main problem in such diagnosis is the quality and quantity of the images.In this paper such problem is solved by utilizing some methods of preprocessing such as augmentation and segmentation. In addition are experimenting different machine learning techniques for feature selection and classification.The experiments have been conducted on three different data sets. As the results showed, the recognition accuracy using SVM algorithm in the classification stage, the VGG16 model for feature extraction, and LDA for dimension reduction is 67% without using image pre-processing techniques, by applying pre-processing the accuracy increased to 89%. Using a two-layer NN the recognition accuracy is 69.3%. For the same model, the accuracy has increased with the addition of image pre-processing techniques to reach 96%.
Menoufia Journal of Electronic Engineering Research
The estimation and the controlling processes in the crowd counting filed are the most important o... more The estimation and the controlling processes in the crowd counting filed are the most important on over applications related to surveillance systems and control follow related peoples to give them a complete safety especially in the large groups that contain a huge collection of individuals. There were found various events in human histories that gather a huge number of crowds in the same place. For example, in religion occasions as in HAJJ season which occurs every year, these events cause accidents that lead to death. So, to avoid and prevent all types of accidents related to these huge crowds; this paper presents an improved technique to estimate the density of peoples in the same place to help the decision makers to monitor and control the pedestrians overcrowded in the large collection of individuals. The proposed technique depends on the thermal bands for human, according to the big variance of temperature human body between the skin and the skin covered with clothes. Also, it presents the whole range of the temperature for each frame in the video. The essential characteristic of crowd counting technique is that it does not require a previously stored and trained data, but it uses a live video stream as input. Also, it does not require any intervention from individuals. The research's approach depends on capturing the thermal features of an individual. The result of this technique is introduced and proved to be highly accurate, and the experimental results demonstrate the effectiveness of the approach.
Multimedia Tools and Applications, 2021
One of the known problems in security systems is to identify persons based on certain signatures.... more One of the known problems in security systems is to identify persons based on certain signatures. Biometrics have been adopted in security systems to identify persons based on some physiological or behavioral characteristics that they own. Photoplethysmography (PPG) is a physiological signal that is used to describe the volumetric change of blood flow in peripherals with heartbeats. The PPG signals gained some interest of researchers in the last few years, because they are used non-invasively, and they are easily captured by the emerging IoT sensors from fingertips. This paper presents a PPG-based approach to identify persons using a neural network classifier. Firstly, PPG signals are captured from a number of persons using IoT sensors. Then, unique features are extracted from captured PPG signals by estimating the Mel-Frequency Cepstral Coefficients (MFCCs). These features are fed into an Artificial Neural Network (ANN) to be trained first and used for identification of persons. A dataset of PPG signals for 35 healthy persons was collected to test the performance of the proposed approach. Experimental results demonstrate 100% and 98.07% accuracy levels using the hold-out method and the 10-fold cross-validation method, respectively.
Journal of Internet Technology, 2019
Management of cloud computing resources is critical, especially when several cloudlets are submit... more Management of cloud computing resources is critical, especially when several cloudlets are submitted simultaneously to cloud computing. Therefore, it is very important to use high efficient cloudlet scheduling techniques to guarantee efficient utilization of computing resources. This paper presents a two-phase approach, called SAAC , for scheduling cloudlets onto Virtual Machines ( VMs ) of cloud computing environment to balance workload on the available VMs and minimize makespan (i.e., the completion time at the maximum loaded VM ). In the first phase, the SAAC approach applies the Simulated Annealing ( SA ) to find a near optimal scheduling of the cloudlets. While, in the second phase, the SAAC approach improves the cloudlets distribution by applying the Ant Colony Optimization ( ACO ) considering the solution obtained by the SA as the initial solution. The SAAC approach overcomes the computational time complexity of the ACO algorithm and low solutions quality of the SA . The prop...
Menoufia Journal of Electronic Engineering Research, 2020
Reliability assessment of a digital dynamic system using traditional Fault Tree Analysis (FTA) is... more Reliability assessment of a digital dynamic system using traditional Fault Tree Analysis (FTA) is difficult. This paper addresses the dynamic modeling of safety-critical complex systems such as the digital Reactor Protection System (RPS) in Nuclear Power Plants (NPPs). The digital RPS is a safety system utilized in the NPPs for safe operation and shutdown of the reactor in emergency events. A quantitative evaluation reliability analysis for the digital RPS with 2-out-of-4 architecture using the state transition diagram is presented in this paper. The study assesses the effects of independent hardware failures, Common Cause Failures (CCFs), and software failures on the failure of the RPS through calculating Probability of Failure on Demand (PFD). The results prove the validity of the proposed method in analyzing and evaluating reliability of the digital RPS and also show that the CCFs and longer detection time are the main contributions to the PFD of digital RPS.
The International Conference on Electrical Engineering, 2016
In distributed computing environment, efficient task scheduling is essential to obtain high perfo... more In distributed computing environment, efficient task scheduling is essential to obtain high performance. A vital role of designing and development of task scheduling algorithms is to achieve better makespan. Several task scheduling algorithms have been developed for homogeneous and heterogeneous distributed computing systems. In this paper, we proposed a static task scheduling algorithm that optimizes the performance of Leveled DAG Prioritized Task (LDPT) algorithm; namely ELDPT; to efficiently schedule tasks on homogeneous distributed computing systems. ELDPT algorithm aims to improve the performance of the system by minimizing the schedule length.
Communications on Applied Electronics, 2016
The coordination of the simultaneous execution of transactions in a multiuser database system may... more The coordination of the simultaneous execution of transactions in a multiuser database system may violates with consistency, performance and correctness of whole database. Data must maintain in consistency with correctness concurrency control techniques try to make balance between these characteristics and take in account time and performance of transactions. In this paper we had discussed various concurrency techniques, their advantages and disadvantages and make comparative study between them
We determined the minimum multicast tree by finding the Minimum Number of Transmission (MNT) usin... more We determined the minimum multicast tree by finding the Minimum Number of Transmission (MNT) using Minimum Cost Tree (MCT) and virtual mesh based. The proposed algorithm is compared to the Shortest Path Tree (SPT) algorithm, the one that most multicast routing protocols in "MANETs" use, on metrics number of transmission (bandwidth), faster join and throughput.
International Journal of Computer Applications, 2014
Efficient task scheduling is essential for obtaining high performance in heterogeneous distribute... more Efficient task scheduling is essential for obtaining high performance in heterogeneous distributed computing systems. Several algorithms are proposed for heterogeneous distributed computing systems. In this paper, a new static scheduling algorithm is proposed called Highest Communicated Path of Task (HCPT) algorithm to efficiently schedule tasks on the heterogeneous distributed computing systems. Our algorithm is based on the list-scheduling technique. The algorithm not only is focused on reducing the makespan, but also provides better performance than the other algorithms in terms of speedup and efficiency. It consists of three phases, level sorting phase, taskprioritizing phase and processor selection phase. From the theoretical analysis of the HCPT algorithm with other algorithms for a Directed A-cyclic Graph (DAG), the better performance is observed.
Bulletin of Sociological Methodology/Bulletin de Méthodologie Sociologique, 1992
Journal of Network and Systems Management, 2018
Cloudlet scheduling in cloud computing is one of the most issues that face the cloud computing en... more Cloudlet scheduling in cloud computing is one of the most issues that face the cloud computing environment. This paper presents a new efficient approach, called Traveling Salesman Approach for Cloudlet Scheduling (TSACS), to solve the cloudletscheduling problem. The main idea is to convert the cloudlet-scheduling problem into an instance of the Traveling Salesman Problem (TSP) and then apply one of the TSP solution strategies to solve the problem. The proposed approach consists of three phases: clustering phase, converting phase, and assignment phase. In the clustering phase, the proposed approach converts the large size cloudlet-scheduling problem into a small size cluster-scheduling problem to minimize computation time complexity of the proposed approach. In the converting phase, the approach forms the cluster-scheduling problem as an instance of the TSP. In the assignment phase, the approach schedules the clusters into the available virtual machines by using the nearest neighbor algorithm. The proposed approach is evaluated by using the CloudSim and the results are compared with that obtained by the most recent algorithms. The results show that the proposed approach enhances the overall system performance in terms of schedule length, balancing degree, and time complexity. In addition, the proposed TSACS overcomes the oscillation problem of the existing cloudlet-scheduling algorithms.
Multimedia Tools and Applications, 2022
Nowadays, deep learning achieves higher levels of accuracy than ever before. This evolution makes... more Nowadays, deep learning achieves higher levels of accuracy than ever before. This evolution makes deep learning crucial for applications that care for safety, like self-driving cars and helps consumers to meet most of their expectations. Further, Deep Neural Networks (DNNs) are powerful approaches that employed to solve several issues. These issues include healthcare, advertising, marketing, computer vision, speech processing, natural language processing. The DNNs have marvelous progress in these different fields, but training such DNN models requires a lot of time, a vast amount of data and in most cases a lot of computational steps. Selling such pre-trained models is a profitable business model. But, sharing them without the owner permission is a serious threat. Unfortunately, once the models are sold, they can be easily copied and redistributed. This paper first presents a review of how digital watermarking technologies are really very helpful in the copyright protection of the D...
IEEE Access, 2020
Nowadays, there is a strong demand for increasing the protection of resource-constrained devices ... more Nowadays, there is a strong demand for increasing the protection of resource-constrained devices such as Radio frequency identification (RFID) systems. Current cryptographic algorithms are sufficient for high-resource desktop computers. RFID systems are commonly used in high-security applications such as access control systems, transaction banking systems, and payment systems. The attacker attempts to mislead RFIDs for unauthorized access to services without payment or to circumvent security mechanisms by detecting a secret password. The biggest challenge in RFID systems is how to ensure successful protection against such infringements. Lightweight cryptography can provide security assurance for protecting RFID systems. This article presents a new ultra-lightweight cryptography algorithm for RFID systems called SLIM. SLIM is a 32-bit block cipher based on the Feistel structure since block ciphers are the most used cryptographic and provide very tight protection for IoT devices. The key challenge in designing a lightweight block cipher is to cope with performance, cost, and security. SLIM, like all symmetric block cipher, uses the same key for encryption and decryption. The proposed algorithm has an excellent performance in both hardware and software environments, with a limited implementation area, an acceptable cost/security for RFID systems, and an energy-efficient behaviour. SLIM has demonstrated high immunity against the most effective linear and differential cryptanalysis attacks and has a sufficient margin of defence against these attacks. INDEX TERMS RFID, block ciphers, lightweight cryptography, feistel ciphers, cryptanalysis.
Complex & Intelligent Systems, 2021
Communication between sensors spread everywhere in healthcare systems may cause some missing in t... more Communication between sensors spread everywhere in healthcare systems may cause some missing in the transferred features. Repairing the data problems of sensing devices by artificial intelligence technologies have facilitated the Medical Internet of Things (MIoT) and its emerging applications in Healthcare. MIoT has great potential to affect the patient's life. Data collected from smart wearable devices size dramatically increases with data collected from millions of patients who are suffering from diseases such as diabetes. However, sensors or human errors lead to missing some values of the data. The major challenge of this problem is how to predict this value to maintain the data analysis model performance within a good range. In this paper, a complete healthcare system for diabetics has been used, as well as two new algorithms are developed to handle the crucial problem of missed data from MIoT wearable sensors. The proposed work is based on the integration of Random Forest, ...
Menoufia Journal of Electronic Engineering Research, 2021
Traffic congestions is a crucial problem affecting cities around the globe and they are only gett... more Traffic congestions is a crucial problem affecting cities around the globe and they are only getting worse as the number of vehicles tends to increase significantly. Traffic signal controllers are considered as the most important mechanism to control traffic, specifically at intersections, the field of Machine Learning introduces advanced techniques which can be applied to provide more flexibility and adaptiveness to traffic control techniques. Efficient traffic controllers can be designed using a reinforcement learning (RL) approach but major problems of following RL approach are, exponential growth in the state and action spaces and the need for coordination. We use real traffic data of 65 intersection of the city of Ottawa to build our simulations and show that, clustering the network using hierarchal techniques has a great potential in reducing the stateaction pair significantly and enhance overall traffic performance.
Menoufia Journal of Electronic Engineering Research, 2020
Analyzing the Stable election protocol (SEP) in wireless sensor networks and aiming at the proble... more Analyzing the Stable election protocol (SEP) in wireless sensor networks and aiming at the problem of network structure as, unbalanced nodes distribution per cluster, random deployment of advanced nodes. unreasonable cluster-heads selectivity and undefined time period in which the nodes to become a cluster head. So, an optimal energy leach protocol (OLEP) is proposed. In this algorithm, the network structure constructed by using Voronoi tessellation, thus the balanced nodes deployment. Based on SEP, the node's residual energy is added in cluster-heads election. The information is transmitted with mode of multi hops between network nodes. The simulation experiments showed that compared to the classical LEACH and SEP, this algorithm balances the energy consumption of the entire network and significantly prolongs network lifetime.
Menoufia Journal of Electronic Engineering Research, 2008
Many network applications require transmitting the same single copy of data packets concurrently ... more Many network applications require transmitting the same single copy of data packets concurrently to many destinations, it is called multicasting. Wired and infrastructure-based wireless networks are supported by many multicast routing protocols. But, applying this concept in Mobile Ad hoc wireless NETworks (MANETs) is a big challenge. Ad hoc wireless networks composed of self-organized mobile nodes that can move arbitrarily without any preexisting communication infrastructure base stations. It causes producing dynamic and unpredictable network topology. Many proposals are introduced trying to solve multicast supporting problem in MANETs. In this paper, multicast routing protocols in MANETs that was proposed in recent years will be classified according to different view points such as multicast topology, topology initialization, topology maintenance, core or coreless approach, and dependency on unicast routing protocols.
Menoufia Journal of Electronic Engineering Research, 2008
A method for face detection and recognition using geometrical relationships of faces is presented... more A method for face detection and recognition using geometrical relationships of faces is presented in this paper. A gray level of image which is thresholded to produce a binary, edged image is required for face detection step then finding the head outline and then extracting it to perform the process of recognition. A pre-processing step will occur to remove the noise effect and to produce the accurate outline of head curve. The system is composed of a face detection stage presented previously which provides good results maintaining a low computational cost. After the integration of the two stages, several improvements are proposed which increase the face detection and recognition rate and the overall performance of the system. Application of these techniques has led to measurements of characteristic features of the human face with sufficient accuracy to merit later inclusion in full package for automated facial recognition. Good results have been obtained.