Hikmat A . M . Abdeljaber | Prince Sattam Bin Abdulaziz University (original) (raw)
Papers by Hikmat A . M . Abdeljaber
International Journal of Advanced Computer Science and Applications
Search engines are the instruments for website navigation and search, because the Internet is big... more Search engines are the instruments for website navigation and search, because the Internet is big and has expanded greatly. By continuously downloading web pages for processing, search engines provide search facilities and maintain indices for web documents. Online crawling is the term for this process of downloading web pages. This paper proposes solution to network traffic problem in migrating parallel web crawler. The primary benefit of a parallel web crawler is that it does local analysis at the data's residence rather than inside the web search engine repository. As a result, network load and traffic are greatly reduced, which enhances the performance, efficacy, and efficiency of the crawling process. Another benefit of moving to a parallel crawler is that as the web gets bigger, it becomes important to parallelize crawling operations in order to retrieve web pages more quickly. A web crawler will produce pages of excellent quality. When the crawling process moves to a host or server with a specific domain, it begins downloading pages from that domain. Incremental crawling will maintain the quality of downloaded pages and keep the pages in the local database updated. Java is used to implement the crawler. The model that was put into practice supports all aspects of a three-tier, realtime architecture. An implementation of a parallel web crawler migration is shown in this paper. The method for efficient parallel web migration detects changes in the content and structure using neural network-based change detection techniques in parallel web migration. This will produce highquality pages and detection for changes will always download new pages. Either of the following strategies is used to carry out the crawling process: either crawlers are given generous permission to speak with one another, or they are not given permission to communicate with one another at all. Both strategies increase network traffic. Here, a fuzzy logic-based system that predicts the load at a specific node and the path of network traffic is presented and implemented in MATLAB using the fuzzy logic toolbox.
International Journal of Emerging Technologies in Learning (iJET)
Peer and self-assessment open opportunities to scale assessments in online classrooms. This artic... more Peer and self-assessment open opportunities to scale assessments in online classrooms. This article reports our experiences of using AsPeer- peer assessment system, with two iterations of a university online class. We observed that peer grades highly correlated with staff assigned grades. It was recorded that, the peer grade of all student submissions within the range of instructor grade averaged to 21.0% and that within the next 2 ranges was 49.0%. We performed three experiments to improve accuracy of peer grading. First, we observed grading bias and introduced a data driven feedback mechanism to inform peers of it. Students aided by feedback were mindful and performed grading with better accuracy. Second, we observed that the rubric lacked efficiency in translating intent to students. Simplified guiding questions improved accuracy in assessment by 89% of students. Third, we encouraged peers to provide personalized qualitative feedback along with rating. We provided them with feed...
International Journal of Computer Applications in Technology
Computers, Materials & Continua
In recent times, financial globalization has drastically increased in different ways to improve t... more In recent times, financial globalization has drastically increased in different ways to improve the quality of services with advanced resources. The successful applications of bitcoin Blockchain (BC) techniques enable the stockholders to worry about the return and risk of financial products. The stockholders focused on the prediction of return rate and risk rate of financial products. Therefore, an automatic return rate bitcoin prediction model becomes essential for BC financial products. The newly designed machine learning (ML) and deep learning (DL) approaches pave the way for return rate predictive method. This study introduces a novel Jellyfish search optimization based extreme learning machine with autoencoder (JSO-ELMAE) for return rate prediction of BC financial products. The presented JSO-ELMAE model designs a new ELMAE model for predicting the return rate of financial products. Besides, the JSO algorithm is exploited to tune the parameters related to the ELMAE model which in turn boosts the classification results. The application of JSO technique assists in optimal parameter adjustment of the ELMAE model to predict the bitcoin return rates. The experimental validation of the JSO-ELMAE model was executed and the outcomes are inspected in many aspects. The experimental values demonstrated the enhanced performance of the JSO-ELMAE model over recent state of art approaches with minimal RMSE of 0.1562.
IEEE Access, 2021
The manual process of scoring short answers of Arabic essay questions is exhaustive, susceptible ... more The manual process of scoring short answers of Arabic essay questions is exhaustive, susceptible to error and consumes instructor's time and resources. This paper explores longest common subsequence (LCS) algorithm as a string-based text similarity measure for effectively scoring short answers of Arabic essay questions. To achieve this effectiveness, the longest common subsequence is modified by developing weight-based measurement techniques and implemented along with using Arabic WordNet for scoring Arabic short answers. The experiments conducted on a dataset of 330 students' answers reported Root Mean Square Error (RMSE) value of 0.81 and Pearson correlation r value of 0.94. Findings based on experiments have shown improvements in the accuracy of performance of the proposed approach compared to similar studies. Moreover, the statistical analysis has shown that the proposed method scores students' answers similar to that of human estimator.
PeerJ Computer Science, 2021
Semantic Text Similarity (STS) has several and important applications in the field of Natural Lan... more Semantic Text Similarity (STS) has several and important applications in the field of Natural Language Processing (NLP). The Aim of this study is to investigate the effect of stemming on text similarity for Arabic language at sentence level. Several Arabic light and heavy stemmers as well as lemmatization algorithms are used in this study, with a total of 10 algorithms. Standard training and testing data sets are used from SemEval-2017 international workshop for Task 1, Track 1 Arabic (ar–ar). Different features are selected to study the effect of stemming on text similarity based on different similarity measures. Traditional machine learning algorithms are used such as Support Vector Machines (SVM), Stochastic Gradient Descent (SGD) and Naïve Bayesian (NB). Compared to the original text, using the stemmed and lemmatized documents in experiments achieve enhanced Pearson correlation results. The best results attained when using Arabic light Stemmer (ARLSTem) and Farasa light stemmers...
International Journal of Advanced Computer Science and Applications, 2018
User profiles play a critical role in personalizing user search. It assists search systems in ret... more User profiles play a critical role in personalizing user search. It assists search systems in retrieving relevant information that is searched on the web considering the user needs. Researchers presented a vast number of profile-based approaches that aims to improve the effectiveness of information retrieval. However, these approaches are syntactic-based which fail to achieve the user satisfaction. By the means that the search results do not meet user preferences, due to the fact that the search is keyword-based rather than semantic-based. Exploiting user profiles with the application of semantic web technology into personalization might produce a step forward in future retrieval systems. By adopting profiling approach and using ontology base characteristics, a semantic-based method using heuristics and KNN algorithm is proposed. It engages searching ontology base domains horizontally and vertically to discover and extract the closest concept to the meaning of the query keyword. The extracted concept is used to expand the user query to personalize the search result and present the customized information for individuals.
Wireless Personal Communications, 2012
A new hybrid algorithm termed Reduction Estimation-Combinatorial Optimisation-Reduction Approxima... more A new hybrid algorithm termed Reduction Estimation-Combinatorial Optimisation-Reduction Approximation is proposed to identify an optimal base station placement as a subset of a known set of potential sites to provide wireless communication services to a given set of stationary users. Both forward and reverse link signal-to-interference ratios are considered, and the performance of the proposed algorithm is reported for several representative case studies and compared with Brute Force Search and existing optimisation algorithms (including Genetic, Ngadiman and Greedy algorithms). For the cases considered, the proposed algorithm is found to be superior to the existing algorithms in that it can yield an optimal deployment (equivalent to Brute Force Search) in an acceptable time.
Security and Communication Networks
The purpose of automatic text summarising technology is to condense a given text while properly p... more The purpose of automatic text summarising technology is to condense a given text while properly portraying the main information in the original text in a summary. To present generative text summarising approaches, on the other hand, restructure the original language and introduce new words when constructing summary sentences, which can easily lead to incoherence and poor readability. This research proposes a XAI (explainable artificial intelligence)-based Reinforcement Learning-based Text Summarization of Social IoT-Based Content using Reinforcement Learning. Furthermore, standard supervised training based on labelled data to improve the coherence of summary sentences has substantial data costs, which restricts practical applications. In order to do this, a ground-truth-dependent text summarization (generation) model (XAI-RL) is presented for coherence augmentation. On the one hand, based on the encoding result of the original text, a sentence extraction identifier is generated, and...
Scientific Programming
Energy waste altogether adds to expanded expenses in the car fabricating industry, which is liabl... more Energy waste altogether adds to expanded expenses in the car fabricating industry, which is liable to energy use limitations and tax assessment from national and global strategy creators and confinements and charges from national energy suppliers. This checking is essential for energy sparing since it empowers organizations to roll out operational improvements to diminish energy utilization and expenses. The primary test to energy observation is the need to incorporate assembling and energy checking and control gadgets that help diverse correspondence conventions and are generally dispersed over a wide region. One of the most significant challenges in the advancement of the Internet of Things (IoT) has been the powering of billions of connected devices. Evaluation of digital services considering an energy impression of the Internet normally requires models of the energy intensity of the Internet. A typical way to deal with the display of the energy intensity is to consolidate assess...
Scientific Programming, 2022
The time series forecasting system can be used for investments in a safe environment with minimiz... more The time series forecasting system can be used for investments in a safe environment with minimized chances of loss. The Holt–Winters algorithm followed various procedures and observed the multiple factors applied to the neural network. The final module helps filter the system to predict the various factors and provides a rating for the system. This research work uses real-time dataset of fifteen stocks as input into the system and, based on the data, predicts or forecasts future stock prices of different companies belonging to different sectors. The dataset includes approximately fifteen companies from different sectors and forecasts their results based on which the user can decide whether to invest in the particular company or not; the forecasting will give an accurate result for the customer investments.
Deep learning has paved the way for critical and revolutionary applications in almost every field... more Deep learning has paved the way for critical and revolutionary applications in almost every field of life in general. Ranging from engineering to healthcare, machine learning and deep learning has left its mark as the state-of-the-art technology application which holds the epitome of a reasonable high benchmarked solution. Incorporating neural network architectures into applications has become a common part of any software development process. In this paper, we perform a comparative analysis on the different transfer learning approaches in the domain of hand-written digit recognition. We use two performance measures, loss and accuracy. We later visualize the different results for the training and validation datasets and reach to a unison conclusion. This paper aims to target the drawbacks of the electronic whiteboard with simultaneous focus on the suitable model selection procedure for the digit recognition problem.
International Journal of Emerging Technologies in Learning (iJET), 2017
In educational community, assessment process focuses on learning, teaching and outcomes. It provi... more In educational community, assessment process focuses on learning, teaching and outcomes. It provides information for improving learning and teaching. Therefore, a well-established assessment process plays a vital role for improving program outcomes which, in turn, results in fulfilling program educational objectives. However, such a process entails setting well-defined courses learning objectives, program outcomes, and program educational objectives. In addition, an effective assessment method is needed for measuring the extent that program outcomes meet academic accreditation body criteria. This measurement is performed by mapping courses learning objectives with program educational objectives passing through program outcomes. Such mapping for just one academic accreditation body is a straightforward process and involves no complications. However, a coherent assessment method is required for multi academic accreditation bodies. The approach of mapping program outcomes across criter...
Wireless Communications and Mobile Computing
According to recent research, attacks on USIM cards are on the rise. In a 5G setting, attackers c... more According to recent research, attacks on USIM cards are on the rise. In a 5G setting, attackers can also employ counterfeit USIM cards to circumvent the identity authentication of specified standard applications and steal user information. Under the assumption that the USIM can be replicated, the identity authentication process of common mobile platform applications is investigated. The identity authentication tree is generated by examining the application behavior of user login, password reset, and sensitive operations. We tested 58 typical applications in 7 categories, including social communication and personal health. We found that 29 of them only needed the SMS verification code received by the USIM card to pass the authentication. In response to this problem, it is recommended to enable two-step verification and use USIM anti-counterfeiting methods to complete the verification.
The information on the Web increases tremendously. A number of search engines have been developed... more The information on the Web increases tremendously. A number of search engines have been developed for searching Web information and retrieving relevant documents that satisfy the inquirers needs. Search engines provide inquirers irrelevant documents among search results, since the search is text-based rather than semantic-based. Information retrieval research area has presented a number of approaches and methodologies such as profiling, feedback, query modification, human-computer interaction, etc for improving search results. Moreover, information retrieval has employed artificial intelligence techniques and strategies such as machine learning heuristics, tuning mechanisms, user and system vocabularies, logical theory, etc for capturing user's preferences and using them for guiding the search based on the semantic analysis rather than syntactic analysis. Although a valuable improvement has been recorded on search results, the survey has shown that still search engines users are...
International Journal of Advanced Computer Science and Applications
Search engines are the instruments for website navigation and search, because the Internet is big... more Search engines are the instruments for website navigation and search, because the Internet is big and has expanded greatly. By continuously downloading web pages for processing, search engines provide search facilities and maintain indices for web documents. Online crawling is the term for this process of downloading web pages. This paper proposes solution to network traffic problem in migrating parallel web crawler. The primary benefit of a parallel web crawler is that it does local analysis at the data's residence rather than inside the web search engine repository. As a result, network load and traffic are greatly reduced, which enhances the performance, efficacy, and efficiency of the crawling process. Another benefit of moving to a parallel crawler is that as the web gets bigger, it becomes important to parallelize crawling operations in order to retrieve web pages more quickly. A web crawler will produce pages of excellent quality. When the crawling process moves to a host or server with a specific domain, it begins downloading pages from that domain. Incremental crawling will maintain the quality of downloaded pages and keep the pages in the local database updated. Java is used to implement the crawler. The model that was put into practice supports all aspects of a three-tier, realtime architecture. An implementation of a parallel web crawler migration is shown in this paper. The method for efficient parallel web migration detects changes in the content and structure using neural network-based change detection techniques in parallel web migration. This will produce highquality pages and detection for changes will always download new pages. Either of the following strategies is used to carry out the crawling process: either crawlers are given generous permission to speak with one another, or they are not given permission to communicate with one another at all. Both strategies increase network traffic. Here, a fuzzy logic-based system that predicts the load at a specific node and the path of network traffic is presented and implemented in MATLAB using the fuzzy logic toolbox.
International Journal of Emerging Technologies in Learning (iJET)
Peer and self-assessment open opportunities to scale assessments in online classrooms. This artic... more Peer and self-assessment open opportunities to scale assessments in online classrooms. This article reports our experiences of using AsPeer- peer assessment system, with two iterations of a university online class. We observed that peer grades highly correlated with staff assigned grades. It was recorded that, the peer grade of all student submissions within the range of instructor grade averaged to 21.0% and that within the next 2 ranges was 49.0%. We performed three experiments to improve accuracy of peer grading. First, we observed grading bias and introduced a data driven feedback mechanism to inform peers of it. Students aided by feedback were mindful and performed grading with better accuracy. Second, we observed that the rubric lacked efficiency in translating intent to students. Simplified guiding questions improved accuracy in assessment by 89% of students. Third, we encouraged peers to provide personalized qualitative feedback along with rating. We provided them with feed...
International Journal of Computer Applications in Technology
Computers, Materials & Continua
In recent times, financial globalization has drastically increased in different ways to improve t... more In recent times, financial globalization has drastically increased in different ways to improve the quality of services with advanced resources. The successful applications of bitcoin Blockchain (BC) techniques enable the stockholders to worry about the return and risk of financial products. The stockholders focused on the prediction of return rate and risk rate of financial products. Therefore, an automatic return rate bitcoin prediction model becomes essential for BC financial products. The newly designed machine learning (ML) and deep learning (DL) approaches pave the way for return rate predictive method. This study introduces a novel Jellyfish search optimization based extreme learning machine with autoencoder (JSO-ELMAE) for return rate prediction of BC financial products. The presented JSO-ELMAE model designs a new ELMAE model for predicting the return rate of financial products. Besides, the JSO algorithm is exploited to tune the parameters related to the ELMAE model which in turn boosts the classification results. The application of JSO technique assists in optimal parameter adjustment of the ELMAE model to predict the bitcoin return rates. The experimental validation of the JSO-ELMAE model was executed and the outcomes are inspected in many aspects. The experimental values demonstrated the enhanced performance of the JSO-ELMAE model over recent state of art approaches with minimal RMSE of 0.1562.
IEEE Access, 2021
The manual process of scoring short answers of Arabic essay questions is exhaustive, susceptible ... more The manual process of scoring short answers of Arabic essay questions is exhaustive, susceptible to error and consumes instructor's time and resources. This paper explores longest common subsequence (LCS) algorithm as a string-based text similarity measure for effectively scoring short answers of Arabic essay questions. To achieve this effectiveness, the longest common subsequence is modified by developing weight-based measurement techniques and implemented along with using Arabic WordNet for scoring Arabic short answers. The experiments conducted on a dataset of 330 students' answers reported Root Mean Square Error (RMSE) value of 0.81 and Pearson correlation r value of 0.94. Findings based on experiments have shown improvements in the accuracy of performance of the proposed approach compared to similar studies. Moreover, the statistical analysis has shown that the proposed method scores students' answers similar to that of human estimator.
PeerJ Computer Science, 2021
Semantic Text Similarity (STS) has several and important applications in the field of Natural Lan... more Semantic Text Similarity (STS) has several and important applications in the field of Natural Language Processing (NLP). The Aim of this study is to investigate the effect of stemming on text similarity for Arabic language at sentence level. Several Arabic light and heavy stemmers as well as lemmatization algorithms are used in this study, with a total of 10 algorithms. Standard training and testing data sets are used from SemEval-2017 international workshop for Task 1, Track 1 Arabic (ar–ar). Different features are selected to study the effect of stemming on text similarity based on different similarity measures. Traditional machine learning algorithms are used such as Support Vector Machines (SVM), Stochastic Gradient Descent (SGD) and Naïve Bayesian (NB). Compared to the original text, using the stemmed and lemmatized documents in experiments achieve enhanced Pearson correlation results. The best results attained when using Arabic light Stemmer (ARLSTem) and Farasa light stemmers...
International Journal of Advanced Computer Science and Applications, 2018
User profiles play a critical role in personalizing user search. It assists search systems in ret... more User profiles play a critical role in personalizing user search. It assists search systems in retrieving relevant information that is searched on the web considering the user needs. Researchers presented a vast number of profile-based approaches that aims to improve the effectiveness of information retrieval. However, these approaches are syntactic-based which fail to achieve the user satisfaction. By the means that the search results do not meet user preferences, due to the fact that the search is keyword-based rather than semantic-based. Exploiting user profiles with the application of semantic web technology into personalization might produce a step forward in future retrieval systems. By adopting profiling approach and using ontology base characteristics, a semantic-based method using heuristics and KNN algorithm is proposed. It engages searching ontology base domains horizontally and vertically to discover and extract the closest concept to the meaning of the query keyword. The extracted concept is used to expand the user query to personalize the search result and present the customized information for individuals.
Wireless Personal Communications, 2012
A new hybrid algorithm termed Reduction Estimation-Combinatorial Optimisation-Reduction Approxima... more A new hybrid algorithm termed Reduction Estimation-Combinatorial Optimisation-Reduction Approximation is proposed to identify an optimal base station placement as a subset of a known set of potential sites to provide wireless communication services to a given set of stationary users. Both forward and reverse link signal-to-interference ratios are considered, and the performance of the proposed algorithm is reported for several representative case studies and compared with Brute Force Search and existing optimisation algorithms (including Genetic, Ngadiman and Greedy algorithms). For the cases considered, the proposed algorithm is found to be superior to the existing algorithms in that it can yield an optimal deployment (equivalent to Brute Force Search) in an acceptable time.
Security and Communication Networks
The purpose of automatic text summarising technology is to condense a given text while properly p... more The purpose of automatic text summarising technology is to condense a given text while properly portraying the main information in the original text in a summary. To present generative text summarising approaches, on the other hand, restructure the original language and introduce new words when constructing summary sentences, which can easily lead to incoherence and poor readability. This research proposes a XAI (explainable artificial intelligence)-based Reinforcement Learning-based Text Summarization of Social IoT-Based Content using Reinforcement Learning. Furthermore, standard supervised training based on labelled data to improve the coherence of summary sentences has substantial data costs, which restricts practical applications. In order to do this, a ground-truth-dependent text summarization (generation) model (XAI-RL) is presented for coherence augmentation. On the one hand, based on the encoding result of the original text, a sentence extraction identifier is generated, and...
Scientific Programming
Energy waste altogether adds to expanded expenses in the car fabricating industry, which is liabl... more Energy waste altogether adds to expanded expenses in the car fabricating industry, which is liable to energy use limitations and tax assessment from national and global strategy creators and confinements and charges from national energy suppliers. This checking is essential for energy sparing since it empowers organizations to roll out operational improvements to diminish energy utilization and expenses. The primary test to energy observation is the need to incorporate assembling and energy checking and control gadgets that help diverse correspondence conventions and are generally dispersed over a wide region. One of the most significant challenges in the advancement of the Internet of Things (IoT) has been the powering of billions of connected devices. Evaluation of digital services considering an energy impression of the Internet normally requires models of the energy intensity of the Internet. A typical way to deal with the display of the energy intensity is to consolidate assess...
Scientific Programming, 2022
The time series forecasting system can be used for investments in a safe environment with minimiz... more The time series forecasting system can be used for investments in a safe environment with minimized chances of loss. The Holt–Winters algorithm followed various procedures and observed the multiple factors applied to the neural network. The final module helps filter the system to predict the various factors and provides a rating for the system. This research work uses real-time dataset of fifteen stocks as input into the system and, based on the data, predicts or forecasts future stock prices of different companies belonging to different sectors. The dataset includes approximately fifteen companies from different sectors and forecasts their results based on which the user can decide whether to invest in the particular company or not; the forecasting will give an accurate result for the customer investments.
Deep learning has paved the way for critical and revolutionary applications in almost every field... more Deep learning has paved the way for critical and revolutionary applications in almost every field of life in general. Ranging from engineering to healthcare, machine learning and deep learning has left its mark as the state-of-the-art technology application which holds the epitome of a reasonable high benchmarked solution. Incorporating neural network architectures into applications has become a common part of any software development process. In this paper, we perform a comparative analysis on the different transfer learning approaches in the domain of hand-written digit recognition. We use two performance measures, loss and accuracy. We later visualize the different results for the training and validation datasets and reach to a unison conclusion. This paper aims to target the drawbacks of the electronic whiteboard with simultaneous focus on the suitable model selection procedure for the digit recognition problem.
International Journal of Emerging Technologies in Learning (iJET), 2017
In educational community, assessment process focuses on learning, teaching and outcomes. It provi... more In educational community, assessment process focuses on learning, teaching and outcomes. It provides information for improving learning and teaching. Therefore, a well-established assessment process plays a vital role for improving program outcomes which, in turn, results in fulfilling program educational objectives. However, such a process entails setting well-defined courses learning objectives, program outcomes, and program educational objectives. In addition, an effective assessment method is needed for measuring the extent that program outcomes meet academic accreditation body criteria. This measurement is performed by mapping courses learning objectives with program educational objectives passing through program outcomes. Such mapping for just one academic accreditation body is a straightforward process and involves no complications. However, a coherent assessment method is required for multi academic accreditation bodies. The approach of mapping program outcomes across criter...
Wireless Communications and Mobile Computing
According to recent research, attacks on USIM cards are on the rise. In a 5G setting, attackers c... more According to recent research, attacks on USIM cards are on the rise. In a 5G setting, attackers can also employ counterfeit USIM cards to circumvent the identity authentication of specified standard applications and steal user information. Under the assumption that the USIM can be replicated, the identity authentication process of common mobile platform applications is investigated. The identity authentication tree is generated by examining the application behavior of user login, password reset, and sensitive operations. We tested 58 typical applications in 7 categories, including social communication and personal health. We found that 29 of them only needed the SMS verification code received by the USIM card to pass the authentication. In response to this problem, it is recommended to enable two-step verification and use USIM anti-counterfeiting methods to complete the verification.
The information on the Web increases tremendously. A number of search engines have been developed... more The information on the Web increases tremendously. A number of search engines have been developed for searching Web information and retrieving relevant documents that satisfy the inquirers needs. Search engines provide inquirers irrelevant documents among search results, since the search is text-based rather than semantic-based. Information retrieval research area has presented a number of approaches and methodologies such as profiling, feedback, query modification, human-computer interaction, etc for improving search results. Moreover, information retrieval has employed artificial intelligence techniques and strategies such as machine learning heuristics, tuning mechanisms, user and system vocabularies, logical theory, etc for capturing user's preferences and using them for guiding the search based on the semantic analysis rather than syntactic analysis. Although a valuable improvement has been recorded on search results, the survey has shown that still search engines users are...