Abubakar Zakari - Academia.edu (original) (raw)

Papers by Abubakar Zakari

Research paper thumbnail of The Big Data Technology: Assessing the Impact in the Banking Industry

Open Journal of Science and Technology

Big data is a form of data with increased volume, difficult to analyze, process, and store using ... more Big data is a form of data with increased volume, difficult to analyze, process, and store using traditional database technologies. It has long been adopted in business and finance where a large number of bank transaction are executed daily. The emergence of big data in banking industry results to large proportion of technical improvements in the industry. However, its processing causes disruption in the banking industry. Big data analytics is the process that involves using algorithms and software tools to extract useful business information from the dataset. This study adopts big data analytics process to investigates the disruption due to big data processing in the banking industry. The study identifies, acquired, and extracted dataset of the banking industry which was analyzed using MapReduce based fraud committed due to processing of large amount of data. findings show that government employee commit more crime in comparison with the private sector employees. Finally, based on ...

Research paper thumbnail of Software Requirements Negotiation: a Systematic Literature Review

Software requirements negotiation (SRN) is one of the most essential stages of software requireme... more Software requirements negotiation (SRN) is one of the most essential stages of software requirements engineering. SRN involves the stakeholder's interaction to reach a mutual understanding of the requirements for developing a software project. The increased research interest in requirements engineering has resulted in huge literature in the SRN domain. There is a need to investigate the broad of techniques, processes, and evaluation mechanisms used in the SRN research community. This study aims to examine and identify the existing methods, processes, evaluation mechanisms, quantity of publications, publication trends and demographics shaping SRN research domain. To accomplish our aim, we used an evidence-based systematic approach, and 67 relevant studies were ultimately chosen from the search process based on the formulated research questions. Our study result shows broad and promising SRN techniques that include agent-based negotiation, TAICOS, wikiwinwin and winbook. However, ...

Research paper thumbnail of IPv4 and IPv6 Protocols: A Comparative Performance Study

2019 IEEE 10th Control and System Graduate Research Colloquium (ICSGRC), 2019

The advancement in wireless technologies has allowed devices to access the internet from virtuall... more The advancement in wireless technologies has allowed devices to access the internet from virtually anywhere around the globe. However, the demand for reliable communication through voice and video streaming is significantly high. Internet protocol version 4 (IPv4) is a widely used internet protocol version in the world today, but due to the shortage of IPv4 addresses in the information technology (IT) world, a new internet protocol was introduced coined Internet protocol version 6 (IPv6) to address this issue. This paper aims to perform a comparative study on the performance of IPv4 and IPv6 on voice and video network traffic flow using performance metrics such as jitter, throughput, and packet loss. Accordingly, a testbed experimental environment is set-up with two hosts in client-server mode. Therefore, two scenarios (voice/video) were set-up to analyze the performance of the protocols. The result of this study shows that IPv6 has performed better than IPv4 in both experimental scenarios (voice/video).

Research paper thumbnail of Software Fault Localization : Issues and Limitations

Fault localization is the most challenging and tedious activity in program debugging. This activi... more Fault localization is the most challenging and tedious activity in program debugging. This activity is vital in maintaining software quality. Existing fault localization techniques assume that a program contains only one fault during the localization process. Realistically, a program failure can be caused by multiple active faults. However, for a program with multiple faults, interference do occur between faults that mask failures, this causes the existing fault localization techniques to lose their localization effectiveness with a great margin. Hence, techniques that try to solve this problem are suffering from lack of scalability and high computational complexity. In this paper, we discussed and analyzed the existing fault localization techniques and their application to multiple fault programs, and highlight some of their issues and limitations. Based on our findings, existing fault localization techniques based on multiple faults are shallow and more improvement is needed. Neve...

Research paper thumbnail of A Systematic Mapping Study of the Empirical Explicit Aspect Extractions in Sentiment Analysis

IEEE Access, 2020

Aspect-based sentiment analysis (ABSA) is described as one of the most vibrant research areas ove... more Aspect-based sentiment analysis (ABSA) is described as one of the most vibrant research areas over the last decade. However, due to the exponential increase in aspect-based sentiment researches, there is a massive interest in advanced explicit aspect extraction (EAE) techniques. This interest brings about a huge amount of literature in the EAE domain. This study aims to investigate and identify the existing approaches, techniques, types of research, quantity of publications, publication trends and demographics shaping the EAE research domain in the last decade (2009-2019). Accordingly, an evidence-based systematic methodology was adopted to effectively capture all the relevant studies. The main findings revealed that, 1) there is considerable and continuous rise of EAE research activities around different parts of the globe in the last five years, particularly Asia, Middle-East, and European countries; 2) EAE research has been very limited among African countries which need to be addressed due its role on business intelligence as well as semantic values; 3) three research facets were highlighted based on this study, i.e. solution research, validation research, and evaluation research, in which solution research gets the highest attention; and finally 4) the EAE challenges, as well as feasible future recommendations, were highlighted in this study.

Research paper thumbnail of Calculating carbon emissions from personal travelling: insights from a top-down analysis of key calculators

Environmental Science and Pollution Research, 2020

Personal travelling unfavourably contributes to the emissions of greenhouse gases, which adversel... more Personal travelling unfavourably contributes to the emissions of greenhouse gases, which adversely causes long-term damage to the climate. In order to reduce the associated negative impacts of such activities on the environment, there is a wide consensus that enhancements and innovations in the efficiency of vehicles will not be enough, but behavioural changes are needed. For this, individuals should be able to measure their travel-related carbon emissions, and such emissions could be determined by using personal carbon footprint calculators, which proliferated during the previous decade. However, various research questions related to such calculators are yet to be answered in published literature. As such, this paper investigates how key transport-based calculators account for emissions from personal transport-related activities following a top-down analysis. In this endeavour, ten such calculators are investigated through a set of formulated research questions to analyse their scope, calculation approach used, transparency, consistency of results, communication methods utilized and platform differences. Results revealed that the calculators have varying granularity, have limited transparency, provide significantly inconsistent results in some cases and are not fully engaging end users. Based on limitations identified, recommendations have been proposed through a taxonomy to guide policy-makers towards improving such tools.

Research paper thumbnail of Digital Hadith authentication: Recent advances, open challenges, and future directions

Transactions on Emerging Telecommunications Technologies, 2020

The Holy Quran and Hadith are the two main sources of legislation and guidelines for Muslims to s... more The Holy Quran and Hadith are the two main sources of legislation and guidelines for Muslims to shape their lives. The daily activities, sayings, and deeds of the Holy Prophet Muhammad (PBUH) are called Hadiths. Hadiths are the optimal practical descriptions of the Holy Quran. Technological advancements of information and communication technologies (ICT) have revolutionized every field of daily life, including digitizing the Holy Quran and Hadith. Available online contents of Hadith are obtained from different sources. Thus, alterations and fabrications of fake Hadiths are feasible. Authentication of these online available Hadith contents is a complex and challenging task and a crucial area of study in Islam. Few Hadith authentication techniques and systems are proposed in the literature. In this study, we have surveyed all techniques and systems, which are proposed for Hadith authentication. Furthermore, classification, open challenges, and future research directions related to Hadith authentication are identified.

Research paper thumbnail of Multiple fault localization of software programs: A systematic literature review

Information and Software Technology, 2020

Context: Multiple fault localization (MFL) is the act of identifying the locations of multiple fa... more Context: Multiple fault localization (MFL) is the act of identifying the locations of multiple faults (more than one fault) in a faulty software program. This is known to be more complicated, tedious, and costly in comparison to the traditional practice of presuming that a software contains a single fault. Due to the increasing interest in MFL by the research community, a broad spectrum of MFL debugging approaches and solutions have been proposed and developed. Objective: The aim of this study is to systematically review existing research on MFL in the software fault localization (SFL) domain. This study also aims to identify, categorize, and synthesize relevant studies in the research domain. Method: Consequently, using an evidence-based systematic methodology, we identified 55 studies relevant to four research questions. The methodology provides a systematic selection and evaluation process with rigorous and repeatable evidence-based studies selection process. Result: The result of the systematic review shows that research on MFL is gaining momentum with stable growth in the last 5 years. Three prominent MFL debugging approaches were identified, i.e. One-bug-at-a-time debugging approach (OBA), parallel debugging approach, and multiple-bug-at-a-time debugging approach (MBA), with OBA debugging approach being utilized the most. Conclusion: The study concludes with some identified research challenges and suggestions for future research. Although MFL is becoming of grave concern, existing solutions in the field are less mature. Studies utilizing real faults in their experiments are scarce. Concrete solutions to reduce MFL debugging time and cost by adopting an approach such as MBA debugging approach are also less, which require more attention from the research community.

Research paper thumbnail of A Systematic Mapping Study of the Empirical MOOC Literature

IEEE Access, 2019

Massive open online courses (MOOCs) have revolutionized todays education by offering a global acc... more Massive open online courses (MOOCs) have revolutionized todays education by offering a global accessible form of online learning. Over the years, MOOCs have been an attractive research area and have yielded an ample amount of research publications. However, the existing review studies in MOOCs are characterized by short year coverage or focusing on a specific theme. As such, a systematic mapping methodology was adopted to provide a fine-grain overview of MOOC research domain by identifying the quantity, types of research, available results and publication trends in educational aspects of MOOCs from 2009 to 2018. Key findings show that I) MOOC research have been on the rise since MOOCs became mainstream in 2011. II) MOOC research largely resides in the United States and few European countries. II) Most of MOOC studies focused on addressing learners' completion/dropout/retention. In addition, we proposed some recommendations for future research on MOOCs.

Research paper thumbnail of Parallel debugging: An investigative study

Journal of Software: Evolution and Process, 2019

In the simultaneous localization of multiple software faults, a parallel debugging approach has c... more In the simultaneous localization of multiple software faults, a parallel debugging approach has consistently been utilized. The effectiveness of a parallel debugging approach is critically determined by the type of clustering algorithm and the distance metric used. However, clustering algorithms that group failed tests based on their execution profile similarity with distance metrics such as Euclidean distance, Jaccard distance, and Hamming distance are considered to be problematic and not appropriate. In this paper, we conducted an investigative study on the usefulness of the problematic parallel debugging approach that makes use of k-means clustering algorithm (that groups failed tests based on their execution profile similarity) with Euclidian distance metric on three similarity coefficient-based fault localization techniques in terms of localization effectiveness. Secondly, we compare the effectiveness of the problematic parallel debugging approach with one-bug-at-a-time debugging approach (OBA) and a state-of-the-art parallel debugging approach named MSeer. The empirical evaluation is conducted on 540 multiple-fault versions of eight medium-sized to large-sized subject programs with two, three, four, and five faulty versions. Our results suggest that clustering failed tests based on their execution profile similarity and the utilization of distance metrics such as Euclidean distance is indeed problematic and contributes to the reduction of effectiveness in localizing multiple faults.

Research paper thumbnail of A Community-Based Fault Isolation Approach for Effective Simultaneous Localization of Faults

IEEE Access, 2019

During program testing, software programs may be discovered to contain multiple faults. Multiple ... more During program testing, software programs may be discovered to contain multiple faults. Multiple faults in a program may reduce the effectiveness of the existing fault localization techniques due to the complex relationship between faults and failures in the presence of multiple faults. In an ideal case, faults are isolated into fault-focused clusters, each targeting a single fault for developers to localize them simultaneously in parallel. However, the relationship between faults and failures is not easily identified and depends solely on the accuracy of clustering, as such, existing clustering algorithms are not able to isolate failed tests to their causative faults effectively which hinder localization effectiveness. This paper proposes a new approach that makes use of a divisive network community clustering algorithm to isolate faults into separate fault-focused communities that target a single fault each. A community weighting and selection mechanism that aids in prioritizing highly important fault-focused communities to the available developers to debug the faults simultaneously in parallel is also proposed. The approach is evaluated on eight subject programs ranging from medium-sized to large-sized programs (tcas, replace, gzip, sed, flex, grep, make, and ant). Overall, 540 multiple-fault versions of these programs were generated with 2-5 faulty versions. The experimental results have demonstrated that the proposed approach performs significantly better in terms of localization effectiveness in comparison with two other parallel debugging approaches for locating multiple faults in parallel.

Research paper thumbnail of Software fault localisation: a systematic mapping study

IET Software, 2018

Software fault localisation (SFL) is recognised to be one of the most tedious, costly, and critic... more Software fault localisation (SFL) is recognised to be one of the most tedious, costly, and critical activities in program debugging. Due to the increase in software complexity, there is a huge interest in advanced SFL techniques that aid software engineers in locating program bugs. This interest paves a way to the existence of a large amount of literature in the SFL research domain. This study aims to investigate the overall research productivity, demographics, and trends shaping the landscape of SFL research domain. The research also aims to classify existing fault localisation techniques and identify trends in the field of study. Accordingly, a systematic mapping study of 273 primary selected studies is conducted with the adoption of an evidence-based systematic methodology to ensure coverage of all relevant studies. The results of this systematic mapping study show that SFL research domain is gaining more attention since 2010, with an increasing number of publications per year. Three main research facets were identified, i.e. validation research, evaluation research, and solution research, with solution research type getting more attention. Hence, various contribution facets were identified as well. In totality, general demographics of SFL research domain were highlighted and discussed.

Research paper thumbnail of Towards Improving the Security of Low-Interaction Honeypots: Insights from a Comparative Analysis

Lecture Notes in Electrical Engineering, 2017

Full bibliographic details must be given when referring to, or quoting from full items including ... more Full bibliographic details must be given when referring to, or quoting from full items including the author's name, the title of the work, publication details where relevant (place, publisher, date), pagination, and for theses or dissertations the awarding institution, the degree type awarded, and the date of the award.

Research paper thumbnail of A Hybrid Three-Phased Approach in Requirement Elicitation

Lecture Notes in Electrical Engineering, 2017

Full bibliographic details must be given when referring to, or quoting from full items including ... more Full bibliographic details must be given when referring to, or quoting from full items including the author's name, the title of the work, publication details where relevant (place, publisher, date), pagination, and for theses or dissertations the awarding institution, the degree type awarded, and the date of the award.

Research paper thumbnail of The Big Data Technology: Assessing the Impact in the Banking Industry

Open Journal of Science and Technology

Big data is a form of data with increased volume, difficult to analyze, process, and store using ... more Big data is a form of data with increased volume, difficult to analyze, process, and store using traditional database technologies. It has long been adopted in business and finance where a large number of bank transaction are executed daily. The emergence of big data in banking industry results to large proportion of technical improvements in the industry. However, its processing causes disruption in the banking industry. Big data analytics is the process that involves using algorithms and software tools to extract useful business information from the dataset. This study adopts big data analytics process to investigates the disruption due to big data processing in the banking industry. The study identifies, acquired, and extracted dataset of the banking industry which was analyzed using MapReduce based fraud committed due to processing of large amount of data. findings show that government employee commit more crime in comparison with the private sector employees. Finally, based on ...

Research paper thumbnail of Software Requirements Negotiation: a Systematic Literature Review

Software requirements negotiation (SRN) is one of the most essential stages of software requireme... more Software requirements negotiation (SRN) is one of the most essential stages of software requirements engineering. SRN involves the stakeholder's interaction to reach a mutual understanding of the requirements for developing a software project. The increased research interest in requirements engineering has resulted in huge literature in the SRN domain. There is a need to investigate the broad of techniques, processes, and evaluation mechanisms used in the SRN research community. This study aims to examine and identify the existing methods, processes, evaluation mechanisms, quantity of publications, publication trends and demographics shaping SRN research domain. To accomplish our aim, we used an evidence-based systematic approach, and 67 relevant studies were ultimately chosen from the search process based on the formulated research questions. Our study result shows broad and promising SRN techniques that include agent-based negotiation, TAICOS, wikiwinwin and winbook. However, ...

Research paper thumbnail of IPv4 and IPv6 Protocols: A Comparative Performance Study

2019 IEEE 10th Control and System Graduate Research Colloquium (ICSGRC), 2019

The advancement in wireless technologies has allowed devices to access the internet from virtuall... more The advancement in wireless technologies has allowed devices to access the internet from virtually anywhere around the globe. However, the demand for reliable communication through voice and video streaming is significantly high. Internet protocol version 4 (IPv4) is a widely used internet protocol version in the world today, but due to the shortage of IPv4 addresses in the information technology (IT) world, a new internet protocol was introduced coined Internet protocol version 6 (IPv6) to address this issue. This paper aims to perform a comparative study on the performance of IPv4 and IPv6 on voice and video network traffic flow using performance metrics such as jitter, throughput, and packet loss. Accordingly, a testbed experimental environment is set-up with two hosts in client-server mode. Therefore, two scenarios (voice/video) were set-up to analyze the performance of the protocols. The result of this study shows that IPv6 has performed better than IPv4 in both experimental scenarios (voice/video).

Research paper thumbnail of Software Fault Localization : Issues and Limitations

Fault localization is the most challenging and tedious activity in program debugging. This activi... more Fault localization is the most challenging and tedious activity in program debugging. This activity is vital in maintaining software quality. Existing fault localization techniques assume that a program contains only one fault during the localization process. Realistically, a program failure can be caused by multiple active faults. However, for a program with multiple faults, interference do occur between faults that mask failures, this causes the existing fault localization techniques to lose their localization effectiveness with a great margin. Hence, techniques that try to solve this problem are suffering from lack of scalability and high computational complexity. In this paper, we discussed and analyzed the existing fault localization techniques and their application to multiple fault programs, and highlight some of their issues and limitations. Based on our findings, existing fault localization techniques based on multiple faults are shallow and more improvement is needed. Neve...

Research paper thumbnail of A Systematic Mapping Study of the Empirical Explicit Aspect Extractions in Sentiment Analysis

IEEE Access, 2020

Aspect-based sentiment analysis (ABSA) is described as one of the most vibrant research areas ove... more Aspect-based sentiment analysis (ABSA) is described as one of the most vibrant research areas over the last decade. However, due to the exponential increase in aspect-based sentiment researches, there is a massive interest in advanced explicit aspect extraction (EAE) techniques. This interest brings about a huge amount of literature in the EAE domain. This study aims to investigate and identify the existing approaches, techniques, types of research, quantity of publications, publication trends and demographics shaping the EAE research domain in the last decade (2009-2019). Accordingly, an evidence-based systematic methodology was adopted to effectively capture all the relevant studies. The main findings revealed that, 1) there is considerable and continuous rise of EAE research activities around different parts of the globe in the last five years, particularly Asia, Middle-East, and European countries; 2) EAE research has been very limited among African countries which need to be addressed due its role on business intelligence as well as semantic values; 3) three research facets were highlighted based on this study, i.e. solution research, validation research, and evaluation research, in which solution research gets the highest attention; and finally 4) the EAE challenges, as well as feasible future recommendations, were highlighted in this study.

Research paper thumbnail of Calculating carbon emissions from personal travelling: insights from a top-down analysis of key calculators

Environmental Science and Pollution Research, 2020

Personal travelling unfavourably contributes to the emissions of greenhouse gases, which adversel... more Personal travelling unfavourably contributes to the emissions of greenhouse gases, which adversely causes long-term damage to the climate. In order to reduce the associated negative impacts of such activities on the environment, there is a wide consensus that enhancements and innovations in the efficiency of vehicles will not be enough, but behavioural changes are needed. For this, individuals should be able to measure their travel-related carbon emissions, and such emissions could be determined by using personal carbon footprint calculators, which proliferated during the previous decade. However, various research questions related to such calculators are yet to be answered in published literature. As such, this paper investigates how key transport-based calculators account for emissions from personal transport-related activities following a top-down analysis. In this endeavour, ten such calculators are investigated through a set of formulated research questions to analyse their scope, calculation approach used, transparency, consistency of results, communication methods utilized and platform differences. Results revealed that the calculators have varying granularity, have limited transparency, provide significantly inconsistent results in some cases and are not fully engaging end users. Based on limitations identified, recommendations have been proposed through a taxonomy to guide policy-makers towards improving such tools.

Research paper thumbnail of Digital Hadith authentication: Recent advances, open challenges, and future directions

Transactions on Emerging Telecommunications Technologies, 2020

The Holy Quran and Hadith are the two main sources of legislation and guidelines for Muslims to s... more The Holy Quran and Hadith are the two main sources of legislation and guidelines for Muslims to shape their lives. The daily activities, sayings, and deeds of the Holy Prophet Muhammad (PBUH) are called Hadiths. Hadiths are the optimal practical descriptions of the Holy Quran. Technological advancements of information and communication technologies (ICT) have revolutionized every field of daily life, including digitizing the Holy Quran and Hadith. Available online contents of Hadith are obtained from different sources. Thus, alterations and fabrications of fake Hadiths are feasible. Authentication of these online available Hadith contents is a complex and challenging task and a crucial area of study in Islam. Few Hadith authentication techniques and systems are proposed in the literature. In this study, we have surveyed all techniques and systems, which are proposed for Hadith authentication. Furthermore, classification, open challenges, and future research directions related to Hadith authentication are identified.

Research paper thumbnail of Multiple fault localization of software programs: A systematic literature review

Information and Software Technology, 2020

Context: Multiple fault localization (MFL) is the act of identifying the locations of multiple fa... more Context: Multiple fault localization (MFL) is the act of identifying the locations of multiple faults (more than one fault) in a faulty software program. This is known to be more complicated, tedious, and costly in comparison to the traditional practice of presuming that a software contains a single fault. Due to the increasing interest in MFL by the research community, a broad spectrum of MFL debugging approaches and solutions have been proposed and developed. Objective: The aim of this study is to systematically review existing research on MFL in the software fault localization (SFL) domain. This study also aims to identify, categorize, and synthesize relevant studies in the research domain. Method: Consequently, using an evidence-based systematic methodology, we identified 55 studies relevant to four research questions. The methodology provides a systematic selection and evaluation process with rigorous and repeatable evidence-based studies selection process. Result: The result of the systematic review shows that research on MFL is gaining momentum with stable growth in the last 5 years. Three prominent MFL debugging approaches were identified, i.e. One-bug-at-a-time debugging approach (OBA), parallel debugging approach, and multiple-bug-at-a-time debugging approach (MBA), with OBA debugging approach being utilized the most. Conclusion: The study concludes with some identified research challenges and suggestions for future research. Although MFL is becoming of grave concern, existing solutions in the field are less mature. Studies utilizing real faults in their experiments are scarce. Concrete solutions to reduce MFL debugging time and cost by adopting an approach such as MBA debugging approach are also less, which require more attention from the research community.

Research paper thumbnail of A Systematic Mapping Study of the Empirical MOOC Literature

IEEE Access, 2019

Massive open online courses (MOOCs) have revolutionized todays education by offering a global acc... more Massive open online courses (MOOCs) have revolutionized todays education by offering a global accessible form of online learning. Over the years, MOOCs have been an attractive research area and have yielded an ample amount of research publications. However, the existing review studies in MOOCs are characterized by short year coverage or focusing on a specific theme. As such, a systematic mapping methodology was adopted to provide a fine-grain overview of MOOC research domain by identifying the quantity, types of research, available results and publication trends in educational aspects of MOOCs from 2009 to 2018. Key findings show that I) MOOC research have been on the rise since MOOCs became mainstream in 2011. II) MOOC research largely resides in the United States and few European countries. II) Most of MOOC studies focused on addressing learners' completion/dropout/retention. In addition, we proposed some recommendations for future research on MOOCs.

Research paper thumbnail of Parallel debugging: An investigative study

Journal of Software: Evolution and Process, 2019

In the simultaneous localization of multiple software faults, a parallel debugging approach has c... more In the simultaneous localization of multiple software faults, a parallel debugging approach has consistently been utilized. The effectiveness of a parallel debugging approach is critically determined by the type of clustering algorithm and the distance metric used. However, clustering algorithms that group failed tests based on their execution profile similarity with distance metrics such as Euclidean distance, Jaccard distance, and Hamming distance are considered to be problematic and not appropriate. In this paper, we conducted an investigative study on the usefulness of the problematic parallel debugging approach that makes use of k-means clustering algorithm (that groups failed tests based on their execution profile similarity) with Euclidian distance metric on three similarity coefficient-based fault localization techniques in terms of localization effectiveness. Secondly, we compare the effectiveness of the problematic parallel debugging approach with one-bug-at-a-time debugging approach (OBA) and a state-of-the-art parallel debugging approach named MSeer. The empirical evaluation is conducted on 540 multiple-fault versions of eight medium-sized to large-sized subject programs with two, three, four, and five faulty versions. Our results suggest that clustering failed tests based on their execution profile similarity and the utilization of distance metrics such as Euclidean distance is indeed problematic and contributes to the reduction of effectiveness in localizing multiple faults.

Research paper thumbnail of A Community-Based Fault Isolation Approach for Effective Simultaneous Localization of Faults

IEEE Access, 2019

During program testing, software programs may be discovered to contain multiple faults. Multiple ... more During program testing, software programs may be discovered to contain multiple faults. Multiple faults in a program may reduce the effectiveness of the existing fault localization techniques due to the complex relationship between faults and failures in the presence of multiple faults. In an ideal case, faults are isolated into fault-focused clusters, each targeting a single fault for developers to localize them simultaneously in parallel. However, the relationship between faults and failures is not easily identified and depends solely on the accuracy of clustering, as such, existing clustering algorithms are not able to isolate failed tests to their causative faults effectively which hinder localization effectiveness. This paper proposes a new approach that makes use of a divisive network community clustering algorithm to isolate faults into separate fault-focused communities that target a single fault each. A community weighting and selection mechanism that aids in prioritizing highly important fault-focused communities to the available developers to debug the faults simultaneously in parallel is also proposed. The approach is evaluated on eight subject programs ranging from medium-sized to large-sized programs (tcas, replace, gzip, sed, flex, grep, make, and ant). Overall, 540 multiple-fault versions of these programs were generated with 2-5 faulty versions. The experimental results have demonstrated that the proposed approach performs significantly better in terms of localization effectiveness in comparison with two other parallel debugging approaches for locating multiple faults in parallel.

Research paper thumbnail of Software fault localisation: a systematic mapping study

IET Software, 2018

Software fault localisation (SFL) is recognised to be one of the most tedious, costly, and critic... more Software fault localisation (SFL) is recognised to be one of the most tedious, costly, and critical activities in program debugging. Due to the increase in software complexity, there is a huge interest in advanced SFL techniques that aid software engineers in locating program bugs. This interest paves a way to the existence of a large amount of literature in the SFL research domain. This study aims to investigate the overall research productivity, demographics, and trends shaping the landscape of SFL research domain. The research also aims to classify existing fault localisation techniques and identify trends in the field of study. Accordingly, a systematic mapping study of 273 primary selected studies is conducted with the adoption of an evidence-based systematic methodology to ensure coverage of all relevant studies. The results of this systematic mapping study show that SFL research domain is gaining more attention since 2010, with an increasing number of publications per year. Three main research facets were identified, i.e. validation research, evaluation research, and solution research, with solution research type getting more attention. Hence, various contribution facets were identified as well. In totality, general demographics of SFL research domain were highlighted and discussed.

Research paper thumbnail of Towards Improving the Security of Low-Interaction Honeypots: Insights from a Comparative Analysis

Lecture Notes in Electrical Engineering, 2017

Full bibliographic details must be given when referring to, or quoting from full items including ... more Full bibliographic details must be given when referring to, or quoting from full items including the author's name, the title of the work, publication details where relevant (place, publisher, date), pagination, and for theses or dissertations the awarding institution, the degree type awarded, and the date of the award.

Research paper thumbnail of A Hybrid Three-Phased Approach in Requirement Elicitation

Lecture Notes in Electrical Engineering, 2017

Full bibliographic details must be given when referring to, or quoting from full items including ... more Full bibliographic details must be given when referring to, or quoting from full items including the author's name, the title of the work, publication details where relevant (place, publisher, date), pagination, and for theses or dissertations the awarding institution, the degree type awarded, and the date of the award.