Information Technology and Computer Science Research Papers (original) (raw)

This study examines the individuals’ participation intentions and behaviour on Social Networking Sites (SNSs). For this purpose, the Technology Acceptance Model (TAM) is utilized and extended in this study through the addition of... more

This study examines the individuals’ participation intentions and behaviour on Social Networking Sites (SNSs). For this purpose, the Technology Acceptance Model (TAM) is utilized and extended in this study through the addition of “perceived social capital” construct aiming to increase its explanatory power and predictive ability in this context. Data collected from a survey of 1100 participants and distilled to 657 usable sets has been analysed to assess the predictive power of proposed model via structural equation modelling. The model proposed in this study explains 56% of the variance in “Participation Intentions” and 55% of the variance in “Participation Behaviour”. Participation of behavioural intention in the model’
explanatory power was the highest amongst the constructs (able to explain 28% of usage behaviour). While, “Attitude” explain around 11% of SNSs usage behaviour. The study findings also show that “Perceived Social Capital” construct has a notable impact on usage behaviour, this impact came indirectly through its direct effect on “Attitude” and “Perceived Usefulness”. Participation of “Perceived Social Capital” in the models' explanatory power was the third highest amongst the constructs. “Perceived Social Capital”, alone explain around 9% of SNSs usage behaviour.

In modern and crowded traffic life, it becomes mandatory to save the human life by reaching the hospital very sooner than possible. This paper explains about the advancement in tracking the ambulance presence amidst the traffic with the... more

In modern and crowded traffic life, it becomes mandatory to save the human life by reaching the hospital very sooner than possible. This paper explains about the advancement in tracking the ambulance presence amidst the traffic with the usage of GPS tracking system calculating the position of the ambulance from the traffic signal which replaces the RF transmitter receiver circuitry. This paper also retains the use of the image capturing technology available from the base paper (IMPACT OF IMAGE PROCESSING IN SAVING THE HUMAN LIFE BY AUTOMATING TRAFFIC SIGNALS) to identify the presence of the ambulance, thereby changing the signal to green automatically in favour of ambulance.

Due to the increasingly need for automatic traffic monitoring, vehicle license plate detection is of high interest to perform automatic toll collection, traffic law enforcement, parking lot access control, among others. In this paper, a... more

Due to the increasingly need for automatic traffic monitoring, vehicle license plate detection is of high interest to perform automatic toll collection, traffic law enforcement, parking lot access control, among others. In this paper, a sliding window approach based on Histogram of Oriented Gradients (HOG)
features is used for Brazilian license plate detection. This approach consists in scanning the whole image in a multiscale fashion such that the license plate is located precisely. The main contribution of this work consists in a deep study of the best setup for HOG descriptors on the detection of Brazilian license plates, in which HOG have never been applied before. We also demonstrate the reliability of this method ensured
by a recall higher than 98% (with a precision higher than 78%) in a publicly available data set.

This paper presents the results of an online survey which was conducted to analyse the use of social web in the context of daily news. Users' motivation and habit in the news consumption were focused. Moreover, users' news behaviour was... more

This paper presents the results of an online survey which was conducted to analyse the use of social web in the context of daily news. Users' motivation and habit in the news consumption were focused. Moreover, users' news behaviour was distinguished in three purposes such news consumption, news production and news dissemination to find out if the usage has a passive or active character. In a second step it was questioned which social software is used for which purpose. In conclusion users appreciate social software for features such as interactivity and information that traditional media does not provide. Among the social web platforms users prefer social networking sites as well as videoshare platforms. Social networking sites also rank first in the news production and dissemination.

Map Reduce has gained remarkable significance as a prominent parallel data processing tool in the research community, academia and industry with the spurt in volume of data that is to be analyzed. Map Reduce is used in different... more

Map Reduce has gained remarkable significance as a prominent parallel data processing tool in the research community, academia and industry with the spurt in volume of data that is to be analyzed. Map Reduce is used in different applications such as data mining, data analytics where massive data analysis is required, but still it is constantly being explored on different parameters such as performance and efficiency. This survey intends to explore large scale data processing using MapReduce and its various implementations to facilitate the database, researchers and other communities in developing the technical understanding of the MapReduce framework. In this survey, different MapReduce implementations are explored and their inherent features are compared on different parameters. It also addresses the open
issues and challenges raised on fully functional DBMS/Data Warehouse on MapReduce. The comparison of various Map Reduce implementations is done with the most popular implementation Hadoop and other similar implementations using other platforms.

Bangladesh is a developing country with the literacy rate of 62.5%. Challenges in education sector are huge. There is no doubt that IT is going to bring about a tremendous change in education. The use of ICT in education was more... more

Bangladesh is a developing country with the literacy rate of 62.5%. Challenges in education sector are huge. There is no doubt that IT is going to bring about a tremendous change in education. The use of ICT in education was more student-centred learning. Because of the world moving rapidly into digital media and information, the role of ICT in education is becoming more and more important. In developing countries like Bangladesh, The implementation of ICT in education sector is a greatest challenge. In this paper, challenges are discussed and some recommendations are provided. The development of Bangladesh through the implementation of ICT in education also discussed here.

Business Process Management aligns organisational strategy and business operation. The dynamic environment within which organisations operate promotes changes in business processes, in a phenomenon known as business process variability.... more

Business Process Management aligns organisational strategy and business operation. The dynamic environment within which organisations operate promotes changes in business processes, in a phenomenon known as business process variability. The goal of this research is reviewing business process variability literature to comprehend this phenomenon and analyse its theoretical foundation. Through a systematic mapping study, 80 primary studies acted as sources of evidence to answer three research questions. By summarizing this theoretical background, we establish a conceptual synthesis of business process variability. We equally describe business process variability approaches and observe whether these were empirically assessed. Finally, we discuss research opportunities in the field. Our study shows that concepts in business process variability domain are used in an inconsistent manner, demanding a common vocabulary. A significant number of approaches is available, but most of them lack empirical studies. Additionally, our findings provide a diagnosis of the major challenges in the field.

Project management is quite important to execute projects effectively and efficiently. Project management is vital to projects success. The main challenge of project management is to achieve all project goals, taking into consideration... more

Project management is quite important to execute projects effectively and efficiently. Project management is vital to projects success. The main challenge of project management is to achieve all project goals, taking into consideration time, scope, budget constraints, and quality. This paper will identify general measures for the two specific goals and its ten specific practices of Integrated Project management Process Area in Capability Maturity Model Integration (CMMI). CMMI is a framework for improvement and assessment of computer information systems. The method we used to define the measures is to apply the Goal Questions Metrics (GQM) paradigm to the two specific goals and its ten specific practices of Integrated Project management Process Area in CMMI.

Of the four elementary operations, division is the most time consuming and expensive operation in modern day processors. This paper uses the tricks based on Ancient Indian Vedic Mathematics System to achieve a generalized algorithm for... more

Of the four elementary operations, division is the most time consuming and expensive operation in modern day processors. This paper uses the tricks based on Ancient Indian Vedic Mathematics System to achieve a generalized algorithm for BCD division in a much time efficient and optimised manner than the conventional algorithms in literature. It has also been observed that the algorithm in concern exhibits remarkable results when executed on traditional mid range processors with numbers having size up to 15 digits (50 bits). The present form of the algorithm can divide numbers having 38 digits (127 bits) which can be further enhanced by simple modifications.

Lane detection and tracking is one of the key features of advanced driver assistance system. Lane detection is finding the white markings on a dark road. Lane tracking use the previously detected lane markers and adjusts itself according... more

Lane detection and tracking is one of the key features of advanced driver assistance system. Lane detection is finding the white markings on a dark road. Lane tracking use the previously detected lane markers and adjusts itself according to the motion model. In this paper, review of lane detection and tracking algorithms developed in the last decade is discussed. Several modalities are considered for lane detection which include vision, LIDAR, vehicle odometry information, information from global positioning system and digital maps. The lane detection and tracking is one of the challenging problems in computer vision. Different vision based lane detection techniques are explained in the paper. The performance of different lane detection and tracking algorithms is also compared and studied.

Project monitoring and control process is an important process in developing any computer information system. Monitoring and Control provides an understanding of the project’s progress so that when the project deviates significantly from... more

Project monitoring and control process is an important process in developing any computer information system. Monitoring and Control provides an understanding of the project’s progress so that when the project deviates significantly from the plan appropriate corrective actions can be taken. This research will identify general measures for the specific goals and its specific practices of Project Monitoring and Control Process Area in Capability Maturity Model Integration (CMMI). CMMI is developed in USA by Software Engineering Institute (SEI) in Carnegie Mellon University. CMMI is a framework for assessment and improvement of computer information systems. The procedure we used to determine the measures is to apply the Goal Questions Metrics (GQM) approach to the two specific goals and its ten specific practices of Project Monitoring and Control Process Area in CMMI.

The present survey provides the state-of-the-art of research, copiously devoted to Evolutionary Approach (EAs) for clustering exemplified with a diversity of evolutionary computations. The Survey provides a nomenclature that highlights... more

The present survey provides the state-of-the-art of research, copiously devoted to Evolutionary Approach (EAs) for clustering exemplified with a diversity of evolutionary computations. The Survey provides a nomenclature that highlights some aspects that are very important in the context of evolutionary data
clustering. The paper missions the clustering trade-offs branched out with wide-ranging Multi Objective Evolutionary Approaches (MOEAs) methods. Finally, this study addresses the potential challenges of MOEA design and data clustering, along with conclusions and recommendations for novice and
researchers by positioning most promising paths of future research. MOEAs have substantial success across a variety of MOP applications, from pedagogical multifunction optimization to real-world engineering design. The survey paper noticeably organizes the developments witnessed in the past three decades for EAs based metaheuristics to solve multiobjective optimization problems (MOP) and to derive significant progression in ruling high quality elucidations in a single run.
Data clustering is an exigent task, whose intricacy is caused by a lack of unique and precise definition of a cluster. The discrete optimization problem uses the cluster space to derive a solution for Multiobjective data clustering. Discovery of a majority or all of the clusters (of illogical shapes) present in the data is a long-standing goal of unsupervised predictive learning problems or exploratory pattern analysis.

Cloud computing is a development of parallel, distributed and grid computing which provides computing potential as a service to clients rather than a product. Clients can access software resources, valuable information and hardware... more

Cloud computing is a development of parallel, distributed and grid computing which provides computing potential as a service to clients rather than a product. Clients can access software resources, valuable information and hardware devices as a subscribed and monitored service over a network through cloud computing.Due to large number of requests for access to resources and service level agreements between cloud service providers and clients, few burning issues in cloud environment like QoS, Power, Privacy and Security, VM Migration, Resource Allocation and Scheduling need attention of research community.Resource allocation among multiple clients has to be ensured as per service level agreements. Several techniques have been invented and tested by research community for generation of optimal schedules in cloud computing. A few promising approaches like Metaheuristics, Greedy, Heuristic technique and Genetic are applied for task scheduling in several parallel and distributed systems. This paper presents a review on scheduling proposals in cloud environment.

The most important aim of software engineering is to improve software productivity and quality of software product and further reduce the cost of software and time using engineering and management techniques. Broadly speaking, software... more

The most important aim of software engineering is to improve software productivity and quality of software product and further reduce the cost of software and time using engineering and management techniques. Broadly speaking, software engineering initiative has been introduced during software crisis period to describe the collection of techniques that apply engineering and management skills to the construction and
support of software process and products. There is no universally agreed theory for software measurement.
And the software metrics are useful for obtaining the information on evaluation of process and product in
software engineering. It helps to plan and carry out improvement in software organizations and to provide
objective information about project performance, process capability and product quality. The process capability is extremely important for software industry because the quality of products is largely determined by the quality of the processes. The make use of of existing metrics and development of innovative software metrics will be important factors in future software engineering process and product
development. In future, research work will be based on using software metrics in software development for the development of the time schedule, cost estimates and software quality and can be improved through software metrics. The permanent application of measurement based methodologies is used to the software process and its products to provide important and timely management information, together with the use of
those techniques to improve that software process and its products. This research paper mainly concentrates on the overview of unique basics of software measurement and exclusive fundamentals of software metrics in software engineering.

Of the four elementary operations, division is the most time consuming and expensive operation in modern day processors. This paper uses the tricks based on Ancient Indian Vedic Mathematics System to achieve a generalized algorithm for... more

Of the four elementary operations, division is the most time consuming and expensive operation in modern day processors. This paper uses the tricks based on Ancient Indian Vedic Mathematics System to achieve a generalized algorithm for BCD division in a much time efficient and optimised manner than the conventional algorithms in literature. It has also been observed that the algorithm in concern exhibits
remarkable results when executed on traditional mid range processors with numbers having size up to 15 digits (50 bits). The present form of the algorithm can divide numbers having 38 digits (127 bits) which can be further enhanced by simple modifications.

The Least Significant Bit (LSB) algorithm and the Most Significant Bit (MSB) algorithm are steganography algorithms with each one having its demerits. This work therefore proposed a Hybrid approach and compared its efficiency with LSB and... more

The Least Significant Bit (LSB) algorithm and the Most Significant Bit (MSB) algorithm are steganography algorithms with each one having its demerits. This work therefore proposed a Hybrid approach and compared its efficiency with LSB and MSB algorithms. The Least Significant Bit (LSB) and Most Significant Bit (MSB) techniques were combined in the proposed algorithm. Two bits (the least significant bit and the most significant bit) of the cover images were replaced with a secret message. Comparisons were made based on Mean-Squared Error (MSE), Peak Signal-to-Noise Ratio (PSNR) and the encoding time between the proposed algorithm, LSB and MSB after embedding in digital images. The combined
technique produced a stego-image with minimal distortion in image quality than MSB technique independent of the nature of data that was hidden. However, LSB algorithm produced the best stego-image quality. Large cover images however made the combined algorithm’s quality better improved. The
combined algorithm had lesser time of image and text encoding. Therefore, a trade-off exists between the encoding time and the quality of stego-image as demonstrated in this work.

An enhanced technique of color image encryption based on random matrix key encoding is proposed. To encrypt the color image a separation into Red Green and Blue (R, G, B) channels will applied. Each channel is encrypted using a technique... more

According to the Moore’s law, the number of transistor should be doubled every 18 to 24 months. The main factors of increasing the number of transistor are: a density and a die size. Each of them has a serious physical limitation; the... more

According to the Moore’s law, the number of transistor should be doubled every 18 to 24 months. The main factors of increasing the number of transistor are: a density and a die size. Each of them has a serious physical limitation; the first one “density” may be reached “Zero” after few years, which causes limitation in performance and speed of a microprocessor, the second one “die size” cannot be increased every 2 years, it must be fixed for several years, otherwise it will affect the economical side. This article aims to increase the number of transistors, which increase the performance and the speed of the microprocessor without or with a little bit increasing the die size, by constructing a two-layer crystal square for transistors, which allows increasing the number of transistors two additional times. By applying the new approach the number of transistors in a single chip will be approximately doubled every 24 months according to Moore’s Law without changing rapidly the size of a chip (length and width), only the height of a chip must be changed for putting the two layers.

In this paper, we apply backstepping control method to derive new results for the adaptive controller and synchronizer design for the Arneodo chaotic system (1980), when the system parameters are unknown. First, we design an adaptive... more

In this paper, we apply backstepping control method to derive new results for the adaptive controller and synchronizer design for the Arneodo chaotic system (1980), when the system parameters are unknown. First, we design an adaptive backstepping controller to stabilize the Arneodo system to its unstable equilibrium at the origin. Next, we design an adaptive backstepping controller to achieve global chaos synchronization of the identical Arneodo chaotic systems with unknown parameters. MATLAB simulations have been detailed to illustrate the proposed adaptive backstepping controller and synchronizer design for Arneodo chaotic system with unknown parameters.

Information security risk analysis becomes an increasingly essential component of organization’s operations. Traditional Information security risk analysis is quantitative and qualitative analysis methods. Quantitative and qualitative... more

Information security risk analysis becomes an increasingly essential component of organization’s operations. Traditional Information security risk analysis is quantitative and qualitative analysis methods. Quantitative and qualitative analysis methods have some advantages for information risk analysis.
However, hierarchy process has been widely used in security assessment. A future research direction may be development and application of soft computing such as rough sets, grey sets, fuzzy systems, generic algorithm, support vector machine, and Bayesian network and hybrid model. Hybrid model are developed by integrating two or more existing model. A Practical advice for evaluation information security risk is
discussed. This approach is combination with AHP and Fuzzy comprehensive method.

For properly installing CCTV cameras on any intersection point for traffic surveillance, some parameters need to be determined in order to get maximum benefit. The height, angle of placement of the CCTV camera is used to determine the... more

Centralization of services is a global trend in universities worldwide. IT services in a university become a solo department to provide common IT support for the entire university community such as staff, students and visitors. Special... more

Centralization of services is a global trend in universities worldwide. IT services in a university become a solo department to provide common IT support for the entire university community such as staff, students and visitors. Special services required for Computer Science and Information Technology courses are quite often not part of IT service agreements. The shared common computer labs are tightly controlled by IT services and could not be used to deliver some specialised course such as Microsoft Active Directory Configuration and Management. A dedicated computer lab is no longer a viable option to provide a
practical solution with three-year University leased computers. The challenge in providing a long-term
solution heavily relies on the individual academic staffs who deliver such courses. There is an urgent need to find a new long-term working solution for this problem. After numerous search and research in finding an optimal solution, a portable computer lab using virtualisation is proposed. Instead of using a desktop computer in a common lab, each student enrolled in the course such as Microsoft Active Directory is allocated a customised USB hard dive. The USB drive contains a bootable Windows 7 Operating System as
well as six Windows Servers operated under Microsoft Windows Virtual PC. This not only allows the students to use a common computer lab but also enable them to totally control their learning with flexibility which was not achievable in the past. This paper aims to share the knowledge as well as tricks and tips gained through the process and to provide step by step instructions for building the customised USB hard dive. It is expected that the concept of a ‘portable’ computer lab using virtualisation will provide insight into delivery of IT courses in the future.

Overall performance of computer systems are better investigated and evaluated when its various components are considered, components such as the hardware, software and firmware. The comparative analysis of single-core and multi-core... more

Overall performance of computer systems are better investigated and evaluated when its various components are considered, components such as the hardware, software and firmware. The comparative analysis of single-core and multi-core systems was carried out using Intel Pentium G640T 2.4GHz dualcore, Intel Pentium IV 2.4GHz single-core and Intel Pentium IV 2.8GHz single-core systems. The approach
method was using hi-tech benchmarking and stress testing software(s) to examine systems’ CPU and RAM for performance and stability. In all the tests, the components of dual-core had better rating when compared with single-core components; GFLOP result, and execution time for various processes rank
G640T 2.4GHz dual-core above Pentium IV 2.4GHz and 2.8GHz single-core respectively.

Of the four elementary operations, division is the most time consuming and expensive operation in modern day processors. This paper uses the tricks based on Ancient Indian Vedic Mathematics System to achieve a generalized algorithm for... more

Of the four elementary operations, division is the most time consuming and expensive operation in modern day processors. This paper uses the tricks based on Ancient Indian Vedic Mathematics System to achieve a generalized algorithm for BCD division in a much time efficient and optimised manner than the conventional algorithms in literature. It has also been observed that the algorithm in concern exhibits
remarkable results when executed on traditional mid range processors with numbers having size up to 15 digits (50 bits). The present form of the algorithm can divide numbers having 38 digits (127 bits) which can be further enhanced by simple modifications.

Yet, existing research on the delivery of citizen-centric e-government services in developing countries is still lacking explanatory power for the following reasons: 1) focus either on the supply-side of these services, or on the... more

Yet, existing research on the delivery of citizen-centric e-government services in developing countries is still lacking explanatory power for the following reasons: 1) focus either on the supply-side of these services, or on the demand-side separately, thus there is no enough research on the integration between them, and 2) focus on the results of previous research, thus ignoring the development of theories fit the new
context under investigation by understanding the relationship between the implementation of ICT and social structures in the same context. This study aims to fill these gaps by employing a holistic approach to enable in-depth understanding and gain valuable insights for success factors in the delivery of citizencentric e-government services from multiple perspectives, and in the real context of one of the Arab
countries, namely Jordan. This would reduce the gap between strategies and government policies on the one hand, and the perceptions of citizens on the other hand about the determinants of the delivery of citizen-centric e-government services in developing countries, allowing a better understanding of citizens' needs and priorities that must be taken into account by those governments in order to ensure the success of those services on a large scale. This study is part of a two-phase research aims to propose an integrated model of success factors in the delivery of citizen centric e-government services and then validate it in the context of developing countries in general and Jordan in particular. The first phase, which is beyond the scope of this study, employs grounded theory method to develop a research model, as well as survey instruments necessary to validate it. The second phase, which is within the scope of this study, employs survey questionnaire method to validate the research model by using multivariate and structural equation modeling (SEM). The results show that factors such as the perception of ease of use as well as the perception of usefulness are no longer affect the delivery of citizen-centric e-government services in this particular context. The results also show that the perception of organizational loyalty through more involvement of citizens in the decision-making process of change, as well as awareness-raising campaigns for such e-services, is among the positive and significant factors that affect the delivery of citizen-centric egovernment services in the same context. Perception of trust, quality and security in e-services, all of these have significance and positive impact on the delivery of such e-services in this particular context. Website design is also among the factors that have significance and a positive impact in the same context. Most importantly, the results show that the national e-readiness, including the macro political environment,
legal and regulatory environment, socio-economic environment and infrastructure related to ICT, is among
the factors that have significance, but did not reach to become positive in this particular context. However, the modern Arab culture stands out as among the factors that have significance, as well as a positive impact in this context. Finally, this study concludes by highlighting the theoretical and practical implications, limitations and future directions.

Diabetes disease is amongst the most commondisease in India. It affects patient’s health and also leads to other chronic diseases. Prediction of diabetes plays a significant role in saving of life and cost.Predicting diabetes in human... more

Diabetes disease is amongst the most commondisease in India. It affects patient’s health and also leads to other chronic diseases. Prediction of diabetes plays a significant role in saving of life and cost.Predicting diabetes in human body is a challenging task because it depends on several factors. Few studies have reported the performance of classification algorithms in terms of accuracy. Results in these studies are difficult and complex to understand by medical practitioner andalso lack in terms of visual aids asthey are presented in pure text format. This reported survey uses ROC and PRC graphical measures to improve understanding of results. A detailed parameter wise discussion of comparison isalso presented which lacks in other reported surveys. Execution time, Accuracy, TP Rate, FP Rate, Precision, Recall, F Measure parameters are used for comparative analysis and Confusion Matrix is prepared for quick review of each algorithm. Ten fold cross validation method is used for estimation of prediction model. Different sets of classification algorithms are analyzed on diabetes datasetacquired from UCI repository.

Rapid advancement in computing technology has put forth Cloud computing as a paramount paradigm in distributed systems. It is very much essential and important too to fully understand the underlying technologies that makes clouds... more

Rapid advancement in computing technology has put forth Cloud computing as a paramount paradigm in distributed systems. It is very much essential and important too to fully understand the underlying technologies that makes clouds possible. One key technology that make makes the cloud popular is virtualization. Even though virtualization technology is not new, the concept of hypervisors with virtualization is getting popular and also well understood by many. There are a good number of hypervisors. This paper discusses the types of Virtualization Technologies, hypervisors and configuration of the VMs and analyse the performance of Virtual Machines using Kernel Based Virtual Machine-KVM, a Type2 hypervisor.

This paper presents an approach for automatic detection of microaneurysms and hemorrhages in fundus images. These lesions are considered the earliest signs of diabetic retinopathy. The diabetic retinopathy is a disease caused by diabetes... more

This paper presents an approach for automatic detection of microaneurysms and hemorrhages in fundus images. These lesions are considered the earliest signs of diabetic retinopathy. The diabetic retinopathy is a disease caused by diabetes and is considered as the major cause of blindness in working age population. The proposed method is based on mathematical morphology and consists in removing components of retinal anatomy to reach the lesions. This method consists of five stages: a) pre-processing; b)
enhancement of low intensity structures; c) detection of blood vessels; d) elimination of blood vessels; e) elimination of the fovea. The accuracy of the method was tested on a public database of fundus images, where it achieved satisfactory results, comparable to other methods from the literature, reporting 87.69% and 92.44% of mean sensitivity and specificity, respectively.

Current multi-document summarization systems can successfully extract summary sentences, however with many limitations including: low coverage, inaccurate extraction to important sentences, redundancy and poor coherence among the selected... more

Current multi-document summarization systems can successfully extract summary sentences, however with
many limitations including: low coverage, inaccurate extraction to important sentences, redundancy and poor coherence among the selected sentences. The present study introduces a new concept of centroid approach and reports new techniques for extracting summary sentences for multi-document. In both
techniques keyphrases are used to weigh sentences and documents. The first summarization technique (Sen-Rich) prefers maximum richness sentences. While the second (Doc-Rich), prefers sentences from centroid document. To demonstrate the new summarization system application to extract summaries of Arabic documents we performed two experiments. First, we applied Rouge measure to compare the new techniques among systems presented at TAC2011. The results show that Sen-Rich outperformed all systems in ROUGE-S. Second, the system was applied to summarize multi-topic documents. Using human evaluators, the results show that Doc-Rich is the superior, where summary sentences characterized by extra coverage and more cohesion.

One of the major challenges for software, nowadays, is software cost estimation. It refers to estimating the cost of all activities including software development, design, supervision, maintenance and so on. Accurate cost-estimation of... more

One of the major challenges for software, nowadays, is software cost estimation. It refers to estimating the cost of all activities including software development, design, supervision, maintenance and so on. Accurate cost-estimation of software projects optimizes the internal and external processes, staff works, efforts and the overheads to be coordinated with one another. In the management software projects, estimation must be taken into account so that reduces costs, timing and possible risks to avoid project failure. In this paper, a decision- support system using a combination of multi-layer artificial neural network and decision tree is proposed to estimate the cost of software projects. In the model included into the proposed system, normalizing factors, which is vital in evaluating efforts and costs estimation, is carried out using C4.5 decision tree. Moreover, testing and training factors are done by multi-layer artificial neural network and the most optimal values are alloc...

Social networking sites (SNSs) are increasingly becoming a major type of online applications that facilitate online social interactions and information sharing among a large amount of users. Furthermore, privacy protection is an important... more

Social networking sites (SNSs) are increasingly becoming a major type of online applications that facilitate online social interactions and information sharing among a large amount of users. Furthermore, privacy protection is an important issue in social networking. Users are not able to easily specify their access control requirements through the available privacy configuration interfaces. An approach assisting online users in composing and managing their access control policies to configure their privacy setting is proposed based on Decision Tree Learning. Moreover, Ontology APIs include social network ontology (SNO) to capture the information semantics in an SNS and an access control ontology (ACO) that is used to store rules from the classifier combining with existing access control rules. Therefore, a fine-gained OSN access control model based on semantic web technologies is proposed in order to automatically construct access control rules for the users' privacy settings wi...

Predicting the student performance is a great concern to the higher education managements.This prediction helps to identify and to improve students' performance.Several factors may improve this performance.In the present study, we employ... more

Predicting the student performance is a great concern to the higher education managements.This prediction helps to identify and to improve students' performance.Several factors may improve this performance.In the present study, we employ the data mining processes, particularly classification, to enhance the quality of the higher educational system. Recently, a new direction is used for the improvement of the classification accuracy by combining classifiers.In thispaper, we design and evaluate a fastlearning algorithm using AdaBoost ensemble with a simple genetic algorithmcalled " Ada-GA " where the genetic algorithm is demonstrated to successfully improve the accuracy of the combined classifier performance. The Ada-GA algorithm proved to be of considerable usefulness in identifying the students at risk early, especially in very large classes. This early prediction allows the instructor to provide appropriate advising to those students. The Ada/GA algorithm is implemented and tested on ASSISTments dataset, the results showed that this algorithm hassuccessfully improved the detection accuracy as well as it reduces the complexity of computation.

The proposed work presents a navigation system, which detects the obstacles and also guide to visually impaired people about appropriate path. The system we propose detects the obstacle via an infrared based detecting system and sends... more

The proposed work presents a navigation system, which detects the obstacles and also guide to visually impaired people about appropriate path. The system we propose detects the obstacle via an infrared based detecting system and sends back vibro-tactile or sound (buzzer) feedback to inform the user about its
position. The most common method of obstacle detection, used by blind people is the walking stick. The limitation of walking stick is that, it does not provide protection near to head area. A sensor module is fixed on a light weight cap allowing the user to obtain the information about obstacles (near to head area) and also about correct path on which the user should move. When the user uses this system with cane, he is fully protected about obstacles and can navigate freely into the environment.

In recent years, many new cafés have emerged onto the market. Other than view cafés, beautiful cafés that seem as if they came from Paris or New York have gradually appeared in leisure and quiet residential areas, in alleyways, in... more

In recent years, many new cafés have emerged onto the market. Other than view cafés, beautiful cafés that seem as if they came from Paris or New York have gradually appeared in leisure and quiet residential areas, in alleyways, in peripheral areas, and in local commercial areas. In particular, leisure is trendy at present, and modern restaurants innovate in terms of their food, leisure, and consumption. Unlike traditional restaurants, they are able to develop into cafés with unique styles to attract consumers. Even though not all of these new cafés are successful, as cafés are an industry that is at the forefront of fashion, many individuals who dream of ntrepreneurship would want to open a café. However, as there are many types of cafés on the market, what type and style of cafés are the most suitable? An overview of cafés in Taiwan shows that each café offer unique services and functions to attract consumers, which is the key to sustainable operations of cafés. Therefore, this study explores the decisions of companies when choosing the style for their cafés. This study uses the analytical hierarchy process (AHP) to explore the selection of café styles, in order to provide references for café operators to achieve successful and sustainable operations. Based on literature review, expert interviews, and AHP, this study intends to provide useful results to the operators of cafés.

Precision agriculture is area with lack of cheap technology. The refinement of the production system brings large advantages to the producer and the use of images makes the monitoring a more cheap methodology. Macronutrients monitoring... more

Precision agriculture is area with lack of cheap technology. The refinement of the production system brings large advantages to the producer and the use of images makes the monitoring a more cheap methodology. Macronutrients monitoring can to determine the health and vulnerability of the plant in specific stages. In this paper is analyzed the method based on computational intelligence to work with image segmentation in
the identification of symptoms of plant nutrient deficiency. Artificial neural networks are evaluated for image segmentation and filtering, several variations of parameters and insertion impulsive noise were evaluated too. Satisfactory results are achieved with artificial neural for segmentation same with high noise levels.

8th International Conference on Computer Science and Information Technology (CoSIT 2021) will provide an excellent international forum for sharing knowledge and results in theory, methodology and applications of Computer Science,... more

8th International Conference on Computer Science and Information Technology (CoSIT 2021) will provide an excellent international forum for sharing knowledge and results in theory, methodology and applications of Computer Science, Engineering and Information Technology. The Conference looks for significant contributions to all major fields of the Computer Science and Information Technology in theoretical and practical aspects. The aim of the conference is to provide a platform to the researchers and practitioners from both academia as well as industry to meet and share cutting-edge development in the field.

Graph Isomorphism is one of the classical problems of graph theory for which no deterministic polynomial-time algorithm is currently known, but has been neither proven to be NP-complete. Several heuristic algorithms have been proposed to... more

Graph Isomorphism is one of the classical problems of graph theory for which no deterministic polynomial-time algorithm is currently known, but has been neither proven to be NP-complete. Several heuristic algorithms have been proposed to determine whether or not two graphs are isomorphic (i.e., structurally the same). In this paper, we analyze the discriminating power of the well-known centrality measures on real-world network graphs and propose to use the sequence (either the non-decreasing or non-increasing order) of eigenvector centrality (EVC) values of the vertices of two graphs as a precursor step to decide whether or not to further conduct tests for graph isomorphism. The eigenvector centrality of a vertex in a graph is a measure of the degree of the vertex as well as the degrees of its neighbors. As the EVC values of the vertices are the most distinct, we hypothesize that if the non-increasing (or non-decreasing) order of listings of the EVC values of the vertices of two test graphs are not the same, then the two graphs are not isomorphic. If two test graphs have an identical non-increasing order of the EVC sequence, then they are declared to be potentially isomorphic and confirmed through additional heuristics. We test our hypothesis on random graphs (generated according to the Erdos-Renyi model) and we observe the hypothesis to be indeed true: graph pairs that have the same sequence of non-increasing order of EVC values have been confirmed to be isomorphic using the well-known Nauty software.

This paper investigated how doctors in remote rural hospitals in South Africa use computer-mediated tool to communicate with experienced and specialist doctors for professional advice to improve on their clinical practices. A case study... more

This paper investigated how doctors in remote rural hospitals in South Africa use computer-mediated tool
to communicate with experienced and specialist doctors for professional advice to improve on their clinical practices. A case study approach was used. Ten doctors were purposively selected from ten hospitals in the North West Province. Data was collected using semi-structured open ended interview questions. The interviewes were asked to tell in their own words the average number of patients served per week,
processes used in consultation with other doctors, communication practices using computer-mediated tool,
transmission speed of the computer-mediated tool and satisfaction in using the computer-mediated communication tool. The findings revealed that an average of 15 consultations per doctor to a specialist doctor per week was done through face to face or through telephone conversation instead of using a computer-mediated tool. Participants cited reasons for not using computer-mediated tool for communication due to slow transmission speed of the Internet and regular down turn of the Internet connectivity, constant electricity power outages and lack of e-health application software to support real time computer-mediated communication. The results led to the recommendation of a hybrid cloud computing architecture for improving communication between doctors in hospitals.

The new technique for image retrieval using the color features extracted from images based on Log-Histogram is proposed. The proposed technique is compared with Global color histogram and histogram of corners .It has been observed that... more

The new technique for image retrieval using the color features extracted from images based on Log-Histogram is proposed. The proposed technique is compared with Global color histogram and histogram of corners .It has been observed that number of histogram bins used for retrieval comparison of proposed technique (Log Histogram)is less as compared to Global Color Histogram and Histogram of corners. The experimental results on a database of 792 images with 11 classes indicate that proposed method (Log-
Histogram) significantly improves Precision/Recall and Complexity of proposed method is less as compared to Global Color Histogram (GCH) and Histogram of Corners (HOC).

A reliable broadcast is communication primitive used to develop fault tolerant distributed applications. It in due course delivers messages to all participating sites irrespective of their ordering. Total order broadcast impose... more

Femtocell is a small cellular base station in home, a low-power wireless access point designed for use in residential or small business environment, this technology is a promising solution for operators to improve network performance, and... more

Femtocell is a small cellular base station in home, a low-power wireless access point designed for use in residential or small business environment, this technology is a promising solution for operators to improve network performance, and the deployment of a femtocell is a key for bandwidth limitation and coverage issues in conventional mobile network system. Therefore, the inter-cell handover process will become more
complex, frequent and time-sensitive with the introduction of femtocells in cellular networks. This paper presents the architecture of LTE femtocell networks and investigates the different scenarios in the handover procedure. Especially, the mobility of user from macrocell to femtocell which is quite difficult due to the large number of candidate femtocells in the coverage area and the characteristics of the femtocell entity, In
order to achieve the optimize procedure handover decision policy based on using the advantage of the entity HeNB Policy Function to select and make decision of handover based on different constraint to make the optimal decision about the target femtocell to mitigate the unnecessary handovers, hence we describe the decision policies rules as well as the handover signalling flow furthermore, an analytical model was presented for the handover signalling cost.