Software Analysis Research Papers - Academia.edu (original) (raw)
Program representation is a critical issue in the area of software analysis and software re-engineering. It heavily relates to the portability and effectiveness of the software analysis tools that can be developed. This paper describes an... more
Program representation is a critical issue in the area of software analysis and software re-engineering. It heavily relates to the portability and effectiveness of the software analysis tools that can be developed. This paper describes an approach that focuses on source code representation schemes in the form of Abstract Syntax Trees that are encoded as XML documents. These XML source
PurposeThe purpose of this paper is to investigate the nine‐point strategic leadership characteristics of Malaysian Quality National Primary School Leaders (QNPSL) and to indicate the implications of these findings for the current... more
PurposeThe purpose of this paper is to investigate the nine‐point strategic leadership characteristics of Malaysian Quality National Primary School Leaders (QNPSL) and to indicate the implications of these findings for the current educational management and leadership practices in their quest for Malaysian quality education.Design/methodology/approachThe study selected 600 senior management team members from 150 schools to complete a seven‐page questionnaire survey and eventually managed to collect back 420 completed survey questionnaires. The study employed SPSS 15.0 and a full‐fledged Structural Equation Modeling (SEM) software Analysis of Moment Structures (AMOS) version 16.0 to confirm the dimensionality and the psychometric properties of the scale, as well as to test the research hypotheses.FindingsDespite what Davies and Davies suggested, the results confirm that the Malaysian QNPSL do not possess three out of nine predetermined characteristics of a strategic leader such as wi...
This is a field report on a comprehensive study of the Atamir Formation from the engineering geology perspective using the related indices. The Atamir Formation of the Cretaceous Period, which has outcropped in the form of thick... more
This is a field report on a comprehensive study of the Atamir Formation from the engineering geology perspective using the related indices. The Atamir Formation of the Cretaceous Period, which has outcropped in the form of thick frequencies of grey-knotted sandstone and black shales, is situated in the Kope Dagh zone. A survey of discontinuities together with bedding was carried out to study slope stability. The layers have a general east-west trend with a gentle slope towards the south. Because of the tectonic and stratigraphic differences, and with the purpose of facilitating surveys related to joint study of the outcrop, the formation in the study region was divided into three units. The lower unit is made of shale, the middle of sandstone, and the upper of marlstone. All three units were studied from the perspective of geomechanical classification, rock mass indices , geological strength, geomechanical indices, and wedge instability analysis under dry and wet conditions, and the results were investigated in the form of various images and figures. The Dips software was used to display the rose diagram and stereographic projection of each unit, the Swedge software to analyze instability of the wedges, and the Roctab software to analyze the geo-mechanical parameters and present the outputs along with the description of each unit.
Formal Concept Analysis has been originally developed as a subfield of Applied Mathematics based on the mathematization of concept and concept hierarchy. Only after more than a decade of development, the connections to the philosophical... more
Formal Concept Analysis has been originally developed as a subfield of Applied Mathematics based on the mathematization of concept and concept hierarchy. Only after more than a decade of development, the connections to the philosophical logic of human thought became clearer and even later the connections to Piaget’s cognitive structuralism which Thomas Bernhard Seiler convincingly elaborated to a comprehensive theory of concepts in his recent book [Se01]. It is the main concern of this paper to show the surprisingly rich correspondences between Seiler’s multifarious aspects of concepts in the human mind and the structural properties and relationships of formal concepts in Formal Concept Analysis. These correspondences make understandable, what has been experienced in a great multitude of applications, that Formal Concept Analysis may function in the sense of transdisciplinary mathematics, i.e., it allows mathematical thought to aggregate with other ways of thinking and thereby to support human thought and action.
Abstract We outline a new approach to safety analysis in which concepts of computer HAZOP are fused with the idea of software fault tree analysis to enable a continuous assessment of an evolving programmable design developed in... more
Abstract We outline a new approach to safety analysis in which concepts of computer HAZOP are fused with the idea of software fault tree analysis to enable a continuous assessment of an evolving programmable design developed in Matlab-Simulink. We also ...
There are many manufacturers around the world providing power quality (PQ) monitoring and digital fault recording (DFR) solutions. Different monitoring requirements, monitoring practices, and new standardization make for a diverse range... more
There are many manufacturers around the world providing power quality (PQ) monitoring and digital fault recording (DFR) solutions. Different monitoring requirements, monitoring practices, and new standardization make for a diverse range of available solutions. The first section of the paper will summarize the requirements and challenges of a modern PQ and DFR monitoring system. The second section will be a case study about the Qualitrol and Qenergia' systems implemented in Portugal, particularly at EDP and REN; it will show what has been done. Until IEC 61000-4-30 [1] was introduced in 2003 many power quality parameters were not well defined leading to situations where different instruments produced different results. In 2008 Edition 2 was introduced and Edition 3 is now in progress. As well as considering 61000-4-30, this paper will summarize the key requirements of the new upcoming IEC 62586 [2] product standard for power quality monitoring devices. In the second section of the paper we will present how to manage grid measuring points whose characteristics change over time, rather than just manage a fleet of measuring instruments. We will discuss the way to make compatible different types of measurement devices, access different data sources, handle the standards changes both in the limit values and in the way values are calculated and validate PQ and event data. Different ways of representing power quality information will also be addressed. Presenting information and not just raw data from the monitoring system is essential for efficient management of reports for regulatory and third party use. The Qualitrol and Qenergia's systems at EDP and REN have been developed to suit existing communication infra structure, automatic and manual data retrieval and integrate in a single software analysis package, in a multi-vendor way, already existing older PQ devices together with new instruments. Combining PQ and DFR data also leads to efficiencies when collecting and a- alyzing results.
In this paper, pile and soil are simulated by Solid-3D element. In the computational model, contact surfaces between pile and soil are imitated by contact element in ANSYS software. Analysis results of this three-dimensional model were... more
In this paper, pile and soil are simulated by Solid-3D element. In the computational model, contact surfaces between pile and soil are imitated by contact element in ANSYS software. Analysis results of this three-dimensional model were compared with another mathematics model of previous authors, such as: Brinch Hansen, Broms, Reese and Matlock, Poulos, K.X Zavriev.
By the early 1990s the need for reengineering legacy systems was already acute, but recently the demand has increased sig- nificantly with the shift toward web-based user interfaces. The demand by all business sectors to adapt their... more
By the early 1990s the need for reengineering legacy systems was already acute, but recently the demand has increased sig- nificantly with the shift toward web-based user interfaces. The demand by all business sectors to adapt their informa- tion systems to the Web has created a tremendous need for methods, tools, and infrastructures to evolve and exploit ex- isting applications
Software systems are hard to understand due to the complexity and the sheer size of the data to be analyzed. Software visualization tools are a great help as they can sum up large quantities of data in dense, meaningful pictures.... more
Software systems are hard to understand due to the complexity and the sheer size of the data to be analyzed. Software visualization tools are a great help as they can sum up large quantities of data in dense, meaningful pictures. Traditionally, such tools come in the form of desktop applications. Modern web frameworks are about to change this status quo, as building software visualization tools as web applications can help in making them available to a larger audience in a collaborative setting. Such a migration comes with a number of promises, perils, and technical implications that must be considered before starting any migration process. In this paper, we share our experiences in porting two such tools to the web and provide guidelines about the porting. In particular, we discuss promises and perils that go hand in hand with such an endeavor and present a number of technological alternatives that are available to implement web-based visualizations.
Failure to find information is common. Anxiety, reformulation of needs and queries, doubts about the quality of results and satisficing are typical in information seeking (eg, Kuhlthau 1993; Arora et al. 2008), and especially in... more
Failure to find information is common. Anxiety, reformulation of needs and queries, doubts about the quality of results and satisficing are typical in information seeking (eg, Kuhlthau 1993; Arora et al. 2008), and especially in exploratory, open ended information seeking (Mansourian ...
To evaluate performance (image signal to noise ratio) of a digital mammographic system working in 2D planar versus tomosynthesis modality, a contrast detail phantom was developed embedding 1 cm plexyglas, including 49 holes of different... more
To evaluate performance (image signal to noise ratio) of a digital mammographic system working in 2D planar versus tomosynthesis modality, a contrast detail phantom was developed embedding 1 cm plexyglas, including 49 holes of different diameter and depth, between two layers containing a breast simulating material. The acquisition protocol included 15 low dose projections (reconstructed 1mm-thick slices) and a 2D view. Using an automatic software analysis tool, the signal difference to noise ratio (SDNR) was evaluated. SDNR in the DBT images was about a factor two higher than with FFDM (P
This paper examines the commonalities of manual controllers used in robotics for teleoperation. These include devices ranging from simple joysticks to force-reflecting controllers. The similarities in functionality and behaviour of these... more
This paper examines the commonalities of manual controllers used in robotics for teleoperation. These include devices ranging from simple joysticks to force-reflecting controllers. The similarities in functionality and behaviour of these controllers is further exploited to develop a reusable software architecture for manual controller interfacing. The development of this architecture is based on object-oriented design. The application of this design philosophy led to the development of a hierarchy of software components that are manual controller independent and also have a standardized interface. Reusability of these components is supported through generality and extensibility. The key design requirements for this architecture were: open-system, reusable, application independent, extensive error-handling and safety checking, applicability to real-time control and simulation, and reduction in program development time. This paper discusses the software analysis and design issues that...
Assurance of cell line homogeneity and capability of cell contamination detection are among the most essential steps of cell based research. Due to high discriminatory efficiency, low cost and reliability, analysis of short tandem repeats... more
Assurance of cell line homogeneity and capability of cell contamination detection are among the most essential steps of cell based research. Due to high discriminatory efficiency, low cost and reliability, analysis of short tandem repeats (STR) has been introduced as a method of choice for human cell line authentication. In the present study 13 Combined DNA Index System (CODIS) based STRs along with the gender determination (Amelogenin) gene were utilized to establish a reproducible approach for the authentication of 100 human cell lines deposited in the National Cell Bank of Iran (NCBI), using the polymerase chain reaction (PCR) method. PCR products were subsequently analyzed by polyacrylamide gel electrophoresis (PAGE) and visualized by silver staining followed by gel documentation and software analysis. STR profiles obtained were compared with those of the American Type Culture Collection (ATCC) and the Japanese Collection of Research Bioresource (JCRB) as STR references. We detected 18.8% cross contamination among the NCBI human cell lines. To our knowledge, this is the first report of authentication of human cell lines using the 13 CODIS core STRs combined with Amelogenin.