Consistency Checking Research Papers - Academia.edu (original) (raw)

The initial step of any software engineering methodology is to form requirements. Recently, a goal-oriented approach to requirements has been proposed and argued to be beneficial. Goals also play a key role in the implementation of... more

The initial step of any software engineering methodology is to form requirements. Recently, a goal-oriented approach to requirements has been proposed and argued to be beneficial. Goals also play a key role in the implementation of proactive software agents. However, although some agent-oriented software engineering methodologies have incorporated (aspects of) goal-oriented requirements engineering, and although they target agent platforms that provide goals as an implementation construct, none of the methodologies provide a goal-oriented design process. We present three modifications to the Prometheus methodology which aim to make it more goal-oriented in its design phases: we introduce a goal coupling diagram, as an adjunct to the existing data coupling diagram; we modify the temporal design artefacts to include goals; and we introduce a number of goal-related consistency checks. Finally, we report on an experimental evaluation comparing the effectiveness of the original and refined methodologies.

We are presenting techniques for simulating the motion and the deformation of cloth, fabrics or, more generally, deformable surfaces. Our main goal is to be able to simulate any kind of surface without imposing restrictions on shape or... more

We are presenting techniques for simulating the motion and the deformation of cloth, fabrics or, more generally, deformable surfaces. Our main goal is to be able to simulate any kind of surface without imposing restrictions on shape or geometrical environment. In particular, we are considering difficult situations with respect to deformations and collisions, like wrinkled fabric falling on the ground. Thus, we have enhanced existing algorithms in order to cope with any possible situation. A mechanical model has been implemented to deal with any irregular triangular meshes, handle high deformations despite rough discretisation, and cope with complex interacting collisions. Thus, it should deal efficiently with situations where nonlinearities and discontinuities are really non marginal. Collision detection has also been improved to efficiently detect self-collisions, and also to correctly consider collision orientations despite the lack of surface orientation information from preset geometrical contexts, using consistency checking and correction. We illustrate these features through simulation examples.

The work reported here is part of the PROGRES (PROgrammed Graph Rewriting Systems) project. PROGRES is a very high level multi paradigm language for the specification of complex structured data types and their operations. The data... more

The work reported here is part of the PROGRES (PROgrammed Graph Rewriting Systems) project. PROGRES is a very high level multi paradigm language for the specification of complex structured data types and their operations. The data structures are modelled as directed, attributed, node and edge labelled graphs (diane graphs). The basic programming constructs of PROGRES are graph rewriting rules (productions and tests) and derived relations on nodes (paths and restrictions). These basic operations may be combined to build partly imperative, partly rule based, complex graph transformations by special control structures which regard the nondeterministic nature of graph rewriting rules. PROGRES offers its users a convenient, partly textual, partly graphical concrete syntax and a rich system of consistency checking rules for the underlying calculus of programmed diane-graph rewriting systems. This paper presents the key techniques used for the execution of PROGRES programs. We will discuss...

We present a fully automatic method to detect doctored digital images. Our method is based on a rigorous consistency checking principle of physical characteristics among different arbitrarily shaped image regions. In this paper, we... more

We present a fully automatic method to detect doctored digital images. Our method is based on a rigorous consistency checking principle of physical characteristics among different arbitrarily shaped image regions. In this paper, we specifically study the camera response function (CRF), a fundamental property in cameras mapping input irradiance to output image intensity. A test image is first automatically segmented into distinct arbitrarily shaped regions. One CRF is estimated from each region using geometric invariants from locally planar irradiance points (LPIPs). To classify a boundary segment between two regions as authentic or spliced, CRF-based cross fitting and local image features are computed and fed to statistical classifiers. Such segment level scores are further fused to infer the image level authenticity. Tests on two data sets reach performance levels of 70% precision and 70% recall, showing promising potential for real-world applications. Moreover, we examine individual features and discover the key factor in splicing detection. Our experiments show that the anomaly introduced around splicing boundaries plays the major role in detecting splicing. Such finding is important for designing effective and efficient solutions to image splicing detection.

This paper proposes LEAP: a simple framework for Enterprise Architecture (EA) that views an organization as an engine that executes in terms of hierarchically decomposed communicating components. The approach allows all aspects of the... more

This paper proposes LEAP: a simple framework for Enterprise Architecture (EA) that views an organization as an engine that executes in terms of hierarchically decomposed communicating components. The approach allows all aspects of the architecture to be precisely defined using standard modelling notations. Given that the approach is simple and precisely defined it can form the basis for a wide range of EA analysis techniques including simulation, compliance and consistency checking. The paper defines the LEAP framework and provides an overview in terms of a case study. LEAP does not mandate any specific notation, a UML-style notation is used in this paper and the implications for ArchiMate are analysed.

This paper describes the methods presently used to produce unaveraged radial velocity maps from radar voltage cross spectra measured by a SeaSonde, including a discussion of the multiple signal classification (MUSIC) algorithm as it is... more

This paper describes the methods presently used to produce unaveraged radial velocity maps from radar voltage cross spectra measured by a SeaSonde, including a discussion of the multiple signal classification (MUSIC) algorithm as it is applied to Sea-Sonde data and methods employed to alleviate difficulties associated with the use of measured antenna patterns. We also describe internal consistency checks including visual observation of the radial velocity map, consideration of the computed uncertainties and quantitative tests of radial velocities measured by two radars along the baseline between the systems. Baseline analysis is illustrated by application to two SeaSonde networks, with contrasting results that lead to a better understanding of SeaSonde output.

In recent years, the significance of automated generation of object-oriented code from UML diagrams has increased due to its benefits, such as, cost reduction and accuracy. Consistency Checking between UML diagrams, and ensuring accuracy,... more

In recent years, the significance of automated generation of object-oriented code from UML diagrams has increased due to its benefits, such as, cost reduction and accuracy. Consistency Checking between UML diagrams, and ensuring accuracy, maintainability, efficiency, and completeness of the generated code are the main concerns in this area. This paper extends our work on a tool called UJECTOR for automatic generation of executable Java code from UML diagrams. A set of three UML diagrams, ie, class diagram, sequence ...

In recent years, the significance of automated generation of object-oriented code from UML diagrams has increased due to its benefits, such as, cost reduction and accuracy. Consistency Checking between UML diagrams, and ensuring accuracy,... more

In recent years, the significance of automated generation of object-oriented code from UML diagrams has increased due to its benefits, such as, cost reduction and accuracy. Consistency Checking between UML diagrams, and ensuring accuracy, maintainability, efficiency, and completeness of the generated code are the main concerns in this area. This paper extends our work on a tool called UJECTOR for automatic generation of executable Java code from UML diagrams. A set of three UML diagrams, ie, class diagram, sequence ...

In recent years, the significance of automated generation of object-oriented code from UML diagrams has increased due to its benefits, such as, cost reduction and accuracy. Consistency Checking between UML diagrams, and ensuring accuracy,... more

In recent years, the significance of automated generation of object-oriented code from UML diagrams has increased due to its benefits, such as, cost reduction and accuracy. Consistency Checking between UML diagrams, and ensuring accuracy, maintainability, efficiency, and completeness of the generated code are the main concerns in this area. This paper extends our work on a tool called UJECTOR for automatic generation of executable Java code from UML diagrams. A set of three UML diagrams, ie, class diagram, sequence ...

Skyline Queries have recently received a lot of attention due to their intuitive query capabilities. Following the concept of Pareto optimality all 'best' database objects are returned to the user. However, this often results in... more

Skyline Queries have recently received a lot of attention due to their intuitive query capabilities. Following the concept of Pareto optimality all 'best' database objects are returned to the user. However, this often results in unmanageable large result set sizes hampering the success of this innovative paradigm. As an effective remedy for this problem, trade-offs provide a natural concept for dealing with incomparable choices. But such trade-offs are not reflected by the Pareto paradigm. Thus, incorporating them into the users' preference orders and adjusting skyline results accordingly needs special algorithms beyond traditional skylining. For the actual integration of trade-offs into skylines, the problem of ensuring the consistency of arbitrary trade-off sets poses a demanding challenge. Consistency is a crucial aspect when dealing with multi-dimensional trade-offs spanning over several attributes. If the consistency should be violated, cyclic preferences may occur in the result set. But such cyclic preferences cannot be handled by information systems in a sensible way. Often, this problem is circumvented by restricting the trade-offs' expressiveness, e.g. by altogether ignoring some classes of possibly inconsistent tradeoffs. In this paper, we will present a new algorithm capable of efficiently verifying the consistency of any arbitrary set of trade-offs. After motivating its basic concepts and introducing the algorithm itself, we will also show that it exhibits superior average-case performance. The benefits of our approach promise to pave the way towards personalized and cooperative information systems.

The integrated meta-modelling approach SIAT is the central product of the project SENSOR (6th EU Framework Program), which innovates ex-ante impact assessments (IA) of European policy instruments. Analytical focus of this article is how... more

The integrated meta-modelling approach SIAT is the central product of the project SENSOR (6th EU Framework Program), which innovates ex-ante impact assessments (IA) of European policy instruments. Analytical focus of this article is how to cope with the ...

When a group of authors collaboratively edits interrelated documents, consistency problems occur almost immediately. Current document management systems (DMS) provide useful mechanisms such as document locking and version control, but... more

When a group of authors collaboratively edits interrelated documents, consistency problems occur almost immediately. Current document management systems (DMS) provide useful mechanisms such as document locking and version control, but often lack consistency management facilities. If at all, consistency is "defined" via informal guidelines, which do not support automatic consistency checks.

Evapotranspiration plays a central role in the water balance of arid and semi-arid areas, as it is the major loss flux. The spatial estimation of actual evapotranspiration is thus of critical importance for hydrologic, environmental and... more

Evapotranspiration plays a central role in the water balance of arid and semi-arid areas, as it is the major loss flux. The spatial estimation of actual evapotranspiration is thus of critical importance for hydrologic, environmental and agricultural purposes. This study therefore applied the Surface Energy Balance System for turbulent heat fluxes (SEBS) algorithm to determine actual evapotranspiration in the Upper Manyame catchment in Zimbabwe. The SEBS algorithm was run using atmospherically corrected MODIS satellite imagery on nine clear skies days stretching from 2003 to 2005. SEBS determined actual evapotranspiration was evaluated based on two criteria over Harare-Kutsaga and Grasslands research stations. Firstly, a plausibility or physical consistency check was applied by which SEBS actual evapotranspiration estimates were compared to FAO Penman-Monteith potential evapotranspiration estimates. On average, it was observed that SEBS actual evapotranspiration estimates were physically consistent. SEBS estimates were 36.5% and 76.5% of potential evapotranspiration over Harare-Kutsaga and Grasslands stations, respectively. However, on a few days over Grasslands station SEBS actual evapotranspiration estimates were physically inconsistent as they were greater than potential evapotranspiration. SEBS actual evapotranspiration estimates were also evaluated against actual evapotranspiration from the advection aridity equation. Over Harare-Kutsaga the mean error showed a slight underestimation of 0.32 mm d À1 by SEBS and the mean absolute error was 0.5 mm d À1. However, over Grasslands station the results were not as good as the mean error and mean absolute error showed an overestimation 2.6 mm d À1. This was relatively high. The discussed reasons for the relatively high overestimation errors over Grasslands station include; roughness parameterisation, the spatial variability of input temperature data and heterogeneity. An analysis of the evaporative behaviour of the different land cover types in the catchment showed that: water bodies, closed broadleaved deciduous forests, and open broadleaved deciduous forests had relatively higher mean actual evapotranspiration rates and yet these land cover types constituted less than 5% of the catchment area. Open grasslands were observed to have low evaporative behaviour. Overall this study showed that SEBS has potential for estimating spatial actual evapotranspiration and providing useful information that can be used for water resources and environmental management and planning.

Modeling languages such as UML offer a set of basic models to describe a software system from different views and at different levels of abstraction. Tools supporting an unrestricted usage of these UML models cannot guarantee the... more

Modeling languages such as UML offer a set of basic models to describe a software system from different views and at different levels of abstraction. Tools supporting an unrestricted usage of these UML models cannot guarantee the consistency between multiple models/views, due to the lack of a formal definition of the semantics of UML diagrams. A better alternative that does allow for automatic consistency checking is modeling according to the single model principle. This approach is based on the conception of a single model, for which different views are constructed, and with an automatic or semi-automatic generation or consistency checking among these views. Three basic approaches to consistency checking are consistency by analysis, consistency by monitoring and consistency by construction. In this paper we illustrate the consistency by construction approach by means of the conceptual domain modeling approach MERODE and its associated case-tool MERMAID. We also illustrate how consistency by construction improves the validity and completeness of the conceptual model.

We describe an algorithm for deciding the first-order multisorted theory BAPA, which combines Boolean algebras of sets of uninterpreted elements (BA) and Presburger arithmetic operations (PA). BAPA can express the relationship between... more

We describe an algorithm for deciding the first-order multisorted theory BAPA, which combines Boolean algebras of sets of uninterpreted elements (BA) and Presburger arithmetic operations (PA). BAPA can express the relationship between integer variables and ...

This position paper comprises six theses that are intentionally formulated in a provocative way to motivate the development of information techniques and systems to support the Warfighter in future operations. The authors are convinced... more

This position paper comprises six theses that are intentionally formulated in a provocative way to motivate the development of information techniques and systems to support the Warfighter in future operations. The authors are convinced that such C4ISR systems 1 must integrate M&S methods & technology to fulfill the users’ operational needs for Alternative Course of Action Analysis, Consistency Checks of Decisions, Planning and Mission Rehearsal, After-Action Analysis, etc. Well-accepted Grand Challenges are the development of methods and algorithms in various simulation domains like Human Behavior Representation, Modeling Uncertainty, Techniques supporting DecisionMaking, etc. Some C4ISR domains that need M&S support will be dealt with in this paper explicitly. However, as important as the development of new methods is, the necessity of using the same conceptual and architectural approaches in the C4ISR & M&S Development Communities is equally vital for operational systems. Formulat...

We have developed a low cost, WebCam based optical barcode reader, which can extract and decode the sequence on a cluttered background. It is composed of three functions: barcode localization from the raw image, transformation of the... more

We have developed a low cost, WebCam based optical barcode reader, which can extract and decode the sequence on a cluttered background. It is composed of three functions: barcode localization from the raw image, transformation of the localized barcode and decoding the sequence with an intelligent algorithm. The localization method is based on detecting the areas with the maximum density difference in two normal directions. The transformation method, capable of identifying any orientation, is based on the Hough line detection method. The decoding method is based on the peak/valley detection method of the barcode waveform and a consistency checking method. The consistency checking method, a constraint network, employs artificial intelligence searching methods. The algorithms used in the barcode reader have been tested on hundreds of images with an accuracy of more than 99%.

In a central bank statistical information sustains many institutional functions and is a strategic resource for research and decision-making. To serve this purpose, new statistics often have to be created. This complex and dynamic... more

In a central bank statistical information sustains many institutional functions and is a strategic resource for research and decision-making. To serve this purpose, new statistics often have to be created. This complex and dynamic scenario demands a comprehensive and flexible IT solution which should support various kinds of processes and be promptly adapted to new requirements. The solution should be founded on a rigorous and general information model that can describe all the characteristics of the statistical data: the meaning, the properties and the transformation rules to produce other data. Compliance of this information model with international standards is important, in that statistical production requires a high level of co-operation among all the stakeholders (statistical agencies, users, etc.).

A rapidly changing environment, in terms of technology and market, forces companies to keep their business processes aligned with current and upcoming requirements. This is still a major issue in modern process oriented information... more

A rapidly changing environment, in terms of technology and market, forces companies to keep their business processes aligned with current and upcoming requirements. This is still a major issue in modern process oriented information systems, where improvements on process models require considerable effort to implement them in a technical infrastructure. We address this problem by lifting technical details into BPMN 2.0 process models and present a configuration wizard for these process models in the open-source modeling tool Oryx. This wizard includes a consistency checking mechanism to automatically discover inconsistencies in the data dependencies of a process model. Immediate feedback after changes to the model eliminates a crucial source of errors when configuring or redesigning business process models, leading to more efficient process implementation.

In this paper we introduce a new biometric verification system based on on-line signatures and simulate its operation. For this purpose we have split the MCYT signature database in three subsets: one for classifier training, another for... more

In this paper we introduce a new biometric verification system based on on-line signatures and simulate its operation. For this purpose we have split the MCYT signature database in three subsets: one for classifier training, another for system adjustment and a third one for system testing simulating enrolment and verification. This context corresponds to a real operation, where a new user tries to enrol an existing system and must be automatically guided by the system in order to detect the failure to enrol (FTE) situations. The main contribution of this work is the management of FTE situations by means of a new proposal, called intelligent enrolment, which consists of consistency checking in order to automatically reject low quality samples. This strategy enhances the performance of the system to 22% when 8% of the users are left out. In this situation 8% of the people cannot be enroled in the system and must be verified by other biometrics or by human abilities. These people are identified with intelligent enrolment and the situation can be thus managed. In addition we also propose a DCT-based feature extractor with threshold coding and discriminability criteria.

In recent years, the significance of automated generation of object-oriented code from UML diagrams has increased due to its benefits, such as, cost reduction and accuracy. Consistency Checking between UML diagrams, and ensuring accuracy,... more

In recent years, the significance of automated generation of object-oriented code from UML diagrams has increased due to its benefits, such as, cost reduction and accuracy. Consistency Checking between UML diagrams, and ensuring accuracy, maintainability, efficiency, and completeness of the generated code are the main concerns in this area. This paper extends our work on a tool called UJECTOR for automatic generation of executable Java code from UML diagrams. A set of three UML diagrams, i.e., class diagram, sequence diagrams and activity diagrams are input to the tool to generate completely executable Java code automatically. The object-oriented code structure is built from class diagram, methods' flow of control is created through sequence diagrams, and object manipulations are included by activity diagrams from the implemented approach in UJECTOR. The extension includes a more detailed overview of the code generation tool regarding its architecture and code generation process. Two detailed case studies are presented to validate the generated code from the code generation tool. We also compare UJECTOR with the existing research-based, commercial and open-source tools. The evaluation results show that the generated code from UJECTOR is fully functional and understandable.

For active and future Earth observation missions, the availability of near real-time precise orbit information is becoming more and more important. The latency and quality of precise orbit determination results is mainly driven by the... more

For active and future Earth observation missions, the availability of near real-time precise orbit information is becoming more and more important. The latency and quality of precise orbit determination results is mainly driven by the availability of precise GPS ephemerides and clocks. In order to have high-quality GPS ephemerides and clocks available at real-time, the German Space Operations Center (GSOC) has developed the real-time clock estimation system RETICLE. The system receives data streams with GNSS observations from the global tracking network of the International GNSS Service (IGS) in real-time. Using the known station position, RET-ICLE estimates precise GPS satellite clock offsets and drifts based on the most recent available ultra rapid predicted orbits provided by the IGS. The clock offset estimates have an accuracy of better than 0.3 ns and are globally valid. The latency of the estimated clocks is approximately 7 s after the observation epoch.

Over the last decade, Architecture Description Languages (ADLs) are attracting considerable attention by automotive companies because they consider them as one of the key solutions to improve the quality of automotive electronic and... more

Over the last decade, Architecture Description Languages (ADLs) are attracting considerable attention by automotive companies because they consider them as one of the key solutions to improve the quality of automotive electronic and software systems. Automotive ADLs like EAST-ADL, AADL, TADL, and AML are being defined to address not only the architectural description or the representation issues but also as a method to enable requirements traceability and early analysis of a system. Besides the automotive specific ADLs, SysML and MARTE are emerging as viable modeling approaches for automotive systems engineering domain as well. However, all these modeling approaches lack the capability of ensuring the architectural quality. This paper identifies an architectural inconsistency between the different architectural levels as one of the key issues regarding architectural quality of automotive systems and proposes a rule-based method to enforce consistency between these levels. Since there is no standard ADL for automotive systems, we first evaluated a set of modeling approaches based on the automotive specific modeling requirements and selected SysML mainly due to its flexibility, and mature and accessible tool support. We modeled a Cruise Control system to demonstrate modeling of an automotive system in SysML and to evaluate the method for the architectural consistency checking using SysML.

We measure cosmological parameters using the three-dimensional power spectrum P (k) from over 200,000 galaxies in the Sloan Digital Sky Survey (SDSS) in combination with WMAP and other data. Our results are consistent with a "vanilla"... more

We measure cosmological parameters using the three-dimensional power spectrum P (k) from over 200,000 galaxies in the Sloan Digital Sky Survey (SDSS) in combination with WMAP and other data. Our results are consistent with a "vanilla" flat adiabatic ΛCDM model without tilt (ns = 1), running tilt, tensor modes or massive neutrinos. Adding SDSS information more than halves the WMAP-only error bars on some parameters, tightening 1σ constraints on the Hubble parameter from h ≈ 0.74 +0.18 −0.07 to h ≈ 0.70 +0.04 −0.03 , on the matter density from Ωm ≈ 0.25 ± 0.10 to Ωm ≈ 0.30 ± 0.04 (1σ) and on neutrino masses from < 11 eV to < 0.6 eV (95%). SDSS helps even more when dropping prior assumptions about curvature, neutrinos, tensor modes and the equation of state. Our results are in substantial agreement with the joint analysis of WMAP and the 2dF Galaxy Redshift Survey, which is an impressive consistency check with independent redshift survey data and analysis techniques. In this paper, we place particular emphasis on clarifying the physical origin of the constraints, i.e., what we do and do not know when using different data sets and prior assumptions. For instance, dropping the assumption that space is perfectly flat, the WMAP-only constraint on the measured age of the Universe tightens from t0 ≈ 16.3 +2.3 −1.8 Gyr to t0 ≈ 14.1 +1.0 −0.9

Evapotranspiration plays a central role in the water balance of arid and semi-arid areas, as it is the major loss flux. The spatial estimation of actual evapotranspiration is thus of critical importance for hydrologic, environmental and... more

Evapotranspiration plays a central role in the water balance of arid and semi-arid areas, as it is the major loss flux. The spatial estimation of actual evapotranspiration is thus of critical importance for hydrologic, environmental and agricultural purposes. This study therefore applied the Surface Energy Balance System for turbulent heat fluxes (SEBS) algorithm to determine actual evapotranspiration in the Upper Manyame catchment in Zimbabwe. The SEBS algorithm was run using atmospherically corrected MODIS satellite imagery on nine clear skies days stretching from 2003 to 2005. SEBS determined actual evapotranspiration was evaluated based on two criteria over Harare-Kutsaga and Grasslands research stations. Firstly, a plausibility or physical consistency check was applied by which SEBS actual evapotranspiration estimates were compared to FAO Penman-Monteith potential evapotranspiration estimates. On average, it was observed that SEBS actual evapotranspiration estimates were physically consistent. SEBS estimates were 36.5% and 76.5% of potential evapotranspiration over Harare-Kutsaga and Grasslands stations, respectively. However, on a few days over Grasslands station SEBS actual evapotranspiration estimates were physically inconsistent as they were greater than potential evapotranspiration. SEBS actual evapotranspiration estimates were also evaluated against actual evapotranspiration from the advection aridity equation. Over Harare-Kutsaga the mean error showed a slight underestimation of 0.32 mm d À1 by SEBS and the mean absolute error was 0.5 mm d À1 . However, over Grasslands station the results were not as good as the mean error and mean absolute error showed an overestimation 2.6 mm d À1 . This was relatively high. The discussed reasons for the relatively high overestimation errors over Grasslands station include; roughness parameterisation, the spatial variability of input temperature data and heterogeneity. An analysis of the evaporative behaviour of the different land cover types in the catchment showed that: water bodies, closed broadleaved deciduous forests, and open broadleaved deciduous forests had relatively higher mean actual evapotranspiration rates and yet these land cover types constituted less than 5% of the catchment area. Open grasslands were observed to have low evaporative behaviour. Overall this study showed that SEBS has potential for estimating spatial actual evapotranspiration and providing useful information that can be used for water resources and environmental management and planning.

This paper gives an overview of the current state of the Prague English Dependency Treebank project. It is an updated version of a draft text that was released along with a CD presenting the first 25% of the PDT-like version of the Penn... more

This paper gives an overview of the current state of the Prague English Dependency Treebank project. It is an updated version of a draft text that was released along with a CD presenting the first 25% of the PDT-like version of the Penn Treebank -WSJ section (PEDT 1.0).

Early phases of software development are known to be problematic, difficult to manage and errors occurring during these phases are expensive to correct. Many systems have been developed to aid the transition from informal Natural Language... more

Early phases of software development are known to be problematic, difficult to manage and errors occurring during these phases are expensive to correct. Many systems have been developed to aid the transition from informal Natural Language requirements to semi-structured or formal specifications. Furthermore, consistency checking is seen by many software engineers as the solution to reduce the number of errors occurring during the software development life cycle and allow early verification and validation of software systems. However, this is confined to the models developed during analysis and design and fails to include the early Natural Language requirements. This excludes proper user involvement and creates a gap between the original requirements and the updated and modified models and implementations of the system. To improve this process, we propose a system that generates Natural Language specifications from UML class diagrams. We first investigate the variation of the input language used in naming the components of a class diagram based on the study of a large number of examples from the literature and then develop rules for removing ambiguities in the subset of Natural Language used within UML. We use WordNet, a linguistic ontology, to disambiguate the lexical structures of the UML string names and generate semantically sound sentences. Our system is developed in Java and is tested on an independent though academic case study.

This paper describes a feasibility study into the use of a formal requirements modeling method (SCR) to assist with Independent Verification and Validation of change requests for Space Shuttle flight software. The goal of the study was to... more

This paper describes a feasibility study into the use of a formal requirements modeling method (SCR) to assist with Independent Verification and Validation of change requests for Space Shuttle flight software. The goal of the study was to determine whether a formal modeling technique could automate some of the manual analysis tasks performed on change requests, including consistency checking. To analyze the change request, the key part of the original functionality was modeled in SCR. The model was then updated to reflect the proposed changes. Tool support was used to perform consistency checking and to validate the model against domain properties. The study showed that as an analysis tool, formal modeling offers some advantages over inspectionbased approaches. However, the problem of analyzing change requests is sufficiently different from other requirements modeling tasks that some specialist tools will be needed. The paper ends with a discussion of the demands of these needs.

We examine the correlation function ξ of the Sloan Digital Sky Survey (SDSS) Luminous Red Galaxy sample (LRG) at large scales (60 < s < 400 h −1 Mpc) using the final data release (DR7; 105, 831 LRGs between 0.16 < z < 0.47). Using mock... more

We examine the correlation function ξ of the Sloan Digital Sky Survey (SDSS) Luminous Red Galaxy sample (LRG) at large scales (60 < s < 400 h −1 Mpc) using the final data release (DR7; 105, 831 LRGs between 0.16 < z < 0.47). Using mock catalogs, we demonstrate that the observed baryonic acoustic peak and larger scale signal are consistent with ΛCDM at the 1.5σ level. The signal at 155 < s < 200 h −1 Mpc tends to be high relative to theoretical expectations; this slight deviation can be attributed to a bright subsample of the LRGs. Fitting data to a non-linear, redshift-space, template based-model, we constrain the peak position at s p = 103.6 +3.6 −2.4 h −1 Mpc when fitting the range 60 < s < 150 h −1 Mpc (1σ uncertainties measured from the mocks). This redshiftspace distance s p is related to the comoving sound horizon scale r s after taking into account matter clustering non-linearities, redshift distortions and galaxy clustering bias. Mock catalogs show that the probability that a DR7-sized sample would not have an identifiable peak is at least ∼ 10%. As a consistency check of a fiducial cosmology, we use the observed s p to obtain the distance D V ≡ [(1 + z) 2 D 2 A cz/H(z)] 1 3 relative to the acoustic scale. We find r s /D V (z = 0.278) = 0.1394± 0.0049. This result is in excellent agreement with , who examine roughly the same data set, but using the power spectrum. Comparison with other determinations in the literature are also in very good agreement. We have tested our results against a battery of possible systematic effects, finding all effects are smaller than our estimated sample variance.

An RLC bridge based on an automated synchronous sampling system has been developed using commercially available high-resolution analog-to-digital and digital-to-analog converters. This bridge allows the comparison of any kind of impedance... more

An RLC bridge based on an automated synchronous sampling system has been developed using commercially available high-resolution analog-to-digital and digital-to-analog converters. This bridge allows the comparison of any kind of impedance standards in the four-terminal-pair configuration at frequencies between 50 Hz and 20 kHz within a range from 1 Ω to 100 kΩ. An automatic balance of the bridge is carried out using a downhill simplex algorithm. Consistency checks have been realized by comparing resistance, inductance, and capacitance standards at different frequencies. The consistency of the measured voltage ratio is better than 20 μV/V over the whole frequency range and even smaller than 5 μV/V around 1 kHz. Finally, the results of the calibration of a 10-nF capacitance standard have been compared to those obtained using a commercial high-accuracy capacitance bridge. The difference is smaller than the commercial bridge specifications over the whole frequency range.

This paper presents an approach to model features and function nets of automotive systems comprehen- sively. In order to bridge the gap between feature requirements and function nets, we describe an ap- proach to describe both using a... more

This paper presents an approach to model features and function nets of automotive systems comprehen- sively. In order to bridge the gap between feature requirements and function nets, we describe an ap- proach to describe both using a SysML-based nota- tion. If requirements on the automotive system are changed by several developers responsible for differ- ent features, it is important

Loop Quantum Cosmology (LQC), mainly due to Bojowald, is not the cosmological sector of Loop Quantum Gravity (LQG). Rather, LQC consists of a truncation of the phase space of classical General Relativity to spatially homogeneous... more

Loop Quantum Cosmology (LQC), mainly due to Bojowald, is not the cosmological sector of Loop Quantum Gravity (LQG). Rather, LQC consists of a truncation of the phase space of classical General Relativity to spatially homogeneous situations which is then quantized by the methods of LQG. Thus, LQC is a quantum mechanical toy model (finite number of degrees of freedom) for LQG(a genuine QFT with an infinite number of degrees of freedom) which provides important consistency checks. However, it is a non trivial question whether the predictions of LQC are robust after switching on the inhomogeneous fluctuations present in full LQG. Two of the most spectacular findings of LQC are that 1. the inverse scale factor is bounded from above on zero volume eigenstates which hints at the avoidance of the local curvature singularity and 2. that the Quantum Einstein Equations are non -- singular which hints at the avoidance of the global initial singularity. We display the result of a calculation for LQG which proves that the (analogon of the) inverse scale factor, while densely defined, is {\it not} bounded from above on zero volume eigenstates. Thus, in full LQG, if curvature singularity avoidance is realized, then not in this simple way. In fact, it turns out that the boundedness of the inverse scale factor is neither necessary nor sufficient for curvature singularity avoidance and that non -- singular evolution equations are neither necessary nor sufficient for initial singularity avoidance because none of these criteria are formulated in terms of observable quantities.After outlining what would be required, we present the results of a calculation for LQG which could be a first indication that our criteria at least for curvature singularity avoidance are satisfied in LQG.

A set of linear pathways often does not capture the full range of behaviors of a metabolic network. The concept of 'elementary flux modes' provides a mathematical tool to define and comprehensively describe all metabolic routes that are... more

A set of linear pathways often does not capture the full range of behaviors of a metabolic network. The concept of 'elementary flux modes' provides a mathematical tool to define and comprehensively describe all metabolic routes that are both stoichiometrically and thermodynamically feasible for a group of enzymes. We have used this concept to analyze the interplay between the pentose phosphate pathway (PPP) and glycolysis. The set of elementary modes for this system involves conventional glycolysis, a futile cycle, all the modes of PPP function described in biochemistry textbooks, and additional modes that are a priori equally entitled to pathway status. Applications include maximizing product yield in amino acid and antibiotic synthesis, reconstruction and consistency checks of metabolism from genome data, analysis of enzyme deficiencies, and drug target identification in metabolic networks.

Recently, the spinor helicity formalism and on-shell superspace were developed for six-dimensional gauge theories with (1,1) supersymmetry. We combine these two techniques with (generalised) unitarity, which is a powerful technique to... more

Recently, the spinor helicity formalism and on-shell superspace were developed for six-dimensional gauge theories with (1,1) supersymmetry. We combine these two techniques with (generalised) unitarity, which is a powerful technique to calculate scattering amplitudes in any massless theory. As an application we calculate one-loop superamplitudes with four and five external particles in the (1,1) theory and perform several consistency checks on our results.

Surface reflectance adjusted for atmospheric effects is a primary input for land cover change detection and for developing many higher level surface geophysical parameters. With the development of automated atmospheric correction... more

Surface reflectance adjusted for atmospheric effects is a primary input for land cover change detection and for developing many higher level surface geophysical parameters. With the development of automated atmospheric correction algorithms, it is now feasible to produce large quantities of surface reflectance products using Landsat images. Validation of these products requires in situ measurements, which either do not exist or are difficult to obtain for most Landsat images. The surface reflectance products derived using data acquired by the Moderate Resolution Imaging Spectroradiometer (MODIS), however, have been validated more comprehensively. Because the MODIS on the Terra platform and the Landsat 7 are only half an hour apart following the same orbit, and each of the 6 Landsat spectral bands overlaps with a MODIS band, good agreements between MODIS and Landsat surface reflectance values can be considered indicators of the reliability of the Landsat products, while disagreements may suggest potential quality problems that need to be further investigated. Here we develop a system called Landsat-MODIS Consistency Checking System (LMCCS). This system automatically matches Landsat data with MODIS observations acquired on the same date over the same locations and uses them to calculate a set of agreement metrics. To maximize its portability, Java and open-source libraries were used in developing this system, and object-oriented programming (OOP) principles were followed to make it more flexible for future expansion. As a highly automated system designed to run as a standalone package or as a component of other Landsat data processing systems, this system can be used to assess the quality of essentially every Landsat surface reflectance image where spatially and temporally matching MODIS data are available. The effectiveness of this system was demonstrated using it to assess preliminary surface reflectance products derived using the Global Land Survey (GLS) Landsat images for the 2000 epoch. As surface reflectance likely will be a standard product for future Landsat missions, the approach developed in this study can be adapted as an operational quality assessment system for those missions.

Evapotranspiration plays a central role in the water balance of arid and semi-arid areas, as it is the major loss flux. The spatial estimation of actual evapotranspiration is thus of critical importance for hydrologic, environmental and... more

Evapotranspiration plays a central role in the water balance of arid and semi-arid areas, as it is the major loss flux. The spatial estimation of actual evapotranspiration is thus of critical importance for hydrologic, environmental and agricultural purposes. This study therefore applied the Surface Energy Balance System for turbulent heat fluxes (SEBS) algorithm to determine actual evapotranspiration in the Upper Manyame catchment in Zimbabwe. The SEBS algorithm was run using atmospherically corrected MODIS satellite imagery on nine clear skies days stretching from 2003 to 2005. SEBS determined actual evapotranspiration was evaluated based on two criteria over Harare-Kutsaga and Grasslands research stations. Firstly, a plausibility or physical consistency check was applied by which SEBS actual evapotranspiration estimates were compared to FAO Penman-Monteith potential evapotranspiration estimates. On average, it was observed that SEBS actual evapotranspiration estimates were physically consistent. SEBS estimates were 36.5% and 76.5% of potential evapotranspiration over Harare-Kutsaga and Grasslands stations, respectively. However, on a few days over Grasslands station SEBS actual evapotranspiration estimates were physically inconsistent as they were greater than potential evapotranspiration. SEBS actual evapotranspiration estimates were also evaluated against actual evapotranspiration from the advection aridity equation. Over Harare-Kutsaga the mean error showed a slight underestimation of 0.32 mm d À1 by SEBS and the mean absolute error was 0.5 mm d À1. However, over Grasslands station the results were not as good as the mean error and mean absolute error showed an overestimation 2.6 mm d À1. This was relatively high. The discussed reasons for the relatively high overestimation errors over Grasslands station include; roughness parameterisation, the spatial variability of input temperature data and heterogeneity. An analysis of the evaporative behaviour of the different land cover types in the catchment showed that: water bodies, closed broadleaved deciduous forests, and open broadleaved deciduous forests had relatively higher mean actual evapotranspiration rates and yet these land cover types constituted less than 5% of the catchment area. Open grasslands were observed to have low evaporative behaviour. Overall this study showed that SEBS has potential for estimating spatial actual evapotranspiration and providing useful information that can be used for water resources and environmental management and planning.

CAD and GIS have been designed for different applications but a tendency for convergence is increasing in the last years. For many different tasks integrations of GIS and CAD models are required. Currently there is no system that can... more

CAD and GIS have been designed for different applications but a tendency for convergence is increasing in the last years. For many different tasks integrations of GIS and CAD models are required. Currently there is no system that can easily integrate models from these two different domains. One of the bottlenecks in this integration is the supported data types. While CAD systems have a reach set of different types of shapes, GIS is limited to only points, lines and polygons. In this paper we suggest that DBMS can play an intermediate role in this integration by providing the missing data types. The two types of systems can selectively access only those data types that they understand. The consistency check is ensured at DBMS level.

The critical role played by observations during ocean data assimilation was explored when the Regional Ocean Modeling System (ROMS) 4-dimensional variational (4D-Var) data assimilation system was applied sequentially to the California... more

The critical role played by observations during ocean data assimilation was explored when the Regional Ocean Modeling System (ROMS) 4-dimensional variational (4D-Var) data assimilation system was applied sequentially to the California Current circulation. The adjoint of the 4D-Var gain matrix was used to quantify the impact of individual observations and observation platforms on different aspects of the 4D-Var circulation estimates during both analysis and subsequent forecast cycles. In this study we focus on the alongshore and cross-shore transport of the California Current System associated with wind-induced coastal upwelling along the central California coast. The majority of the observations available during any given analysis cycle are from satellite platforms in the form of SST and SSH, and on average these data exert the largest controlling influence on the analysis increments and forecast skill of coastal transport. However, subsurface in situ observations from Argo floats, CTDs, XBTs and tagged marine mammals often have a considerable impact on analyses and forecasts of coastal transport, even though these observations represent a relatively small fraction of the available data at any particular time.During 4D-Var the observations are used to correct for uncertainties in the model control variables, namely the initial conditions, surface forcing, and open boundary conditions. It is found that correcting for uncertainties in both the initial conditions and surface forcing has the largest impact on the analysis increments in alongshore transport, while the cross-shore transport is controlled mainly by the surface forcing. The memory of the circulation associated with the control variable increments was also explored in relation to 7 day forecasts of the coastal circulation. Despite the importance of correcting for surface forcing uncertainties during analysis cycles, the coastal transport during forecast cycles initialized from the analyses has less memory of the surface forcing corrections, and is controlled primarily by the analysis initial conditions.Using the adjoint of the entire 4D-Var system we have also explored the sensitivity of the coastal transport to changes in the observations and the observation array. A single integration of the adjoint of 4D-Var can be used to predict the change that occurs when observations from different platforms are omitted from the 4D-Var analysis. Thus observing system experiments can be performed for each data assimilation cycle at a fraction of the computational cost that would be required to repeat the 4D-Var analyses when observations are withheld. This is the third part of a three part series describing the ROMS 4D-Var systems.► The impact of observations on California Current circulation estimates is assessed. ► All observations are found to be important in controlling coastal transports. ► In situ observations have significant impact despite their relatively small number. ► The adjoint of 4D-Var can be used reliably for observing system experiments. ► The impact of uncertainties in the observations on coastal transports is assessed.

Portable systems such as cell phones and portable media players commonly use non-volatile RAM (NVRAM) to hold all of their data and metadata, and larger systems can store metadata in NVRAM to increase file system performance by reducing... more

Portable systems such as cell phones and portable media players commonly use non-volatile RAM (NVRAM) to hold all of their data and metadata, and larger systems can store metadata in NVRAM to increase file system performance by reducing synchronization and transfer overhead between disk and memory data structures. Unfortunately, wayward writes from buggy software and random bit flips may result in an unreliable persistent store. We introduce two orthogonal and complementary approaches to reliably storing file system structures in NVRAM. First, we reinforce hardware and operating system memory consistency by employing page-level write protection and error correcting codes. Second, we perform on-line consistency checking of the filesystem structures by replaying logged file system transactions on copied data structures; a structure is consistent if the replayed copy matches its live counterpart. Our experiments show that the protection mechanisms can increase fault tolerance by six orders of magnitude while incurring an acceptable amount of overhead on writes to NVRAM. Since NVRAM is much faster and consumes far less power than disk-based storage, the added overhead of error checking leaves an NVRAM-based system both faster and more reliable than a disk-based system. Additionally, our techniques can be implemented on systems lacking hardware support for memory management, allowing them to be used on lowend and embedded systems without an MMU.

There has been considerable past work studying data integration and uncertain data in isolation. We develop the foundations for local-as-view (LAV) data integration when the sources being integrated are uncertain. We motivate two distinct... more

There has been considerable past work studying data integration and uncertain data in isolation. We develop the foundations for local-as-view (LAV) data integration when the sources being integrated are uncertain. We motivate two distinct settings for uncertain-data integration. We then define containment of uncertain databases in these settings, which allows us to express uncertain sources as views over a virtual mediated uncertain database. Next, we define consistency of a set of uncertain sources and show intractability of consistency-checking. We identify an interesting special case for which consistency-checking is polynomial. Finally, the notion of certain answers from traditional LAV data integration does not generalize to the uncertain setting, so we define a corresponding notion of correct answers.

Traditionally, fail-silent computers are implemented by using massive redundancy (hardware or software). In this research we investigate if it is possible to obtain a high degree of fail-silent behavior from a computer without hardware or... more

Traditionally, fail-silent computers are implemented by using massive redundancy (hardware or software). In this research we investigate if it is possible to obtain a high degree of fail-silent behavior from a computer without hardware or software replication by using only simple behavior based error detection techniques. It is assumed that if the errors caused by a fault are detected in time it will be possible to stop the erroneous computer behavior, thus preventing the violation of the fail-silent model. The evaluation technique used in this research is physical fault injection at the pin level. Results obtained by the injection of about 20000 different faults in two different target systems have shown that 1) in a system without error detection up to 46% of the faults caused the violation of the fail-silent model; 2) in a computer with behavior based error detection the percentage of faults that caused the violation of the fail-silent mode was reduced to values from 2.3% to 0.4%; 3) the results are very dependent on the target system, on the program under execution during the fault injection and on the type of faults.

Though the importance of UML Sequence Diagrams is well recognized by the object-oriented community, they remain a very difficult UML artifact to develop. In this paper we present a multi-level methodology to develop UML Sequence Diagrams.... more

Though the importance of UML Sequence Diagrams is well recognized by the object-oriented community, they remain a very difficult UML artifact to develop. In this paper we present a multi-level methodology to develop UML Sequence Diagrams. Our methodology is significant in three aspects. First, it provides a multilevel procedure to facilitate ease of the development process. Second, it makes use of certain patterns to ensure the validity of SQDs. Third, it uses consistency checks with corresponding use-case and class diagrams. Throughout the steps of the method we present rules and patterns demonstrating correct and incorrect diagramming of common situations through examples. The purpose of this study is to serve as a reference guide for novice sequence diagram modelers. This methodology is particularly useful for novice practitioners in this field who face challenges in learning the process of SQD development.

The aim of this position statement is to describe our work to support Web service technology as contribution to SOA by providing semantically "rich" descriptions of Web services together with methods and techniques to handle and... more

The aim of this position statement is to describe our work to support Web service technology as contribution to SOA by providing semantically "rich" descriptions of Web services together with methods and techniques to handle and to manipulate them by introducing Semantic Web technologies and additional logical formalisms into the annotation process. The annotation proc- ess will thus be enriched by new features and techniques to solve inference tasks (for instance with respect to consistency checking), to allow automated reasoning about annotation contents, to support automated search and complex query design, and to realize information derivation and interpretation based on different foci.

This paper focuses on argumentative diagrams for computer-supported collaborative argumentation-based learning. Thirty pairs of students discussed two cases on the topic of Genetically Modified Organisms (GMOs) via the computer. They... more

This paper focuses on argumentative diagrams for computer-supported collaborative argumentation-based learning. Thirty pairs of students discussed two cases on the topic of Genetically Modified Organisms (GMOs) via the computer. They communicated via chat. Pairs of students were either asked to collaboratively construct a diagram using argumentative labels to describe the boxes in the diagram, to construct a diagram using argumentative labels to describe the arrows between the boxes, or to collaboratively write a text without using labels. The collaborative process of discussing the topic and constructing the representation (diagram or text) was analysed on how broad and deep students explore the topic of GMOs. Special attention was given to the way students resolve controversy and relate different aspects of the topic under discussion. Preliminary results show that students do not discuss the labels very often and overall are not more aware of controversies and relations in their arguments. They are mostly focused on finishing the diagram or text. Further results will be discussed in Amsterdam.