Data Integrity Research Papers - Academia.edu (original) (raw)

Abstract With the increasing popularity of object-relational technology, it is becoming important to have a methodology which allows designers to exploit the great modelling power of object-oriented conceptual models (OOCMs) and yet which... more

Abstract With the increasing popularity of object-relational technology, it is becoming important to have a methodology which allows designers to exploit the great modelling power of object-oriented conceptual models (OOCMs) and yet which still facilitates ...

This paper proposes efficient resource allocation techniques for a policy-based wireless/wireline interworking architecture, where quality of service (QoS) provisioning and resource allocation is driven by the service level agreement... more

This paper proposes efficient resource allocation techniques for a policy-based wireless/wireline interworking architecture, where quality of service (QoS) provisioning and resource allocation is driven by the service level agreement (SLA). For end-to-end IP QoS delivery, each wireless access domain can independently choose its internal resource management policies to guarantee the customer access SLA (CASLA), while the border-crossing traffic is served by a core network following policy rules to meet the transit domain SLA (TRSLA). Particularly, we propose an engineered priority resource sharing scheme for a voice/data integrated wireless domain, where the policy rules allow cellular-only access or cellular/WLAN interworked access. By such a resource sharing scheme, the CASLA for each service class is met with efficient resource utilization, and the interdomain TRSLA bandwidth requirement can be easily determined. In the transit domain, the traffic load fluctuation from upstream access domains is tackled by an inter-TRSLA resource sharing technique, where the spare capacity from underloaded TRSLAs can be exploited by the overloaded TRSLAs to improve resource utilization. Advantages of the inter-SLA resource sharing technique are that the core network service provider can freely design the policy rules that define underload and overload status, determine the bandwidth reservation, and distribute the spare resources among bandwidth borrowers, while all the policies are supported by a common set of resource allocation techniques.

Data exchange is the problem of taking data structured under a source schema and creating an instance of a target schema that reflects the source data as accurately as possible. In this paper, we address foundational and algorithmic... more

Data exchange is the problem of taking data structured under a source schema and creating an instance of a target schema that reflects the source data as accurately as possible. In this paper, we address foundational and algorithmic issues related to the semantics of data exchange and to the query answering problem in the context of data exchange. These issues arise because, given a source instance, there may be many target instances that satisfy the constraints of the data exchange problem.

In 1998 Italy introduced a restricted version of an Allowance for Corporate Equity, called Dual Income Tax. Using data integrating Italy’s Institute for Statistics enterprise survey data and company accounts, we explore the effects of the... more

In 1998 Italy introduced a restricted version of an Allowance for Corporate Equity, called Dual Income Tax. Using data integrating Italy’s Institute for Statistics enterprise survey data and company accounts, we explore the effects of the Dual Income Tax on enterprise performance in 1998-2001. Firms benefiting from this allowance are simulated through the DIECOFIS corporate tax microsimulation model. The method to estimate enterprise performance is based on a structural equation model which allows us to compute a composite indicator given specific factors observed from enterprise activity. We find a positive impact of the Dual Income Tax on enterprise performance in that companies benefiting from this allowance outperformed non-eligible ones.

Cloud servers is a platform for enabling convenient, on demand network access to a shared pool of configurable server resources (memory, networks, storage, cpu, applications, and services) that can be rapidly provisioned and released with... more

Cloud servers is a platform for enabling convenient, on demand network access to a shared pool of configurable server resources (memory, networks, storage, cpu, applications, and services) that can be rapidly provisioned and released with minimal management effort or cloud service provider interactions. Cloud servers are mostly being used; however, data security is one of the major barriers to adoption in cloud storage. Users can store data and used on demand or for the applications without keeping any local copy of the data on there machine. The Cloud servers storage technologies offers the promise of massive cost savings combined with increased IT agility due to pay per consume. However, this technology challenges many traditional approaches to hosting provider and enterprise application design and management. Users can able to upload data on cloud storage without worrying about to check or verify the integrity. Hence integrity auditing for cloud data is more important task to ens...

Cloud computing is one of the today’s most exciting technologies due to its abilities like ensuring scalable services, reducing the burden of local hardware and software management associated with computing while increasing flexibility... more

Cloud computing is one of the today’s most exciting technologies due to its abilities like ensuring scalable services, reducing the burden of local hardware and software management associated with computing while increasing flexibility and scalability. A major feature of the cloud services is that user’s data are usually processed remotely in unknown machines. Though there are many conveniences brought by this new technology, there are some issues such as cloud computing servers can be easily compromised, data is no longer under the control of the data owner and cloud computing is not responsible for the access control of outsourced data desired by the data owner. To address these problems, we propose a new framework known as Secure Flexible Framework (SFF). This proposed framework allows data owner to achieve cloud data integrity, actual data usage tracking and fine grained access control of outsourced data. Through this paper, we strengthen the correctness and user’s control of th...

ABSTRACT A data warehouse provides information for analytical processing, decision making and data mining tools. As the concept of real-time enterprise evolves, the synchronism between transactional data and data warehouses, statically... more

ABSTRACT A data warehouse provides information for analytical processing, decision making and data mining tools. As the concept of real-time enterprise evolves, the synchronism between transactional data and data warehouses, statically implemented, has been redefined. ...

In this paper a model concept is set up for the simulation of the aquifer system in the over-pumped well fields west of the Gedaref city. Assumptions are made on different system elements that would ultimately be confirmed by numerical... more

In this paper a model concept is set up for the simulation of the aquifer system in the over-pumped well fields west of the Gedaref city. Assumptions are made on different system elements that would ultimately be confirmed by numerical simulation. Prior estimates of system parameters as well as calibration targets are established to measure confidence on the model results. A typical modeling process starts with data integration and description of the hydrogeological setup, then the definition of the conceptual framework for analysis, and finally comes the numerical approximation. The conceptual framework represents an important phase in defining the quantitative framework within which a numerical scheme works. It identifies and specifies the different steps, which can be taken in the process of formulating, analysis, evaluating and presenting alternative models (KOUDSTAAL, 1992). According to SUN (1994) application of sound hydrologic reasoning during the development of an appropria...

To achieve the goal of poverty reduction, as encapsulated within the Millennium Development Goals, the collection, analysis, and use of geographic information as it relates to the multidimensionality of poverty offers a starting point.... more

To achieve the goal of poverty reduction, as encapsulated within the Millennium Development Goals, the collection, analysis, and use of geographic information as it relates to the multidimensionality of poverty offers a starting point. The spatial handling of poverty is an emerging paradigm for which researches on the spatial modelling of poverty are required. Attempting to contribute to a better understanding of poverty mapping, this paper examines GIS suitability for use in poverty application areas. GIS analysis functions most appropriate for use in specific poverty mapping tasks are examined. The uses are identified as data integration of socio-economic, environmental, cultural data, etc.; delineation of areas lying within a specified threshold distance from selected features or places; deriving further data from spatial analysis for multivariate analysis of poverty; deriving straight-line and network distances; visualisation and presentation of the results of poverty analysis. ...

When computer security violations are detected, com-puter forensic analysts attempting to determine the relevant causes and effects are forced to perform the tedious tasks of finding and preserving useful clues in large networks of... more

When computer security violations are detected, com-puter forensic analysts attempting to determine the relevant causes and effects are forced to perform the tedious tasks of finding and preserving useful clues in large networks of op-erational machines. To augment a computer crime ...

Electronic Health Records (EHRs) is used to increase the interoperability between healthcare organizations and patient health information while preserving privacy and confidentiality of patient information. EHR is structured information... more

Electronic Health Records (EHRs) is used to increase the interoperability between healthcare organizations and patient health information while preserving privacy and confidentiality of patient information. EHR is structured information that may include text, image(s) or both of them; its aims to have the features of decentralization, security, openness, and traceability. This systematic review aims to examine and identify the forms of implemented electronic health records with the available protection and privacy techniques. A number of keywords have been used to scan four reliable databases, which are: PubMed, IEEE, Web of Science, and Science Direct. Where 126 studies have been obtained, based on the phases of filtering and scanning that implemented related to the criteria of inclusion/exclusion processes. This review presented a taxonomy where the concluded 126 studies were classified based on two categories, first includes the applied process of (authentication, authorization and access control), and second includes the applied process of privacy and security in (information, image, and both information with image) of EHR. Then, in this research, a deep review were conducted to highlight the challenges, issues, and critical gaps that outlined in the academic literature of this research subject. The obtained results showed no relevant study that examine and discuss the two aforementioned categories. This concluded that EHR could be enhanced by applying the processes of authentication, authorization, and access control in security, in addition to applying privacy for both information and image included in EHR.

The growth of data volumes in science is reaching epidemic proportions. Consequently, the status of data-oriented science as a research methodology needs to be elevated to that of the more established scientific approaches of... more

The growth of data volumes in science is reaching epidemic proportions. Consequently, the status of data-oriented science as a research methodology needs to be elevated to that of the more established scientific approaches of experimentation, theoretical modeling, and simulation. Data-oriented scientific discovery is sometimes referred to as the new science of X-Informatics, where X refers to any science (e.g., Bio-, Geo-, Astro-) and informatics refers to the discipline of organizing, describing, accessing, integrating, mining, and analyzing diverse data resources for scientific discovery. Many scientific disciplines are developing formal sub-disciplines that are information-rich and data-based, to such an extent that these are now stand-alone research and academic programs recognized on their own merits. These disciplines include bioinformatics and geoinformatics, and will soon include astroinformatics. We introduce Astroinformatics, the new data-oriented approach to 21st century astronomy research and education. In astronomy, petascale sky surveys will soon challenge our traditional research approaches and will radically transform how we train the next generation of astronomers, whose experiences with data are now increasingly more virtual (through online databases) than physical (through trips to mountaintop observatories). We describe Astroinformatics as a rigorous approach to these challenges. We also describe initiatives in science education (not only in astronomy) through which students are trained to access large distributed data repositories, to conduct meaningful scientific inquiries into the data, to mine and analyze the data, and to make data-driven scientific discoveries. These are essential skills for all 21st century scientists, particularly in astronomy as major new multi-wavelength sky surveys (that produce petascale databases and image archives) and grand-scale simulations (that generate enormous outputs for model universes, such as the Millennium Simulation) become core research components for a significant fraction of astronomical researchers.

In body biosensor networks, which may be classified as a specialized type of ad hoc networks, it is necessary to ensure the authenticity and freshness of the vital medical information. In this work, we propose security solutions to... more

In body biosensor networks, which may be classified as a specialized type of ad hoc networks, it is necessary to ensure the authenticity and freshness of the vital medical information. In this work, we propose security solutions to identify attacks on data freshness and preserve message integrity in these networks. We use the measurement of permissible round trip time threshold and computationally feasible authentication solution to address the security threats in the network. We have implemented a prototype framework in GloMoSim to assess and evaluate the robustness of our detection mechanism.

This paper introduces an interactive video system and its architecture where several systems cooperate to manage the services of interactive video. Each system is specialized according to the data it handles and the functionality it... more

This paper introduces an interactive video system and its architecture where several systems cooperate to manage the services of interactive video. Each system is specialized according to the data it handles and the functionality it performs. A system can be a database (for billing purposes) or just a video store system (to store the video data) lacking the typical features of a database or an information retrieval system to support indexing and querying of video data. Because quality of service is an important requirement for whole ...

This paper presents a novel approach for automated test data generation of imperative programs containing integer, boolean and/or float variables. Our approach is based on consistency techniques integrating integer and float vari-ables.... more

This paper presents a novel approach for automated test data generation of imperative programs containing integer, boolean and/or float variables. Our approach is based on consistency techniques integrating integer and float vari-ables. We handle statement, branch and path coverage ...