Software Verification and Validation Research Papers (original) (raw)

2025

This paper presents a framework for evaluate and compare different defect taxonomies. Six well-known taxonomies are evaluated with it and the results are showed. We found deficiencies in every taxonomy once they are evaluated with the... more

This paper presents a framework for evaluate and compare different defect taxonomies. Six well-known taxonomies are evaluated with it and the results are showed. We found deficiencies in every taxonomy once they are evaluated with the framework.

2025

Since the inventions of the Internet and mobile smartphone, the number of resources for informal L2 (additional) learning 'in the digital wilds' have exploded, especially for super-languages like English and Spanish. Why and how should we... more

Since the inventions of the Internet and mobile smartphone, the number of resources for informal L2 (additional) learning 'in the digital wilds' have exploded, especially for super-languages like English and Spanish. Why and how should we respond? To apprehend what's going on, understanding the history, practice, and theory of the phenomenon of 'language learning beyond the classroom' (LLBC) is a good place to start. Recognizing what is truly new (and what is not) can help researchers, instructors, and curriculum designers leverage LLBC for formal purposes and identify new opportunities and roles. A variety of recent studies exploring informal and formal practices using social media, digital games, and other emergent digital technologies offer insight and direction. New media, digital literacies, and post-structural identity theories offer useful heuristics, as well as traditional and new theories of autonomy and motivation. Turning towards socio-cognitive and ecological interpretations, SLA and psychological theories are responding as well. By examining the new literacies and motivations for LLBC, this talk will hopefully implicate new research and teaching practices, moving the conversation beyond keeping formal language learning experiences relevant towards making them indispensable.

2025

The area of software verification has grown its importance in software engineering. This is a bibliography of verification of a specialized class of softwares called compiler. The citations are sorted year wise in chronological order with... more

The area of software verification has grown its importance in software engineering. This is a bibliography of verification of a specialized class of softwares called compiler. The citations are sorted year wise in chronological order with most recent years first. The format of each citation is authors; Title; publisher; year. 2003

2025, XXIV Congreso Argentino de Ciencias de la Computación (La Plata, 2018).

Software is a crucial element in the functionality of devices and industry. Likewise, the operation of an enterprise or organization depends largely on the reliability of the software systems used for supporting the business process or... more

Software is a crucial element in the functionality of devices and industry. Likewise, the operation of an enterprise or organization depends largely on the reliability of the software systems used for supporting the business process or particular tasks. The quality of software has become the most important factor in determining the success of products or enterprises. In order to accomplish a quality software product several methodologies, techniques, and frameworks have been developed, each of them tailored to specific areas or characteristics of the software under review. This paper presents a white-box testing framework for Object-Oriented Programming based on Message Sequence Specification. In the context of an object-oriented program, our framework can be used to test the correct order in which the methods of a class are invoked by its clients. The implementation of the framework is based on aspect-oriented programming.

2025, International Conference on Computer Design

We present a method which accepts Interval Temporal Logic (ITL) formulas as specification and automatically generates state machines. The specification in ITL can also be used as a con- straint for a state machine which is an abstraction... more

We present a method which accepts Interval Temporal Logic (ITL) formulas as specification and automatically generates state machines. The specification in ITL can also be used as a con- straint for a state machine which is an abstraction for an existing sequential circuit, which can be useful for redesign or engineering change. The generated state machines can be further pro-

2025

A simple, rapid, precise and accurate RP-HPLC method has been developed for the estimation of colchicine in tablet formulation. A validation and quantity evaluation method has been established by a defined high performance liquid... more

A simple, rapid, precise and accurate RP-HPLC method has been developed for the estimation of colchicine in tablet formulation. A validation and quantity evaluation method has been established by a defined high performance liquid chromatography by using acetonitrile : methanol : water (32:48:20 v/v; pH adjusted to 5.2 with phosphoric acid) as a mobile phase pumped through C18 (250 mm x 4.6mm, 5μm) column. The flow rate was 1.2 ml/min and effluents were monitored at 254 nm. The retention time was 7.56 min and injection volume set at 20μl. The linear regression analysis data for calibration curve showed a good relationship with correlation coefficient of 0.999 8 . The concentration range was 17.5-140 μg/ml. The percentage recovery of colchicine was found to be 99.86%. This selective method is found to be accurate, precise and effectively used for the colchicine in tablet formulation with better chromatographic conditions.

2025, World journal of analytical chemistry

A sensitive, selective and precise high performance thin layer chromatographic method has been developed and validated for the determination of tolperisone hydrochloride in bulk drug and in formulation. The method employed HPTLC aluminium... more

A sensitive, selective and precise high performance thin layer chromatographic method has been developed and validated for the determination of tolperisone hydrochloride in bulk drug and in formulation. The method employed HPTLC aluminium plates pre-coated with silica gel 60F-254 as the stationary phase while the solvent system was methanol: ethyl acetate (3:7, v/v). The R f value of tolperisone hydrochloride was observed to be 0.50 ± 0.008. The densitometric analysis was carried out in absorbance mode at 261nm. The linear regression analysis data for the calibration plots showed a good linear relationship for tolperisone hydrochloride over a concentration range of 50 -800 ng/spot. The limit of detection and limit of quantification for tolperisone hydrochloride was found to be 7.57 and 10ng/spot. Tolperisone hydrochloride was subjected to acid and alkali hydrolysis, chemical oxidation, wet hydrolysis, dry heat degradation and sun light degradation. The degraded product peaks were well resolved from the pure drug peak with significant difference in their R f values. Stressed samples were assayed using developed HPTLC method. The proposed method was validated with respect to linearity, accuracy, precision and robustness. The method was successfully applied to the estimation of tolperisone in tablet dosage forms. The proposed study describes a HPTLC method for the estimation of tolperisone in bulk and their pharmaceutical dosage form. Statistical analysis showed that the method is suitable for the routine analysis of tolperisone in tablets.

2025, Asian Journal of Pharmaceutical and Clinical Research

Objective: To develop a simple, accurate, precise, and linear reverse-phase high-performance liquid chromatographic (RP-HPLC) method and verify the quantitative estimation (assay) of sodium butyrate in tablets. Methods: The optimized... more

Objective: To develop a simple, accurate, precise, and linear reverse-phase high-performance liquid chromatographic (RP-HPLC) method and verify the quantitative estimation (assay) of sodium butyrate in tablets. Methods: The optimized RP-HPLC method uses a reverse phase stationary phase which is YMC Triart C18 column, having specifications of 250×4.6 mm; 5 μm, a mobile phase composition of pH 8.0 sodium dihydrogen phosphate buffer and acetonitrile in the proportion of 92:8 v/v, flow rate of 0.5 ml/min, injection volume of 10 μL, and detection wavelength of 210 nm using a UV/PDA detector. Results: The developed method gave sodium butyrate eluting at about 6 min. Sodium butyrate exhibited linearity in the range of 119.56-1195.6 μg/ ml. The precision was exemplified by a relative standard deviation of 0.30%. The percentage of individual recovery was found to be in the range of 97.0 and 103.0 during accuracy studies. Conclusion: A simple, specific, accurate, precise, and linear RP-HPLC method was developed and verified for the quantitative estimation (assay) of sodium butyrate in tablets and hence this method can be explored for the analysis of sodium butyrate in tablets in various pharmaceutical industries.

2025, Discover Chemistry

A sensitive, accurate, robust, and precise analytical quality by design (AQbD)-based RP-HPLC (reverse-phase high-performance liquid chromatography) method has been developed for the simultaneous estimation of perindopril erbumine and... more

A sensitive, accurate, robust, and precise analytical quality by design (AQbD)-based RP-HPLC (reverse-phase high-performance liquid chromatography) method has been developed for the simultaneous estimation of perindopril erbumine and moxonidine hydrochloride, both used in hypertension treatment. Initially, a three-level factorial design screened various parameters impacting chromatographic responses, followed by Central Composite Design (CCD) for optimizing the critical parameters. This cost-effective isocratic method employed a Shimpack ODS C-18 column (250 mm × 4.6 mm, 5 μm) as the stationary phase. The method, developed and validated in line with ICH Q2 (R1) guidelines, utilized a mobile phase comprising methanol: acetonitrile: phosphate buffer (34:30:36 v/v/v) adjusted to pH 3.5 with 1% ortho phosphoric acid, at a flow rate of 1.0 mL/min, a column temperature of 40 ºC, and UV detection at 214 nm, achieving efficient separation of both drugs. Linearity ranges were 25-125 μg/mL for perindopril erbumine (r 2 = 0.9996) and 1-5 μg/mL for moxonidine hydrochloride (r 2 = 0.9993). This newly developed RP-HPLC method effectively enables routine quantitative and simultaneous qualitative analysis of these compounds in bulk and synthetic mixtures. Additionally, the method's greenness and environmental safety were evaluated using eco-analytical metrics (eco analytical scale, NEMI, GAPI, and AGREE), confirming its ecological capability.

2024, Theoretical Computer Science

2024

The Transparent Psi Project (TPP) was intended as a potential model and learning experience for optimal research practices. The TPP is, to my knowledge, the only research study in psychology that was designed to use methods that are... more

The Transparent Psi Project (TPP) was intended as a potential model and learning experience for optimal research practices. The TPP is, to my knowledge, the only research study in psychology that was designed to use methods that are comparable to my experience in regulated clinical research. These methods include formal software validation, measures to prevent experimenter fraud, developing the operating characteristics for Bayesian analyses, and research audits. TPP will hopefully become a useful model for high quality research methodology.
The purpose of this audit is to apply to the TPP my experience with research audits in regulated clinical research. I initiated the audit after reviewing the final reports of previous audits of TPP by an IT auditor and two research auditors, all with academic backgrounds. In reviewing their final reports, it was clear that their expectations were very different than mine. The previous audits did not have the range of topics, details, emphasis on protocol deviations, and edginess consistent with my expectations for an audit. My hope is to provide an audit that demonstrates useful expectations based on experience outside of academic research.
The goals of a good academic research audit have two components: (a) verify that recognized good research practices were used, and (b) verify that the actions and practices specified in the preregistration and/or protocol were properly implemented.

2024

The lessons and recommendations described here were developed from conducting a research audit of the Transparent Psi Project (TPP). The TPP was intended as a potential model and learning experience for optimal research practices. The... more

The lessons and recommendations described here were developed from conducting a research audit of the Transparent Psi Project (TPP). The TPP was intended as a potential model and learning experience for optimal research practices. The topics discussed here are among the potential lessons from the TPP. The recommendations are: (1) Validation of data collection software is essential. (2) Evaluation of power or operating characteristics (inferential errors) is essential, including for sequential Bayesian analyses. (3) An online repository copy of data is very valuable, but easy-to-use, reliable, secure processes remain to be developed. (4) A good research audit is much more extensive than current peer review for publication, including Registered Reports. (5) Quality control (QC) comes before auditing. (6) Good research practice guidelines and/or standard operating procedures (SOPs) for general use would be very valuable for confirmatory psychological research. (7) It is time to implement routine measures to prevent experimenter fraud. (8) Git appears to be not optimal for software tracking to prevent programming fraud.

2024, HAL (Le Centre pour la Communication Scientifique Directe)

This extended abstract takes advantage of a theory of software testing based on formal specifications to point out the benefits and limits of the use of formal methods to this end. A notion of exhaustive test set is defined according to... more

This extended abstract takes advantage of a theory of software testing based on formal specifications to point out the benefits and limits of the use of formal methods to this end. A notion of exhaustive test set is defined according to the semantics of the formal notation, the considered conformance relation, and some testability hypotheses on the system under test. This gives a framework for the formalisation of test selection, test execution, and oracles, and, moreover, leads to the explicitation of those hypotheses underlying test selection strategies, such as uniformity hypotheses or regularity hypotheses. This explicitation provides some guides to complementary proofs, or tests, or instrumentations of the system under test. This approach has been applied to various formalisms: axiomatic specifications of data types, model-based specifications, process algebras, transition systems, etc. It provides some guiding principles for the development of testing methods given a formal specification notation and an associated conformance/refinement relation. It is at the origin of the development of some test environments based on SMT solvers and theorem provers.

2024, Journal of Computer Science and Technology

When building dependable systems by integrating untrusted software components that were not originally designed to interact with each other, it is likely the occurrence of architectural mismatches related to assumptions in their failure... more

When building dependable systems by integrating untrusted software components that were not originally designed to interact with each other, it is likely the occurrence of architectural mismatches related to assumptions in their failure behaviour. These mismatches, if not prevented during system design, have to be tolerated during runtime. This paper presents an architectural abstraction based on exception handling for structuring fault-tolerant software systems. This abstraction comprises several components and connectors that promote an existing untrusted software element into an idealised fault-tolerant architectural element. Moreover, it is considered in the context of a rigorous software development approach based on formal methods for representing the structure and behaviour of the software architecture. The proposed approach relies on a formal specification and verification for analysing exception propagation, and verifying important dependability properties, such as deadlock freedom, and scenarios of architectural reconfiguration. The formal models are automatically generated using model transformation from UML diagrams: component diagram representing the system structure, and sequence diagrams representing the system behaviour. Finally, the formal models are also used for generating unit and integration test cases that are used for assessing the correctness of the source code. The feasibility of the proposed architectural approach was evaluated on an embedded critical case study.

2024, FOOD ANALYTICAL METHODS

Animal-derived gelatins are widely used in a variety of foods such as biscuits, chewing gum, chocolate, jelly beans and confectionery, as well as medicines and food supplements. However, high temperatures, chemical treatments and other... more

Animal-derived gelatins are widely used in a variety of foods such as biscuits, chewing gum, chocolate, jelly beans and confectionery, as well as medicines and food supplements. However, high temperatures, chemical treatments and other methods used in the production process can damage the animal genes in the gelatin. This makes it difficult to extract DNA from gelatin-containing products and prevents the detection of animal species by technological methods such as Real Time Polymerase Chain Reaction (RT-PCR). This problem clearly demonstrates the need for
new and more advanced technologies for species identification. In such cases where traditional methods are inadequate, more sensitive and reliable technological approaches should be developed. In this study, verification studies were conducted for species detection on 124 animal-containing gelatin products that could not be identified by RT-PCR due to DNA degradation. This was achieved using Quadrupole (Q) Exactive Orbitrap, a new generation molecular technology. As a result of the study, 62 samples containing bovine gelatin were spiked with 1% pork
gelatin and 62 of the samples were found to be positive for porks by Orbitrap. Similarly, 62 of the 62 samples containing pork gelatin spiked with 1% bovine gelatin were detected as bovine positive by Orbitrap. In this context, successful results were obtained in the detection of beef and pork species with LOD 1% level. The developed method was verification in terms of sensitivity, accuracy, and precision. As a result, limits of quantification (LOQ) were ranging from 10.0 μg/L. As a result the limit of detection (LOD) was found to be 1%. Furthermore, this method was successfully applied to positive and negative controls.

2024, International Journal of Science and Research

Payment transaction security is a critical component of modern financial systems, as digital transactions become increasingly prevalent across mobile, online, and in-person platforms. This field encompasses technologies and protocols... more

Payment transaction security is a critical component of modern financial systems, as digital transactions become increasingly prevalent across mobile, online, and in-person platforms. This field encompasses technologies and protocols designed to protect sensitive financial information from unauthorized access, fraud, and cyber threats while ensuring a seamless user experience. Key components secure cryptographic protocols and tokenization methods that replace card details with secure digital tokens to mitigate the risks of data breaches. The evolution of payment security is driven by regulatory mandates (e.g., PSD2/SCA, GDPR) and industry standards (e.g., PCI-DSS, ISO 8583), which require stringent security measures to safeguard both consumers and financial institutions. Despite these advancements, emerging technologies and the increasing complexity of the digital payment’s ecosystem pose ongoing challenges. The convergence of mobile wallets, contactless payments, and eCommerce transactions requires adaptable security frameworks that balance high-level protection with ease of implementation and scalability. Future developments will likely focus on enhancing interoperability between payment platforms, refining AI-driven anomaly detection, and adapting cryptographic techniques to counter new types of fraud. Ensuring a secure transaction environment remains essential for fostering consumer trust and supporting the growth of digital payment solutions worldwide.

2024, International Journal of Pharmacy and Pharmaceutical Sciences

Objective: To develop a simple, accurate, precise and linear Reverse Phase High-Performance Liquid Chromatographic (RP-HPLC) method and verify for the quantitative estimation (Dissolution) of Bromelain in delayed-release tablets. Methods:... more

Objective: To develop a simple, accurate, precise and linear Reverse Phase High-Performance Liquid Chromatographic (RP-HPLC) method and verify for the quantitative estimation (Dissolution) of Bromelain in delayed-release tablets. Methods: The optimized RP-HPLC method for both acid and buffer stage dissolutions of delayed-release tablets uses Zorbax 300 SB-C8 column (150 mm X 4.6 mm; 3.5μ), a mobile phase-A of 0.1% trifluoroacetic acid in water and a mobile phase-B of 0.1% Trifluoroacetic acid in Acetonitrile in the gradient proportion, flow rate of 1.0 ml/min, injection volume of 25 µl, detection wavelength of 280 nm using a UV/PDA detector, column temperature of 40 °C, sample tray/compartment temperature of 5 °C and a run time of 20 min. Results: The developed method gave Bromelain eluting at about 6 min. Bromelain exhibited linearity in the range 53.4-800.6 μg/ml (r 2 =0.99992). The precision is exemplified by relative standard deviation of 1.3 and 2.3% for acid and buffer stages, respectively. Percentage recovery of the drug was found to be between 90.0 and 110.0 during accuracy studies. Conclusion: A simple, accurate, precise, and linear RP-HPLC method was developed and verified for the quantitative estimation (Dissolution) of Bromelain in tablets and hence this method can be explored for the analysis of Bromelain in tablets in various pharmaceutical industries.

2024

Classification is a key ontology reasoning task. Several highly-optimised OWL reasoners are designed for different fragments of OWL 2. Combining these delegate reasoners to classify one ontology gives potential benefits, but these may be... more

Classification is a key ontology reasoning task. Several highly-optimised OWL reasoners are designed for different fragments of OWL 2. Combining these delegate reasoners to classify one ontology gives potential benefits, but these may be offset by overheads or redundant subsumption tests. In this paper, we show that with the help of the atomic decomposition, a known ontology partition approach, these redundant subsumption tests can be avoided. We design and implement our classification algorithms and empirically evaluate them.

2024, The Semantic Web - ISWC 2015

OWL 2 DL is a complex logic with reasoning problems that have a high worst case complexity. Modern reasoners perform mostly very well on naturally occurring ontologies of varying sizes and complexity. This performance is achieved through... more

OWL 2 DL is a complex logic with reasoning problems that have a high worst case complexity. Modern reasoners perform mostly very well on naturally occurring ontologies of varying sizes and complexity. This performance is achieved through a suite of complex optimisations (with complex interactions) and elaborate engineering. While the formal basis of the core reasoner procedures are well understood, many optimisations are less so, and most of the engineering details (and their possible effect on reasoner correctness) are unreviewed by anyone but the reasoner developer. Thus, it is unclear how much confidence should be placed in the correctness of implemented reasoners. To date, there is no principled, correctness unit test-like suite for simple language features and, even if there were, it is unclear that passing such a suite would say much about correctness on naturally occurring ontologies. This problem is not merely theoretical: Divergence in behaviour (thus known bugginess of implementations) has been observed in the OWL Reasoner Evaluation (ORE) contests to the point where a simple, majority voting procedure has been put in place to resolve disagreements. In this paper, we present a new technique for finding and resolving reasoner disagreement. We use justifications to cross check disagreements. Some cases are resolved automatically, others need to be manually verified. We evaluate the technique on a corpus of naturally occurring ontologies and a set of popular reasoners. We successfully identify several correctness bugs across different reasoners, identify causes for most of these, and generate appropriate bug reports and patches to ontologies to work around the bug.

2024

Very expressive Description Logics in the SH family have worst case complexity ranging from EXPTIME to double NEXPTIME. In spite of this, they are very popular with modellers and serve as the foundation of the Web Ontology Language (OWL),... more

Very expressive Description Logics in the SH family have worst case complexity ranging from EXPTIME to double NEXPTIME. In spite of this, they are very popular with modellers and serve as the foundation of the Web Ontology Language (OWL), a W3C standard. Highly optimised reasoners handle a wide range of naturally occurring ontologies with relative ease, albeit with some pathological cases. A recent optimisation trend has been modular reasoning, that is, breaking the ontology into hopefully easier subsets with a hopefully smaller overall reasoning time (see MORe and Chainsaw for prominent examples). However, it has been demonstrated that subsets of an OWL ontology may be harder-even much harder-than the whole ontology. This introduces the risk that modular approaches might have even more severe pathological cases than the normal monolithic ones. In this paper, we analyse a number of ontologies from the BioPortal repository in order to isolate cases where random subsets are harder than the whole. For such cases, we then examine whether the module nearest to the random subset also exhibits the pathological behaviour.

2024

VeriFast is a sound modular formal Verification tool for C and Java programs. It accepts programs annotated with preconditions and postconditions written in separation logic as an input and verifies the correctness of the code with... more

VeriFast is a sound modular formal Verification tool for C and Java programs. It accepts programs annotated with preconditions and postconditions written in separation logic as an input and verifies the correctness of the code with respect to these annotations. In this paper, we present Automated VeriFast which is a new extension or an automation layer that lies on top of VeriFast that, given a partially annotated program, offers to attempt to incrementally improve the annotations, e.g. by inferring a fix to the specification of a program fragment that fails to verify. Our thesis is that such small, interactive inference steps will have practical benefits over non-interactive specification inference approaches by allowing the user to guide the inference process and by being simpler and therefore more predictable and diagnosable. Our current prototype is able to infer preconditions, postconditions, and loop invariants for some programs that manipulate linked lists.

2024

We propose in this paper an algebraic approach to hard-ware/software partitioning in Verilog Hardware Description Language (HDL). We explore a collection of algebraic laws for Verilog programs, from which we design a set of syntax-based... more

We propose in this paper an algebraic approach to hard-ware/software partitioning in Verilog Hardware Description Language (HDL). We explore a collection of algebraic laws for Verilog programs, from which we design a set of syntax-based algebraic rules to conduct hardware/software partitioning. The co-specification language and the target hardware and software description languages are specific subsets of Verilog. Through this, we confirm successful verification for the correctness of the partitioning process by an algebra of ...

2024, international journal of chemical sciences

An accurate and precise HPLC method was developed for the simultaneous determination of perindopril and amlodipine. Separation of the drugs was achieved on a reverse phase C 18 column using mobile phase consisting of phosphate buffer and... more

An accurate and precise HPLC method was developed for the simultaneous determination of perindopril and amlodipine. Separation of the drugs was achieved on a reverse phase C 18 column using mobile phase consisting of phosphate buffer and acetonitrile in the ratio of 65 : 35 v/v. The flow rate was 0.6 mL/min and the detection wavelength was 237 nm. The linearity was observed in the range of 10-50 µg/mL for amlodipine and 200-1000 µg/mL for perindopril. The proposed method was validated for its linearity, accuracy, precision and robustness. The proposed method can be employed to estimate the drug contents in marketed formulations.

2024

A rapid, precise, accurate, specific and sensitive reverse phase liquid chromatographic method has been developed for the estimation of valsartan in pure and tablet formulation. The chromatographic method was standardized using a Xterra C... more

A rapid, precise, accurate, specific and sensitive reverse phase liquid chromatographic method has been developed for the estimation of valsartan in pure and tablet formulation. The chromatographic method was standardized using a Xterra C 18 column (100×4.6 mm I.D., 5 µm particle size) with UV detection at 210 nm and flow rate of 1 ml/min. The mobile phase consisting of a mixture of phosphate buffer pH 3 and acetonitrile in the ratio of 50:50 v/v was selected. The proposed method was validated for its sensitivity, linearity, accuracy and precision. The retention time for valsartan was 4.450 min. The % recovery was within the range between 98.6 % and 101.2 %. The percentage RSD for precision and accuracy of the method was found to be less than 2 %. This method can be employed for routine quality control analysis of valsartan in tablet dosage forms.

2024

Jakarta workshop participants and GAW APP SAG members • SAG contributes to enhancing exchanges between the GAW community and different end-user and modeling communities requested atmospheric composition data, especially for near-real-time... more

Jakarta workshop participants and GAW APP SAG members • SAG contributes to enhancing exchanges between the GAW community and different end-user and modeling communities requested atmospheric composition data, especially for near-real-time data applications on regional to global scales. • Possible Focus Area on the Arctic as a joint study with YOPP, PACES and AMAP ??

2024, GI Jahrestagung (1)

Im folgenden wird der Einsatz eines Prototypen zur formalen Verifikation von ASCET-SD-Modellen im Kontext der aktuell von BMW entwickelten Aktivlenkung [EPK + 02] geschildert. Der Prototyp wurde zurÜberprüfung sicherheitsrelevanter... more

Im folgenden wird der Einsatz eines Prototypen zur formalen Verifikation von ASCET-SD-Modellen im Kontext der aktuell von BMW entwickelten Aktivlenkung [EPK + 02] geschildert. Der Prototyp wurde zurÜberprüfung sicherheitsrelevanter Eigenschaften der Abschaltlogik verwendet, welche ein zentraler Bestandteil der Steuerung der Aktivlenkung ist.

2024, Journal of analytical & pharmaceutical research

HPTLC method and chromatographic conditions Sample application: The standard and formulation samples of Luliconazole were spotted on Precoated TLC plates in the form of narrow bands of lengths 6mm, with 8.0mm from the bottom and left... more

HPTLC method and chromatographic conditions Sample application: The standard and formulation samples of Luliconazole were spotted on Precoated TLC plates in the form of narrow bands of lengths 6mm, with 8.0mm from the bottom and left margin 15 mm and with 10mm distance between two bands. Samples were applied under continuous drying stream of nitrogen gas at constant application rate.

2024, World journal of analytical chemistry

A sensitive, selective and precise high performance thin layer chromatographic method has been developed and validated for the determination of tolperisone hydrochloride in bulk drug and in formulation. The method employed HPTLC aluminium... more

A sensitive, selective and precise high performance thin layer chromatographic method has been developed and validated for the determination of tolperisone hydrochloride in bulk drug and in formulation. The method employed HPTLC aluminium plates pre-coated with silica gel 60F-254 as the stationary phase while the solvent system was methanol: ethyl acetate (3:7, v/v). The R f value of tolperisone hydrochloride was observed to be 0.50 ± 0.008. The densitometric analysis was carried out in absorbance mode at 261nm. The linear regression analysis data for the calibration plots showed a good linear relationship for tolperisone hydrochloride over a concentration range of 50-800 ng/spot. The limit of detection and limit of quantification for tolperisone hydrochloride was found to be 7.57 and 10ng/spot. Tolperisone hydrochloride was subjected to acid and alkali hydrolysis, chemical oxidation, wet hydrolysis, dry heat degradation and sun light degradation. The degraded product peaks were well resolved from the pure drug peak with significant difference in their R f values. Stressed samples were assayed using developed HPTLC method. The proposed method was validated with respect to linearity, accuracy, precision and robustness. The method was successfully applied to the estimation of tolperisone in tablet dosage forms. The proposed study describes a HPTLC method for the estimation of tolperisone in bulk and their pharmaceutical dosage form. Statistical analysis showed that the method is suitable for the routine analysis of tolperisone in tablets.

2024

Software products cover many aspects of our everyday life as they are used in different application domains, such as communication, healthcare, military, and transportation. Thus, our modern life cannot be imagined without software. The... more

Software products cover many aspects of our everyday life as they are used in different application domains, such as communication, healthcare, military, and transportation. Thus, our modern life cannot be imagined without software. The extensive demand and use of different software products in our day-to-day activities have significantly increased their functionality, size, and complexity. As a result, the number and types of software faults have also increased. Software faults not only lead to financial losses but also loss of lives. Therefore, faults should be fixed as soon as they are found. Finding the locations of faults in software systems has historically been a manual task that has been known to be tedious, expensive, and time-consuming, particularly for large-scale software systems. Besides, manual fault localization depends on the developer’s experience to find and prioritize code elements that are likely to be faulty. Developers spend almost half or more of their time on finding faults alone. Therefore, there is a serious need for automatic fault localization techniques to help developers effectively find the locations of faults in software systems with minimal human intervention. Software fault localization is a significant research topic in software engineering. Despite starting in the late 1950s, software fault localization research has gained more attention in the last few decades. Researchers and developers have proposed and implemented different types of fault localization techniques. However, Spectrum-based Fault Localization (SBFL) is considered amongst the most prominent techniques in this respect due to its efficiency and effectiveness, lightweight, language-agnostic, easy-to-use, and relatively low overhead in execution time characteristics. In SBFL, the probability of each program element (e.g., statement, function, or class) being faulty is calculated based on the results of executing test cases and their corresponding code coverage information. Currently, SBFL is not yet widely adopted in the industry as it poses several issues and its performance is affected by several influential factors. Therefore, addressing SBFL issues can lead to improving its effectiveness and making it widely used. This PhD thesis aims to improve the effectiveness of SBFL, the most common fault localization technique, by addressing some of the most important challenges and issues posed by it.

2024, International Energy Agency, Annex

2024, International journal of pharmaceutical investigation

Introduction: Favipiravir was remerging antiviral drug in the beginning era of COVID-19 Pandemic as an urgent therapeutic alternative to treat COVID-19. Literature survey reveals that there is no any reported stability indicating HPTLC... more

Introduction: Favipiravir was remerging antiviral drug in the beginning era of COVID-19 Pandemic as an urgent therapeutic alternative to treat COVID-19. Literature survey reveals that there is no any reported stability indicating HPTLC method for this non-pharmacopoeial drug. So current scheme of research work focuses on development and Validation of SI-HPTLC method for quantification of FVP in bulk and commercial film coated tablet. Materials and Methods: Silica Gel 60 F 254 plates used as stationary phase to carried out chromatographic separation using mobile phase Toluene: Methanol: Ethyl Acetate: Formic acid (7: 1: 2: 0.5). Densitometric scanning of well resolved bands of FVP was carried out at 325 nm. The propose SI-HPTLC method was validated as per ICH Q (R 1) guidelines. Results: It was found to be Sensitive, Selective, Accurate and Precise in nature and can be applied for routine In-Process Quality Control testing of Commercial Film Coated Tablet of FVP. FVP was subjected to various stress condition in force degradation study and was found to be more susceptible for alkaline degradation. Alkaline degradants further subjected to LCMS analysis in order to reveal possible degradation pathway of FVP in alkaline condition. Conclusion: Developed HPTLC method is sensitive, accurate, selective and precise in nature and can be applied for quantification for Favipiravir in bulk and commercial film coated tablet.

2024, arXiv (Cornell University)

A major determinant of the quality of software systems is the quality of their requirements, which should be both understandable and precise. Most requirements are written in natural language, good for understandability but lacking in... more

A major determinant of the quality of software systems is the quality of their requirements, which should be both understandable and precise. Most requirements are written in natural language, good for understandability but lacking in precision. To make requirements precise, researchers have for years advocated the use of mathematics-based notations and methods, known as "formal". Many exist, differing in their style, scope and applicability. The present survey discusses some of the main formal approaches and compares them to informal methods. The analysis uses a set of 9 complementary criteria, such as level of abstraction, tool availability, traceability support. It classifies the approaches into five categories: general-purpose, natural-language, graph/automata, other mathematical notations, seamless (programming-language-based). It presents approaches in all of these categories, altogether 22 different ones, including for example SysML, Relax, Eiffel, Event-B, Alloy. The review discusses a number of open questions, including seamlessness, the role of tools and education, and how to make industrial applications benefit more from the contributions of formal approaches. This is the full version of the survey, including some sections and two appendices which, because of length restrictions, do not appear in the submitted version.

2024, ArXiv

Writing requirements for embedded software is pointless unless they reflect actual needs and the final software implements them. In usual approaches, the use of different notations for requirements (often natural language) and code (a... more

Writing requirements for embedded software is pointless unless they reflect actual needs and the final software implements them. In usual approaches, the use of different notations for requirements (often natural language) and code (a programming language) makes both conditions elusive. To address the problem, we propose to write requirements in the programming language itself. The expected advantages of this seamless approach, called AutoReq include: avoiding the potentially costly miss due to the use of different notations; facilitating software change and evolution, by making it easier to update code when requirements change and conversely; benefiting from the remarkable expressive power of modern object-oriented programming languages, while retaining a level of abstraction appropriate for requirements; leveraging, in both requirements and code, the ideas of Design by Contract, including (as the article shows) applying Hoare-style assertions to express temporal-logic-style proper...

2024, Bulletin of the American Meteorological Society

The Gradient in Longitude of Atmospheric Constituents above the Mediterranean Basin (GLAM) airborne campaign was set up to investigate the summertime variability of gaseous pollutants, greenhouse gases, and aerosols between the western... more

The Gradient in Longitude of Atmospheric Constituents above the Mediterranean Basin (GLAM) airborne campaign was set up to investigate the summertime variability of gaseous pollutants, greenhouse gases, and aerosols between the western (∼3°E) and eastern (∼35°E) sections of the Mediterranean basin as well as how this connects with the impact of the Asian monsoon anticyclone on the eastern Mediterranean in the mid- to upper troposphere (∼5–10 km). GLAM falls within the framework of the Chemistry–Aerosol Mediterranean Experiment (ChArMEx) program. GLAM used the French Falcon-20 research aircraft to measure aerosols, humidity, and chemical compounds: ozone, carbon monoxide, methane, and carbon dioxide. GLAM took place between 6 and 10 August 2014, following a route from Toulouse (France) to Larnaca (Cyprus) and back again via Minorca (Spain), Lampedusa (Italy), and Heraklion (Crete, Greece). The aircraft flew at an altitude of 5 km on its outbound journey and 10 km on the return leg. G...

2024

Information Technology is a power hub of resources with each passing day; as well, people's demand is increasing each day resulting in high demand for products and services, which leads to high expectation of delivery of services on time.... more

Information Technology is a power hub of resources with each passing day; as well, people's demand is increasing each day resulting in high demand for products and services, which leads to high expectation of delivery of services on time. The organizational strategy employs immense use of the internet and PC, providing a strong impact on competitors against the customers. Our approach here is primarily focused on reviewing a software Development Life Cycle (SDLC), starting from the development of software, management of the software, and these results in examining the development of the software through various models developed widely known as SDLC. Moreover, this review various existing SDLC models including the Agile model, Waterfall, V-shaped, Spiral Model, Waterfall Model and recent models like DevOps. This research review provides a critical analysis of these models along with a comparative analysis considering the various constraints. At last, this research review performs the review analysis of the integration of Machine Learning and SDLC model for the future research directions.

2024, Lecture Notes in Computer Science

This paper sketches a hard real-time programming language featuring operators for expressing timeliness requirements in an abstract, implementation-independent way and presents parts of the design and verification of a provably correct... more

This paper sketches a hard real-time programming language featuring operators for expressing timeliness requirements in an abstract, implementation-independent way and presents parts of the design and verification of a provably correct code generator for that language. The notion of implementation correctness used as an implicit specification of the code generator pays attention to timeliness requirements. Hence, formal verification of the code generator design is a guarantee of meeting all deadlines when executing generated code.

2024, Lecture Notes in Computer Science

2024, Journal of Materials Processing Technology

In this study, a distributed multi-agent system (DMAS) for sheet metal forming is proposed. The system consists of functional agents such as die design agent, knowledge management agent, validation and verification agent, product... more

In this study, a distributed multi-agent system (DMAS) for sheet metal forming is proposed. The system consists of functional agents such as die design agent, knowledge management agent, validation and verification agent, product description agent, operation planning agent, calculating and dimensioning agent, artificial intelligence agent, assembly and disassembly agent. Also there is a design mediator. Each of the functional agents is enhanced software and can be considered as an expert of the classical die design. This paper provides the theoretical framework for future research. This enhanced system has been implemented on a simple sheet metal part. So, it can be seen that die design of sheet metal parts can successfully be realized using a multi-agent system.

2024, Dimension

The goals of language education in the United States have always been informed by the social, historical, and political contexts in which the instruction takes place. In this paper, we make the case for social justice education in all... more

The goals of language education in the United States have always been informed by the social, historical, and political contexts in which the instruction takes place. In this paper, we make the case for social justice education in all language classrooms, and we explore the different threads of scholarship that inform social justice in language education. We begin with the Communities and Cultures standards, and then discuss critical pedagogy and transformative learning. Avenues and opportunities are explored for effective social justice instruction at the lesson planning and course design level, and for professional development. This paper concludes with a call to action for all language teachers.

2024, Lecture Notes in Computer Science

This paper sketches a hard real-time programming language featuring operators for expressing timeliness requirements in an abstract, implementation-independent way and presents parts of the design and verification of a provably correct... more

This paper sketches a hard real-time programming language featuring operators for expressing timeliness requirements in an abstract, implementation-independent way and presents parts of the design and verification of a provably correct code generator for that language. The notion of implementation correctness used as an implicit specification of the code generator pays attention to timeliness requirements. Hence, formal verification of the code generator design is a guarantee of meeting all deadlines when executing generated code.

2024, Pacific Asia Conference on Information Systems

Since an increasing amount of business rules management solutions are utilized, organizations search for guidance to design such solutions. As the amount of BRMS-implementations increase, the amount of implementation challenges... more

Since an increasing amount of business rules management solutions are utilized, organizations search for guidance to design such solutions. As the amount of BRMS-implementations increase, the amount of implementation challenges experienced in practice increase as well. Therefore, it is of importance to gain insights into these implementation challenges which can help guide future implementations of BRMS. Smit, Zoet and Versendaal (2017) described the challenges regarding elicitation, design and specification of business decisions and business logic; in this study, we identify the main challenges regarding 1) the verification and 2) validation of business decisions and business logic in the Dutch governmental context. Building on the collection and the analysis of two three-round focus groups and two three-round Delphi studies we report on the 17 challenges experienced by the participants. The presented results provide a grounded basis from which empirical and practical research on best practices can be further explored.

Table 1. Implementation-dependent business rule (in a decision table format)

2024, Pacific Asia Conference on Information Systems

Business rules management solutions are widely applied, standalone or in combination with business process management solutions. Yet scientific research on business rules management solutions is limited. The purpose of this paper is to... more

Business rules management solutions are widely applied, standalone or in combination with business process management solutions. Yet scientific research on business rules management solutions is limited. The purpose of this paper is to define the business rules management solution problem space. Using contingency theory and relational theory as our lens, we conducted a qualitative study on 39 business rules management solutions. The range of data sources included interviews and document analysis. From the qualitative study six situational factors have been defined to classify the business rules management solution space: 1) value proposition, 2) approach, 3) standardization, 4) change frequency, 5) n-order compliance, and 6) integrative power of the software environment. The six factors can be clustered in three structures 1) deep structure, 2) physical structure and, 3) organizational structure. The classification of the problem space provides a framework for the analysis of business rules management solutions.

2024, Indian Journal of Pharmaceutical Education and Research

Context: A Simple, selective, precise, and Stability indicating High Performance Thin Layer Chromatography (HPTLC) method of analysis of Metformin Hydrochloride (MET) and Benfotiamine (BENT) both as a bulk drug and in their combined... more

Context: A Simple, selective, precise, and Stability indicating High Performance Thin Layer Chromatography (HPTLC) method of analysis of Metformin Hydrochloride (MET) and Benfotiamine (BENT) both as a bulk drug and in their combined formulation has been developed. Method: The basic aim of this method is to separate both the drugs by HPTLC and measure their spots at 249 nm. The separation was carried out on TLC aluminium sheets of silica gel 60F 254 using Benzene: Methanol: Triethylamine (8.5:1:0.5, v/v/v) as a mobile phase. Stability of MET and BENT was carried out by forced degradation study. Result: MET and BENT gave distinct and well defined peak at Rf 0.26 and 0.72, respectively. Calibration curves were linear in range of 500-3000 and 75-450 ng/spot for MET and BENT, respectively. Method was successively applied to tablet formulation. Stability study shows that the chromatograms of samples degraded with acid, base, hydrogen peroxide, light and dry heat showed well separated spots of pure MET and BENT as well as some additional peaks at different RF values. Conclusion: The HPTLC method was also able to selectively quantitate Metformin hydrochloride and Benfotiamine in presence of their degradation products obtained in forced degradation study. Hence, the method can be used as stability indicating. The method was validated as per ICH guidelines and it is applied for the analysis of pharmaceutical dosage form containing these two drugs.

2024, Atmospheric Chemistry and Physics

Between December 2005 and 2013, the In-service Aircraft for a Global Observing System (IAGOS) program produced almost daily in situ measurements of CO and O 3 between Europe and southern Africa. IAGOS data combined with measurements from... more

Between December 2005 and 2013, the In-service Aircraft for a Global Observing System (IAGOS) program produced almost daily in situ measurements of CO and O 3 between Europe and southern Africa. IAGOS data combined with measurements from the Infrared Atmospheric Sounding Interferometer (IASI) instrument aboard the Metop-A satellite (2008-2013) are used to characterize meridional distributions and seasonality of CO and O 3 in the African upper troposphere (UT). The FLEXPART particle dispersion model and the SOFT-IO model which combines the FLEXPART model with CO emission inventories are used to explore the sources and origins of the observed transects of CO and O 3. We focus our analysis on two main seasons: December to March (DJFM) and June to October (JJASO). These seasons have been defined according to the position of Intertropical Convergence Zone (ITCZ), determined using in situ measurements from IAGOS. During both seasons, the UT CO meridional transects are characterized by maximum mixing ratios located 10 • from the position of the ITCZ above the dry regions inside the hemisphere of the strongest Hadley cell (132 to 165 ppb at 0-5 • N in DJFM and 128 to 149 ppb at 3-7 • S in JJASO) and decreasing values southward and northward. The O 3 meridional transects are characterized by mixing ratio minima of ∼ 42-54 ppb at the ITCZ (10-16 • S in DJFM and 5-8 • N in JJASO) framed by local maxima 14536 V. Lannuque et al.: Origins and characterization of CO and O 3 in the African upper troposphere ITCZ. The O 3 maxima correspond to old high-altitude air masses uplifted from either local or long-distance area of high O 3 precursor emissions (Africa and South America during all the year, South Asia mainly in JJASO) and must be created during transport by photochemistry.

2024, Atmospheric Measurement Techniques

The In-service Aircraft for a Global Observing System (IAGOS) is a European research infrastructure that equips the Airbus A340/330 with a system for monitoring atmospheric composition. The IAGOS instruments have three different... more

The In-service Aircraft for a Global Observing System (IAGOS) is a European research infrastructure that equips the Airbus A340/330 with a system for monitoring atmospheric composition. The IAGOS instruments have three different configurations: IAGOS-Core, IAGOS-Measurement of Ozone and Water Vapor by Airbus In Service Aircraft (IAGOS-MOZAIC) and IAGOS-Civil Aircraft for the Regular Investigation of the atmosphere Based on an Instrument Container (IAGOS-CARIBIC). Since 1994, there have been a total of 17 aircraft equipped. In this study, we perform an intercomparison of about 8000 landing and takeoff profiles to compare the O 3 and CO measurements performed from these different configurations. The collocated profiles used in the study met various selection criteria. The first was a maximal 1 h time difference between an ascent or descent by two different aircraft at the same airport and the second was a selection based on the similarity of air masses based on the meteorological data acquired by the aircraft. We provide here an evaluation of the internal consistency of the O 3 and CO measurements since 1994. For both O 3 and CO, we find no drift in the bias amongst the different instrument units (six O 3 and six CO IAGOS-MOZAIC instruments, nine IAGOS-Core Package1 and the two instruments used in the IAGOS-CARIBIC aircraft). This result gives us confidence that the entire IAGOS database can be treated as one continuous program and is therefore appropriate for studies of long-term trends.

2024, Atmospheric Chemistry and Physics Discussions

In situ measurements in the upper troposphere – lower stratosphere (UTLS) are performed in the framework of the European research infrastructure IAGOS (In-service Aircraft for a Global Observing System) for ozone since 1994 and for... more

In situ measurements in the upper troposphere – lower stratosphere (UTLS) are performed in the framework of the European research infrastructure IAGOS (In-service Aircraft for a Global Observing System) for ozone since 1994 and for carbon monoxide since 2002. The flight tracks cover a wide range of longitudes in the northern extratropics, extending from the North American western coast (125° W) to the eastern Asian coast (135° E), and more recently over the northern Pacific ocean. Different tropical regions are also sampled frequently, such as the Brazilian coast, central and southern Africa, southeastern Asia and the western Maritime Continent. As a result, a new set of climatologies for O<sub>3</sub> (Aug. 1994–Dec. 2013) and CO (Dec. 2001–Dec. 2013) in the upper troposphere (UT), tropopause layer and lower stratosphere (LS) are made available, including quasi-global gridded horizontal distributions, and seasonal cycles over eight well s...

2024, Atmospheric Chemistry and Physics Discussions

Since 1994, the In-service Aircraft for a Global Observing System (IAGOS) program has produced in-situ measurements of the atmospheric composition during more than 51000 commercial flights. In order to help analyzing these observations... more

Since 1994, the In-service Aircraft for a Global Observing System (IAGOS) program has produced in-situ measurements of the atmospheric composition during more than 51000 commercial flights. In order to help analyzing these observations and understanding the processes driving the observed concentration distribution and variability, we developed the SOFT-IO tool to quantify source/receptor links for all measured data. Based on the FLEXPART particle dispersion model (Stohl et al., 2005), SOFT-IO simulates the contributions of anthropogenic and biomass burning emissions from the ECCAD emission inventory database for all locations and times corresponding to the measured carbon monoxide mixing ratios along each IAGOS flight. Contributions are simulated from emissions occurring during the last 20 days before an observation, separating individual contributions from the different source regions. The main goal is to supply added-value products to the IAGOS database by evincing the geographica...

2024, Application of Accelerated Stability Studies on Linagliptin by HPTLC

Linagliptin is an inhibitor of dipeptidylpeptidase-4 (DPP-4) enzyme that degrades the incretin hormones glucagon-like peptide-1 and glucose-dependent insulin tropic polypeptide and is used to lower the increased blood sugar level. High... more

Linagliptin is an inhibitor of dipeptidylpeptidase-4 (DPP-4) enzyme that degrades the incretin hormones glucagon-like peptide-1 and glucose-dependent insulin tropic polypeptide and is used to lower the increased blood sugar level. High performance thin layer chromatography method was used for chromatographic separation of linagliptin. As stability testing is major step in the development of new drug as well as formulation, stress degradation studies were carried out according to ICH guidelines. Linagliptin was found susceptible to all the analyzed stress conditions. Chromatographic resolution of linagliptin and its degradation products was achieved by using precoated silica gel 60 F254 aluminium plates as stationary phase and chloroform:methanol (8.5:1.5, v/v) as optimum mobile phase. Densitometric detection was carried out at 242 nm. The developed method has been successfully applied for the estimation of linagliptin in tablet dosage form.