Solid QA (original) (raw)
Abstract
This document describes the Solid Quality Assurance (QA) policy, processes, and procedures. It details the requirements and recommendations for the publication and use of Solid technical reports, test suites, test cases, test assessments, and test reports to improve the quality of technical reports at critical development stages, promote wide deployment and proper implementation of technical reports through open implementation reports, help produce quality test suites, and advance the development and assessment of test cases.
Status of This Document
Table of Contents
- Abstract
- Status of This Document
- 1. Introduction
- 2. Conformance
- 3. Technical Report
- 4. Test Report
- 5. Test Suite
- 6. Test Assessment
- 7. Considerations
- A. Changelog
- B. Acknowledgements
- C. References
- C.1 Normative References
- C.2 Informative References
Introduction
This section is non-normative.
Specifications in the Solid ecosystem Solid Technical Reports [SOLID-TECHNICAL-REPORTS] describe how implementations can be interoperable by using Web communication protocols, global identifiers, authentication and authorization mechanisms, data formats and shapes, notifications, and query interfaces.
Writing tests in a way that allows implementations to conform to the requirements of a technical report gives Solid projects confidence that their software is compatible with other implementations. This in turn gives authors of technical reports and software implementers confidence that they can rely on the Solid ecosystem to deliver on the promise of interoperability based on open standards. Implementation and interoperability experience can be verified by reporting on implementations passing open test suites.
The goal of this document is to describe the Solid Quality Assurance (QA) policy, processes, and procedures. The document details the requirements and advisements towards the publication and consumption of Solid technical reports, test suites, test cases, test assessments, and test reports to:
- improve the quality of technical reports at critical stages of their development;
- promote wide deployment and proper implementation of technical reports with open implementation reports;
- help produce quality test suites;
- advance the development and assessment of test cases.
This document is influenced by the W3C Quality Assurance activity work encompassing W3C processes, specification authoring and publishing, and quality assurance, including: W3C Process Document, Variability in Specifications, QA Framework: Specification Guidelines, The QA Handbook, Test Metadata, Evaluation and Report Language (EARL) 1.0 Schema.
This specification is for:
Terminology
This section is non-normative.
The Solid QA defines the following terms. These terms are referenced throughout this document.
A Uniform Resource Identifier (URI) provides the means for identifying resources [RFC3986].
Namespaces
Prefixes and Namespaces
Prefix | Namespace | Description |
---|---|---|
dcterms | http://purl.org/dc/terms/ | [DC-TERMS] |
doap | http://usefulinc.com/ns/doap# | DOAP |
earl | http://www.w3.org/ns/earl# | [EARL10-Schema] |
prov | http://www.w3.org/ns/prov# | [prov-o] |
rdf | http://www.w3.org/1999/02/22-rdf-syntax-ns# | [rdf-schema] |
skos | http://www.w3.org/2004/02/skos/core# | [skos-reference] |
solid | http://www.w3.org/ns/solid/terms# | Solid Terms |
spec | http://www.w3.org/ns/spec# | Spec Terms |
td | http://www.w3.org/2006/03/test-description# | Test Description |
Data Formats
This specification uses the RDF language to describe technical reports, test reports, test suites, and test cases. Implementers are encouraged to produce human-visible and machine-readable representations with RDFa in host languages such as HTML and SVG.
Conformance
This section describes the conformance model of the Solid QA.
Normative and Informative Content
All assertions, diagrams, examples, and notes are non-normative, as are all sections explicitly marked non-normative. Everything else is normative.
The key words “MUST”, “MUST NOT”, “SHOULD”, and “MAY” are to be interpreted as described in BCP 14 [RFC2119] [RFC8174] when, and only when, they appear in all capitals, as shown here.
The key words “strongly encouraged”, “strongly discouraged”, “encouraged", “discouraged", “can", “cannot”, “could”, “could not”, “might”, and “might not” are used for non-normative content.
Specification Category
The Solid QA identifies the following Specification Category to distinguish the types of conformance: API, notation/syntax, set of events, processor behaviour, protocol.
Classes of Products
The Solid QA identifies the following Classes of Products for conforming implementations. These products are referenced throughout this specification.
A Technical Report is a document outlining recommendations and the level of conformance that various classes of products, processes, or services can achieve [QA GLOSSARY].
A Test Cases is an individual test with a purpose that maps to measurable or testable behaviours, actions, or conditions in a Technical Report [QA GLOSSARY].
A Test Report is a resource that describes the level of conformance of a project’s tests against Test Cases.
A Test Suite is collection of documents and software designed to verify an implementation's degree of conformance by using Technical Reports and Test Cases, and it generates Test Reports [QA GLOSSARY].
Interoperability
Interoperability of implementations for Test Suite and Technical Report is tested by evaluating an implementation’s ability to consume and process data that conform to this specification.
Technical Report
Technical Report Description
The Solid Technical Reports Contributing Guide provides the recommendations for publishing technical reports following the Linked Data design principles, where significant units of information, such as concepts and requirements, are given an identifier, and described with a concrete RDF syntax. The Spec Terms vocabulary provides classes and properties that can be used to describe any significant unit of information in technical reports, as well as supporting the description of test cases and test reports. The SKOS data model can be used to identify, describe, and link concepts and definitions across technical reports.
Add other requirements from Spec Terms and SKOS.
- One
spec:testSuite
property to refer to a test suite. - One
spec:implementationReport
property to refer to an implementation report.
Defining URI Templates for implementation report, e.g., https://solidproject.org/test-reports/{technical-report-short-name}/summary
, and test report, e.g., https://solidproject.org/test-reports/{technical-report-short-name}/{uuid}
.
Test Report
This section describes the requirements for publishing test reports and summaries.
Test Report Description
solid/test-suite-panel/issues/5
The Test Report Description includes information pertaining to conformance and interoperability of an implementation.
Document metadata:
- One
dcterms:license
property to indicate the license of the report. - One
dcterms:created
property to indicate the date and time of the report. - One publication status property (TBD) to state the publication status of the report.
- One activity association property (TBD) to indicate the agent that had a role in the production of the report.
- One submitted by property (TBD) to indicate the entity that proposed the report for publication.
- One approved by (TBD) property to indicate the entity that accepted the report for publication.
- Zero or one
dcterms:description
property to include additional notes about the report from the submitter.
Additional notes about the report can be expressed as a Web Annotation (oa:Annotation
).
Agent (person or software) that had a role in the production of the test report can be expressed as a PROV-O activity association (prov:wasAssociatedWith
).
For publication status, some TRs use pso:holdsStatusInTime
with pso:withStatus
- is there something from a common vocabulary? What visual cues should be required to indicate publication status at a glance?
Implementations that are software projects MUST be described with the Description of a Project vocabulary [DOAP].
Description of a project:
- One
rdf:type
property whose object isdoap:Project
. - Zero or one
doap:name
property to state the name of the project. - Zero or one
doap:repository
property to indicate the location of of project's source code. - Zero or one
doap:release
property to indicate the version information of a project release. - Zero or one
doap:maintainer
property to refer to the maintainers of a project.
Note: Test Report Description: Description of a Project
The subject of the doap:Project
for Test Report Description coincides with the object of the earl:subject
in Test Assertion Description.
To help distinguish project releases in test reports, it is encouraged to use versioned URIs for projects.
While some information about the project can be accompanied with the test report, it is encouraged that projects are self-describing documents.
What should be the recommendation for implementations that are not software projects? Perhaps equivalent to a top concept of spec:ClassesOfProducts
?
Test assertions:
Test reports MUST incorporate additional information about test criteria indicating test authors and reviewers, review status, version of the test criterion, software and setup used to run the tests, provenance and coverage and test suite (see also Test Assertion Description).
Test reports with approved status MUST NOT include assertions related to test criteria with rejected review status (td:rejected
).
Note: Test Report Description: Test Assertion
To help distinguish test criteria, it is encouraged to use versioned URIs for criteria.
To convey association between a test criterion and its reviews, it is encouraged to use the Web Annotation Vocabulary with the oa:assessing
motivation.
Should Web Annotation Vocabulary be required to convey the relationship between test criteria and reviews?
Test Case Description
- One
rdf:type
whose object istd:TestCase
. - One
spec:requirementReference
property to refer to the specification requirement that the test case is testing. - One
td:reviewStatus
property to indicate the status of a test (at the time when the test was run) with an object one of:td:unreviewed
,td:onhold
,td:assigned
,td:accepted
,td:approved
,td:rejected
. - Zero or one
td:input
property to indicate the parameter or data that are needed for the test execution. - One
td:expectedResults
property to indicate the results that a conformant implementation is expected to produce when this test is executed. - Zero or one
td:preCondition
property to indicate that a condition must be met before the test is executed. - Zero or one
td:purpose
property to state the reason for the test with additional context. - Zero or one
dcterms:title
property to provide a human-oriented name for the test. - Zero or one
dcterms:description
property to provide a description of the nature and characteristic of the test. - One or more
dcterms:contributor
s to indicate individuals or organisations that contributed to this test.
spec:testScript
may need to be rdfs:subPropertyOf
td:input
or td:informationResourceInput
, or use those properties instead.
When referencing a requirement with spec:requirementReference
, it is strongly encouraged to use versioned URLs of technical reports where available with preference to URLs covered by a persistence policy.
Example: Test Case
@prefix sopr: <https://solidproject.org/TR/2022/protocol-20221231#> .
``
:server-content-type-reject
a td:TestCase ;
spec:requirementReference sopr:server-content-type ;
td:reviewStatus td:unreviewed ;
td:expectedResults :server-content-type-expected-result ;
dcterms:contributor <https://csarven.ca/#i> .
A test case for server rejecting requests with reference to expected results.
Test Assertion Description
A Test Assertion indicates measurable or testable statements of behaviour, action, or condition derived from specification's requirements. A test assertion is stated by an entity carrying out the test; indicating contextual result of a test; with a particular process; based on a criterion; that is used to evaluate an implementation.
Test assertions MUST use the Evaluation and Report Language 1.0 Schema [EARL10-Schema].
Test Suite implementers are encouraged to follow the Developer Guide for Evaluation and Report Language 1.0 [EARL10-Guide].
Test Report Notification
GitHub repo/directory and/or report sent as an LDN (TBD: either including or requesting maintainer’s approval.)
Should Activity Vocabulary and Linked Data Notifications be one of the ways to submit test reports as a notification?
Publication of reports can be pre-authorized for approved tests.
Project Maintainer Input and Review
Project maintainer will be notified to review the publication of a report, and if no objection within a certain time (TBD), it can be approved for publication by the submitter (or test suite panel). During the review process, maintainers will be given the opportunity to provide explanatory notes to go with the report.
Implementation Report Description
The Implementation Report Description refers to the criteria and resolutions set forth by the Group publishing a Technical Report to demonstrate implementation experience; refers to test reports; provides a summary.
Document metadata:
Similar to Test Report Description document metadata. Reuse or redefine?
Referencing individual test reports:
- One or more
spec:testReport
s to refer to test reports.
Test Suite
This section describes the requirements for test suite and test metadata.
solid/test-suite-panel/issues/6
Test Suite Description
Description of test suites (e.g., license, date, written by, reviewed by), the software (e.g., repository, version, maintainers), setup (e.g., platform, configuration), provenance (e.g., activity, entity, agent), input about environment (e.g., combination of subject and setup). Meet the requirements of reporting (issue 5) and test review checklist (issue 7).
- One
rdf:type
property whose object isspec:TestSuite
(TBD). - One or more
spec:testCase
s to refer to test cases.
Test Environment
Documentation on:
- how to set up a test environment, how to build the test system (if necessary).
- how to write tests that are in general, short, self-contained, and maybe provide best practices and guidelines.
- how to run tests (provide a set of steps to test and obtain the result). Tests may be run in different ways depending on the specification requirement, e.g., command-line, containers, Web browser.
Do preliminary checks against multiple implementations to catch anomalies or to tease out issues with the tests themselves. Tests authors and specification authors should coordinate regarding the test design.
PR the updated test, mark what it replaces, mark it as to be reviewed, request reviews.
Notify specification authors and editors (and other group of contributors) about new tests and request reviews, e.g., tagging on GitHub.
Tagging the project maintainer (issue 5).
Provide information indicating to what extent the test suite completely or proportionally covers the specification it aims to support (issue 5).
Link to project maintainer’s WebID or GitHub account.
Test Assessment
This section describes the process for authoring and reviewing tests.
solid/test-suite-panel/issues/7
Test Review Policy
- Test review has a URI and its contents are publicly accessible when dereferenced.
- Test reviewer can be anyone (other than the original test author) that has the required experience with the specification. TBD whether at least one reviewer must be the author of the specification.
Test Review Criteria
- The test has a URI and its contents are publicly accessible when dereferenced.
- The test links to specification requirements.
- The CI jobs on the pull request have passed. (TBD)
- It is obvious what the test is trying to test.
- The test passes when it’s supposed to pass.
- The test fails when it’s supposed to fail.
- The test is testing what it thinks it’s testing.
- The specification backs up the expected behaviour in the test.
- The test is automated as - TBD - unless there’s a very good reason for it not to be.
- The test does not use external resources. (TBD)
- The test does not use proprietary features (vendor-prefixed or otherwise).
- The test does not contain commented-out code
- The test is placed in the relevant location.
- The test has a reasonable and concise (file)name.
- If the test needs to be run in some non-standard configuration or needs user interaction, it is a manual test.
- The title is descriptive but not too wordy.
Considerations
This section details security, privacy, accessibility and internationalization considerations.
Some of the normative references with this specification point to documents with a Living Standard or Draft status, meaning their contents can still change over time. It is advised to monitor these documents, as such changes might have implications.
Security Considerations
This section is non-normative.
Privacy Considerations
This section is non-normative.
Accessibility Considerations
This section is non-normative.
Internationalization Considerations
This section is non-normative.
Security and Privacy Review
Change Log
This section is non-normative.
The summary of editorial and substantive changes in this section are based on W3C Process Document Classes of Changes [W3C-PROCESS].
Acknowledgements
The Community Group gratefully acknowledges the work that led to the creation of this specification, and extends sincere appreciation to those individuals that worked on technologies and specifications that deeply influenced our work.
The Community Group would like to thank the following individuals for their useful comments, both large and small, that have led to changes to this specification over the years:
- Alain Bourgeois
- April Daly
- Emmet Townsend
- Hadrian Zbarcea
- Kjetil Kjernsmo
- Michiel de Jong
- Pete Edwards
- Ted Thibodeau Jr
- Tim Berners-Lee
- Wouter Termont
- Yvo Brevoort