Solid QA (original) (raw)

Abstract

This document describes the Solid Quality Assurance (QA) policy, processes, and procedures. It details the requirements and recommendations for the publication and use of Solid technical reports, test suites, test cases, test assessments, and test reports to improve the quality of technical reports at critical development stages, promote wide deployment and proper implementation of technical reports through open implementation reports, help produce quality test suites, and advance the development and assessment of test cases.

Status of This Document

Table of Contents

  1. Abstract
  2. Status of This Document
  3. 1. Introduction
    1. 1.1 Terminology
    2. 1.2 Namespaces
    3. 1.3 Data Formats
  4. 2. Conformance
    1. 2.1 Normative and Informative Content
    2. 2.2 Specification Category
    3. 2.3 Classes of Products
    4. 2.4 Interoperability
  5. 3. Technical Report
    1. 3.1 Technical Report Description
  6. 4. Test Report
    1. 4.1 Test Report Description
    2. 4.2 Test Case Description
    3. 4.3 Test Assertion Description
    4. 4.4 Test Report Notification
    5. 4.5 Project Maintainer Input and Review
    6. 4.6 Implementation Report Description
  7. 5. Test Suite
    1. 5.1 Test Suite Description
    2. 5.1 Test Environment
  8. 6. Test Assessment
    1. 6.1 Test Review Policy
    2. 6.1 Test Review Criteria
  9. 7. Considerations
    1. 8.1 Security Considerations
    2. 8.2 Privacy Considerations
    3. 8.3 Accessibility Considerations
    4. 8.4 Internationalization Considerations
    5. 8.5 Security and Privacy Review
  10. A. Changelog
  11. B. Acknowledgements
  12. C. References
  13. C.1 Normative References
  14. C.2 Informative References

Introduction

This section is non-normative.

Specifications in the Solid ecosystem Solid Technical Reports [SOLID-TECHNICAL-REPORTS] describe how implementations can be interoperable by using Web communication protocols, global identifiers, authentication and authorization mechanisms, data formats and shapes, notifications, and query interfaces.

Writing tests in a way that allows implementations to conform to the requirements of a technical report gives Solid projects confidence that their software is compatible with other implementations. This in turn gives authors of technical reports and software implementers confidence that they can rely on the Solid ecosystem to deliver on the promise of interoperability based on open standards. Implementation and interoperability experience can be verified by reporting on implementations passing open test suites.

The goal of this document is to describe the Solid Quality Assurance (QA) policy, processes, and procedures. The document details the requirements and advisements towards the publication and consumption of Solid technical reports, test suites, test cases, test assessments, and test reports to:

This document is influenced by the W3C Quality Assurance activity work encompassing W3C processes, specification authoring and publishing, and quality assurance, including: W3C Process Document, Variability in Specifications, QA Framework: Specification Guidelines, The QA Handbook, Test Metadata, Evaluation and Report Language (EARL) 1.0 Schema.

This specification is for:

Terminology

This section is non-normative.

The Solid QA defines the following terms. These terms are referenced throughout this document.

URI

A Uniform Resource Identifier (URI) provides the means for identifying resources [RFC3986].

Namespaces

Prefixes and Namespaces

Prefix Namespace Description
dcterms http://purl.org/dc/terms/ [DC-TERMS]
doap http://usefulinc.com/ns/doap# DOAP
earl http://www.w3.org/ns/earl# [EARL10-Schema]
prov http://www.w3.org/ns/prov# [prov-o]
rdf http://www.w3.org/1999/02/22-rdf-syntax-ns# [rdf-schema]
skos http://www.w3.org/2004/02/skos/core# [skos-reference]
solid http://www.w3.org/ns/solid/terms# Solid Terms
spec http://www.w3.org/ns/spec# Spec Terms
td http://www.w3.org/2006/03/test-description# Test Description

Data Formats

This specification uses the RDF language to describe technical reports, test reports, test suites, and test cases. Implementers are encouraged to produce human-visible and machine-readable representations with RDFa in host languages such as HTML and SVG.

Conformance

This section describes the conformance model of the Solid QA.

Normative and Informative Content

All assertions, diagrams, examples, and notes are non-normative, as are all sections explicitly marked non-normative. Everything else is normative.

The key words “MUST”, “MUST NOT”, “SHOULD”, and “MAY” are to be interpreted as described in BCP 14 [RFC2119] [RFC8174] when, and only when, they appear in all capitals, as shown here.

The key words “strongly encouraged”, “strongly discouraged”, “encouraged", “discouraged", “can", “cannot”, “could”, “could not”, “might”, and “might not” are used for non-normative content.

Specification Category

The Solid QA identifies the following Specification Category to distinguish the types of conformance: API, notation/syntax, set of events, processor behaviour, protocol.

Classes of Products

The Solid QA identifies the following Classes of Products for conforming implementations. These products are referenced throughout this specification.

Technical Report

A Technical Report is a document outlining recommendations and the level of conformance that various classes of products, processes, or services can achieve [QA GLOSSARY].

Test Case

A Test Cases is an individual test with a purpose that maps to measurable or testable behaviours, actions, or conditions in a Technical Report [QA GLOSSARY].

Test Report

A Test Report is a resource that describes the level of conformance of a project’s tests against Test Cases.

Test Suite

A Test Suite is collection of documents and software designed to verify an implementation's degree of conformance by using Technical Reports and Test Cases, and it generates Test Reports [QA GLOSSARY].

Interoperability

Interoperability of implementations for Test Suite and Technical Report is tested by evaluating an implementation’s ability to consume and process data that conform to this specification.

Technical Report

Technical Report Description

The Solid Technical Reports Contributing Guide provides the recommendations for publishing technical reports following the Linked Data design principles, where significant units of information, such as concepts and requirements, are given an identifier, and described with a concrete RDF syntax. The Spec Terms vocabulary provides classes and properties that can be used to describe any significant unit of information in technical reports, as well as supporting the description of test cases and test reports. The SKOS data model can be used to identify, describe, and link concepts and definitions across technical reports.

Add other requirements from Spec Terms and SKOS.

Defining URI Templates for implementation report, e.g., https://solidproject.org/test-reports/{technical-report-short-name}/summary, and test report, e.g., https://solidproject.org/test-reports/{technical-report-short-name}/{uuid}.

Test Report

This section describes the requirements for publishing test reports and summaries.

Test Report Description

solid/test-suite-panel/issues/5

The Test Report Description includes information pertaining to conformance and interoperability of an implementation.

Document metadata:

Additional notes about the report can be expressed as a Web Annotation (oa:Annotation).

Agent (person or software) that had a role in the production of the test report can be expressed as a PROV-O activity association (prov:wasAssociatedWith).

For publication status, some TRs use pso:holdsStatusInTime with pso:withStatus - is there something from a common vocabulary? What visual cues should be required to indicate publication status at a glance?

Implementations that are software projects MUST be described with the Description of a Project vocabulary [DOAP].

Description of a project:

Note: Test Report Description: Description of a Project

The subject of the doap:Project for Test Report Description coincides with the object of the earl:subject in Test Assertion Description.

To help distinguish project releases in test reports, it is encouraged to use versioned URIs for projects.

While some information about the project can be accompanied with the test report, it is encouraged that projects are self-describing documents.

What should be the recommendation for implementations that are not software projects? Perhaps equivalent to a top concept of spec:ClassesOfProducts?

Test assertions:

Test reports MUST incorporate additional information about test criteria indicating test authors and reviewers, review status, version of the test criterion, software and setup used to run the tests, provenance and coverage and test suite (see also Test Assertion Description).

Test reports with approved status MUST NOT include assertions related to test criteria with rejected review status (td:rejected).

Note: Test Report Description: Test Assertion

To help distinguish test criteria, it is encouraged to use versioned URIs for criteria.

To convey association between a test criterion and its reviews, it is encouraged to use the Web Annotation Vocabulary with the oa:assessing motivation.

Should Web Annotation Vocabulary be required to convey the relationship between test criteria and reviews?

Test Case Description

spec:testScript may need to be rdfs:subPropertyOf td:input or td:informationResourceInput, or use those properties instead.

When referencing a requirement with spec:requirementReference, it is strongly encouraged to use versioned URLs of technical reports where available with preference to URLs covered by a persistence policy.

Example: Test Case

@prefix sopr: <https://solidproject.org/TR/2022/protocol-20221231#> . `` :server-content-type-reject a td:TestCase ; spec:requirementReference sopr:server-content-type ; td:reviewStatus td:unreviewed ; td:expectedResults :server-content-type-expected-result ; dcterms:contributor <https://csarven.ca/#i> .

A test case for server rejecting requests with reference to expected results.

Test Assertion Description

A Test Assertion indicates measurable or testable statements of behaviour, action, or condition derived from specification's requirements. A test assertion is stated by an entity carrying out the test; indicating contextual result of a test; with a particular process; based on a criterion; that is used to evaluate an implementation.

Test assertions MUST use the Evaluation and Report Language 1.0 Schema [EARL10-Schema].

Test Suite implementers are encouraged to follow the Developer Guide for Evaluation and Report Language 1.0 [EARL10-Guide].

Test Report Notification

GitHub repo/directory and/or report sent as an LDN (TBD: either including or requesting maintainer’s approval.)

Should Activity Vocabulary and Linked Data Notifications be one of the ways to submit test reports as a notification?

Publication of reports can be pre-authorized for approved tests.

Project Maintainer Input and Review

Project maintainer will be notified to review the publication of a report, and if no objection within a certain time (TBD), it can be approved for publication by the submitter (or test suite panel). During the review process, maintainers will be given the opportunity to provide explanatory notes to go with the report.

Implementation Report Description

The Implementation Report Description refers to the criteria and resolutions set forth by the Group publishing a Technical Report to demonstrate implementation experience; refers to test reports; provides a summary.

Document metadata:

Similar to Test Report Description document metadata. Reuse or redefine?

Referencing individual test reports:

Test Suite

This section describes the requirements for test suite and test metadata.

solid/test-suite-panel/issues/6

Test Suite Description

Description of test suites (e.g., license, date, written by, reviewed by), the software (e.g., repository, version, maintainers), setup (e.g., platform, configuration), provenance (e.g., activity, entity, agent), input about environment (e.g., combination of subject and setup). Meet the requirements of reporting (issue 5) and test review checklist (issue 7).

Test Environment

Documentation on:

Do preliminary checks against multiple implementations to catch anomalies or to tease out issues with the tests themselves. Tests authors and specification authors should coordinate regarding the test design.

PR the updated test, mark what it replaces, mark it as to be reviewed, request reviews.

Notify specification authors and editors (and other group of contributors) about new tests and request reviews, e.g., tagging on GitHub.

Tagging the project maintainer (issue 5).

Provide information indicating to what extent the test suite completely or proportionally covers the specification it aims to support (issue 5).

Link to project maintainer’s WebID or GitHub account.

Test Assessment

This section describes the process for authoring and reviewing tests.

solid/test-suite-panel/issues/7

Test Review Policy

Test Review Criteria

Considerations

This section details security, privacy, accessibility and internationalization considerations.

Some of the normative references with this specification point to documents with a Living Standard or Draft status, meaning their contents can still change over time. It is advised to monitor these documents, as such changes might have implications.

Security Considerations

This section is non-normative.

Privacy Considerations

This section is non-normative.

Accessibility Considerations

This section is non-normative.

Internationalization Considerations

This section is non-normative.

Security and Privacy Review

Change Log

This section is non-normative.

The summary of editorial and substantive changes in this section are based on W3C Process Document Classes of Changes [W3C-PROCESS].

Acknowledgements

The Community Group gratefully acknowledges the work that led to the creation of this specification, and extends sincere appreciation to those individuals that worked on technologies and specifications that deeply influenced our work.

The Community Group would like to thank the following individuals for their useful comments, both large and small, that have led to changes to this specification over the years:

References

Normative References

Informative References