Linked Data Platform 1.0 Test Cases (original) (raw)
The Linked Data Platform specification, informally LDP, describes the use of HTTP for accessing, updating, creating and deleting resources from servers that expose their resources as Linked Data. This document introduces the conditions that LDP servers must satisfy in order to be conformant with the specification and presents a common format for describing LDP test results. These test cases both illustrate the features of the specification and can be used for testing conformance. [[LINKED-DATA-PLATFORM]]
Introduction
This document introduces a test suite that can be used to evaluate the conformance of LDP servers to the LDP specification [[LINKED-DATA-PLATFORM]]. The document also presents the design principles that guided the development of the test suite, a testing process, and a common format for describing test results.
The purpose of the test cases is to illustrate the specification features and to help in testing conformance. The provided set of test cases is "incomplete" in the sense that passing all the tests does not prove that a given system conforms to the LDP specification; failing a test does, however, prove that the system does not conform to the specification.
The presented format is intended to facilitate the use of tests by LDP server developers, e.g., in a test harness, as well as the extension of the test suite with new tests. Developers can check the LDP Primer [[LDP-PRIMER]] for concrete examples of inputs and expected outputs that can be used for testing.
Design principles
Generic vs domain-specific servers
There will be two types of servers implementing the LDP specification:
- Generic storage systems that allow interacting with their resources by means of the LDP specification. These servers do not impose any restriction on LDPRs.
- Servers exposing their data using the LDP specification. These servers impose restrictions on LDPRs since they have an underlying business logic and data model.
In order to cover both types of servers, there are some basic input data and a way to provide specific input data in the test suite. It is up to the evaluator to define specific input data for a certain server. Evaluators must include these input data along with the results when reporting the results of a certain server.
Protocol evaluation vs data evaluation
The LDP specification includes restrictions on LDP servers at the protocol level and at the data level. Currently, the restrictions at the data level are minimal and servers are not forced to have a certain behavior when processing LDPR representations. Therefore, the test suite evaluates LDP servers mostly at a protocol level; the only exception is in the case of LDPCs, since they are required to include a rdf:type
, containment and membership statements in their representation.
It is out of the scope of the test suite to test LDP servers in terms of the restrictions imposed by their underlying data models.
Test suite coverage
The test suite covers those requirements present in the LDP specification of any compliance level: MUST, SHOULD and MAY. This set of absolute requirements identifies the core subset of the LDP specification,LDP Core from now on, and any LDP server that satisfies these absolute requirements will be an LDP Core conformant server.
It is out of the scope of the test suite to test other levels of conformance in terms of optional capabilities (e.g., paging, patch formats).
Furthermore, the LDP specification [[LINKED-DATA-PLATFORM]] contains a number of requirements that can not validated by automated means, these are identified in a coverage report for the [[LINKED-DATA-PLATFORM]]. These requirements will need to be validated by some non-automatic method and results evaluated.
Separation of results and assertions
Instead of defining expected results for tests, which will be dependent on specific implementations, we have defined the assertions to be made over test results. In order to successfully pass a test, all of the assertions must be satisfied.
Separating test outputs and assertions has other benefits: it makes simpler to report tool results and assertions can be made by a third party.
Traceability of test cases
Any test case and its produced results and assertions should be related to those documents that are relevant for it (e.g., specifications, uses cases, etc.).
Testing process
The LDP Test Cases are defined in a location, within Java source code. [[LDP-TESTCASES]] Details about each individual test case, such as information about whether it can be executed by automated means or manually, will be found in the Java source code annotations. Also in the Java source code annotations the status of each test case, such as approved by the LDP Working Group, awaiting approval or not yet implemented.[[LDP-TESTSUITE-COVERAGE]]
- The person or agent in charge of executing the test cases in a specific LDP server will take the test case definitions and run every test case on the LDP server. The execution of test cases must produce a test execution report for the LDP server, in RDF format, that contains for every test case: the specific inputs used during its execution, the produced outputs, and the assertion of whether the test case is passed. The test execution report must be supplied as defined in the document on implementation conformance reports. [[LDP-CONFORM]]
- A report generator software will take all the LDP server execution reports and will generate an implementation report that includes the results of all the LDP servers. [[LDP-CONFORM]]
Submitting results
Here is a summary of the steps needed for an assertor to submit the compliance results for an implementation.
- Run the automated test suite [[LDP-TESTCASES]]
- Run the manual tests and update the results within the same EARL results file
- Email results file (at minimum) EARL file (Turtle preferred) to public-ldp-comments@w3.org. Be sure to indicate if this is intended to replace any previously submitted results. It is also helpful to indicate if these are preliminary results and that plans are to submit additional progress in the future.
Describing testing artifacts in RDF
Namespaces used
The following vocabularies are reused for describing the testing artifacts: DOAP (doap
), Dublin Core (dcterms
) [[DC11]], FOAF (foaf
) [[FOAF]], and W3C Test Metadata (td
) [[TEST-METADATA]].
All the new required entities that are not covered by those vocabularies have been defined under a new namespace (ldpt
), as well as the LDP test cases.
Next we present the definition of these namespaces and of all the namespaces used in the examples.
dcterms: http://purl.org/dc/terms/doap: http://usefulinc.com/ns/doap#earl: http://www.w3.org/ns/earl#foaf: http://xmlns.com/foaf/0.1/mf: http://www.w3.org/2001/sw/DataAccess/tests/test-manifest#rdfs: http://www.w3.org/2000/01/rdf-schema#rdft: http://www.w3.org/ns/rdftest#td: http://www.w3.org/2006/03/test-description#ldpt: http://www.w3.org/ns/ldp/test#
Test case description
Atest case is defined as an instance of thetd:TestCase
class and it can be further described using the following properties:
rdfs:label
. The human-readable label of the test.mf:name
. Likerdfs:label
but less human focus.dcterms:title
. The name of the test.dcterms:description
. The description of the test.dcterms:contributor
. The person (foaf:Person
) contributing the test.dcterms:subject
. The grouping of test or compliance level.td:reviewStatus
. The status of the test; possible status are:td:unreviewed
,td:approved
ortd:rejected
.rdfs:seeAlso
. A link to the specification it refers to.td:specificationReference
. An excerpt (tn:Excerpt
) of the specification that is relevant to the test.
Anexcerpt is defined as an instance of thetn:Excerpt
class and it can be further described using the following properties:
rdfs:seeAlso
: The document where the excerpt is included.tn:includesText
. The excerpt from the document.
The following example contains the description of one of the LDP test cases.
ldpt:CommonResource-GetResource a td:TestCase; rdfs:label "CommonResource-GetResource"; mf:name "CommonResource-GetResource"; dcterms:title "GET on an LDPR"; dcterms:description "Tests making a GET request on an existing LDPR"; dcterms:contributor :RaulGarciaCastro; td:reviewStatus td:approved; rdfs:seeAlso http://www.w3.org/TR/ldp#ldpr-get-must; dcterms:subject "MUST" ; td:specificationReference [ a tn:Excerpt; rdfs:seeAlso http://www.w3.org/TR/ldp#ldpr-get-must; tn:includesText "LDP servers MUST support the HTTP GET Method for LDPRs". ].
:RaulGarciaCastro a foaf:Person; rdfs:label "Raúl García-Castro"; owl:sameAs http://www.garcia-castro.com/#me.
Test case assertion description
Anassertion is defined as an instance of theearl:Assertion
class and it can be further described using the following properties:
earl:subject
.The subject (doap:Project
) asserted.earl:test
. The test case (td:TestCase
) to which the assertion refers to.dcterms:date
. The date when the assertion was performed.earl:assertedBy
. The validator (doap:Project
) that makes the assertion.earl:mode
. The execution mode of the validator. In this case it will always beearl:automatic
.earl:result
. The outcome value (earl:OutcomeValue
) of the assertion.
The following example contains the description of one test assertion.
:TCR1-Assertion-SomeServer a earl:Assertion; earl:subject :AwesomeLDP; earl:test ldpt:CommonResource-GetResource; earl:assertedBy :AwesomeLDP; earl📳 earl:automatic; earl:result [ a earl:OutcomeValue ; dcterms:date "2014-07-06T09:30:10"; earl:outcome earl:passed ].
:AwesomeLDP a doap:Project, earl:TestSubject, earl:Software, earl:Assertor; doap:name "Awesome LDP"; doap:description "Awesome LDP implementation"; doap:developer [ a foaf:Person ; foaf:mbox "awesomeldp@example.org" ; foaf:name "Dope Developer" ]; doap:homepage http://example.org/AwesomeLDP; doap:programming-language "JavaScript".
Change history
- 2014-08-25 Added section on submitting results and fixed references (SS)
- 2014-07-08 Various grammar fixes and removed todos (SS)
- 2014-07-07 Further alignment with current testing approach and namespaces (SS)
- 2014-06-22 Brought inline with separate automated testsuite hosted on GitHub (SS)
- 2014-04-09 Updated according to Last Call Working Draft from 11 March 2014 (FS and RGC)
- 2013-08-27 Updated according to Last Call Working Draft from 30 July 2013 (RGC)
- 2013-06-03 Updated to use ReSpec (RGC)
- 2013-06-03 Implemented changes suggested by Eric Prud'hommeaux (RGC)