Conformance roadmap 2021

[this post is a wiki, you can modify directly if desired]

Discussion thread for approaching formal conformance specifications, tools, test environments etc. The openEHR board has agreed to formally support a conformance project that will result in some conformance testing capability by Nov/Dec 2021, in order to support increasing procurement activities. It also wants a roadmap for work in 2022.

Goals

  • enabling procuring orgs to be provided ‘standard’ (comprehensible) conformance statements by tendering vendors for their offerings;
  • enabling procuring bodies to perform conformance testing on products to determine veracity of claims in tender responses, also post integration (does this thing still work when hooked up to product xyz?), open source offerings etc;
  • enabling vendors & implementers to self-test and initially self-certify;
  • (eventually) enabling reliable 3rd party testing with certification.

The summary of the above may be understood as: ensure that dubious claims for dubious offerings can be distinguished from genuine products in which substantial investment has been made.

Below is my current view on this based on the unfinished CNF spec done a couple of years ago; we need the SEC to crystallise a common view, so just take the below as a draft to be worked on.

Principles

I believe we should work to the following principles (feel free to adjust):

  • conformance is concretely tested against ITS specifications, since these are the concrete instantiations of the specifications whose correct implementation we want to test;
    • this means that a product can’t claim to universally conform to e.g. ‘the RM’ or ‘Task Planning’ or whatever, but instead to openEHR REST API v1.1.0, with e.g. JSON, XML etc payloads, or some other ITS artefacts;
  • the service model is useful to capture pure semantics, particularly pre- and post-conditions, also modified parts of the RM, so that we always know the intended semantics of what we have in REST, XSD, etc - this is important because how things are represented in HTTP headers / body, where certain parameters are, XSD tricks that we need to use all can obscure the underlying semantics a bit;
  • the service model also provides a useful way to name the capabilities being tested, e.g. I_DEFINITION_ADL14.list_all_opts();
    • this may be applied in different concrete ITS forms (e.g. XML OPTs, JSON OPTs, different versions of REST API etc);
    • we can construct the list of capabilities for a component from the service model - known as the Conformance Schedule below;
  • we assume that the legal veracity of conformance claims made for products by vendors is the business of commercial contracts, i.e. the conformance framework can only provide a technical basis for testing, not a guarantee that a claim is true, until such time as we have certificated conformance undertaken by a trusted 3rd party org (or openEHR).
  • more?

Framework structure

We’ve done a fair bit of work on this over the last few years. At the bottom end, I think we have the following concepts:

  • Product
    • component [*]
      • capability [*]
        • test [*]

For a platform, this will be:

  • Platform - e.g. ACME openEHR Platform, including CDR, Terminology, Definitions, Demographics, …
    • service [*] - e.g. EHR service
      • call [*] - e.g. I_EHR.create_ehr()
        • test [*] - e.g. 5 tests to demonstrate this call working in some ITS profile

In the existing Conformance spec, there are the high-level notions of:

  • Conformance Schedule - list what can possibly be tested
  • Conformance Profiles - collections of Components/capabilities to make up viable products / business components
  • Conformance Statement - some subset of the Conformance Schedule, usually defined or related to a profile, that describes what some product actually supports, and indicates which tests it passes
  • Conformance Certificate - a Conformance Statement + report of how tested + certification of having been tested;

I am happy to revisit this. @pablo may have suggestions for improvement.

Roadmap

What we need to achieve:

  • a v1.0 Conformance spec, rewritten as necessary and containing Conformance Schedule, Profiles etc as we agree them to be;
  • a set of test suites and test cases covering the Schedule in the spec, for existing ITS, most likely derived from the EhrBase environment;
  • an up to date Service model spec (relatively easy - @pablo and I can do this);
  • Medium term: a plan to establish a CI-based tool environment that is either hosted & publicly usable and/or locally instantiable (e.g. forked from Github);
  • Longer term: a plan for establishing 3rd part testing and certification.
  • anything else?

The vendors represented here most likely have views on all of this, and/or some public IP to contribute to help the work along. That input will be important now.

This work will take a bit of effort but I think everyone will agree is needed in order to protect investments and to ensure quality for customers.

One question I have is: do we want to make all the conformance testing materials completely public and free, or make them free only for some level of membership payment to openEHR?

3 Likes

Hi Tom,

great proposal and I hopefully can contribute now that I am back from my vacations. Just a quick feedback on your last question: I’m in favor of providing free access to openEHR member companies only as we need further incentives for the members.

Cheers,

Birger

2 Likes

– Cross posting from Slack –

Hi all, there is one clarification to do in terms of CONFORMANCE in general, CONFORMANCE TESTING and CONFORMANCE STATEMENTs. I think, because of the implementation profile of most of us, we are only looking at CONFORMANCE TESTING but not so much at the CONFORMANCE aspects needed for procurement.

Initially this ‘conformance’ topic started when I wanted to create a CONFORMANCE STATEMENT for EHRServer, like it was commonly done for DICOM PACS (every single PACS out there has a C.S. document). But we didn’t have any formal way of doing that in openEHR, so I did my best (https://cloudehrserver.com/pages_en/guide/EHRServer_openEHR_Conformance_v1.1.pdf) and started a discussion with Thomas that lead to the current CONFORMANCE spec.

Check this brief description about CONFORMANCE STATEMENTs in DICOM: 6 Purpose of a Conformance Statement

That spec was focused on TESTING not on a STATEMENT. Though TESTING is required practically, for a procurement, the CONFORMANCE STATEMENT is a legal document that states what the vendor provides is ‘true’ to that statement, and is part of the technical proposal. When the vendor is hired, then the C.S. is part of the contract, and if the client finds something that doesn’t work, the C.S. is the source of truth in terms of claims (technical or legal).

I agree in working on a CONFORMANCE TEST FRAMEWORK, and that ITS is the only way of doing that, we can’t test abstract specifications …

Though, I think we are missing the legal and guidance part for both client and vendor/providers. Clients should have guidance in terms of creating procurements, they need to know what to ask and how to ask for it. Then vendors should have a way of providing that information in terms that are comparable to what the client requests, which is basically a standardized CONFORMANCE STATEMENT.

IMO what was discussed until now about the CONFORMANCE LEVELs is something that we are trying to pull from the CONFORMANCE TEST FRAMEWORK that should really be part of the CONFORMANCE STATEMENT area. Since it is easy to say “we provide a service to create EHRs” (something easily done in the C.S.) but is difficult to pull that from the test execution, specially when some tests work and some don’t work because of minor technical differences between implementations (which is something needed for 100% of technical compatibility between vendors, but that is another topic, for technical and conceptual conformance I don’t think that is required, because this could be a stupid thing like a not supported datetime format).

That is why I mentioned we need to harmonize the SERVICE MODEL spec with REST API spec in the meeting today (but there was no time to expand), because I think we can build better guides for clients and vendors in terms of how to construct RFI/RFP and proposals at the SERVICE MODEL level of abstraction. And that will end up in a CONFORMANCE STATEMENT, though a C.S. has more things besides API and formats, for instance, all the versions of the specifications implemented (RM, AOM, OPT, Schemas, …), so a client could require a minimal version of some spec, etc, etc.

Finally, the CONFORMANCE TEST FRAMEWORK could be used to verify in practical terms what the CONFORMANCE STATEMENT says. Which will be of interest for tech advisors on the client side, but managers will only look at the CONFORMANCE STATEMENT (legal, proposal and contract stuff).

In terms of the big picture CONFORMANCE, the scope I have in mind has these elements in a layered design:

  1. Stakeholders: customers, vendors, independent/openEHR conformance experts, …
  2. Processes: RFQs/RFPs/procurements/research/evaluation …
  3. Specifications: guidelines to support Stakeholders on Processes, templates for common documents (RFQ, RFP, conformance statement, …), Service Model, ITS (REST, Schemas), Abstract Technical Framework Design (based on Service Model)
  4. Technical Framework: test cases and test data, technical documentation on implementation decisions for the Abstract Technical Framework Design implemented on a concrete technology (Robot, Spock, OpenAPI, etc.)
  5. Platform: means to run the Technical Framework on concrete implementations and retrieve results to compare between the same solution at different points in time and different solutions form different vendors.

As always I will end up with a stupid idea: if we can specify the conformance statement as a computable artifact, we could compare those automatically for different vendors/solutions and publish that info on a vendor-neutral site. It is important to clarify the conformance statement is not about specific product features but just about what the product implements from the openEHR specs. So even though two products could have similar conformance statements, the specific features the vendor providers will be the differentiator (and of course, the price…)

1 Like

I agree with this, and as per above, I think the SM is the basis for creating the capability list for each service, what I have called the Conformance Schedule. We can call this something different if we like. This also provides the basis for RFI’s like the Catalan one (main form here).

Possibly what we should try and agree early on are these terms:

So this is mainly @pablo 's point 3. in his list above.

And we keep the Conformance Test Framework as the ‘how’ - Pablo’s points 4. and 5.

Let’s stick with the question of terms for now - can we agree on the above definitions, or rework it to a form everyone can agree on?

This is the PDF of the Conformance Architecture shown today. This is clearly work in progress, any comments / suggestions are welcome.

openEHR Conformance Architecture.pdf (113.0 KB)

1 Like