Across the Healthcare IT industry, from providers to software developers, and even into the U.S. Senate, we continue to hear that the content of shared Consolidated CDA (C-CDA) documents, in spite of improved guides and constraints, are not meaningfully useful.
There continues to be wide divergence of documents across the vendor and provider communities. For example we have heard:
- Documents are ginormous (a very technical term).
- We cannot find what we need.
- It’s just a CCD – where’s the narrative?
- Where is the operative note?
- And more…
In prior Allscripts posts we touched on many of these points, such as in the Relevant and Pertinent survey post. Results from this survey will be published soon. However, we now have another opportunity to help improve the content of C-CDA using a new scoring tool for C-CDA R2.1, the version of content that will be coming from 2015 Certified EHR products.
SITE C-CDA R2.1 Scorecard
The SITE C-CDA R2.1 Scorecard is a new opportunity to measure, in a repeatable manner, the quality of shared C-CDA content, and then use feedback to hopefully produce better and more meaningful content. We encourage customers, users and solution designers to use this mechanism as a way to find opportunities for improvement, both big and small.
The Office of the National Coordinator (ONC) and HL7 have been working together to leverage previous work on improving C-CDA content, based on the SMART (Substitutable Medical Apps Reusable Technologies) concept. Allscripts participated in the SMART C-CDA R1.1 project, along with several other EHR vendors. Allscripts submitted many 2012 Certified C-CDA documents, and internally we benefitted from the scoring exercise. Following up on this early work, a group of HL7 Structured Document participants reviewed and updated the rules and heuristics used for scoring content.
Soon, C-CDA R2.1 documents from 2015 Certified EHR products will be available, so consider submitting their content to the new SITE C-CDA R2.1 Scorecard to help ONC and HL7 improve the rules, and to help the industry as a whole improve the content that we share. For example, a submitted document might only get a score of a “C” in a particular section because something meaningful is missing. Providing that feedback to the system or vendor producing the document should result in updated content in a future release, and an overall better user experience in the future.
One very important note, and the following comes directly from the announcement of the Scorecard: do not include any Protected Health Information (PHI) or Personally Identifiable Information (PII) in your C-CDA file submissions to the Scorecard. Click here for more information on how to de-identify PHI.
You may send your suggestions or questions on content:
Happy Scoring! And note, we’re not proposing that every document always score an “A.” Our goal and responsibility is to raise awareness about the opportunity to produce more consistently usable documents.
A final thought for wishful thinking: shouldn’t we have content scoring that works for C-CDA, FHIR and the next new shiny object that comes along? Perhaps we, as an industry, learn enough from this project that is focused on C-CDA to take it to the next logical step where the Learning Health System scores content for relevance and importance as data are collected.
Update Aug. 18, 2016:
We want to make sure everyone understands that the SITE Scorecard is specifically examining and scoring for: “…whether the C-CDA document meets the requirements of the 2015 Edition Health IT Certification for Transitions of Care 170.315(b)(1) criteria.”
Use of C-CDA R2.1 documents is required. These documents for 2015 Certification testing will be forthcoming from Allscripts products according to the specific product schedules.