2. Dr William Goossen, directeur Results 4 Care B.V. De Stinse 15
3823 VM Amersfoort the Netherlands
3. Andrew M. Wiesenthal, Director Deloitte Consulting, San Francisco
Marc Overhage:
*quit chasing butterflies – don’t worry about developing new standards and focus on making existing standards work
* recognize and accept that this is trench warfare – given that healthcare providers have implemented whatever HIT they have in idiosyncratic and unique ways in terms of terminologies used, which fields are used to record various data, how they define various concepts, degree of rigor applied to data completeness and quality etc., the work to establish interoperability requires effort for each provider implementation. This is work that requires some expertise in clinical concepts etc. the actual message based interfaces are usually almost trivial to create – hours in most cases once you have the right person’s attention.
* use the benefits of a network – we have good enough terminology and messaging concepts available (not perfect but good enough) for much of the data we need to share to serve as the “target”. Google Translate, for example, takes advantage of the many language x to English mappings they have been able to use English as a Rosetta stone for translating language x to language y (using English to mediate the translation), individual provider terminologies need to be mapped to national standard terminologies.
* accept mediocrity (in order to get started) – We will not achieve full semantic interoperability ever if we don’t start. Begin with the stuff that is “simple” and subject to less interpretation like laboratory results, imaging reports, medication records, simple clinical observations like vital signs, diagnoses/problems assigned, etc. Recognize that higher level structures will take time to sort out but let’s not hold interoperability hostage pending that happening. In fact moving simple data will greatly facilitate later agreement on higher level constructs.
William Goosen:
I did a study towards coverage of Snomed CT for more granular data for stroke assessment and rehab and care. The moment you have non diagnostic concepts eg to define functions of hands and fingers and toes and nursing care (determined by clinicians as essential data for continuity of care across organisational borders) we found a coverage of 50% approximately.
Following Cimino's desiderata I advised the Dutch ministry of health that give other studies like yours cover 90% and our granular use case we could solve half of our coding problems. So go ahead with what was in 2007 and invest as priority activity in a procedure to add new codes when required.
Of course it is 2013 we have Snomec ct now and are still waiting for that procedure. So now I moved to CiMI and one target is to set up the Snomed CT extension for missing codes that are required for the detailed clinical models.
Key is none standard will ever be perfect, yet we need them. Most important of any standard is to handle any issue through a transparent governance .
Andrew M. Wiesenthal
Marc--
Hear, hear! And I would add that the core list of 2500 problems that the NLM has is a great place to start as the "seed crystal" for that Rosetta stone. It was developed based on contributions like the SNOMED CT terms accounting for more than 90% of all of the diagnoses and procedures attached to the first 25 million encounters (ambulatory and inpatient) recorded in Kaiser Permanente's electronic health record (I know, because I had that report produced and provided the NLM with the information). About 650 diagnoses accounted for 90% of encounters, varying only slightly by region of the country.
Reported by Dr. Terry Hannan.
0 comments:
Post a Comment