A Very Serious Problem NEHTA is Not On Top of As Best I Can Tell.

For reasons best known to itself, and despite the de-emphasis of the development of the approach in the National e-Health Strategy, NEHTA presses on indicating that their Individual Electronic Health Record (IEHR) is an important way forward.

As recently as a few days ago NEHTA has the IEHR as the centre of its Care Continuum Blueprint and apparently of its conceptual model of Australia’s E-Health future.


Well listen up NEHTA – there is an elephant in the room! Read and weep – and yes her brother is one S. Hawking (a tolerable astro-physicist) I am reliably assured!

Personal view: Mary Hawking

20 Aug 2009

Mary Hawking, a GP and critical friend of NHS IT, asks: are your records fit for sharing?

There has been a lot of work put into sharing information electronically, with an assumption that sharing clinical records will lead to better patient outcomes.

Appropriate sharing of information always has been important; remember the old joke about the referral letter that said: “Dear Consultant, please see and advise, yours sincerely, GP” and the reply: “Dear GP, seen and advised, yours sincerely, Consultant”?

With Electronic Patient Records in the GP sense (meaning all or most information held only electronically), we can provide access to full or summary records to urgent care, secondary care and out of hours to everybody’s benefit – especially the patient’s!

However, putting aside for the moment the privacy, organisational and legal problems involved – have you considered the quality of the information being shared?

Asking the question

Most of the definitions of data quality are from management perspectives, and a lot of good work is being done to improve data quality at this level in the NHS.

However, I am concerned that these standards – even if observed – would be insufficient to ensure that information being shared at single patient record level is fit for purpose; when the purpose is the safe medical care of that individual patient.

For instance, the Quality and Outcomes Framework used for GP performance-related pay in the UK produces useful data on the prevalence and management of selected chronic diseases such as diabetes mellitus. But this is aggregated and incentivised data; even in high performing practices, is the information as good in areas not covered by QoF and at individual patient level?

There are a number of different initiatives being implemented to enable widespread and routine sharing of EPRs or extracts as an essential element in re-organising and improving delivery of routine and urgent care and - especially - allowing access by emergency services and secondary care.

From summary records such as the Summary Care Record, EHI and the Individual Health Record, to single shared records such as TPP SystmOne and Lorenzo, to virtual shared EPRs such as EMIS Web in Liverpool, Tower Hamlets and Gateshead, and the uploaded repositories of entire records such as the Graphnet applications in Hampshire, these all appear to be based on the assumption that the records being shared are fit for the purpose of being shared.

And I can’t find any evidence that they are at present. So what are the potential problems, how can we identify them, establish standards to make sure that EPRs are as useful to the users of the shared record as to the originators, and identify the gaps and the training needed to attain the goals of safe, useful and reliable sharing for the benefit of all concerned?

Some partial answers

Within a practice or organisation, records are held in a way that is fit for purpose for that organisation – in the case of EPRs this is for looking after individual patients, the practice population and managing the business of the practice (including QOF).

There is no need to keep EPRs in a form fit for sharing with other parts of the NHS – even if anyone had agreed the form needed for such sharing.

Single Shared Electronic Patient Records have their own problems – as addressed in a Royal College of General Practitioners report ‘Shared Record Professional Guidance’ – but the issue of data quality when sharing EPRs is wider than that, and affects all forms of shared EPRs.

The IM&T Directed Enhanced Service (one of the payment mechanisms under the new GMS contract) was introduced to improve and accredit data quality in general practice. However, only 70% of practices applied for it – and some will have failed the data accreditation which was one component. So in around a third of practices we either don’t know the quality of the data – or know it did not get accreditation.

The Summary Care Record quite rightly only accepts records from practices holding data accreditation – so does this mean that one third of patients will not be able to have a SCR? How will that affect patient care in local health communities, where the SCR is a fundamental part of urgent care planning and making medication records - in particular - available on hospital admission?

.....

Links: Shared Record Professional Guidance was published by the RCGP this week. More information about PRIMIS+ is on its website.

About the author: Mary Hawking is a GP in Dunstable, Bedfordshire, with an abiding interest in health informatics and medical records - especially electronic ones - and all the issues surrounding them. She is a member of the NHS Faculty of Health Informatics, the BCS’ PHCSG, a committee member of the EMIS NUG, and level 3 UKCHIP. “Also a believer in networking and discussion!”

Lots more here with some excellent comments as well.

http://www.e-health-insider.com/comment_and_analysis/499/personal_view:_mary_hawking

On a similar track we also have the following.

Healthcare Tech: Can BI Help Save The System?

Initiatives like nationwide, integrated e-medical records won't happen until we get beyond closed, proprietary architectures. Business intelligence is a solid place to start.

By Boris Evelson, InformationWeek
Aug. 20, 2009
URL: http://www.informationweek.com/story/showArticle.jhtml?articleID=219300177

Healthcare IT is a good place to be these days. While IT budgets in many verticals have been tightly reined, healthcare is enjoying multiple government mandates. This has resulted in an infusion of funds to modernize and integrate IT infrastructure, applications, and data.

However, we aren't starting from a high ground. There are multiple challenges to attaining a 21st century-grade IT environment. Among them:

Errors abound: Information management systems and computerized physician order entry (CPOE) applications accounted for a staggering 84% of the 43,372 computer-related medication error records in a 2006 study of the United States Pharmacopeia MEDMARX database.

Proprietary, closed architectures still rule: Hospital information management applications are often based on hierarchical databases that don't speak common query languages like SQL or MDX--the basis for all modern business intelligence tools. Even worse, some of these applications aren't architected with separate data and application logic tiers.

No data transparency: Applications with proprietary, hidden data models don't allow for plug-and-play interfaces with standard data integration technologies like ETL (extract, transform, load) and CDC (change data capture). This environment encourages ex-developers of these proprietary, closed applications to take advantage of their inside knowledge of how these apps work to make a living building custom interfaces for clients.

Incomplete standards: Data exchange standards like HL7 only work for about 80% of the content (and that's for administrative data, it's even less so for clinical data). The rest must be custom integrated every time.

Huge chunks of master data management are missing: MDM, a key to effective BI applications, works mostly for patient information and maybe billing codes, but not for anything else, like drugs (good luck trying to find a standard code for 200mg ibuprofen gel coated caplets), conditions, and treatments (there's no such thing as a "standard treatment" for a particular ailment--it's all subjective). For example, one senior healthcare IT manager tells me that glucose tests are coded differently in every single lab system she looked at, so her team spent countless hours coding mapping tables.

The world is vendor, not user, centric: True, most of the state-of-the-art (i.e. proprietary) healthcare applications are very powerful and function rich, but few vendors seem to care about integrating with other vendors' applications.

As a result, most healthcare IT executives I talk to name three top challenges that they face every day: integration, integration, and integration. Another healthcare IT exec tells me that it took about three months to write database, application, and GUI logic for a hospital EMR system, but it's taking years and years (still going strong) to integrate pharmacy and lab data even within her own hospital network! Standards like HL7 are purely communication standards, she says, not content standards. And that's the real problem. Until this changes, I don't see a bright future for much-needed initiatives such as:

- Nationwide, integrated EMRs;

- Translational research that links patient care with pharmaceutical research applications, processes, and data;

- Pay for performance, those Medicare- and Medicaid-driven mandates to link procedures and treatments to actual improvements in patient health.

There's no rocket science behind these initiatives, but they won't materialize until we get beyond proprietary and closed architectures.

Much more here:

http://www.informationweek.com/news/healthcare/interoperability/showArticle.jhtml?articleID=219300177

So what this all boils down to is actually not all that complex. Before any IEHR can even be considered we need to address the data quality and the data content issues in the source systems.

At least in the UK the problem is clearly recognised. Just where I ask is the NEHTA document (as producing documents, as we all know, is almost their only apparent skill) that clearly identifies this issue and explains how it is to be handled.

A start will certainly involve vastly more insight and research into the data quality held by all actors in the e-Health domain (that means GP, Specialists etc and not just the Jurisdictions) – and when that is done I can confidently predict we will know we are light years away of having data that is ‘fit to share’ in most systems.

As identified in the National E-Health Strategy the national approach needs to start with implementation and use of local systems and the development of information flows between these systems. As this evolves it will become clear just what information has to quality and integrity to be permitted to flow and what needs to be improved. Only once that improvement is achieved can we even move on to considering any form of shared repositories.

To pretend you can do it any other way is just plain silly!

David.

0 comments:

Post a Comment