Former ONC chief on TEFCA and the "dynamite" of FHIR, part 2

In part 1, Dr. Don Rucker, the former National Coordinator for Health IT from 2017-2021, framed the Trusted Exchange Framework and Common Agreement’s reliance on brokered protocols and page view document architecture as a costly impediment to modernizing healthcare computing. He also discussed the power of FHIR and leveraging JavaScript Object Notation.

For example, the Centers for Medicare & Medicaid Services already is using Bulk fast healthcare interoperability resources with APIs built on JSON for certain information sharing.

Here in Part 2 of this in-depth interview, Rucker discusses the technical limitations of the national healthcare interoperability strategy, value-based care, healthcare data outside of HIPAA and electronic health records, provider burdens, the use of predictive analytics, and more.

Q. What are some fundamental things ONC needs to address with TEFCA before rollout?

A. ONC needs to understand explicitly whether you need brokers for endpoints or whether the country could use what the [20th Century Cures Act] requires of CMS as endpoints.

Do we have to pay for that on every transaction forever? Do we need those toll takers?

Or, since we’re already paying taxes and CMS is already paying everyone in healthcare, can we just use the mechanisms that make logical sense given the dominant role of CMS? That’s a monster question.

The second question is really what classes of API tools are there. The reality is, if we’re going to have lower-cost healthcare – which is ultimately what equity is about, affordable care – first and foremost, we’re going to need to be able to use the programmers in the rest of the economy.

We’re going to need to harness that kid in the garage that is using modern standards-based APIs. We cannot rely on healthcare-specific programming if we’re going to lower costs and increase competition in healthcare.

Part of the TEFCA discussion is, is this approach of brokered [Integrating the Health Enterprise] endpoints really the way to go?

I mean, there are some interesting other side issues here like how patients give consent under HIPAA. Does that still hold in a world where stuff is fully networked? What does consent mean? Do we need explicit consent in TEFCA?

We have to compare and contrast with the rest of the computing world with what we’re doing in healthcare.

Let’s not reinvent.

Q. What are some of the technical limitations you’d like to see fixed? For example, IHE protocol architecture has no ability to generate population-level queries for provider performance or pandemic surveillance?

A. Is something a technical limitation or a fundamental architectural approach? The restful stuff is powerful enough – it’s dominated. 

On the proposed migration of TEFCA to FHIR – [from a software developer perspective] simplicity is going to win. And that’s where we need to go.

Q. You’re a big believer in FHIR, obviously, from your time at ONC to your work with 1upHealth. What is it, for those who might not know, what are its biggest benefits and what will it enable in the years ahead?

A. The concept of Bulk FHIR is really simple. We’ve gone, as a country, to the effort of using the United States Core Data for Interoperability for the HIPAA individual right of access. 

Let’s reuse it to look at populations of patients. It’s that simple. It’s the same data. Let’s reuse it.

Bulk FHIR will for the first time ever allow – under classic HIPAA treatment payment operations of signed contracts, with all of the protections of modern security – to compare apples to apples and look at provider performance on as broad a swath of data as we can.

The point of all of this is we cannot do a clinical trial on every single individual question ever raised in medicine. We just can’t – there are too many questions and too few people. We need to learn from the population that’s out there, what we’re already doing and what works. 

Bulk FHIR is the computable way of doing that.

Q. How has the CMS proposed rule affected payer-to-payer interoperability, in terms of expanding the types of data that must be exchanged, and does ONC’s approach complement it? If not, what are the holes? 

A. It just rips off the data that’s already electronically available. It’s not generating net new data feeds, it’s riffing off the data all claims data can now be represented – not just an X12, but in FHIR.

It hints at the vision that CMS is working toward, which is how do we integrate claims and clinical data.

And that’s the ultimate question of value-based care. You cannot determine value if you don’t know what you paid for something. You’re going to have lots of quality measures.

Even Bulk FHIR is a near somewhat global quality measure. But if you don’t know what you paid for the thing, I mean, you can’t really shop for care.

I think the payer-to-payer APIs are part of that. The obvious side part of the payer-to-payer API is the realization that as patients change payers, things like prior auth get dropped. 

Part of that vast unhappiness is you get a new plan and all of a sudden you have to go through the whole commotion you did before. Congress hears that.

Q. What about providers – will they be overly burdened by ONC’s proposed rules?

A. Obviously the single biggest burden is paying for electronic case reporting and then double paying for TEFCA APIs – that probably needs to be totally rewritten. 

Ultimately, patients and providers pay for that. The providers pay for it first and then the patients pay for it. Those are the biggest burdens.

On predictive decision support, also known as the AI rules, I get that academics want to know what each data field is and disclose it and ponder what it’s fairness is. And obviously, there’s a huge political narrative around which groups of patients are we going to favor or who is unfavored.

I don’t think any clinician that I’ve ever worked with is going to have the time to read these things. I think they’re going to make the same decisions they make on all of the rest of the care, which is to the best of my knowledge, is this safe? Is it effective?

We have to evaluate every single thing we do as a clinician. Are we giving a med? Are we doing a test? Are we not doing a test? Are we referring somebody to surgery?

I don’t see fundamentally how AI tools are going to be different. Clinicians are going to evaluate these things not by looking at the data fields, but just the way they look at every other piece of medical care.

The reality is people don’t have the time to even read decision alerts today. There’s just too much. So I think the reading load needs to be considered. I’m guessing they’re going to get a lot of comments on that.

I think the other sort of interesting thing, and this is sort of a question for the country and ONC is trying to get at that, is how are we going to deal with these large language models that soak up all information globally available?

I think there’s a mismatch between assuming you can deregulate based on individual fields, where the fact that all of the high performance comes from letting the algorithm sort out which fields are helpful.

We can’t look at AI algorithms one field at a time because the whole point of modern AI is to use every data field. You’re not going to be able to make rules around one field at a time, whether for favoring this political group or adjusting that equity issue, because that’s not the way modern AI works.

Q. You noted previously that the vast bulk of inferable information about healthcare is totally outside of HIPAA or EHRs – in our cell phones and GPS and credit card spending. How will this data play a role in population health and value-based care?

A. There are two gaps with data. 

One gap is between claims in clinical data. We’re not going to get value in healthcare until we unite those. The ability to merge – in one competing platform and data standard – claims and clinical that’s one big gap that we as a country haven’t really done.

I think each payer does bits and pieces of it in a bespoke way today, but we haven’t done it as a country.

The second gap is the historic divide between quote medical data and this other data that you’ve just referred to that as health information in it that’s inferable. 

I think implicit in the Cures Act is, how do we get the rest of the app economy into healthcare? I mean, it doesn’t say that, but I think the real question is we’re constantly sitting on our phones doing stuff now. How do we integrate the hard data points – our illnesses, the medications, the allergies, lab tests and imaging results? How do we integrate hard, highly curated data points with the softer things that maybe have almost as much information individually and likely have more information collectively which are things like, where do we shop?

There’s a lot of information in that accelerometer.

I think we need to bridge the gap between consumer apps and healthcare apps, and it can be done. The rules we put in in the 20th Century Cures Act ONC rule actually allow it to be done. That entire framework exists now.

Obviously, if we haven’t created any economic incentive for prevention, people are going to use a lot less of that than if there were incentives.

As a country, we’re going to have to move there.

I would always mention that point in terms of privacy protection. Invariably on Capitol Hill you hear somebody say we want to protect patient privacy. I have yet to see a single case where somebody who has claimed they want to protect privacy is not ultimately protecting an opaque business model.

I think HIPAA is actually a very well done law on privacy. I think it’s a nice balance of individual protections and societal affordances. We just need to be mindful at the edges of how we do both of those things.

I’m absolutely convinced both can be done. They can be done with the technology we have today. They’re done every day and healthcare and there’s certainly done every day in the rest of our lives.

Q. What do you see when you envision the nationwide interoperability/data exchange ecosystem – from primary care to inpatient, LTPAC to public health reporting, and beyond – in the next five to 10 years?

A. This gets to that famous Bill Gates quote: ‘You overestimate the short term and underestimate the long term.’

So I’m going to talk about more of what I wish. Obviously, some of the political dysfunctions are hard to predict.

What I hope is that we have something that riffs off the modern Internet. The modern Internet is a simply awesome set of networking constructs that have been woven together over the last 60 years.

We need to get in the mode of imitating people who actually do services and compute well. 

That’s my hope, that we get there in healthcare and are very careful about rules that sort of potentially prevent that. Fundamentally, this is my concern with the current iteration of what is called TEFCA and the IHE protocol document centric approach.

READ PART ONE OF THIS INTERVIEW

 

Andrea Fox is senior editor of Healthcare IT News.
Email: [email protected]

Healthcare IT News is a HIMSS Media publication.

Source: Read Full Article