I used to run the analytics department at a public healthcare company and would often get asked to explain why our analytics performance made us better than our competitors. I often answered … “it doesn’t”. After the initial surprise, I then go on to explain why value is only created by applying analytics within the environment of a client to deliver measurable impact for a business user. In other words, what matters is not analytics performance, it’s business performance.
In today’s rapidly changing world, this distinction matters more than ever. The business of healthcare is evolving fast – value-based care, new payment models, changing regulations, increasing consumer-centricity, and more. All of that requires new ways of assembling data, technology, and analytics. Analytics itself is progressing by leaps and bounds, driven by big data, new technology and heavy investments. This would be great news if both were aligned. But they’re not.
In fact, the gap between business and analytics in healthcare is growing. This leads to a large backlog of requests by business users who seem to be constantly screaming for help. While there are examples of great one-off successes, as a whole the industry has a problem. From what I observe, it’s getting worse.
So why can’t analytics keep up? Clearly the tools themselves are advancing at ever-faster speeds – shouldn’t things be getting easier? The answer lies in the how much faster business is changing than can be translated effectively into analytics. In other words, the breakdown between analytics and business value is in the translation from business needs to clear specifications of data, processes, models, workflows, people, and technology that deliver business impact.
The problem, then, is not the analytical tools or techniques themselves. Anybody today can get access to similar data and very good modeling tools. Want a neural network predictive model? Done. More interested in decision trees or random forests? No sweat. And the problem is not the business community who is asking for more and more. Their pains are real and driven by the market, as they should be. The problem is the translation gap – the bridge between them.
Optimizing for analytics often results in subpar real-world performance
Have you ever sat through the entire process of building an analytics model from scratch? From deciding what data is needed, to getting and shaping the data, to dividing populations into cohorts, to the trial-and-error of model building, to tailoring models to fit into the business, and then the assessment of model performance? At each step, we make choices about whether to focus on analytics, the business, or both. Sadly, we choose analytics too often.
Data prepping can cause business relevancy to go out the window
Despite what some may say, we have not solved the data problem. Legacy data systems and new data systems, for very different reasons, challenge even the best companies. For example, in the process of data prepping, members who don’t behave nicely often get dumped, “outliers” get trimmed, and all kinds of other edits change the nature of the data. With each change, we move further away from reflecting business reality in order to create a nicely behaving model. But then this nicely behaving model is applied back in the business, which doesn’t usually return the favor.
Factor tweaking can increase volatility in real-world use
In the model-building phase, factors are tweaked to generate a stronger model. But tweak it just a little too much and – poof – the results go up in smoke. Very subtle changes in methodology often cause widely varying results, but end-users never see this. In order to make a stronger prediction, people manipulate input variables (factors) all the time (controlling, creating, modifying, and matching factors). Yet by playing in so many ways with the factors, which can be critical to achieving the best performance, these models are no longer representative of the diversity found in the real world. The result? Fantastic performance in well-controlled “lab” settings, and highly volatile performance in the real-world.
Operational mismatching can neutralize the best analytical model
We often talk about the right person at the right time with the right care. But rarely do I hear about how a mismatch between analytics and operations could result in connecting with the wrong person, or at the wrong time, or with the wrong care in the wrong place. When this happens, performance disintegrates. This happens a lot. Take for example a hospital readmissions program with outstanding predictive power, but operational data lags mean we reach people too late. Or reaching a person predicted to have a bad event only to ponder, what am I supposed to do or say now? That’s assuming we even have accurate contact information. An average model that is tightly aligned with well-designed operations will outperform the most advanced models every single time.
Performance degradation begins soon after a model is deployed
Healthcare today is in a state of rapid evolution. To what exactly is anyone’s guess. We know it is heading towards value-centric, consumer-centric, and provider-centric models. Social determinants are a growing trend, as is behavioral health, personalized care, bundled payments, accountable care, risk sharing, and so on. Almost every month it seems something new comes out. The only hope for analytics to be relevant is to stay aligned with business objectives as they change. We mostly talk about analytics performance in terms of R-squared, receiver-operator characteristic (ROC), and other descriptions of model performance. But that assumes a static goal or outcome. What if the goalpost is moving? How well does analytics keep up then?
Optimizing for business means re-engineering the analytics process
In most companies I have seen, the process works something like this. A business request comes in asking for help. If it gets prioritized against the other burning-fire requests, then a technical team re-states the problem in analytical terms. Success gets defined by analytical performance. Once a model is built to spec, the project is wrapped up and considered a success. Rarely have I seen success measured by real-world results, where the project never ends – it is continually monitored to ensure it keeps performing for the business as intended.
We need to reverse the process. We should be asking what business users find relevant, meaningful, timely, actionable, and significant … in business terms not analytical ones. Then we design operations to ensure those business objectives are addressed to the satisfaction of business users. Only then is analytics introduced into processes that solve specific issues within well-defined operational workflows.
This requires a new type of business competency, including skills and processes, on how to leverage analytics. There is an important consequence of this approach: that any specific analytics model can and should be discarded at the speed of business. Just as technology undergoes obsolescence, so should analytics. And it completely defines the value of analytics in terms of relevance to the business. Then and only then can we begin to fully realize the value from analytics in healthcare.