Let’s not overcomplicate this.
The real barrier to effective value analysis in many health systems isn’t the process itself, it’s the quality of the supply chain data that supports it, often preventing organizations from clearly identifying and fully implementing cost reduction opportunities.
At its core, this is a supply chain data management issue. When data is fragmented across systems, inconsistent, incomplete, difficult to access or just plain wrong, supply chain teams lose the visibility needed to confidently identify, validate, and execute cost savings opportunities. Instead of operating from a reliable, centralized source of truth, teams are forced to piece together insights from spreadsheets, emails, and disconnected platforms, introducing gaps in accuracy, speed, and accountability. And where accountability breaks down, so does trust.
Without trusted data, clinicians – surgeons, nurses, and other stakeholders are far less likely to engage with or adopt proposed changes, regardless of the potential savings.
The downstream impact is significant: opportunities become harder to quantify, slower to act on, and inconsistently implemented across the organization. Because value analysis relies entirely on the integrity and credibility of this same data, its effectiveness is inherently constrained. In other words, when supply chain data lacks trust, the entire cost savings engine, value analysis included, struggles to deliver measurable and sustainable results.
The Line You Need to Remember
There’s a phrase I’ve used for years (full credit to Charlie Miceli for saying this to me 20 years ago):
“When you’re doing data analysis and you’re using ERP data as your source, you gotta be careful… sometimes you’re dealing with garbage at the speed of light, and if your data isn’t right, you need to get it right, or you’re cooked”
And in many many cases, that’s exactly what I still see happening in supply chain today…low quality data…zero trust…zero cost reduction opportunities implemented.
By the way, here’s my message: its not the people in the process, it rarely is from what I’ve seen. It’s the data.
The Dangerous Illusion of “Fast Data”
Everyone is moving faster:
-
- Faster analytics
- Faster dashboards
- Faster reporting
- Faster decision cycles
But back to our issue…
If the underlying data is bad, you’re not improving performance.
You’re just scaling bad decisions faster.
What “Garbage” Actually Looks Like in Supply Chain
Let’s be specific.
This isn’t theoretical. This is what we see every day:
-
- Item masters filled with inconsistent descriptions
- Duplicate SKUs across facilities
- Missing or unreliable clinical attributes
- No clear functional equivalency between products
- Pricing data that lacks context or comparability
And yet…
All of that feeds directly into:
-
- Value analysis decisions
- Sourcing strategies
- Contracting efforts
- Financial reporting
So What Happens?
You get:
-
- Dashboards that look clean
- Reports that look credible
- Analytics that look sophisticated
But underneath it all…
The foundation is flawed
And Here’s Where It Gets Expensive
Let’s translate “garbage at the speed of light” into ROI.
- You Accelerate Bad Decisions
-
- Products get approved without proper comparison maybe????
- Pricing opportunities are missed, certainly not maximized
- Standardization decisions are misaligned
Faster decisions… wrong outcomes
- You Multiply Inefficiency
-
- Analysts spend hours fixing data before every meeting
- Teams reconcile conflicting information
- Work gets repeated across departments, locations etc…
High-speed waste at its finest
- You Delay Real Savings
-
- Committees hesitate due to lack of confidence
- Opportunities sit in limbo
- Implementation slows down (this one is inevitable)
Savings are pushed out – or lost entirely
- You Lose Trust
This is the one no one talks about.
-
- Finance questions the numbers
- Clinicians challenge the data
- Supply chain loses credibility
And when trust is gone… execution stops
This Is Why SourceFirst Exists
Not to give you more data.
To give you better data at the source.
SourceFirst addresses the core issue:
-
- Clean, normalized, thoughtfully enriched item master data
- Standardized product classification
- High quality functional equivalents identification
- Reliable pricing benchmarks
- Category-level opportunity insights
Improve data quality, comparability, and financial insight across the value analysis workflow and build integrity and trust.
The ROI Shift (This Is the Point)
When you fix supply chain data quality:
-
- Decisions happen faster and more accurately
- Opportunities become visible sooner
- Stakeholders align more quickly
- Implementation starts earlier, ends faster and cleaner
Which leads directly to:
Faster, more reliable margin improvement
The Real Insight
Speed is not the advantage.
Accurate, actionable data is.
Because if you don’t fix the data…
You’re just running garbage at the speed of light.
Bottom Line
What often presents as a value analysis problem in many health systems is, in reality, a supply chain data quality issue, one that more often than not limits the organization’s ability to effectively identify, validate, and implement meaningful cost savings opportunities.
And until you fix it:
-
- You’ll move fast
- You’ll work hard
- You’ll produce reports
But you won’t produce results.
CTA:
If your team is still cleaning data before making decisions, you’re not accelerating…you’re compensating.
Contact us at Data Leverage Group so we can help you to quickly solve the problem. We have the data, we have the process, we have the expertise…
