Multi-Site Performance Standardisation
Group EBITDA margin improved from 19% to 27%; underperforming locations grew revenue by 28–45%; strategic investment closed at 2.4x.
The challenge
A 2.5x revenue gap between best and worst locations couldn't be meaningfully addressed without knowing whether the gap was operational, commercial, or structural.
The MSO's management team knew the performance gap existed but didn't have the diagnostic granularity to act on it with precision. Each location had developed independent operational processes. Clinical photography was inconsistent, making outcome-based marketing impossible at group level. Patient communication varied by location and by practitioner. Financial reporting was fragmented across three different systems, making true like-for-like comparison unreliable. Marketing spend was locally managed, with no shared data on channel attribution or cost per acquisition. The result was a group that looked like a platform from a structure standpoint but operated more like 12 independent practices with shared branding.
The system
Decision system built
We designed and implemented a group-wide performance intelligence and standardisation programme combining unified data infrastructure, centralised AI marketing, standardised clinical protocols, and a performance management framework that gave the central team real visibility — and real levers — for the first time.
System components
Unified data platform: single practice management system deployed across all 12 locations, integrating scheduling, clinical documentation, patient communication, and financial reporting into a consistent data model
Group-level marketing intelligence: centralised AI marketing platform aggregating performance data across all locations, applying predictive targeting trained on the group's combined patient dataset, with dynamic budget allocation to highest-performing channels
Standardised AI consultation protocol: facial analysis tools and outcome simulation deployed uniformly across all locations, with practitioner training calibrated to consistent protocol standards
Performance benchmarking layer: location-level KPI dashboard with peer comparison, tracking acquisition cost, conversion rate, utilisation, revenue per patient, and 90-day retention — giving the central team the ability to see where individual locations deviated from group norms
Variance diagnostic framework: structured analytical methodology to separate commercial underperformance (addressable through intervention) from structural disadvantage (addressable through investment or exit decision)
How we worked
Engagement scope
Data platform unification, group marketing intelligence, consultation protocol standardisation, performance benchmarking, and variance diagnostic framework across 12 locations.
Timeline
Phased 18-month programme. Unified data platform live across all locations by month 6. Marketing centralisation by month 8. Full benchmarking and standardisation complete by month 14.
Operating model
Central programme management team with location leads as implementation partners. Monthly group performance reviews introduced from month 7. Location general managers retained operational autonomy within the standardised framework.
Outcomes
Business impact & measurable results
Group EBITDA margin improved from 19% to 27%; underperforming locations grew revenue by 28–45%; strategic investment closed at 2.4x.
Group-level patient acquisition cost fell 34% within 12 months as centralised AI marketing replaced inconsistent local spend
The three underperforming locations improved revenue by 28%, 41%, and 45% respectively, as operational standardisation and group-calibrated marketing replaced locally improvised approaches
Group EBITDA margin improved from 19% to 27%, representing approximately £3.2M in additional annual profit on a group revenue base of approximately £18M
The group received a strategic investment at a valuation 2.4x above the pre-programme level; the investor specifically cited the group's proprietary patient outcome dataset and centralised AI marketing infrastructure as primary valuation drivers
Governance
Trust, collaboration & governance
Variance diagnostic methodology shared with location GMs — no location was penalised for underperformance identified as structural rather than operational
Marketing budget reallocation decisions made transparently, with location-level data supporting every change in spend
Clinical protocol standardisation developed in collaboration with senior practitioners from across the group, not imposed from centre
Investor data room prepared with full methodology documentation — no assumptions presented without supporting analysis
Reframe
The underperforming locations weren't failures of clinical quality — they were failures of information.
Across every engagement, the goal is the same: engineer a system that makes better decisions — faster, more consistently, and at scale — than the process it replaces.
Next steps
Related services
Start a discovery
Most engagements begin with a conversation about context.
We do not send a proposal before we understand the problem. Start by telling us about your decision context — we will identify the highest-leverage intervention areas before any scope is agreed.