The Problem We're Solving
Large-scale retail operations generate enormous volumes of transactional data across order management, inventory, and ERP systems. The problem isn't data volume. It's that planners making demand, replenishment, and S&OP decisions are often working from signals that arrive late, are inconsistently structured, or are disconnected from what's actually happening in operations.
At QVC Group, the Core Retail transformation needs more than modernising individual systems. It needs a coherent data architecture that makes supply chain intelligence possible: one where demand signals, inventory positions, and order flows can feed planning layers reliably and consistently.
What We're Building
Connecting Transactional Systems to Planning Layers
The goal is to define and evolve the architecture that connects QVC's core transactional systems (SAP S/4HANA, IBM OMS, and inventory management) with downstream analytics and planning platforms.
The design principle we've committed to: treat supply chain data not as a reporting output but as a live signal layer. Planners and AI systems alike need clean, timely, and consistent data to make decisions. Anything that corrupts that signal (late batch jobs, inconsistent schemas, siloed pipelines) degrades the quality of every decision downstream.
Data Platform Architecture
We're building on Azure Databricks with Delta Lake as the storage layer, using Spark for high-volume pipeline processing. The architecture we're working toward covers three layers:
- Ingestion: Schema-governed pipelines pulling from SAP S/4HANA, OMS, and inventory systems with defined latency and data quality expectations
- Transformation: Unified data models for demand signals, stock positions, order flows, and fulfillment status, designed to serve both operational visibility and planning consumption
- Serving: Clean interfaces to downstream planning platforms (including enterprise planning systems in the Anaplan/EPM family) and analytics layers
AI and ML Planning Capabilities
On top of the data platform, we've identified a set of AI and ML capabilities we want to build to augment planning decisions:
- Demand signal integration: Connecting external demand signals (promotional calendars, seasonality, market trends) with internal order history to enrich forecasting inputs
- Forecasting models: ML-driven demand forecasting that augments planner judgement rather than replacing it, providing probabilistic ranges alongside point estimates
- Adaptive replenishment: Optimization strategies that account for lead times, inventory positioning, and service level targets across the value chain
S&OP Data Integration
A recurring theme we're seeing in enterprise S&OP is that the process itself gets blamed for failures that originate in the data. We're working closely with business and planning stakeholders to map the data dependencies behind S&OP cycles, identifying where latency, quality, or coverage gaps are driving the weekly firefighting that planning teams describe as normal.
The architectural direction we're pursuing: treat S&OP as a data product, owned, governed, and maintained with the same rigour as customer-facing systems.
Key Decisions and Trade-offs We've Considered
Reliability before sophistication: We've agreed that the first obligation of a supply chain data platform is to be trustworthy. Before introducing ML or agentic capabilities, the pipelines need to be reliable and the data needs to be correct. Sophisticated models on unreliable data produce confident wrong answers.
Planning system integration: Rather than building bespoke planning tooling, the architecture is designed to serve enterprise planning platforms (Anaplan and similar EPM systems) through clean, well-governed data interfaces. The goal is to preserve planner workflows while significantly improving the quality of the data they work with.
Legacy system coexistence: Wholesale replacement of legacy systems isn't realistic in large retail operations. The approach we're taking is to coexist with and gradually modernise existing flows, using service decomposition and event-driven integration rather than big-bang migration.
Where We Are
This is active, ongoing work. The architecture is being defined and iterated on in parallel with the broader Core Retail transformation. Some pipeline foundations are in place. The planning integration and ML layers are on the roadmap. The direction is clear, but there's real work still ahead.