The Personalization Gap: What Consumers Want and Why It Is So Hard to Deliver (Part 1 of 2)

E X E C U T I V E S U M M A R Y
→ Consumer expectations for personalized AI have already been permanently reset by platforms like Netflix, Spotify, and Amazon, and those expectations are now being applied directly to GenAI tools.
→ Consumers want GenAI to synthesize data across airlines, hotels, and credit cards without being asked to explain their own preferences, a capability that does not yet exist at scale.
→ The trust paradox is real: consumers simultaneously demand deep personalization and strong data protection, and the organizations that deliver both will earn durable competitive advantage.
→ The data infrastructure required to make this vision real is not in place at most organizations, and in many cases has not yet been seriously planned for.
→ Three structural blockers stand in the way: broken consent architecture, fragmented real-time data integration, and cybersecurity exposure that grows with every new data source.
→ Part 2 of this series, The Personalization Imperative, will examine the most promising paths forward, with specific next steps for CMOs, content companies, and agency leaders.
A Note on Where This Comes From
I began my career as a U.S. Army Military Intelligence Officer, leading 256 soldiers on a combat deployment to Iraq. That experience, which came long before I ever opened a data dashboard or sat in a client meeting, taught me something that has shaped every piece of work I have done since: the value of intelligence is not in how much you collect, but in how well it is organized, how quickly it can be acted on, and whether the people and systems receiving it are actually ready to use it.
After leaving the Army, I spent over a decade at Deloitte Digital building practices in data science, customer segmentation, propensity modeling, and marketing technology, advising organizations from $150M to $67B across media, entertainment, technology, and sports. Across every engagement, the same tension surfaced: organizations had more data than they knew what to do with, and consumers wanted experiences that none of those organizations were equipped to deliver. That tension is now moving to the center of the GenAI conversation, and I think it is worth examining directly.
Section 1: What Consumers Actually Want — The Personalization Imperative
GenAI has raised the floor on consumer expectations in a way that most organizations have not yet fully absorbed. The question is no longer whether personalization matters, but whether the industry can deliver personalization at the level consumers now believe is possible and increasingly expect.
BY THE NUMBERS: THE PERSONALIZATION EXPECTATION
71% of consumers expect personalized interactions, and 76% get frustrated when they do not receive them. (McKinsey, 2023)
72% of consumers say they only engage with personalized messaging. (SmarterHQ)
80% of consumers are more likely to purchase from a brand that provides personalized experiences. (Epsilon)
1.1 The Tailored Experience Expectation
The bar for personalization was set high by the platforms consumers interact with every day. Netflix does not ask you what genre you want, because it watches what you watch and builds a feed that reflects your taste, your mood, and your history. Spotify learns from your listening patterns and surfaces music you did not know you needed. Amazon puts what you are likely to buy next in front of you before you have thought to look.
These are not aspirational benchmarks anymore. For consumers, they are the baseline, and when consumers turn to GenAI tools such as assistants, agents, and co-pilots, they bring those same expectations with them. The consumer expectation is not simply a tool that can answer questions, but a tool that knows them, adapts to their communication style, anticipates their needs, and surfaces relevant information without being prompted. Personalization in the GenAI era is not a feature to be added to a product. It is the product itself.
Andrew's Lens: Personalization in the GenAI era is not a feature to be added to a product. It is the product itself.
1.2 The "Zero Effort" Data Expectation
There is a second layer to this expectation that is less discussed but equally important: consumers want GenAI to already know them without being asked to explain themselves, their preferences, or their history.
Consider what a consumer's behavioral data actually looks like across the services they use every day. An airline knows their seat preference, historical routes, and whether they prefer morning or evening departures. A hotel chain knows their floor preference, whether they use the gym, and what time they typically check out. A credit card issuer knows their dining patterns, spending by category, and travel cadence. A streaming service knows their watch history, skip behavior, and time-of-day viewing habits.
No single organization holds the full picture. But a GenAI tool that could synthesize across all of them would know more about a consumer's actual preferences and behaviors than the consumer could articulate themselves, and that is precisely the experience consumers are beginning to imagine and, in some cases, to demand. The value of a GenAI assistant will increasingly be measured not by how capable it is in isolation, but by how deeply it can draw on the full context of a person's life, which means the race ahead is not for better models but for better, broader, and more meaningfully connected data.
1.3 The Trust Paradox
The conversation becomes complicated when consumers simultaneously want deep personalization and airtight data protection, and yet see no contradiction in holding both expectations at once. From a consumer perspective, this is not irrational. What they are describing is a system that uses their data to serve them rather than to profile them for advertisers, sell their behavioral history to third parties, or expose their most sensitive information to a breach. They want the benefit of being known without the vulnerability of being exposed.
The platforms and organizations that manage to deliver both genuine personalization and genuine protection will earn a level of consumer trust that becomes a durable competitive advantage, while those that get it wrong will find that second chances in the age of GenAI are in short supply. In this environment, data trust is not a compliance issue to be managed by a legal team. It is a strategic one that belongs at the executive level.
BY THE NUMBERS: THE TRUST DIMENSION
86% of consumers are concerned about data privacy, yet 79% are willing to share data in exchange for personalized experiences, provided they trust how it will be used. (Salesforce)
48% of consumers have stopped using a service due to data privacy concerns. (Pew Research)
Andrew's lens: In over a decade of customer segmentation and propensity modeling work, the most consistent finding was that organizations almost always have more data than they act on. The gap is rarely in what is collected, but in what is connected, governed, and actually used to make decisions. GenAI does not change that underlying challenge. It amplifies it significantly.
Section 2: The Data Problem — Why Personalization at Scale Is Hard
Consumer expectations are clear, and the technology trajectory is pointed in the right direction. So what is actually in the way? The honest answer is that the data infrastructure, which is the plumbing behind the personalization promise, is not ready. In most organizations, it is not even close to ready.
BY THE NUMBERS: THE DATA ACCESS GAP
13% of organizations report having fully integrated, real-time customer data accessible across their systems. (Forrester, 2024)
60%+ of enterprise data goes unused in analytics and decision-making, locked in siloed systems with no path to activation. (Gartner)
$3.1T estimated annual cost of poor data quality to U.S. businesses, a figure that will grow as AI systems are asked to act on that data. (IBM)
2.1 The Consent and Access Architecture Gap
For a GenAI tool to personalize on behalf of a consumer across multiple data sources such as airline preferences, hotel history, and financial behavior, it needs lawful, structured, and auditable access to that data. This is not a technology problem in the conventional sense. It is an architecture and governance problem, and the current consent frameworks are simply not equipped to handle it.
Today's consent model is built for a simpler world. A consumer agrees to terms of service with a single company, that company uses their data within its own systems, and the consent is platform-specific, often opaque, and rarely portable. It was not designed for an AI agent acting on a consumer's behalf across multiple institutions simultaneously, and trying to retrofit that model onto the GenAI personalization use case is like trying to run a modern data pipeline through a system built for paper forms.
What the personalization promise actually requires is a fundamentally different consent architecture, one where consumers can grant a GenAI tool access to specific data types from specific institutions for specific purposes and revoke that access at any time. That kind of permissioned, portable, and auditable consent layer does not yet exist at scale, and building it will require sustained collaboration between technologists, regulators, and the industries that hold the most valuable consumer data.
2.2 The Integration and Real-Time Data Challenge
Even when consent exists, the organizations holding valuable consumer data are often not equipped to share it in a way that a GenAI tool could actually use. This is not a theoretical observation but something CVA sees directly in client engagements.
In a recent client engagement, CVA mapped the organization's full technology stack, and what we found is representative of a much broader pattern. CRM systems were disconnected from operations. Preference and behavioral data were siloed inside individual tools with no shared taxonomy, and there was no real-time API access between systems. If a sophisticated, growth-oriented organization cannot serve its own internal teams with clean and connected data, it certainly cannot serve an AI agent trying to personalize an experience on its customers' behalf. Across industries including airlines, hospitality, and financial services, the most valuable consumer preference data tends to live in legacy systems and proprietary loyalty platforms that were never designed for external access. The preference data that consumers imagine GenAI tools can access is, in most cases, not exposed through any API at all.
Real-time personalization also demands real-time data, and that requirement exposes a second layer of the problem. Preferences change over time and sometimes change quickly. A traveler who preferred window seats for years just had a bad experience and now wants the aisle. Static data serves static personalization, and the dynamic experience consumers expect requires data infrastructure that most organizations have not yet built and in many cases have not yet seriously planned for.
Andrew's Lens: The preference data that consumers imagine GenAI tools can access is, in most cases, not exposed through any API at all.
2.3 The Cybersecurity Exposure
The personalization vision, which imagines a GenAI tool aggregating behavioral data from airlines, hotels, credit cards, and health platforms into a unified intelligent layer, is from a cybersecurity perspective a remarkably high-value target that the industry has not yet seriously reckoned with.
Every piece of data that flows into a GenAI personalization layer adds to the attack surface. A breach that once would have compromised a single loyalty program now potentially compromises a consumer's full behavioral profile across their financial life, travel patterns, and purchasing behavior. The concentration of data that makes personalization powerful is precisely the same concentration that makes a breach catastrophic, and the economics of that risk have not yet been priced into most product and platform decisions.
The questions of liability and accountability that follow from a breach of this kind remain largely unresolved. If a GenAI tool aggregates data from multiple institutions and is breached, who bears responsibility? None of the current regulatory frameworks were designed to answer that question at the speed or scale at which GenAI is moving, which means the security architecture for personalized AI will need to be built from first principles rather than retrofitted onto frameworks designed for a simpler data environment.
Andrew's lens: In the Army, we learned that unsecured intelligence is not an asset but a liability, and the same principle applies directly here. A personalization layer built on aggregated consumer data that cannot be adequately protected is not a product advantage but a strategic risk that will eventually surface at the worst possible moment. The organizations that treat security as an afterthought in their technology architecture almost always discover that fact too late to avoid the consequences.
Understanding the Gap Is the First Step Toward Closing It
The consumer expectation for deeply personalized GenAI experiences is already formed and growing rapidly. The data infrastructure, consent architecture, and security frameworks needed to deliver on it are not yet in place, and those two facts together define the moment the industry currently occupies.
What comes next, specifically the concrete steps that organizations, platforms, and policymakers can take to begin closing that gap, is the subject of Part 2 of this series. That paper examines the strategic partnerships, open API frameworks, and data monetization models that represent the most promising directions forward, and it speaks directly to the CMOs, content leaders, and agency executives who will need to make real decisions about this landscape sooner than most expect.
COMING NEXT WEEK → PART 2 OF 2
The Personalization Imperative: A Path Forward for Leaders Who Are Ready to Act
Explores strategic partnerships, open API standards, and data monetization models, with specific next steps for CMOs, content companies, and agency leaders.
Is Your Organization Ready for Personalized AI?
Before you can solve the personalization problem for your customers, you need to understand where your own organization stands. Take CVA's AI Depth Check™, a free 10-minute diagnostic scoring your readiness across workflows, data, pricing, governance, and GenAI adoption.
https://www.coastalviewadvisory.com/genai-depthcheck-content-operations
About the Author
Andrew Chan is Co-Founder and Partner at Coastalview Advisory, Inc. He began his career as a U.S. Army Military Intelligence Officer before spending more than 11 years at Deloitte Digital building practices in data science, customer segmentation, MarTech architecture, and GenAI platforms, with advisory experience across TMT industries.
andrew.chan@coastalviewadvisory.com • linkedin.com/in/andrew-chan-808
See Other Blogs
Contact us for a
no-obligation consultation




