Early in my career, the technology contracts I signed were fairly straightforward. I knew what I was paying for, how many seats I had and what was included. Even as data volumes grew and systems became more complex, costs were still mostly understandable. You could estimate what it took to store, move and process data. It wasn’t always obvious, but it was knowable.
That sense of clarity is fading. Watching two new data centers rise just outside my neighborhood is a physical reminder of how quickly we’re scaling compute for AI — and of the entirely different cost structure it introduces, one that’s harder to see, harder to predict and harder to control.
Everyone is focused on the upside: productivity, creativity, velocity and new capabilities. But the financial architecture beneath those gains is still immature. Most organizations don’t know the actual cost of a single AI interaction, what drives usage spikes or whether model consumption is aligned to actual value. AI doesn’t behave like the infrastructure we spent decades learning to manage. It behaves like a series of invisible inference events happening everywhere at once, triggered by anyone.
As AI shifts from experiment to core capability inside marketing systems — powering content, personalization, segmentation, decisioning and orchestration — the reckoning becomes inevitable. If marketing and operations leaders don’t build real cost literacy and visibility now, AI will become the fastest-growing, least predictable line item in the martech budget.
Why this is different — and why the data supports it
AI doesn’t just add a new line item to technology budgets — it changes how cost behaves. Recent research shows that AI costs scale faster, less linearly and with far less visibility than previous generations of technology. The 2025 State of AI Cost Management Report found that 84% of companies are already experiencing measurable gross-margin erosion from AI infrastructure, with 26% reporting a margin impact of 16% or higher. More concerning, 80% of enterprises miss their AI infrastructure forecasts by more than 25%, signaling this is not a planning failure but a structural one.
At the infrastructure level, the cost curve itself is steepening. The cost to train the most compute-intensive models has increased at roughly 2.4 times per year. This is driven by accelerator hardware, specialized staff, interconnects and energy demands. While most companies aren’t training frontier models directly, these economics cascade downstream through API pricing, hosted platforms and cloud infrastructure.
For most organizations, however, the real exposure comes from inference — the cost of using AI at scale. As systems become more agentic and dynamic, a single request increasingly fans out into multiple model calls, retrieval steps, tool invocations and safety checks. Research on dynamic reasoning systems shows that while these architectures improve flexibility and performance, they also introduce significant overhead in tokens, latency, energy and infrastructure, with diminishing returns as complexity increases.
Importantly, this cost escalation is not inevitable. Empirical studies show that better agent design and orchestration can materially reduce spend without sacrificing performance. One recent paper demonstrated a 28.4% reduction in operational cost while retaining more than 96% of benchmark performance, underscoring that architecture — not just model choice — is a primary cost driver.
What makes AI especially difficult to manage is that many of its highest costs aren’t where organizations expect them. Beyond model usage, companies routinely underestimate expenses tied to networking, data movement, storage, redundancy, energy, cooling and operational overhead.
The result is a cost structure that is consumption-based, distributed and opaque by default. AI spend does not arrive neatly packaged as a license fee. It accumulates through thousands — or millions — of invisible interactions, triggered by people, workflows and increasingly by other machines.
Dig deeper: AI productivity gains, like vendors’ AI surcharges, are hard to find
Why this becomes a problem at scale
At an individual level, AI is already delivering value. Multiple studies show meaningful productivity gains for knowledge workers — faster drafting, quicker analysis and less time spent on repetitive tasks. That impact is real, visible and easy to feel.
What’s far less common is seeing those gains translate cleanly at the organizational level. Recent research highlights a widening gap between personal productivity benefits and enterprise-wide return. McKinsey’s The State of AI in 2025 reports that while AI adoption is widespread, only a small percentage of companies have successfully scaled AI into production in ways that deliver material financial impact. Many remain stuck in pilots, fragmented deployments or narrowly scoped use cases that don’t compound into durable advantage.
At the same time, spending is accelerating. Most organizations are investing aggressively in AI infrastructure while missing cost forecasts and experiencing margin erosion. This creates a dangerous dynamic: companies feel pressure to keep up in the AI race, even when the path to value isn’t clear.
This is how cost problems emerge quietly. Teams experiment in parallel. Tools proliferate. Usage grows faster than governance. Infrastructure scales before outcomes are well understood. The result isn’t reckless behavior — it’s misalignment. Investment decisions are being made faster than organizations can gain clarity on which AI use cases deserve scale, which should remain constrained and which should be shut down entirely. The risk isn’t that AI fails to deliver value. It’s that value emerges unevenly while cost accumulates everywhere.
This is where marketing organizations sit squarely in the blast radius — and also where they have the most leverage. Marketing teams are often early adopters, high-volume users and constant experimenters, embedding AI into content, personalization, decisioning and testing long before enterprise guardrails are fully formed. Without a transparent cost structure and ownership model, what begins as local efficiency can quickly become a systemic margin issue.
There’s a familiar lesson here. Just as strong brand foundations amplify performance marketing — rather than replace it — AI infrastructure must come before AI s
