The original bargain
The internet was supposed to be free. That was the promise. Free search. Free email. Free social networking. Free content. The currency was attention, and the implicit contract was simple: you use the product, we show you ads. Most people accepted this without much thought. Ads were the price of free.
But the bargain changed. Slowly at first, then all at once. The ads became targeted. The targeting required surveillance. The surveillance required data collection at a scale that no one had consented to in any meaningful sense. The free product was no longer funded by showing ads. It was funded by selling access to the attention and identity of the people using it.
How attention became a commodity
The attention economy did not emerge from a single decision. It was the compound effect of incentive structures. Platforms that captured more attention attracted more advertisers. More advertisers funded more features. More features captured more attention. The flywheel rewarded engagement above everything else.
But engagement is not a neutral metric. Outrage engages. Anxiety engages. Insecurity engages. Platforms learned that content which provoked strong negative emotions held attention longer than content that informed or entertained. The algorithms optimised accordingly. Not because anyone decided to make people angry. Because the metric rewarded it.
The cost was trust. Users began to feel manipulated. They were right. The system was designed to hold their attention, not to serve their interests. And the data collected in the process was sold through a supply chain so complex that no individual participant could see or control the full picture.
The surveillance supply chain
Most people imagine that when they see a targeted ad, the advertiser chose to target them specifically. The reality is more diffuse and less controlled. A typical programmatic ad impression passes through five to fifteen intermediaries between the advertiser and the person who sees it. Data management platforms, demand-side platforms, supply-side platforms, ad exchanges, data brokers, identity resolution providers. Each takes a margin. Each handles the data differently. Each operates under different privacy standards.
A study by the Irish Council for Civil Liberties found that the real-time bidding system broadcasts personal data about individuals hundreds of billions of times per day. Location data, browsing history, inferred interests, and device identifiers are transmitted to thousands of companies in the time it takes a webpage to load. The person being profiled has no visibility into this process and no practical ability to stop it.
The trust deficit
The consequences are now visible at scale. A 2024 Edelman survey found that only 38 percent of people trust most brands to do the right thing with their data. Ad blocker adoption exceeds 40 percent in many markets. Cookie consent rates are declining as people learn what they are actually consenting to. Regulatory enforcement is accelerating. The GDPR, CCPA, and their successors are not abstractions. They are the policy response to a system that lost public trust.
The exploitation economy created its own opposition. People do not hate advertising. They hate being surveilled. They hate the feeling that their phone is listening. They hate seeing an ad for something they mentioned in a private conversation. The system taught them to distrust every digital interaction.
The alternative is not less intelligence. It is better architecture.
The solution to the exploitation economy is not to abandon personalisation. Relevance still matters. People still prefer to see content and offers that are useful to them. The problem was never personalisation itself. It was the architecture used to achieve it.
Intent builds intelligence that never requires data to leave the device. The on-device model processes behavioural signals locally and produces a privacy twin: a mathematical representation of intent that carries no personally identifiable information. The twin can be used to deliver relevant offers and content. But it cannot be sold to a data broker, leaked in a breach, or used to build a surveillance profile.
This is not privacy by policy. It is privacy by architecture. The data physically cannot leave the device. There is no supply chain of intermediaries. There is no real-time bidding auction. There is a model that understands intent and a system that responds to it.
From exploitation to understanding
The exploitation economy treated people as inventory to be sold. Behavioural intelligence treats them as individuals to be understood. The distinction is not cosmetic. It produces different architectures, different incentive structures, and different outcomes.
When the system is designed to understand intent rather than capture attention, the incentives align with the user. A system that correctly identifies what someone wants does not need to manipulate them into wanting something else. The relevance is genuine. The value exchange is honest.
The internet monetised attention at the cost of trust. The next era will monetise understanding, and the companies that build it on architectures that preserve trust will define the market. The exploitation economy is not inevitable. It is a design choice. And there is a better one available.