Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
I use AI tools like ChatGPT, Gemini and Copilot to explore career plans, responsibilities, ambitions and even moments of self-doubt. It’s not just about finding answers – it’s about gaining clarity by seeing how your ideas are reflected, reframed or expanded.
Millions of people rely on AI for guidance and trust these systems to help them navigate the complexities of life. Yet every time we share, we also teach these systems. Our vulnerabilities—our doubts, hopes, and fears—become part of a larger machine. AI doesn’t just help us; they learn it from us.
For years, the attention economy has thrived on capturing and monetizing our focus. Social media platforms have optimized their algorithms for engagement, often favoring sensationalism and outrage to keep us scrolling. But now AI tools like ChatGPT represent the next phase. Not only do they attract our attention; they shape our actions.
This development has been labeled the “intention economy,” where companies collect and commodify users’ intentions—our goals, desires, and motivations. As researchers Chaudhary and Penn argue in their Harvard Data Science Review article, “Beware of the intention economy: the collection and commodification of intentions through large language models,” these systems don’t just respond to our queries – they actively shape our decisions, often in line with corporate profits over personal benefits.
Dig deeper: Are marketers trusting AI too much? How to avoid the strategic trap
Honey, the browser extension acquired by PayPal for $4 billion, illustrates how it trusts can be silently exploited. Honey’s practices, which are marketed as a tool to save users money, tell a different story. In his series, YouTuber MegaLag stated “Honey Influencer Scam Revealed” that the platform redirected affiliate links from influencers to itself, diverting potential earnings while capturing clicks for profit.
Honey also gave retailers control over which coupons users saw, promoting less attractive discounts and driving consumers away from better deals. The influencers who endorsed Honey were unwittingly encouraging their audience to use a tool that drained their own commissions. By becoming a useful tool, he built trust – and then capitalized on it for financial gain.
“He wasn’t saving you money honey – he was robbing you and pretending to be your ally.
“MegaLag.”
(Note: Some have said that the MegaLag account contains errors; this is an ongoing story.)
The dynamics we saw with Honey are eerily familiar with AI tools. These systems present themselves as neutral and without obvious monetization strategies. For example, ChatGPT does not bombard users with advertisements or sales offers. It’s like a tool designed solely to help you think, plan and solve problems. Once that trust is established, influencing decisions becomes much easier.
Ddeeper: The Ethics of Marketing Technology Based on Artificial Intelligence
You don’t have to be a tech expert to protect yourself from hidden agendas. By asking the right questions, you can find out what interests the platform really serves. Here are five key questions to help you.
Every platform serves someone – but who exactly?
Start by asking yourself:
What to watch out for:
Most digital systems are not truly “free”. If you’re not paying with money, you’re paying with something else: your data, your attention, or even your trust.
Ask yourself:
What to watch out for:
Every digital instrument has a program – sometimes subtle, sometimes not. Algorithms, nudges, and design choices shape the way you interact with the platform and even the way you think.
Ask yourself:
What to watch out for:
Dig deeper: How behavioral economics can be a marketer’s secret weapon
When platforms cause harm — whether it’s a data breach, impact on mental health, or user exploitation — liability often becomes a murky issue.
Ask yourself:
What to watch out for:
A trusted system does not hide its operation – it calls for inspection. Transparency is not just about explaining policies in the fine print; it’s about getting users to understand and challenge the system.
Ask yourself:
What to watch out for:
Dig deeper: How wisdom makes AI more effective in marketing
We have faced similar challenges before. In the early days of search engines, the line between paid and organic results was blurred until public demand for transparency forced a change. But with AI and the intentional economy, the stakes are much higher.
Organizations such as the Marketing Accountability Council (MAC) are already working to achieve this goal. MAC evaluates platforms, advocates for regulation and educates users about digital manipulation. Imagine a world where every platform has a clear, honest “nutrition label” that outlines its intentions and mechanisms. That’s the future MAC is trying to create. (Disclosure: I started a MAC.)
Creating a fairer digital future is not just a societal responsibility; it is collective. The best solutions don’t come from boardrooms, but from people who care. That’s why we need your voice to shape this movement.
Dig deeper: The science behind high performing calls to action
Contributing authors are invited to create content for MarTech and are selected for their expertise and contribution to the martech community. Our contributors work under supervision editorial office and submissions are reviewed for quality and relevance to our readers. The opinions they express are their own.