Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
In a recent presentation to students at the University of Wisconsin-Madison, I encouraged them to expand the scope of their use of artificial intelligence beyond stereotypical applications such as homework and social media. I started the discussion by asking: Can the impact of artificial intelligence be simultaneously over- and under-praised?
My conclusion is that yes, it can be overdone and under-shouted. But I came to this conclusion based on a related insight: our view of AI will be greatly shaped if or when we can invest time in training and experimenting with AI (DIY).
It’s always important to experiment with new technologies, but it’s especially important with artificial intelligence. Recently, I’ve been thinking about whether my adoption and value of AI is too limited. I decided to devote even more time to DIY experimentation.
Independent experiments with technologies like artificial intelligence can change the way you think and help you get through “trough of disappointment” that follows the much-vaunted technology while delivering greater value for your customers.
Dig deeper: How to implement AI for your marketing team
We’ve all experienced AI vertigo in the last two years. For me personally, I experienced a happy moment in relation to the release of ChatGPT.
At the end of 2022, I have made the decision to step back from a corporate leadership role for the longest hiatus I have ever taken. I eventually transitioned into consulting and teaching. I had no idea, of course, that this hiatus would coincide with the launch of ChatGPT. It’s still fascinating to hear what OpenAI actually describes it as “a modest survey of research.”
My break meant I had time for independent training, testing and learning. Many friends and colleagues who were still in corporate roles didn’t have time to experiment and quickly labeled AI as overhyped.
Many of these colleagues do not have the capacity or leadership to take the time to train themselves. I was lucky. I attended live events rather than on demand. I listened to all the podcasts linked. But most importantly, I had time to try things out in DIY mode which convinced me of the value, despite all the limitations.
To illustrate the importance of DIY time to creating value, I created my own arousal cycle graphic with different training triggers that I believe we all go through.
Gap 1 is the time between the launch of the technology and when you start learning. This is usually based on an external trigger for training, through your job or network of colleagues.
Gap 2 isn’t just when you leverage technology, but critically, when you cross what I call the “DIY Chasm.” You have to zigzag your way through limitations and reach those “A-Ha” moments on your own terms.
The cumulative time between Gap 1 and Gap 2 has an exponential impact on the resulting gap you feel between personal productivity and the value you get from AI and current capabilities.
In my previous roles, I would have held similar views about AI being overhyped. The pressures of managing day-to-day projects meant that established patterns for managing martech were established. Previously, when you got stuck or reached a limit, a quick search would provide the answer. But if that knowledge base didn’t include your specific context, you’re stuck. We often blamed the software in these cases.
Since genAI is not constrained by those original limitations, further experimentation to help optimize your results is even more critical.
Two years later, I breached the Gap 3 challenge. My teaching and consulting work reduces my ability for DIY experimentation, while at the same time the pace of change in artificial intelligence accelerates. There are some days I don’t feel like I’m falling behind. But every time I devote more time to testing, I hit a “wow” moment.
Clearly, I am the limitation, not the AI technology. I’ve found it helpful to stay connected with thought leaders reminding us that this is the “worst AI” we’ll ever work with.
I always credit Scott Brinker with helping shape my views on “civic martech.” I eagerly looked at the latest MarTech 2025 report from Brinker and Frans Riemersma. But given the lack of capacity, I only had time for a quick read rather than the thorough review I would have preferred. (You will find a summary and download links on the MarTech website.)
A quick scan of the report revealed that the leading use cases for AI are focused on content and personalization at various stages of ideation and distribution, and the sheer number of mentions of these applications tells us that the impact of AI is not really exaggerated. develop.
But the jury is still out on whether users feel the improvement. This is where my latest AI DIY experimentation revealed some insights for the future of personalization.
Dig deeper: Consumers are not enthusiastic about AI experiences
I will absolutely be back to read Brinker and Riemersma’s report, which runs to over 100 pages, in its entirety. However, I want to share two AI-infused shortcuts that have helped me bridge the value gap while signaling the future of more personalized content.
First, before a recent ride, I loaded Brinker’s chiefmartec articles into ElevenLab Reader. An AI-generated voice reads the latest articles. If you haven’t tried this, it’s underrated and will change your long-form content consumption forever.
A screenshot of ElevenLab’s Reader app.
I then loaded the entire report into Google’s NotebookLM and listened to the audio preview it generated.
Audio Preview creates an AI-generated “podcast” with two enthusiastic hosts discussing topics from source material you’ve selected. It allows you to upload documents, YouTube links, web pages and more.
I also created a NotebookLM AI preview for most a recent series of articles for MarTech.
It is this “native grounding” that makes NotebookLM so powerful, while still taking advantage of the overall model (Gemini) in this multi-modal format. As Steve Johnson, one of NotebookLM’s co-creators, said on the Google Deep Mind Podcast, it’s “a kind of personalized artificial intelligence that’s an expert on the information you care about.”
If you’re interested in NotebookLM, keep testing more than the audio preview. NotebookLM allowed me to “talk” with Brinker and Riemersma through a back-and-forth review with the AI review.
Screenshot of NotebookLM answering a question using source material.
At this point, I realized that I needed to add a key topic to my 2025 learning plans — a content layer. As Rasmus Houlind explained, it involves a chain of multiple content LLMs working together in a suite of martech to enhance personalization.
I think Houlind would agree with my thoughts on the need for a customer tone, using data from previous discussions such as emails, meeting notes and more, which I suggested in Part two of my series.
This personalized content discovery is a combination of overarching trends that I have written in 2024. The ROI of crossing the original DIY Chasm two years ago with genAI has been well worth it, but it has a shorter lifespan as we go through the AI vertigo. I have to plan more time dedicated to DIY chasm crossing than ever.
With the help of the personal artificial intelligence technology infusions I discussed here, I still managed to gain valuable insights despite the time crunch. Thanks to Q&A, AI engagement, and my preferred audio formats, AI helped personalize key content insights, on my terms.
Now we just need to prioritize efforts to expand these approaches for our customers.
Contributing authors are invited to create content for MarTech and are chosen for their expertise and contributions to the martech community. Our associates work under supervision redaction and contributions are checked for quality and relevance to our readers. The opinions expressed are their own.