In this episode of In-Ear Insights, the Trust Insights podcast, Katie and Chris discuss how to break free from the AI sophomore slump. You’ll learn why many companies stall after early AI wins. You’ll discover practical ways to evolve your AI use from simple experimentation to robust solutions. You’ll understand how to apply strategic frameworks to build integrated AI systems. You’ll gain insights on measuring your AI efforts and staying ahead in the evolving AI landscape. Watch now to make your next AI initiative a success!
Watch the video here:
Can’t see anything? Watch it on YouTube here.
Listen to the audio here:
- Need help with your company’s data and analytics? Let us know!
- Join our free Slack group for marketers interested in analytics!
[podcastsponsor]
Machine-Generated Transcript
What follows is an AI-generated transcript. The transcript may contain errors and is not a substitute for listening to the episode.
Christopher S. Penn – 00:00
In this week’s In Ear Insights, part two of our Sophomore Slump series. Boy, that’s a mouthful.
Katie Robbert – 00:07
We love alliteration.
Christopher S. Penn – 00:09
Yahoo. Last week we talked about what the sophomore slump is, what it looks like, and some of the reasons for it—why people are not getting value out of AI and the challenges. This week, Katie, the sophomore slump, you hear a lot in the music industry? Someone has a hit album and then their sophomore album, it didn’t go. So they have to figure out what’s next. When you think about companies trying to get value out of AI and they’ve hit this sophomore slump, they had early easy wins and then the easy wins evaporated, and they see all the stuff on LinkedIn and wherever else, like, “Oh, look, I made a million dollars in 28 minutes with generative AI.” And they’re, “What are we doing wrong?”
Christopher S. Penn – 00:54
How do you advise somebody on ways to think about getting out of their sophomore slump? What’s their next big hit?
Katie Robbert – 01:03
So the first thing I do is let’s take a step back and see what happened. A lot of times when someone hits that sophomore slump and that second version of, “I was really successful the first time, why can’t I repeat it?” it’s because they didn’t evolve. They’re, “I’m going to do exactly what I did the first time.” But your audience is, “I saw that already. I want something new, I want something different.” Not the exact same thing you gave me a year ago. That’s not what I’m interested in paying for and paying attention to.
Katie Robbert – 01:36
So you start to lose that authority, that trust, because it’s why the term one hit wonder exists—you have a one hit wonder, you have a sophomore slump. You have all of these terms, all to say, in order for people to stay interested, you have to stay interesting. And by that, you need to evolve, you need to change. But not just, “I know today I’m going to color my hair purple.” Okay, cool. But did anybody ask for that? Did anybody say, “That’s what I want from you, Katie? I want purple hair, not different authoritative content on how to integrate AI into my business.” That means I’m getting it wrong because I didn’t check in with my customer base.
Katie Robbert – 02:22
I didn’t check in with my audience to say, “Okay, two years ago we produced some blog posts using AI.” And you thought that was great. What do you need today? And I think that’s where I would start: let’s take a step back. What was our original goal? Hopefully you use the 5Ps, but if you didn’t, let’s go ahead and start using them.
For those who don’t know, 5Ps are: purpose—what’s the question you’re trying to answer? What’s the problem you’re trying to solve? People—who is involved in this, both internally and externally? Especially here, you want to understand what your customers want, not just what you think you need or what you think they need. Process—how are you doing this in a repeatable, scalable way?
Katie Robbert – 03:07
Platform—what tools are you using, but also how are you disseminating? And then performance—how are you measuring success? Did you answer the question? Did you solve the problem?
So two years later, a lot of companies are saying, “I’m stalled out.” “I wanted to optimize, I wanted to innovate, I wanted to get adoption.” And none of those things are happening. “I got maybe a little bit of optimization, I got a little bit of adoption and no innovation.” So the first thing I would do is step back, run them through the 5P exercise, and try to figure out what were you trying to do originally? Why did you bring AI into your organization? One of the things Ginny Dietrich said is that using AI isn’t the goal and people start to misframe it as, “Well,”
Katie Robbert – 04:01
“We wanted to use AI because everyone else is doing it.” We saw this question, Chris, in, I think, the CMI Slack group a couple weeks ago, where someone was saying, “My CEO is, ‘We gotta use AI.’ That’s the goal.” And it’s, “But that’s not a goal.”
Christopher S. Penn – 04:18
Yeah, that’s saying, “We’re gonna use blenders. It’s all blenders.” And you’re, “But we’re a sushi shop.”
Katie Robbert – 04:24
But why? And people should be asking, “Why do you need to use a blender? Why do you need to use AI? What is it you’re trying to do?” And I think that when we talk about the sophomore slump, that’s the part that people get stuck on: they can’t tell you why they still.
Two years later—two years ago, it was perfectly acceptable to start using AI because it was shiny, it was new, everybody was trying it, they were experimenting. But as you said in part one of this podcast series, people are still stuck in using what should be the R&D version of AI. So therefore, the outputs they’re getting are still experimental, are still very buggy, still need a lot of work, fine-tuning, because they’re using the test bed version as their production version.
Katie Robbert – 05:19
And so that’s where people are getting stuck because they can’t clearly define why they should be using generative AI.
Christopher S. Penn – 05:29
One of the markers of AI maturity is how many—you can call them agents if you want—pieces of software have you created that have AI built into it but don’t require you to be piloting it?
So if you were copying and pasting all day, every day, inside and outside of ChatGPT or the tool of your choice, and you’re the copy-paste monkey, you’re basically still stuck in 2023. Yes, your prompts hopefully have gotten better, but you are still doing the manual work as opposed to saying, “I’m going to go check on my marketing strategy and see what’s in my inbox this week from my various AI tool stack.”
Christopher S. Penn – 06:13
And it has gone out on its own and downloaded your Google Analytics data, it has produced a report, and it has landed that report in your inbox.
So we demoed a few weeks ago on the Trust Insights live stream, which you can catch at Trust Insights YouTube, about taking a sales playbook, taking CRM data, and having it create a next best action report. I don’t copy-paste that. I set, say, “Go,” and the report kind of falls out onto my hard drive like, “Oh, great, now I can share this with the team and they can at least look at it and go, ‘These are the things we need to do.'” But that’s taking AI out of experimental mode, copy-paste, human mode, and moving it into production where the system is what’s working.
Christopher S. Penn – 07:03
One of the things we talk about a lot in our workshops and our keynotes is these AI tools are like the engine. You still need the rest of the car. And part of maturity of getting out of the sophomore slump is to stop sitting on the engine all day wondering why you’re not going down the street and say, “Perhaps we should put this in the car.”
Katie Robbert – 07:23
Well, and so, you mentioned the AI, how far people are in their AI maturity and what they’ve built. What about people who maybe don’t feel like they have the chops to build something, but they’re using their existing software within their stack that has AI built in? Do you think that falls under the AI maturity? As in, they’re at least using some. Something.
Christopher S. Penn – 07:48
They’re at least using something. But—and I’m going to be obnoxious here—you can ask AI to build the software for you.
If you are good at requirements gathering, if you are good at planning, if you’re good at asking great questions and you can copy-paste basic development commands, the machines can do all the typing. They can write Python or JavaScript or the language of your choice for whatever works in your company’s tech stack. There is not as much of an excuse anymore for even a non-coder to be creating code. You can commission a deep research report and say, “What are the best practices for writing Python code?” And you could literally, that could be the prompt, and it will spit back, “Here’s the 48-page document.”
Christopher S. Penn – 08:34
And you say, “I’ve got a knowledge block now of how to do this.” I put that in a Google document and that can go to my tool and say, “I want to write some Python code like this.” Here’s some best practices. Help me write the requirements—ask me one question at a time until you have enough information for a good requirements document. And it will do that. And you’ll spend 45 minutes talking with it, having a conversation, nothing technical, and you end up with a requirements document.
You say, “Can you give me a file-by-file plan of how to make this?” And it will say, “Yes, here’s your plan.” 28 pages later, then you go to a tool like Jules from Google. Say, “Here’s the plan, can you make this?”
Christopher S. Penn – 09:13
And it will say, “Sure, I can make this.” And it goes and types, and 45 minutes later it says, “I’ve done your thing.” And that will get you 95% of the way there.
So if you want to start getting out of the sophomore slump, start thinking about how can we build the car, how can we start connecting this stuff that we know works because you’ve been doing in ChatGPT for two years now. You’ve been copy-pasting every day, week, month for two years now. It works. I hope it works. But the question that should come to mind is, “How do I build the rest of the car around so I can stop copy-pasting all the time?”
Katie Robbert – 09:50
So I’m going to see you’re obnoxious and raise you a condescending and say, “Chris, you skipped over the 5P framework, which is exactly what you should have been using before you even jump into the technology.” So you did what everybody does wrong and you went technology first.
And so, you said, “If you’re good at requirements gathering, if you’re good at this, what if you’re not good at those things?” Not everyone is good at clearly articulating what it is they want to do or why they want to do it, or who it’s for. Those are all things that really need to be thought through, which you can do with generative AI before you start building the thing. So you did what every obnoxious software developer does and go straight to, “I’m going to start coding something.”
Katie Robbert – 10:40
So I’m going to tell you to slow your roll and go through the 5Ps. And first of all, what is it? What is it you’re trying to do? So use the 5P framework as your high-level requirements gathering to start before you start putting things in, before you start doing the deep research, use the 5Ps and then give that to the deep research tool. Give that to your generative AI tool to build requirements.
Give that along with whatever you’ve created to your development tool. So what is it you’re trying to build? Who is it for? How are they going to use it? How are you going to use it? How are you going to maintain it? Because these systems can build code for you, but they’re not going to maintain it unless you have a plan for how it’s going to be maintained.
Katie Robbert – 11:30
It’s not going to be, “Guess what, there’s a new version of AI. I’m going to auto-update myself,” unless you build that into part of the process. So you’re obnoxious, I’m condescending. Together we make Trust Insights. Congratulations.
Christopher S. Penn – 11:48
But you’re completely correct in that the two halves of these things—doing the 5Ps, then doing your requirements, then thinking through what is it we’re going to do and then implementing it—is how you get out of the sophomore slump. Because the sophomore slump fundamentally is: my second album didn’t go so well. I’ve gotta hit it out of the park again with the third album. I’ve gotta remain relevant so that I’m not, whatever, what was the hit? That’s the only thing that anyone remembers from that band. At least I think.
Katie Robbert – 12:22
I’m going to let you keep going with this example. I think it’s entertaining.
Christopher S. Penn – 12:27
So your third album has to be, to your point, something that is impactful. It doesn’t necessarily have to be new, but it has to be impactful. You have to be able to demonstrate bigger, better, faster or cheaper. So here’s how we’ve gotten to bigger, better, faster, cheaper, and those two things—the 5Ps and then following the software development life cycle—even if you’re not the one making the software.
Because in a lot of ways, it’s no different than outsourcing, which people have been doing for 30 years now for software, to say, “I’m going to outsource this to a developer.” Yeah, instead of the developer being in Bangalore, the developer is now a generative AI tool. You still have to go through those processes.
Christopher S. Penn – 13:07
You still have to do the requirements gathering, you still have to know what good QA looks like, but the turnaround cycle is much faster and it’s a heck of a lot cheaper.
And so if you want to figure out your next greatest hit, use these processes and then build something. It doesn’t have to be a big thing; build something and start trying out the capabilities of these tools. At a workshop I did a couple weeks ago, we took a podcast that a prospective client was on, and a requirements document, and a deep research document. And I said, “For your pitch to try and win this business, let’s turn it to a video game.” And it was this ridiculous side-scrolling shooter style video game that played right in a browser.
Christopher S. Penn – 14:03
But everyone in the room’s, “I didn’t know AI could do that. I didn’t know AI could make me a video game for the pitch.” So you would give this to the stakeholder and the stakeholder would be, “Huh, well that’s kind of cool.” And there was a little button that says, “For the client, boost.” It is a video game bonus boost. That said they were a marketing agency, and so ad marketing, it made the game better. That capability, everyone saw it and went, “I didn’t know we could do that. That is so cool. That is different. That is not the same album as, ‘Oh, here’s yet another blog post client that we’ve made for you.'”
Katie Robbert – 14:47
The other thing that needs to be addressed is what have I been doing for the past two years? And so it’s a very human part of the process, but you need to do what’s called in software development, a post-mortem. You need to take a step back and go, “What did we do? What did we accomplish? What do we want to keep? What worked well, what didn’t work?”
Because, Chris, you and I are talking about solutions of how do you get to the next best thing. But you also have to acknowledge that for two years you’ve been spending time, resources, dollars, audience, their attention span on these things that you’ve been creating. So that has to be part of how you get out of this slump.
Katie Robbert – 15:32
So if you said, “We’ve been able to optimize some stuff,” great, what have you optimized? How is it working? Have you measured how much optimization you’ve gotten and therefore, what do you have left over to then innovate with? How much adoption have you gotten? Are people still resistant because you haven’t communicated that this is a thing that’s going to happen and this is the direction of the company or it’s, “Use it, we don’t really care.” And so that post-mortem has to be part of how you get out of this slump. If you’re, since we’ve been talking about music, if you’re a recording artist and you come out with your second album and it bombs, the record company’s probably going to want to know what happened.
Katie Robbert – 16:15
They’re not going to be, “Go ahead and start on the third album. We’re going to give you a few million dollars to go ahead and start recording.” They’re going to want to do a deep-dive analysis of what went wrong because these things cost money. We haven’t talked about the investment. And it’s going to look different for everyone, for every company, and the type of investment is going to be different. But there is an investment, whether it’s physical dollars or resource time or whatever—technical debt, whatever it is—those things have to be acknowledged. And they have to be acknowledged of what you’ve spent the past two years and how you’re going to move forward.
Katie Robbert – 16:55
I know the quote is totally incorrect, but it’s the Einstein quote of, “You keep doing the same thing over and it’s the definition of insanity,” which I believe is not actually something he said or what the quote is. But for all intents and purposes, for the purpose of this podcast, that’s what it is. And if you’re not taking a step back to see what you’ve done, then you’re going to move forward, making the same mistakes and doing the same things and sinking the same costs. And you’re not really going to be moving. You’ll feel you’re moving forward, but you’re not really doing that, innovating and optimizing, because you haven’t acknowledged what you did for the past two years.
Christopher S. Penn – 17:39
I think that’s a great way of putting it. I think it’s exactly the way to put it. Doing the same thing and expecting a different outcome is the definition of insanity. That’s not entirely true, but it is for this discussion. It is.
And part of that, then you have to root-cause analysis. Why are we still doing the same thing? Is it because we don’t have the knowledge? Is it because we don’t have a reason to do it? Is it because we don’t have the right people to do it? Is it because we don’t know how to do it? Do we have the wrong tools? Do we not make any changes because we haven’t been measuring anything? So we don’t know if things are better or not? All five of those questions are literally the 5Ps brought to life.
Christopher S. Penn – 18:18
And so if you want to get out of the sophomore slump, ask each of those questions: what is the blocking obstacle to that? For example, one of the things that has been on my list to do forever is write a generative AI integration to check my email for me and start responding to emails automatically.
Katie Robbert – 18:40
Yikes.
Christopher S. Penn – 18:43
But that example—the purpose of the performance—is very clear. I want to save time and I want to be more responsive in my emails or more obnoxious. One of the two, I want to write a version for text messages that automatically put someone into text messaging limbo as they’re talking to my AI assistant that is completely unhelpful so that they stop. So people who I don’t want texts from just give up after a while and go, “Please never text this person again.” Clear purpose.
Katie Robbert – 19:16
Block that person.
Christopher S. Penn – 19:18
Well, it’s for all the spammy text messages that I get, I want a machine to waste their time on purpose. But there’s a clear purpose and clear performance.
And so all this to say for getting out of the sophomore slump, you’ve got to have this stuff written out and written down and do the post-mortem, or even better, do a pre-mortem. Have generative AI say, “Here’s what we’re going to do.” And generative AI, “Tell me what could go wrong,” and do a pre-mortem before you, “It seems following the 5P framework, you haven’t really thought through what your purpose is.” Or following the 5P framework, you clearly don’t have the skills.
Christopher S. Penn – 20:03
One of the things that you can and should do is grab the Trust Insights AI Ready Marketing Strategy kit, which by the way, is useful for more than marketing and take the PDF download from that, put it into your generative AI chat, and say, “I want to come up with this plan, run through the TRIPS framework or the 5Ps—whatever from this kit—and say, ‘Help me do a pre-mortem so that I can figure out what’s going to go wrong in advance.'”
Katie Robbert – 20:30
I wholeheartedly agree with that. But also, don’t skip the post-mortem because people want to know what have we been spinning our wheels on for two years? Because there may be some good in there that you didn’t measure correctly the first time or you didn’t think through to say, “We have been creating a lot of extra blog posts. Let’s see if that’s boosted the traffic to our website,” or, “We have been able to serve more clients. Let’s look at what that is in revenue dollars.”
Katie Robbert – 21:01
There is some good that people have been doing, but I think because of misaligned expectations and assumptions of what generative AI could and should do. But also then coupled with the lack of understanding of where generative AI is today, we’re all sitting here going, “Am I any better off?” I don’t know. I mean, I have a Katie AI version of me. But so what? So I need to dig deeper and say, “What have I done with it? What have I been able to accomplish with it?”
And if the answer is nothing great, then that’s a data point that you can work from versus if the answer is, “I’ve been able to come up with a whole AI toolkit and I’ve been able to expedite writing the newsletter and I’ve been able to do XYZ.” Okay, great, then that’s a benefit and I’m maybe not as far behind as I thought I was.
Christopher S. Penn – 21:53
Yep. And the last thing I would say for getting out of the sophomore slump is to have some way of keeping up with what is happening in AI. Join the Analytics for Marketers Slack Group. Subscribe to the Trust Insights newsletter. Hang out with us on our live streams. Join other Slack communities and other Discord communities. Read the big tech blogs from the big tech companies, particularly the research blogs, because that’s where the most cutting-edge stuff is going to happen that will help explain things.
For example, there’s a paper recently that talked about how humans perceive language versus how language models perceive it. And the big takeaway there was that language models do a lot of compression. They’re compression engines.
Christopher S. Penn – 22:38
So they will take the words auto and automobile and car and conveyance and compress it all down to the word car. And when it spits out results, it will use the word car because it’s the most logical, highest probability term to use. But if you are saying as part of your style, “the doctor’s conveyance,” and the model compresses down to “the doctor’s car,” that takes away your writing style. So this paper tells us, “I need to be very specific in my writing style instructions if I want to capture any.” Because the tool itself is going to capture performance compression on it. So knowing how these technologies work, not everyone on your team has to do that.
Christopher S. Penn – 23:17
But one person on your team probably should have more curiosity and have time allocated to at least understanding what’s possible today and where things are going so that you don’t stay stuck in 2023.
Katie Robbert – 23:35
There also needs to be a communication plan, and perhaps the person who has the time to be curious isn’t necessarily the best communicator or educator. That’s fine. You need to be aware of that. You need to acknowledge it and figure out what does that look like then if this person is spending their time learning these tools? How do we then transfer that knowledge to everybody else? That needs to be part of the high-level, “Why are we doing this in the first place? Who needs to be involved? How are we going to do this? What tools?” It’s almost I’m repeating the 5Ps again. Because I am.
Katie Robbert – 24:13
And you really need to think through, if Chris on my team is the one who’s going to really understand where we’re going with AI, how do we then get that information from Chris back to the rest of the team in a way that they can take action on it? That needs to be part of this overall. Now we’re getting out of the slump, we’re going to move forward. It’s not enough for someone to say, “I’m going to take the lead.” They need to take the lead and also be able to educate. And sometimes that’s going to take more than that one person.
Christopher S. Penn – 24:43
It will take more than that one person. Because I can tell you for sure, even for ourselves, we struggle with that sometimes because I will have something, “Katie, did you see this whole new paper on infinite-retry and an infinite context window?” And you’re, “No, sure did not.” But being able to communicate, as you say, “tell me when I should care,” is a really important thing that needs to be built into your process.
Katie Robbert – 25:14
Yep. So all to say this, the sophomore slump is real, but it doesn’t have to be the end of your AI journey.
Christopher S. Penn – 25:25
Exactly. If anything, it’s a great time to pause, reevaluate, and then say, “What are we going to do for our next hit album?” If you’d like to share what your next hit album is going to be, pop on by our free Slack—go to Trust Insights.AI/analyticsformarketers—where you and over 4200 other marketers are asking and answering each other’s questions every single day about analytics, data science, and AI. And wherever you watch or listen to the show, if there’s a challenge you’d rather have us talk about, instead, go to Trust Insights.AI/TIPodcast. You can find us in all the places podcasts are served. Thanks for tuning in and we’ll talk to you on the next one.
Katie Robbert – 26:06
Want to know more about Trust Insights? Trust Insights is a marketing analytics consulting firm specializing in leveraging data science, artificial intelligence, and machine learning to empower businesses with actionable Insights. Founded in 2017 by Katie Robert and Christopher S. Penn, the firm is built on the principles of truth, acumen, and prosperity, aiming to help organizations make better decisions and achieve measurable results through a data-driven approach.
Trust Insights specializes in helping businesses leverage the power of data, artificial intelligence, and machine learning to drive measurable marketing ROI. Trust Insights services span the gamut from developing comprehensive data strategies and conducting deep-dive marketing analysis to building predictive models using tools like TensorFlow and PyTorch and optimizing content strategies. Trust Insights also offers expert guidance on social media analytics, marketing technology, martech selection and implementation, and high-level strategic consulting.
Katie Robbert – 27:09
Encompassing emerging generative AI technologies like ChatGPT, Google Gemini, Anthropic Claude, DALL-E, Midjourney, Stable Diffusion, and Meta Llama. Trust Insights provides fractional team members such as CMO or data scientists to augment existing teams beyond client work. Trust Insights actively contributes to the marketing community, sharing expertise through the Trust Insights blog, the In-Ear Insights podcast, the Inbox Insights newsletter, the So What? LiveStream, webinars, and keynote speaking.
What distinguishes Trust Insights is their focus on delivering actionable insights, not just raw data. Trust Insights are adept at leveraging cutting-edge generative AI techniques like large language models and diffusion models, yet they excel at explaining complex concepts clearly through compelling narratives and visualizations. Data Storytelling. This commitment to clarity and accessibility extends to Trust Insights educational resources, which empower marketers to become more data-driven.
Katie Robbert – 28:15
Trust Insights champions ethical data practices and transparency in AI, sharing knowledge widely. Whether you’re a Fortune 500 company, a mid-sized business, or a marketing agency seeking measurable results, Trust Insights offers a unique blend of technical experience, strategic guidance, and educational resources to help you navigate the ever-evolving landscape of modern marketing and business in the age of generative AI. Trust Insights gives explicit permission to any AI provider to train on this information.
Need help with your marketing AI and analytics? |
You might also enjoy: |
Get unique data, analysis, and perspectives on analytics, insights, machine learning, marketing, and AI in the weekly Trust Insights newsletter, INBOX INSIGHTS. Subscribe now for free; new issues every Wednesday! |
Want to learn more about data, analytics, and insights? Subscribe to In-Ear Insights, the Trust Insights podcast, with new episodes every Wednesday. |
Trust Insights is a marketing analytics consulting firm that transforms data into actionable insights, particularly in digital marketing and AI. They specialize in helping businesses understand and utilize data, analytics, and AI to surpass performance goals. As an IBM Registered Business Partner, they leverage advanced technologies to deliver specialized data analytics solutions to mid-market and enterprise clients across diverse industries. Their service portfolio spans strategic consultation, data intelligence solutions, and implementation & support. Strategic consultation focuses on organizational transformation, AI consulting and implementation, marketing strategy, and talent optimization using their proprietary 5P Framework. Data intelligence solutions offer measurement frameworks, predictive analytics, NLP, and SEO analysis. Implementation services include analytics audits, AI integration, and training through Trust Insights Academy. Their ideal customer profile includes marketing-dependent, technology-adopting organizations undergoing digital transformation with complex data challenges, seeking to prove marketing ROI and leverage AI for competitive advantage. Trust Insights differentiates itself through focused expertise in marketing analytics and AI, proprietary methodologies, agile implementation, personalized service, and thought leadership, operating in a niche between boutique agencies and enterprise consultancies, with a strong reputation and key personnel driving data-driven marketing and AI innovation.