Content Marketing, Engineered Podcast

MAICON Recap: How Marketers are Embracing the Evolving AI Landscape

Written by Wendy Covey | 8/3/23 3:00 PM

What a head-spinning time we had at this year's Marketing AI Conference (MAICON). We're distilling everything we learned - with practical examples of how marketers are utilizing AI - in this week's episode of Content Marketing, Engineered. 


 

At TREW Marketing, we are no strangers to the power of AI marketing tools to provide intelligence and streamline mundane tasks. However, the ever-evolving nature of AI means that there is always something new to discover. From innovative tools to ethical considerations, we knew it was crucial to attend MAICON, hosted by the Marketing AI Institute. TREW's Senior Brand Strategist, Morgan Norris and I immersed ourselves in three days of learning. Now, we're excited to share the key takeaways from the conference, including emerging tools we're eager to experiment with, and creative ways marketers are leveraging AI tools and LLMs themselves to drive business results.

In this podcast episode, Morgan and I delve deep into the capabilities of Large Language Models (LLMs) for marketers. We explore how LLMs can streamline workflows, eliminate mundane tasks, and enhance creativity. We discuss generative AI application tools like Jasper, Writer, MarketMuse, and functional-specific tools for video, transcription and more. We also highlight the potential of AI as an aid in rapid strategy development, as demonstrated by Ethan Mollick's ability to create a host of product launch materials in 30 minutes using AI.

Join us as we share our insights and discoveries from MAICON, and get ready to be inspired by the possibilities of AI in marketing.

Resources


Transcript:

On today's episode, I'm bringing you a recap of my time and Morgan Norris' time at MAICON, which is the Marketing AI conference held late July 2023. So for those of you listening in the future, you have some context to when this was recorded and when the event occurred. Listen, the world of Marketing AI is changing rapidly, as you probably are already aware. And this conference was a wonderful opportunity for us to go way deep on what large language models can make possible and what application software for marketers can make possible, and what marketers ourselves can do to work more efficiently, to work smarter. And what are some of the pitfalls and traps that we need to look out for? What are some of the best practices that we need to implement within our organizations and our teams, and so much more. So all of that is included in this episode. Today in particular, we'll tell you about tools that we learned about, that we'll be studying. We'll tell you about some of those best practices and traps to avoid and give you just a better sense on how this all fits together. Let's do this.

Welcome to Content Marketing, Engineered, your source for building trust and generating demand with technical content.

Here is your host, Wendy Covey.

Hi, and welcome to Content Marketing, Engineered. On each episode, I'll break down an industry trend, challenge or best practice in reaching technical audiences. You'll meet colleagues, friends, and clients of mine who will stop by to share their stories, and I hope that you leave each episode feeling inspired and ready to take action. Before we jump in, I'd like to give a brief shout out to my agency, TREW Marketing. TREW is a full service agency located in beautiful Austin, Texas, serving highly technical companies. For more information, visit trewmarketing.com. And now on with our podcast.

Hey everyone, and welcome to another episode of Content Marketing, Engineered. I'm joined today again with my regular, almost co-host at this point, let's face it, Morgan. Morgan Norris, Senior Brand Strategist from TREW Marketing. Welcome to the show.

Hi, thanks for having me again.

We have spent a lot of time together lately co attending a conference out in Cleveland called MAICON. What the heck is MAICON?

Morgan MAICON is the Marketing AI Institute's conference. So this has been going on for a few years, but it's had a huge growth. So as a nod to just this topic is important to us. We know it's important to people listening, but I think that Paul said they had 250 people or something last year, something like that, under 300. And this year there are about 700 attendees. So just those numbers alone speak to the interest in the topic.

Yeah, huge growth and the diversity of attendees was pretty interesting. So I remember one day at our table, we went around and met everyone, and there was a woman who worked for a large attorney legal firm. There was an analyst from Gartner, a fellow agency that works on pay per click US. Of course, we met people, manufacturing people in working for laboratory services.

Oh, yeah, accounting, what other industries? Yeah, I talked to somebody from an accounting firm. I talked to people doing B2C. I talked to a woman who runs the corporate business that has a bunch of franchises for kids sports. They're trying to figure out how do they keep consistent messaging across a hundred franchise websites.

Oh, yeah. Oh, don't forget the realtor from Canada.

Realtor. Oh my gosh, it was everybody.

And then big brands, right? VMware was there and lots of other enterprise. Nissan got to know them pretty well. So yeah, all over.

That's great.

Yeah. So what we want to do on this episode today is just walk everybody through some of the key things we learned. The takeaways know going into this that no one has this figured out. It's brand new technology. It's changing every couple of weeks. And so there was a lot of joking about how these slides are going to be so cute and antiquated come three months or six months from now. So it's a snapshot in time of where we are today in July 2023. So before we jump into the takeaways, one of the first things that I wanted to mention real quick was several of the presenters talked about AI for marketing as kind of an analogy of cooking. And so I thought I would use that and share that with you. So that those of you who feel you're behind and don't really understand the AI stuff. And when someone says large language model or LLM, you're like, what the heck is that? So think of it this way, is when you cook, you have ingredients, right? So if you're making a pizza, you have your tomatoes and pepperoni and mushrooms or whatever, and then you have your cookbook that helps you put those raw ingredients into recipes.

And then of course, you have the dishes that you make. So the pizza, the output of that. And so they said you can think of the large language models like Chat GPT as that cookbook that you're using. They have raw materials inside the LLM. So those are your tomatoes and your mushrooms or whatever. So the ingredients are loaded up and then you're giving prompts to the Chat GPT or whatever it is. Those prompts are recipes. And then the output, which is that advice or that piece of content or whatever it is that it spits out, that's the dish. So they made the point that if the tomatoes are rotten inside the model, that probably that dish isn't going to taste too well. So, point being, a lot of these LLMs are crawling across public data. Some of that data is great. Some of that data is poison. Some of that data has a lot of bias in it. And so know that the results are going to be skewed because of that, or poisoned, or I don't know, whatever word you want to use. And then it takes, really, that person to go through and clean it up and pick through that.

Most of us, when we talk about using AI for marketing, we're relying upon these publicly traded listen to me publicly trained LLMs and so know that there are these inherent issues with them. Morgan. How did I do?

Translation that was analogy. That was great.

That was so great. So, speaking of LLMs, so I'll start, like, as we think about the state of AI, one of the very big high level things I heard Paul Raitzer say, who's the founder of the Marketing AI Institute, he said, LLMs are the new CRMs in terms of importance to your business. Holy cow.

Yeah. So there's no avoiding this. Right. It's this thought that either get with the program now or later, but it's coming, so you may as well kind of get on the topic and have thoughts and have kind of practices that you're using around it. Right.

What were some of the compelling reasons why companies should adopt this now, in your mind?

Yeah, so part of it is just being able to grow with it. So there are new applications being developed all the time, but just the fact that you've got tasks that you do all day, every day, your day is made up of a million different tasks that you do. Some of those require grave expertise from you and are really proprietary to the things that you do and what makes your work a really incredible work product. But there's also tasks that you do that you don't have to spend your time on anymore. And so there are things that these tools will automate for you or just handle for you or get you 80% of the way there, so you can put your expertise on top of it, so that you can really spend your time doing the most important work and more of the most important work. And so that was a lot of the focus, kind of, that we talked through and heard, I think, is man automate the things that don't serve you well.

Yeah. The chief data scientist from Google had this saying of thinking versus thunking. And the idea of thunking would be doing these just low level, repetitive tasks, things that take you away from the thinking and the strategy side of work. And so she had not to be long winded about it, but she had this example of these cats, and she ate pictures of cats. And how would you write a computer program where you would recognize these cats? Would it be they have pointed ears? Would it be they were fluffy? And then she said, well, what about this breed? This breed has ears that fold down, and this breed is hairless. And so the point being, it's really complicated to come up with that computer program, but utilizing something that has data and can do high level output, you can use that as your program and help train it. So there were two pieces. One was low level tasks that just aren't of value add in your life. And then the other one was really complicated things where, wow, if I don't have to do all that code or find someone to code that, think how much more quickly I can innovate and find things to do with that program that can identify cats.

Who knows what you're going to do with that program?

But someone out there knows the reason, right?

Yeah. So other state of AI picture things.

So some of the big takeaways around privacy and regulations. So bottom line on regulatory things for AI is regulation is not going to keep up with the rate of change in the rate of new applications being developed. And so there was a big onus on as a company, as an individual, you need to have kind of a set of ethics and be really transparent about how you use these tools. Don't be afraid to just share kind of what you use and how you use them. You don't have to give away your IP or your prompts or your code or what you use specifically, but being transparent about the tools. And then the other thing in that realm is privacy. So privacy is still a big issue. So what happens in most of these large language models is when you input information in there, so you think of Chat GBT. So Chat GBT cold everything on the web up until late 2021. And then they had to have a stopping point somewhere so they could move forward with the application. But it stopped there. But now anything you put into Chat GPT, it now has that information as well.

So it has that information to pull from when it produces answers for somebody else. So what that means is if you're developing a new product and you're trying to do some maybe competitive research or say you want some ideas for headlines or a press release that you're going to generate through Chat GBT, you're basically announcing that product to Chat GBT. That's not something you would want to do. So privacy is definitely still an issue, and I would encourage people to just I think that's a huge reason why you need some kind of internal policy about what you use and how you use it. You need to make sure you don't have people like out there rogue creating things that shouldn't be putting information into one of these models. But that's something just to know and to think. Every time you go do something, we have this I wanted to play with a tool and put some data in and then see what kind of analysis I got back. And so we're using old data. We're testing that with something that we've already analyzed right that's already out there, that already exists. And so just making sure we're not dropping new things in.

There are models that allow for some privacy. The paid version of Chat GBT. You can flick on a privacy kind of switch. There are some other ones as well, but it's still an issue.

Yeah, something to watch out for. And I wrote down the words run your queries locally. So if a tool allows you to run those locally, that's something very specific to look for. But I think you're right, Morgan, that it's the paid version of Chat GPT, and I bet other tools might monetize that as well.

Yeah, and that being said, too, honestly, even as much as we have done so, we primarily as a large language model, have interacted with Chat GPT. And then we've used applications, other applications. So things like writer playing around with tools like Jasper, those are applications that sit on top of large language models. And I didn't actually realize how many large language models there are, so they were talking a lot. One that came pretty highly regarded was Claude Two, which is a large language model that stands out because when you use something like Chat GBT, you have prompt limits. You can only put so much information in there, and Cloud Two will allow you to put in 100,000 tokens into one prompt. And a token is not a character, a token is a word. And so, for example, you can potentially analyze a short novel. You can put that much information in there. And so that was helpful to know. And then there are a couple of newer ones. Llama was put out by Meta, and that's an open source large language model. So that allows for a lot of application development on top of it.

But there was definitely still an onus on the user. To understand the lineage of, say, you're not interacting with a large language model, you're interacting with an application that sits on top of it. You need to understand the lineage of what large language model that uses and know for some of the things you talked about earlier, like have they done anything to handle bias or where is their original information coming from? So the complication of that underlayer for me was a new data point, I think, and just being aware of more tools to use and rather than feel.

Overwhelmed by all of the LLMs. One of the takeaways I got from this too, was that all the models seem to be heading in the same direction and they're definitely one upping each other, but then the other one will catch up and leapfrog and it's back and forth right now. And so there was some good advice of just pick one and learn it, learn how to use it, use it for everything. Don't worry about taking this inquiry and putting it into six different LLMs, at least for now. And then over time, take notes. I wish this LLM would do X, Y, and Z, and then you can be on the lookout if the others come out with that. And that's your favorite feature. But I got pretty excited about Cloud as well after hearing how many people were just impressed with it. So I took that away too.

Yeah, so that was a big one. And then really overarching on top of that is this idea of human creativity and the importance of editors. And you've got to have a human in the loop. And so they talked a lot about AI is not like obliterating anyone's job, but the person who can use AI tools effectively will become far more successful in where they're headed. But this idea that maybe you've got, maybe right now, you have a lot of content writers. Those writers need to be able to take on kind of an editing hat. They need to be able to take some kind of material that's created. And you've got to be incredibly particular editors right. To make sure that your brand is coming through, to make sure that all of your facts are right and things like that. And so that importance of editors, the importance of the human in that AI loop, of whatever you're using the tools for, was just huge.

Yeah. Let's talk about SEO a little bit. What did you hear about SEO and the implications of utilizing AI to help with the writing process and how it might hurt or help you with Google Rankings?

Yeah, so there is definitely a consensus. Not consensus, but Google Rankings feel up in the air right now. I think there have been a lot of changes, and then there are more changes coming. I think that SEO as a standalone term was sort of quiet. I don't feel like so last September, we were at Content Marketing World, and there was a lot of talk about using AI to rapidly create a ton of content to master a keyword. Right? And so you want the term battery management test system. Great. Here are 20 blog posts you need to write around the really general topic. What is a battery management test system? How do you use the battery management test system? And AI tools could do that. You could create that sort of high level, really generic content to dominate a keyword. And then Google's helpful content update quickly pushed that out. And so it was really, really interesting to me to see platforms that had doubled down on winning Google Rankings by using AI to create a massive content have now pivoted a little bit in their own messaging and said, okay, actually, let's not do, um, it's not helpful for your Google Rankings anymore.

And we're just creating a bunch of really bland content and putting it out there. And so instead, they're saying, okay, use these tools that are identifying key terms to create really strategic, thoughtful, in depth content around a topic and so for me that makes me excited because it really is terrible to try and get answers and be just wading through a bunch of generic content. But really this focus on hone in have a really niche topic that you're focusing on and create really truly helpful content in that space. And so it gives a lot of credibility to an expert voice. We heard a ton about brand your voice, your brand, your brand personally as an expert and then your company's brand and being able to really convey those well, having tight and consistent messaging. And then use AI tools to help you get your message out on different channels. Like you can use AI tools to we could take the script from this podcast if we wanted to and we could use a tool to create a bunch of short video clips or we could use if we just had audio, we have video of this.

If we just had audio, we could use an AI tool to put our voices over some kind of engaging broll if we didn't have video capabilities. Something like that. And so you can use what you're already creating to unlock kind of new channels and share your thought leadership there. But it was really clear that you're not going to win by using these tools to just kind of create a bunch of junk.

It's not about volume. Yeah, one of the things I heard during the conference, which I heard before, but it makes sense, is you need to be adding to the body of content that exists in the world. So it needs to be novel. So like you said, it needs to be helpful but also new. Something a different take, a deeper take, whatever it is, if you're just repeating what's already out there then that's not a value add and that won't win at certain. And so that takes then more strategy work, more research on what is the body of work over there, what do we think about that body of work? And so the emphasis perhaps moves from the actual writing of the piece itself and moves into this ideation process and then the editorial process looks deeper than just editing. Well, there are some holes in your story or there are some places where you could flesh out even more detail. So think about the writing process more like you would for Investigative Journal or In Depth magazine article, not just your shallow blog post that you've been doing under B, two B site.

Yeah, for sure. That's a huge piece.

Yeah. As we think about AI from a writing or a content generation standpoint, what else did you take away from the conference?

Yeah, so I think the larger applications that have gotten some FaceTime like Writer and Jasper, they are continuing to evolve, which I think is really great. I would say jump in and start using something because then you can kind of walk alongside them as they evolve the tools and like you said, kind of keeping a note of what you would like to see these tools do. And likely if you evaluate them again in three or six months, you'll see some of those new features kind of bubble up. That said, I think from the content generation space itself, knowing your content strategy pieces like you were mentioning is going to be key. So understanding what you're trying to communicate and what you're talking about and what your take is on it and then yeah, you can use these tools to kind of help you with the process and to help evaluate your writing too. So some of the tools in Writer specifically have gotten really advanced to where it used to be sort of like a Grammarly type of tool, where it would say, this could be more concise, or you could shorten this sentence or give more explanation here, or Be sure to define this acronym before you use it, things like that.

But now being able to upload to add your brand voice, your tone, key terms that you use, you can train it to understand that. Okay, when I talk about PXI, yes, it's an acronym, but my audience already understands that I don't ever need to define it. Right, and so you can kind of train it with a style guide like that. And I could see, especially across really large organizations, having that consistent kind of brand checker in there would be really helpful.

What a contrast to the startup that presented called Tomorrow IO. Yes, and they are a 200 person company and they have a marketing team of four with very ambitious growth goals as a company, they're VC backed and one of their positions is an AI specialist. And it was interesting because they showed the original job description and it was basically a digital marketing specialist, right? Someone who could work with content and social media and email marketing and so forth. So kind of just a marketing generalist. And they decided instead of that position to bring on someone who can go deep on all these different AI tools and can work horizontally across sales and marketing and product marketing to help implement these tools to work more efficiently, get better insights. And they're thrilled with how it works. And one of the things I remember they talked about was we don't know how to use Premiere Pro but we can use this other tool called Descript and shortcut some things and create video content and derivative content without having to be an expert in this very sophisticated tool.

Yeah, they showed her tech stack too of what she used, that marketing AI specialist and it was probably 20 tools deep. And what you see is when people are really leveraging these tools, it's not just one. Often they're using something for ideation and then dropping that into some kind of writing tool to get the words on the page and then dropping that into Descript to create some kind of visuals for and so you mean the free.

Tools can't do everything, we can't just use Chat and expect it all to be done?

Morgan it turns out no, you did the end of the first day, we heard Ethan Mollick, he's a professor of entrepreneurship at Wharton, and it was amazing. This is covered in an article, you can look it up, but he covered what could he do in 30 minutes to promote a product? So he was talking in reference of a product launch. And so it was first doing a little bit of competitive analysis with a tool and then coming up with some headlines to promote the product, and then taking those headlines and putting them in, generating some images with mid. Journey and then actually going into there's a new feature in the paid version of Chat GPT called Code Interpreter, and you can actually have it create CSS code for you. So then he plugged that code into a web platform and began creating a website. And so it was his example. Now, was it like perfectly done or anything? No, but it was just his example of the kind of rapid generation that you can do using a host of tools. And it was a really cool example, seeing the power of the tools.

Yeah. Well, since we're talking about tools, you already mentioned Writer and Jasper and of course, Market Muse is another big one that keeps popping up wherever we go, that has a lot of content, strategy and ideation and SEO tools built in. What other tools have caught your interest now?

Yeah, so there are a few actually just on my list that I'm going to look into this week that I thought were interesting. So there's one called Perplexity, and what Perplexity does is helps you get data backed answers specifically, so it focuses on calling research reports and things like that. And Chat GBD will give you those answers as well. But Chat GBD will tend to summarize those answers and give you back a single point where Perplexity's job is to serve you up kind of research as.

It stands, and multiple sources for the same idea concept.

Yeah, so that's what I'm going to look more into. I think that that might be a really cool tool. I often find myself starting a project or a content piece for a client and thinking, what's the lay of the land? What data points are out there that I can kind of start with and get my head around this issue. And so I think that one might be really helpful. You mentioned a Script, which is an AV editing tool, so you could use that for podcasts or videos. You typically start with some sort of audio or visual content and then kind of manipulate that as you move forward. There's another one called Bright Edge, which I honestly don't know much about. I saw like, a 32nd demo, but it's around topic research. You can do some website analysis or keyword research, and you can also look at some trend data. So I thought that one was cool. And then Brand Ops is one I wanted to look at, too. I'm not sure what their kind of out of the box free tool is, but it focuses on comparing different kind of companies marketing presence online. So it's some of the research that you would do yourself by looking around and making evaluations and kind of deciding how you're going to compare what you're going to grade competitors on as you look through.

And it kind of looks like it serves up a baseline for that. So I'm interested to see what that is like. And then I also talked to the Gloss AI people at the show. I think I've mentioned that tool before, but it's a really neat tool. They're kind of working on pricing and what that looks like. But basically you can take a long form video and upload it. I think it would be incredibly valuable for webinars. Oh, we did a webinar. We put all of this expertise into a webinar and spent a lot of time planning it and got a panel of people on it. And then you've got this hour long piece of content. Well, they take that and you can upload it and it'll start to pull out like clips for you. And you can kind of mash those together with other visuals or the visuals that are already recorded in the webinar. And so it's a cool platform for that if you don't have somebody adept in video editing on your team. Good. Yeah.

So I heard a lot of research, helpful tools for research and then helpful tools for derivative content. And it reminds me of so Chris Penn was one of the speakers, and one of the things he emphasized was, everybody is running to this generative ball, this do the writing for me. And he said, I feel like that's the least sophisticated way to use these tools that's sort of like, whatever. And he said, But I guess the idea is because the writing process takes so much time, it's amazing to see it spit stuff out. But he talked about all these other ways that you've really already touched on that's a better use of AI's time than these words on a page. Yeah, maybe we'll move on from that, graduate from that.

It's funny. I think that's the easy piece to grasp onto because it's what we already know. We know somebody's got to write something and then you're like, oh, somebody else will do that for me. But it came up in multiple sessions. People are like, this is the least compelling thing about this.

Well, speaking of compelling other uses, when you put yourself in the shoes of, I don't know, B2B marketing director, what are some of the ways that you think use. Cases that they should consider AI for.

Yeah. So I think there are some great kind of tools and opportunities, definitely for ideation. So many people just saying, man, these have actually been great in my work life to go from zero to something. Right. And I think a lot of people, too, you've got people working from home who in roles that maybe aren't ideal to be home by yourself. Right. They're better when you're sitting next to somebody and have somebody to talk to or bounce ideas off of. And these tools can fill some of those gaps. I think they can get you started on something so that's one and then personalization kind of kept coming up. So in contrast to this thought of like, oh, great, just use this tool to write a bunch of garbage, actually, the key is in that personalization aspect, knowing your customer, being able to there was somebody mentioned like, man, there are a lot of things that we push onto sales and managing a customer relationship, maybe, or managing the nurturing of a customer before they're ready to be a customer. Maybe we've made sales qualified leads, made those indicators too early in the process. Let marketing take some of that back.

Right. So with some of that personalized interaction that we can do, so that we truly don't hand people off, customers prospects off to sales until they're really ready, and then let the salesperson do that negotiation in what they do best. Right.

Which is that one to one interaction. Yeah, I love that. Let's see another takeaway for B. Two B companies that I kept hearing was use these AI tools for intelligence gathering. And there was a CMO panel, and I thought, this is pretty cool. She said, one of the first things I did was ask the LLM that, you know, I would look at the study, the competitors, what are they doing? Well, what are their messages? But then I said, how do our competitors view us?

Interesting.

What a great question. And she said, and we didn't really like what it told us.

Interesting.

And we're working on changing our messaging as a result. And so she didn't get into the details of how she substantiated that view or how it came across or whatever. I have more questions I would have asked her about it, but I thought that was very interesting and just any kind of competitive research was great. So it was another one I had.

Yeah, that perception. And that's pretty kind of humbling to have to sit in a room and say we can talk about ourselves all we want, or we think that people think this, or we think this is the perception, but what does it actually look like? Somebody else? It was a B2C. Application. I haven't kind of thought through how it all works out, but somebody was talking about using AI tools to basically create kind of automated focus groups and the thought there is the LLM is like summarizing. The LLM is biased because the LLM has biases because the information that it's pulling from is biased in a certain way. Right? And so they talked about some really terrible biases and things that come up, like it views certain names or genders as preferable in different situations, things like that, that you would not want to replicate in your content. However, using it as an AI focus group because kind of in that same way, right? How does what's out there perceive us? But how is this idea going to track with kind of the main population that's sitting out there? Because it's basically leveling everything that it's cold from the Internet and so kind of getting a perception of how many people would respond favorably to this or not.

Anyway, I haven't fully fleshed out what that looks like and they just kind of skirted over it. But I have an interesting thought and.

The idea of giving your LLM like, okay, you're going to be this persona, and I want the answer. Not just I always thought of it in terms of content, generation of writing, let's write to this persona or in this voice, but I never thought about the research portion of this being, okay, now you're a procurement manager in manufacturing at a Fortune whatever, and giving it those backgrounds. Okay, in that context, how would you look at this tool, this message, and.

That'S a plug for so we did on our website, there's a webinar that is around prompt generation and prompting. And that's just a key takeaway. Anytime you use one of these tools, you need to give it a job, you need to give it a role, tell it who it is, and then ask what you want. So, definitely helpful.

Okay, on the subject of prompting, I'm cracking up because there were several sessions that said your prompts are your IP, and you need to sort of prompt library, and you need to capture all of these, share them internally. You need to use really long, specific prompts, and everybody can leverage that. And I was like, okay, I get the IP. And then I went to a session that said, the AIS are going to be so smart soon, or the LLMs are going to prompt for you. They're going to give you suggestions on how to make your prompts better. So prompt engineering is going to go out of style as soon as it comes in. So I don't know what to think about that long term, but I definitely understand in the short term that you need to learn the mechanics of how to do this well to get a quality result, right?

Yes. So especially if you're trying to come up with any sort of content that's meaningful, or even when I use if I use something as a brainstorm tool, the prompts that we use are pretty in depth. Otherwise you're going to get like a generic B2C with a fourth grade reading level type of output, which is not helpful. I don't want to brainstorm with my fourth grader. Right? I want to brainstorm with my colleague.

Yes.

So that's just kind of a key. I do think in the same way that we went from generally as a society, we went from longer Google searches to try and get what we want to now. It's like we've kind of gotten dumber where we just ask questions. You're like, Where's pizza? And I think that the AI tools, they're already advancing, or they'll ask a question back and ask you to specify. I was doing some analysis of research, and there were a couple of times where I asked a question and it asked me a clarifying question back before it gave me an answer. And so I think that is probably what Ethan was talking about when he was talking about the AI tools kind of knowing how to prompt themselves.

Yeah.

So I will say if you're trying to do any type of ideation or content generation, you need pretty robust prompts, but also just create a prompt library. So I came out of that conference with a list of things, a list of prompts that I would like to create for our team for just a time saving tool for some routine things that we do all the time, where there might be, like, two or three customizations that they need to drop in, depending on what they're doing. But then they can use that prompt to create something. And really, in that case, that prompt is code, right? You're customizing that code and putting it into a machine, and the machine is spitting out something. It's what we do with a website. Right? And even now, we don't interact with websites that way. We interact with very graphical user interfaces on websites where we drag and drop modules and we enter text and we click the little format button. We don't have to write the custom code to format it. And so these tools are going to evolve in that same way.

Yeah, for sure. All right, Morgan, we've covered a lot of ground here. Let's wrap this up by saying, okay, so just how are you feeling in general about all this AI stuff? Having just immersed in the koolaid of.

It for a few days, I'm feeling super excited. I feel like there's a lot of things, I think kind of in closing, one of the charges was just take AI with you to all the things that you do and think, is there a way that I could make this easier? Is there a way that I could make this better? Is there a way that I can make this faster using a tool? And so sometimes the answer is going to be no. But I think a lot of times there are kind of low level things that we can get a big lift on so that we can spend more time doing the things that we're best at. And so it made me excited again. I wrote a list of things I think that we can have prompts for. I grabbed old research to see what we can can we get insights, even initial insights from this faster, right? You've got a client that's like, what can we gain out of this? We have this old white paper that somebody wrote five years ago and it's 20 pages long. Is it still useful? I can drop that in and say, summarize this for me, give me the key points out of it in a second just so that you can start to make gut checks on things.

Right. Before I add that white paper to my list and devote 30 minutes to reading the whole thing, I can just get something, a quick gut on it. And I think that that made me excited. But you've got to be familiar with the tools or that's always going to feel like too hard, right? Feel like you're two steps away from.

It and now's the time. Now's the time to start getting familiar with the tools. During the conference, I think I mentioned to you I wish there was a counterpoint, someone who was an easier that was there saying, why do this now? You know, why don't I wait six months when the tools start to get a little more sophisticated and stop doing the hallucinations and all of that stuff. And it just feels like you're going to be too far behind and you have even more to figure out rather than growing with these tools. So I like how you put that and I like the time shift from less doing to more thinking and having this attitude of what is possible. I'm excited too. I think we will just continue to talk. We heard again and again the tools six months from now are going to look totally different. We're going to look back at this and be like, oh, that was so crude the way we used to do things. But by understanding the mechanics of how they all work, I think we'll make smarter decisions. People were encouraging you. If you're looking to subscribe with some of these proprietary tools, do it month to month.

Don't sign a year long contract because who knows how things will change. So I thought that was a good advice as so yeah, well, I'll drop in the tools that you mentioned into our resources. Also, I know that Microsoft Copilot video was mentioned a lot, so I'll put that in there because that is a great demonstration of what is going to be possible. It's not all out there today, but for those of you who are still struggling to wrap your head around how could I use this, how does it help me? That video is pretty interesting.

Yeah, definitely.

Else Morgan I should throw in the.

Show notes I'll pull Ethan Mollick's. Just his own case study of what he could do in 30 minutes. I thought that was again similar to that copilot video. It's like, what's possible? And just to kind of get your wheels churning on what you could be using tools for now or in the future.

Cool. All right, well, thanks for taking time out, and now we can catch up on our rest from all the fun we had at Make On.

Yeah, really.

Thanks for joining me today on Content Marketing Engineered. For show notes, including links to resources, visit trewmarketing.com/podcast. While there, you can subscribe to our blog and our newsletter and order a copy of my book, Content Marketing Engineered. Also, I would love your reviews on this podcast, so please, when you get a chance, subscribe and leave me your review on your favorite podcast subscription platform. Thanks and have a great day.