What happens when speed, scale, and convenience start to erode trust in the images brands rely on to tell their story?
In this episode of Tech Talks Daily, I spoke with Dr. Rebecca Swift, Senior Vice President of Creative at Getty Images, about a growing problem hiding in plain sight, the rise of low-quality, generic, AI-generated visuals and the quiet damage they are doing to brand credibility. Rebecca brings a rare perspective to this conversation, leading a global creative team responsible for shaping how visual culture is produced, analyzed, and trusted at scale.

We explore the idea of AI "sloppification," a term that captures what happens when generative tools are used because they are cheap, fast, and available, rather than because they serve a clear creative purpose. Rebecca explains how the flood of mass-produced AI imagery is making brands look interchangeable, stripping visuals of meaning, craft, and originality. When everything starts to look the same, audiences stop looking altogether, or worse, stop trusting what they see.
A central theme in our discussion is transparency. Research shows that the majority of consumers want to know whether an image has been altered or created using AI, and Rebecca explains why this shift matters. For the first time, audiences are actively judging content based on how it was made, not just how it looks. We talk about why some brands misread this moment, mistaking AI usage for innovation, only to face backlash when consumers feel misled or talked down to.
Rebecca also unpacks the legal and ethical risks many companies overlook in the rush to adopt generative tools. From copyright exposure to the use of non-consented training data, she outlines why commercially safe AI matters, especially for enterprises that trade on trust. We discuss how Getty Images approaches AI differently, with consented datasets, creator compensation, and strict controls designed to protect both brands and the creative community.
The conversation goes beyond risk and into opportunity. Rebecca makes a strong case for why authenticity, real people, and human-made imagery are becoming more valuable, not less, in an AI-saturated world. We explore why video, photography, and behind-the-scenes storytelling are regaining importance, and why audiences are drawn to evidence of craft, effort, and intent.
As generative AI becomes impossible to ignore, this episode asks a harder question. Are brands using AI as a thoughtful tool to support creativity, or are they trading long-term trust for short-term convenience, and will audiences continue to forgive that choice?
Useful Links
Thanks to our sponsors, Alcor, for supporting the show.
[00:00:04] When did visual content get so, well if we're honest, boring? Scroll down almost any newsfeed right now, you'll spot it instantly. Perfectly polished images that say absolutely nothing. Familiar faces and words that feel slightly off. Endless variations of visuals that look impressive for about half a second, then they will vanish from your memory forever.
[00:00:28] Welcome to the age of AIslop, my friends. Today's conversation is all about how we got here and more importantly, how brands can climb back out of this mess. My guest today brings a front row seat to one of the biggest shifts that the creative world has ever faced. Her name is Dr. Rebecca Swift. She's the Senior Vice President of Creative at Getty Images.
[00:00:54] And she leads a global team shaping how brands communicate visually in a world flooded with generative content. So in the episode today, Rebecca's going to join me and break down why AI has somehow bulldozed its way into visual culture, how low quality mass produced imagery is quietly eroding trust, and why audiences are becoming far more skeptical of what they see.
[00:01:19] And we will also talk about sloppification, brand-bagification, why transparency now matters more than ever, and how creativity, craft, and consented data can still win in an AI-heavy world. So if you are a brand leader, marketer, creative, or someone that just cares about originality, the creative process, and trust in the age of AI,
[00:01:43] I'm hoping today's conversation will certainly make you think twice before hitting that generate button. Here at the Tech Talks Network, we now have nine podcasts and approaching 4,000 interviews. And that is only possible with some of the great friendships that I've developed over 10 years of podcasting. And a company that I'm proud to call friends of the show is Denodo, because not only have they been on this podcast multiple times, they also help make sense of the AI data chaos that we're seeing now.
[00:02:13] Because the data world is louder than ever. AI hype, lake house complexity, and pressure to deliver more with less. These are things that I talk about every day on this show. But Denodo is helping businesses make sense of it all, because they provide a unified data foundation for trustworthy AI. So if you're ready to unlock real outcomes, simply visit denodo.com today. But now, it's time for today's interview. Let me introduce you to today's guest.
[00:02:41] So a massive warm welcome to the show. Can you tell everyone listening a little about who you are and what you do? My name's Rebecca, and I am the Senior Vice President of the Creative Team at Getty Images. So I have a team of about 90 people who are making visual content every day. And this is such a hot topic right now.
[00:03:08] And I would imagine as Senior Vice President of Creative at Getty Images, you have a front row seat to how AI is reshaping visual culture. So many different viewpoints on this. But when you look at the current flood of generic AI imagery, or aka AI slop, what concerns you most about where brand visuals are heading? I mean, I think the AI slop feels like it's been an interesting experience.
[00:03:34] I kind of feel like we've been in this maelstrom of information about AI. And I'm sure, you know, you've been in the center of it more than I have. But I think in our world where, you know, we're creating photography and film, it's, you know, it started as this is the next big thing. This is the new technology that's going to help us create visuals in a way that we haven't been able to before. It's going to be cheaper. It's going to be more democratized, et cetera, et cetera.
[00:04:04] And it's been really interesting watching the narrative kind of shift in a very, very short period of time to its slop, which is, you know, I mean, I don't think anybody would like to call their creative output slop or knowingly create slop. It's just a horrible word. It just kind of really makes you think literally of like the dregs in a bucket. And so I think the issue with slop is that it's too easy to create. Yeah.
[00:04:33] It's cheap. You know, you can do millions of variations at the press of a button. And so then it kind of all starts to look the same. And, you know, there's been this discussion around the Beijing of brands and how everything's starting to look the same. And I can totally see that. I mean, I think, you know, I have a team who are visual analysts, so we're always kind of looking very specifically at what images are doing and how we might do them better.
[00:05:02] And I think with the AI content, we, you know, early on we picked up, you know, which model created which image. You can kind of almost identify it because of the aesthetic of what comes out. Now it's becoming a little bit more muddied and a bit just samey. And I think no one asked early on whether it was good. No one said, does this have value? Does this have an aesthetic value?
[00:05:31] Is this beautiful, this imagery? You know, which we ask of art, we ask of photography, we ask of illustration, we ask of many other visuals, movies and TV shows, et cetera. No one said, well, actually, what is this adding to the visual landscape? How are we creating this content and in a way that is expanding the spectrum of aesthetics? And I think it does. I think there are, you know, there are artists who are doing
[00:06:00] some really interesting work with AI. I think, you know, we've seen brands who have used it to, you know, personalise content. But I think that laziness of using it because it's available and because it's cheaper, it's perceived to be cheaper, is not necessarily a reason to use it. It's kind of like, you know, if you come up with an idea and you decide you want to have it hand-drawn
[00:06:30] or you want to film it as a video or it should be photographic, that's the point you're kind of making those decisions around it being, you know, which tools to use to kind of bring your vision to life. I think with AI, it's too easy to just bring visuals to life with no idea behind it. And, you know, that's even before I get started on, you know, standing on the shoulders of all of the creatives that have come before and all of the visual artists
[00:06:57] that have dared to show their content on their Instagram feeds or, you know, on online spaces and whose work is now being scraped and added to these models. I think, you know, and it's really interesting how, and again, you know, I've been in this industry a good number of decades. And I feel now is probably the first time in all of that time I've been in the industry where the audience for content in the marketing space
[00:07:26] is reacting to how something has been made. So, you know, for many, many years it was kind of expected that something was photographic or it was created by a film crew. And then obviously as, you know, camera phones became in everybody's pockets and, you know, digital photography and became a thing that we all did, like literally everybody does. There was an expectation that things were created in a certain way. And it's really interesting,
[00:07:56] that kind of rejection of content because it has been made by AI. So the value of AI is so low that it has value until you realise it's AI and then the value drops. And we see that, you know, we've seen that with big ad campaigns. We've seen that with brands who have gone to market promoting the fact that they're using AI gen content, which they see as a positive because it's using new technology.
[00:08:26] It shows that they're innovative, blah, blah, blah, blah, blah. But the audience, you know, will pretty much tell you pretty quickly we don't like it. Yeah, all day long. And it's very short-term thinking as well. I mean, there's always a lot of talk around AI models running short on high quality training data. And the knock-on effect of this is on the output quality. So from what you're seeing here, how has this sloppification shown up in real campaigns?
[00:08:55] And what risk does it create for brands that rely on visuals to signal their credibility when they've almost outsourced their creativity to a machine? Yeah, I mean, I would love to know the conversations that were had internally and the justification that was given for why, you know, certain brands have used AI. I think, you know, one of my favourite examples in the Christmas campaigns,
[00:09:24] as you know, Neil, the Christmas campaigns is a hard-fought territory for advertisers, especially in the UK. And, you know, this year, McDonald's in the Netherlands created an AI campaign. It was actually a really nice idea. It was an idea around, you know, the things that go wrong at Christmas or the things that are really tough at Christmas. But they chose to create the whole ad with AI.
[00:09:52] And their response when the audience reacted badly to it was like, well, actually, we spent 7,000 hours prompting this. And it's kind of like, oh, just dig that, you know, dig that hole a little bit deeper. Yeah, yeah, yeah. You know, so what, you know, to go back to your question, what are we doing here? Why? Just why? Yeah. You know, when the results is,
[00:10:20] maybe internally they felt the result was great, but I'm sure it must have been tested on some audiences before it went live. The audiences did not see through the slop to the idea. They just saw the slop. And so, you know, how are you creating differentiation for your brand? How are you, you know, showing that you care about, you know, community, creativity,
[00:10:50] artistry, craft, you know, all of those things that actually we all care about. And, you know, I think one of the great things about what's happening with AI slop is that there is this kind of shift towards the appreciation of what is good, what is kind of highly crafted. And, you know, Getty Images has a reputation of being kind of, you know, premium imagery. And so, you know, we have a very strong position in the marketplace,
[00:11:18] but there is a lot of businesses who are struggling because they are not, you know, they're kind of falling in the middle. And, you know, that's worrying, I think, for the kind of future of creativity. And to go back to your point around the data sets, you know, exactly to your point, where is the data of the future going to come from? We, you know,
[00:11:46] these AI tools have been lucky enough to scrape 25 years of digital imagery from websites. And so, and, you know, some great imagery has obviously gone into those sets, but also a lot of really bad imagery. And so we're seeing that already in the models and to redress that balance needs a lot more high quality,
[00:12:19] non-pornographic. I can't think of a better way of saying it. I'm sorry. Non-pornographic kind of, you know, you know, the imagery that's, that's objectified both men and women. And, you know, how do you, how do you, how do you redress that balance? And, you know, we've seen Grok has come under fire this week around the nakeification ability.
[00:12:48] It's almost like AI's just kind of proving how low humans could go. It really is. I mean, when, when mobile phones and app stores first came out, we had all that technology. The first thing we do was, oh, I can turn my phone into a pint of beer or a chainsaw, you know, it's insane when we think how little we think about or we need to think bigger for sure. And I think audiences have always craved authenticity. And before AI,
[00:13:17] there was a lot of talk around Photoshop and the damage that was doing on magazine covers with the harmful work it was doing to female celebrities, for example. And I think research also even more recently shown that 90% of consumers, they want to know whether an image has been altered using AI or if it hasn't. So how should brands think about transparency without overwhelming audiences or most importantly damaging trust? Because people are quite tech savvy out there now
[00:13:47] and they can tell straight away and you could blow up for the wrong reasons very quickly. Yeah. I think, again, I think that's the most interesting shift that we have seen is that we have got to this point where even people who are not in doing this work in a professional sense, people, you know, the general public, your mum and dad, et cetera, they come to imagery with suspicion and a lack of trust because there is
[00:14:17] this need to understand whether it's real or not. Now, obviously, in our editorial business that's a big issue. You know, we really, you know, we've been doing a lot of work around ensuring that our content has been verified that is, you know, can be traceable and, you know, we don't take in any content that has been AI manipulated or AI generated. But I think, in general,
[00:14:47] there's this kind of need to almost prove that your content is not AI gen and so, you know, really, really focusing in on authenticity and realness and, you know, we've obviously seen this shift towards people imagery because AI still doesn't do people imagery particularly well and video is just, you know, there are moments of genius but in general just human interaction and body language is not something that AI can replicate because it doesn't
[00:15:16] understand the meaning of it. And so, yeah, I think in the long term great imagery will continue to win the day and there will still be, you know, the kind of slop around because you could argue that there was slop around before AI that, you know, there's plenty of content that was kind of like visual wallpaper but I think it's just become more prevalent
[00:15:45] at the moment. And so, yeah, I mean, I think a lot of the work that we're doing is around just ensuring that we are, you know, thinking very carefully about who we're shooting, who's, you know, who's behind the camera, all of those types of things that are more important, I think, when you've got all the, you know, the synthetic people are running amok on the internet. And again,
[00:16:15] it's really interesting, you know, I work in the side of the business that has the, you know, the stock collections and, you know, for the entire history of our company and we, you know, we turned 31 this year. There wasn't a huge amount of concern about who are the people who are making this content, how does this content get made, it's like this content appears on our site and I can license it if I want to or I can view it. And the need now to show
[00:16:44] how stuff is made, who are the people that are coming together to create this content and how do they make it work and what are the, what's the final output of that, building that kind of value into the content that's being created via, you know, what's traditionally being seen as behind the scenes content is really interesting. It means more work for us, of course, but actually it's the fun stuff that we're already doing and you see it in other, you know, in other media as well,
[00:17:13] you know, the Stranger Things finale at Christmas, the amount of behind the scenes content that was put out on Instagram and YouTube and TikTok during that time was amazing and not that, you know, there was any concern that Stranger Things was AI generated but I think that's a sign of where we've got to kind of visually in terms of our literacy and our sophistication around how we read images that,
[00:17:43] you know, we want to actually understand more about what's behind the content than we ever have before. So, that I think, again, is a great way to combat AI slop because, you know, in the same way that McDonald's in the Netherlands argued that they spent 7,000 hours, you're not going to video someone 7,000 hours at their computer doing prompting. That would be, that would probably be, yeah, it would be interesting.
[00:18:13] That could be an art piece actually, someone just sitting at their computer. How about Tate Modern? Yeah, that would be brilliant. How many weeks? But actually showing someone running around making snow or wrapping presents or whatever is interesting and I think we as humans are more drawn to that than to technological wizardry in that sense. This month I'm partnering with Alcor
[00:18:42] and if you've ever tried to hire engineers in another country you probably know just how painful it can be. Different laws, patchy support and partners who don't truly understand engineering roles. So Alcor approaches this from a different tech point of view. They specialise in Eastern Europe and Latin America and they're able to combine EOR capabilities with recruiting. So you get one partner handling everything and they help you choose the best
[00:19:11] location for your stack find developers with the right depth of experience and run proper assessments so they can onboard people quickly. And they also give you a model that respects both transparency and margin. Most of your spend goes directly to your engineers and the fee will decrease as the team expands and you can even transition everyone in-house at that time when you're ready without having to worry about a penalty. And that structure is why a mix of early stage
[00:19:40] and unicorn stage companies use them as they scale. So if you want to take a look visit alcor.com slash podcast or tap on the link in the show notes. But now on with today's show. And I've been to so many tech conferences and many of the AI vendors they're not interested in reducing workload it just seems to be increasing it more than ever. You can have somebody wearing a fleece and say hey we can create 100 images now of him outside a tent in the mountains at a festival and various other things.
[00:20:10] As a result many marketing teams are now under more pressure than ever to move faster and cheaper and embrace generative AI. So where do you see brands crossing the line from smart experimentation and an assistant or a tool into lazy shortcuts that quietly just erode confidence day by day? Yeah I mean you know these are the conversations that are happening especially in the advertising industry right now you know the advertising agencies are really having to think about what it is
[00:20:40] they offer because the expectation of what you get from your agents your creatives whatever your marketing team is so much higher and there is an expectation that you could do many many versions but do you need to do you you know does the audience want those those versions on my Instagram feed I keep being sent this stained glass lamps
[00:21:09] of animals like a chick or a duck or a crocodile at what point have I said I need a whole collection of stained glass lights in different shapes and they're obviously all AI generated and it's that kind of thing that I think will become really tiresome over time where do you think you'll give in at some point and you will end up purchasing one of these yeah next time you see me they'll all be behind me that's the problem with social media
[00:21:39] it gets you it makes you think you need it eventually yeah no I and I think that's where and to give to give marketers a kind of you know a little bit of leeway I think there's a lot of experimentation happening at the moment saying what does work and what doesn't work we're very fresh in the kind of AI journey you know the most recent kind of iteration of AI and so I suspect
[00:22:08] that we will see a lot of flailing around in that space until we all settle down and kind of determine what it actually does and what works as a what works for a brand and you know and I think you know AI gen in the same way that any type of imagery that's used for visual branding it's like well what you know what purpose is it serving
[00:22:38] yeah and what is it saying about your brand if you're using AI generative content I think those are the questions that are not being asked probably as much which is need to be asked and obviously you know there's a there's a spectrum there as well there's AI gen there's AI modification the AI retouching there's different ways that it can be used and it kind of brings me on to really you know one of the other questions and discussions that's happening in our industry it really pisses me off
[00:23:08] these technology companies just come up with all this stuff and then we have to then do all this extra work but anyway that's a whole other conversation is detection and labeling you know I think for us we made a decision not to allow AI gen and AI modification on our site we now have to check every single piece of content that comes onto our site and you're looking at you know 60,000 plus images and videos
[00:23:38] every single day so that then creates more work I think you know the media companies and you know companies like ourselves and the tech companies to some degree are also looking at labeling how do you label content we're part of an organization called C2PA which is you know a tougher labeling of content so that as it kind of travels through time and space if it's not
[00:24:08] AI generated or it is AI generated or it's been modified it will be that label will stay with that piece of content you raised such a great point a moment ago as well talking about the voice of the brand and whether that be images or text it's so important for a brand any brand to have a unique voice and if you go on LinkedIn for example you'll see so many posts clearly all generated by the same LLM and it's the same
[00:24:38] words there's no personality there for a brand it's almost criminal isn't it to outsource your voice and not have a voice and just be the same as everything else out there that must frustrate you as a creative as well it does because it's the most fun it's the best bit why would do that outsource spreadsheets finance stuff and all that the processes do the
[00:25:07] creative stuff that's really fun and cool and actually you can really probably come up with something that no AI tool would be able to help you find because it's never been thought of before and I don't know whether it's a fear or lack of comfort or what I think again I really hope that we all come to our senses around that one of the really interesting
[00:25:37] things last year actually which I think is so incredibly telling chat GPT which is essentially the AI tool let's face it they created last year it was both video and stills created on film so it was analog photography absolutely beautifully shot you know lifestyle content around how
[00:26:07] chat GPT can fit into your life and then so you've got the biggest AI tool in the world is using old kind of photographic and I just thought so the only people who are thinking about how sloppy this stuff is actually the tools that was house I miss that oh I'll send you a link because I think you'll get a kick out of that yeah oh man if we also
[00:26:37] if we take a peek behind the curtain there commercial safety and consented training data often overlooked in this rush to adopt AI tools so I don't want to be the boring IT guy here but what are the legal and reputational risks for brands that use models trained on questionable data sources because this can come back and bite brands later on too right yeah and I
[00:27:17] in thanks I think they've become so big and obviously got some spare cash so if they do get sued then they can support with the litigation I think from our point of
[00:27:47] we wanted to protect our, our creators rights. And, and, you know, we have trained our own model on content that has been released by the people that are in the images, by the places that it's shot. And we have obviously got permission from the creators for that content to be in the models. And then we obviously pay back every year to those creators for that content to be in, in there. That is not the case with any other model. And so, you know,
[00:28:17] we are a very tiny little fish in this pond where you've got these big, these massive models where I'd say most people who use them don't care. But when it comes to businesses and, and brands where there is an expectation that that kind of diligent due diligence is done, I think, yeah, there is, there is a, there is a risk to brand trust. I think in, in terms of just, you know,
[00:28:45] I think most people want to protect the creative community and want to think that brands are protecting the creative community. And so I think, you know, more and more, there are more questions being asked around, you know, where this content came from. Was it ethically sourced, you know, and then you have the conversations around the environmental impact of, of these big tools as well, which is, I think that's healthy to have the question in that regard.
[00:29:15] And certainly we will always argue, argue for kind of ethical AI, if that's what you want to call it, where you have an understanding of what the model has been trained on. And a question many people might be quietly asking is, is whether AI is a threat to a business like yours, that's built on such high quality content. So from your perspective, do you see AI as a, an existential risk or do you think that reputation damage or the risk of it
[00:29:42] caused by low quality AI will actually push big enterprises, big brands back towards those, those more trusted sources where they know they're getting the quality? Yeah, I think it's a, it's a bit of both. I think, I think we will lose the bottom end. We're the, the, the, the businesses that are less concerned about where content comes from, who, you know, what models have been trained on the, you know, the fact that it's good content or not, it's just a piece of content,
[00:30:12] which has always been the case, to be honest. And so, you know, it's not, we're not going to go down there. We're not going to, you know, scrape around in the bottom of the bucket to use my early analogy, but we are going to keep kind of, you know, heading upwards and, and producing, you know, higher quality, better, more, more authentic, more, you know, more creative content and offer that as an alternative to AI slop.
[00:30:42] And, you know, I, I think a lot of the big businesses that we work with won't use AI generative models in their ads. They just similar to us, they've made a decision not to go down that route. So there'll always be a market for the type of imagery that we're creating. And, and obviously when, you know, when it comes to news, photography, celebrity, red carpet, sports, live sports, et cetera, that will never be replaced by AI.
[00:31:11] Yeah. Great to hear that too. And I always try to give everyone listening a valuable takeaway. So for creative and brand leaders listening who want to use generative AI, but they want to use it responsibly, what principles or guardrails would you suggest so they can benefit from the technology without sacrificing trust, originality, creativity, or long-term brand value? Is it possible? And if it is, where, what should they be doing? Well, you're asking the wrong person, I think.
[00:31:41] I think, I think, look at your model in the same way that you might look at your suppliers, you know, who your suppliers are and who they work with and whether they, you know, treat their staff well or whatever, however you do that when you're kind of signing up a new supplier, is your AI model is a supplier. And so where's the content coming from that's in that model?
[00:32:11] And, you know, how is it being fed? And, you know, therefore, you know, can you trust the outputs? And then I think in terms of brand trust, just be transparent about using the content. It's much better to label your content and for people to be aware of what they're looking at
[00:32:37] than for people to find out later and not be happy with you. You know, we hate being hoodwinked. We hate being lied to, especially by businesses of any kind. So that's, you know, I think it's a dangerous world to be in if you're kind of using AI genitive content and hoping to get away with it. Awesome.
[00:33:03] And finally, I always like to give my guests a soapbox of sorts to finally bust any myths and misconceptions that frustrate them and lay them to rest once and for all. So what do most people misunderstand about your industry? Are there any myths about your job or your field of expertise that we can finally lay to rest today? The floor is yours. What are you going to go with? I think the thing that has frustrated me over the years is that there's this kind of expectation
[00:33:31] that we're just this big black box and that we are this massive corporate company because our name turns up in, you know, in publications, et cetera. We're actually not that big. We just are, you know, a company that's very passionate about the content that we create rather than passionate about technology. And so, you know, we every single day we are covering events.
[00:33:57] We cover 160,000 events every day, every year, every day. That would be impressive, wouldn't it? Every year. And we're shooting, you know, millions and millions of images and videos for our creative collections every year. And, you know, and we have a, like I say, we have a team that are spread across the world, across Asia and Europe and the Middle East and North and South America, who are working with creators to create this content.
[00:34:23] And so, I think, you know, you can almost say that our business has been, or other businesses like ours have been seen as almost like an AI model that's just churning out content all the time. But there's a lot of love and creativity and passion that goes into that content. And we think very, very intensely. And I'd like to think intellectually about what we're creating.
[00:34:52] We're analyzing content. We're testing content on the general public. We're always looking for like the next thing rather than, you know, again, to use the AI model analogy of going, you know, taking from the past. We're always kind of looking to the future and thinking about what's the next iteration of something that's pretty standard. You know, if I use like a business scenario of people sitting in an office, you know, how do we do that better?
[00:35:20] How do we kind of, how do we make that resonant to an audience in 2026, 2027? And I think I would love for that, more of that to be visible and for more of that to be, for people to at least be aware of kind of how we work in that regard. Well, I've absolutely loved chatting with you today. So much gold in your answers. And so many light bulb moments that I can hear going off around the world.
[00:35:49] And for people listening that would like to either continue this conversation or to stay up to speed with how you're adopting technology or how you're talking about some of the problems we raised today. Where would you like to point everyone? I think probably the best place to go is where we place our research. And that is visualgps.com. It's also, you can click through to it from the gettyimages.com site as well.
[00:36:14] But on that site is where we publish our research and we share our insights of what we're seeing. And there's a lot of chit chat around technology on there as well. We created a report beginning of last year, end of 2024, beginning of 2025 around authenticity in the age of AI, which kind of, you know, very much is part of what we've been talking about.
[00:36:43] And we'll be renewing that this year again because, you know, things are changing. Well, I would add links to everything there. I would urge anyone listening to go check that out because I think just listening to you today about how brands can avoid some of those negative impacts of AI, sloppification of visual content. And we've all seen perfect examples of that out there. It's invaluable. So hopefully we've woke a few brands up on the danger of that. But more than anything, just thank you for sharing your story today. Real pleasure. Thanks for having me.
[00:37:13] I've enjoyed talking about sloppification. Appreciate it. So if this conversation made you laugh, nod or even wince a little bit, it's probably because we've all seen AI slop creeping into places that it doesn't belong. And I love that story that Rebecca shared there, how even OpenAI were using cameras to capture video rather than using their own content. If they're not eating their own dog food or drinking their own Kool-Aid, but using more old school traditional methods, I think that is quite telling.
[00:37:43] And Rebecca shared why creativity still matters, why audiences are far more visually literate than brands might give them credit for. And those lazy shortcuts, they can quietly drain confidence in your brand over time. And you also heard why authenticity is becoming a competitive advantage and why knowing how content is made now matters as much as the content itself and why high quality consented data isn't just a nice to have.
[00:38:12] It's foundational. And there was also a powerful reminder running through this episode. Yep, technology can help, but it can't replace taste, judgment or imagination. And the most memorable brands won't be the ones that generate the most images. It'll actually be the ones that make people feel something and can stand behind how their work was created. So hopefully this episode might have sparked a rethink about how your brand uses AI.
[00:38:40] And please share it with someone who's feeling pressure to move faster and cheaper right now and creating more AI slop instead of anything of any meaning, of any personality. And it's just generic. But over to you in a world full of instant visuals. How will your brand make sure it stands out for all the right reasons and keeps its voice? Let me know. TechTalksNetwork.com TechBlogWriterOutlook.com Social's just at Neil C. Hughes. Nice and easy to find.
[00:39:09] This is a dialogue, not a monologue. So I encourage you all to get in touch with me. I'm really curious on your takeaways here. But I will return again tomorrow bright and early with another guest. Hope you enjoyed today's episode. Big thank you to Rebecca for bringing it all to life. And I will speak with you all again tomorrow. Bye for now. Bye for now.

