Is seeing still believing? How brands and content creators are navigating AI in imagery and video
By Nikki Scrivener
N ikki quizzes Rob Fowell, director at All Caps Media, on the growing influence of AI in video and imagery – and how brands can best navigate it.
Q. Has the pace of change surprised you when it comes to the use of AI in visual content creation?
A. A few years ago, there was a generative AI video of Will Smith eating spaghetti that went viral. It looked fairly ridiculous – but if you watch it now and let it sink in just how recent that was, it’s incredible how far things have come. What we’re seeing now is already a million times better. The speed at which this technology is moving is striking for everyone working in creative industries.
Q. Is this progress a good or a bad thing?
A. There’s no harm in admitting that it’s impressive. Some of the short films and videos that people are creating require a huge amount of skill. To get the best results, people are having to meticulously prompt scene by scene. That in itself is a talent. But it does open up ethical questions, especially because AI is fundamentally drawing on other people’s work – and in some cases creatives are being completely ripped off.
"The speed at which this technology is moving is striking for everyone working in creative industries."Rob Fowell Director, All Caps Media
Q. What are your main ethical concerns around AI-generated imagery and video?
A. A lot of it comes down to where your own personal line is. I’m aware of companies that have no problem using it to generate all their visual content. There are others that have budgetary constraints and feel this is their most cost effective route. Other brands are absolutely forbidden from using it, and they insist that their suppliers don’t either.
It’s crucial that clients and their creative agencies have transparency and trust here. Content creators need to be clear on whether they’re using AI for inspiration, rough drafts or reference images, for example. An absolute red line is using it to produce final work and not communicating that with the client. It’s important that this conversation is had at the beginning of the relationship or project.
There are also wider ethical issues with certain AI models, such as Grok. Brands should be asking their creatives to be open about the tools they are using – to ensure they’re aligned.
Q. Do you think AI is an existential threat to visual content creators?
A. I do think it’s changing our world. Realistically, clients can struggle to justify costs like location shoots, photographers, lighting etc when they can type in a prompt and generate an image in minutes.
This has particularly impacted illustrators and designers – with clients generating logos or branding themselves.
But you’re already starting to see an awful lot of logos and illustrations that look very similar and it’s clear that the work has either been inspired by or blatantly copied from something else.
I do think we’re getting to a point where originality of thought is becoming increasingly important again.
"Content creators need to be clear on whether they're using AI for inspiration, rough drafts or reference images."Rob Fowell Director, All Caps Media
Q. How are you advising your clients to use AI responsibly?
A. From a visual content perspective, the AI generated version of an asset should never be the final product. It can be useful for inspiration, story boarding or throwing in a sketch-style AI image to show a direction of thought, for example.
It can also be useful as a companion tool – for pulling hex codes from a logo, identifying fonts or upscaling assets when a company doesn’t have a full brand pack. To me, that’s a handy and genuine use of the technology. The problem starts when AI is doing the work for you and you’re presenting it as your own. Firstly, someone will eventually spot that. Secondly, there are credibility issues to consider.
Q. Is AI imagery and video ever good enough to replace original work?
A. Of course I’m going to say no! I do feel we’re still in the novelty phase with it though. Written content seems to have come through that hype cycle, and I think most people would now agree that AI doesn’t do written content anywhere near well enough. I think the same will happen with visual content.
Real people, real locations and real production still matter, especially for reputation. If a brand gets caught out for putting out a high profile visual campaign that has clearly been generated using AI, the implications around trust could be significant.
Don’t get me wrong, if that Will Smith video is anything to go by, the quality will be in a very different place in a few years’ time. But it will always be important for brands to differentiate themselves – by disrupting the status quo, doing things differently and standing out from their competitors. It’s impossible to do this by drawing on what’s been done before. In that sense, genuine creativity will become even more valuable.
"Real people, real locations and real production still matter, especially for reputation."Rob Fowell Director, All Caps Media
Q. Finally, can you give us any insider tips on spotting fake imagery or videos?!
A. Some of those old tell-tale signs, like hands being wrong, have improved. But there are a few things that you can look out for. Text in video is often ‘off’ – you’ll know what I mean if you look closely at that! Eyes can also look glazed over – in both images and videos. Movement in video tends to be too smooth – we don’t move quite like that in real life. Images can also look overly smooth or polished (think no signs of skin pores or creases in clothes). Also, if you have an underlying feeling that you’ve seen the person in the image a thousand times before – you probably have!
It is getting harder to spot though, spurning industry-wide conversations and even a sub reddit group trying to figure out what’s real and what isn’t! Check out Is this AI for some examples.
Share this:
