10/29/25

Rep Report

The Nepean Rep Report If I Could Turn Back Time

Who’s Zoomin’ Who?

Three signposts
Three signposts
Three signposts

Rage against the (AI) machine

A million hits on Spotify is a milestone most musicians dream of. For those that do make it, it's often built on years of croaking through half-empty pub sets, paid in warm lager. 

For The Velvet Sundown, though, it came without lifting a plectrum. That’s because Velvet Sundown don’t exist, or at least not in the physical sense. The group is AI-generated, releasing songs written, performed and produced by algorithms. 

Earlier this year, news of this ‘fakehousery’ hit the headlines. What’s unsettled the industry most is not so much the rise of artificial bands, but the absence of any legal duty on Spotify (or any other platform) to guard against them, and its attempt to wash its hands of responsibility. 

Pressed on its role, the company insisted: “All music on Spotify, including AI-generated music, is created, owned and uploaded by licensed third parties.” In other words, it’s nothing to do with us. 

The tactic is a familiar one. Anyone that recently watched Swiped, the story of Bumble’s Whitney Wolfe Herd, will know that, in the early days of Tinder, it attempted to fend off criticism over harassment by insisting responsibility rested solely with users. 

But platforms rarely stay neutral for long. Whether it’s dating or streaming music, expectations eventually shift. You built the environment; you must also police it, or someone else might build a new one that does.

For the music streaming industry, policing means grappling with what counts as ethical use of AI in music, how to label AI tracks, and how to protect listeners and the hard-working human artists from an invisible flood of content produced without the usual blood, sweat and tears. Avoiding the issue will only harden suspicions that the platform cares more about streams than stewardship, and risk driving real artists to stop bothering with it altogether. 

Spotify may wish to avoid apologising, but the longer it waits, the bigger the reckoning. The apology always comes, even if it takes years. The question is whether it sounds real or fake. 

Hard to say I’m sorry?

We all know that sorry can be the hardest thing to say, but is it the best strategy?

In November 2021, the first Partygate story about Boris Johnson and illegal COVID-era gatherings was published in the Daily Mirror. The official No.10 response was that “COVID restrictions were followed at all times”. The rest is history. Eventually, the political cost of gaslighting the electorate was one of the larger straws that broke the camel’s back.

What many people forget is that this was not the only partygate in town. Gavin Williamson and the DfE’s own gathering was reported concurrently, but the response was quite different: “Looking back, we accept it would have been better not to have gathered in this way.” Not quite a full apology but certainly some contrition. No further coverage or speculation required.

We often hear arguments against apologies – no admittance of guilt, the need for the full facts, or ongoing investigations, for example – but these ignore the emotional and reputational consequences of delays. In a remorse vacuum, opposition, protests or additional information often piles up. Contrition is an oft overlooked but critical reputational factor.

Other recent examples of the ‘delay then eventually pay’ strategy, built on the hope that things do just quieten down, are CEOs at the Post Office, NatWest and BP, or – in the world of celebrity – Russell Brand and Phillip Schofield (see also: their parent organisations delaying their departures, also at significant reputational cost).

The list goes on. In each case, organisations and individuals suffer significant reputational harm during the scrutiny process, and still fail to avoid what quickly becomes an inevitable end.

Short-term pain can be a route to long-term gain. Not least in this new era of the fake apology – with phony statements increasingly being circulated online in the wake of high-profile transgressions. If you don’t say sorry, someone else might do it for you. Better to rip off the plaster, soak up the story and own the way forward.

Sorry seems to be the easiest word

Football is no stranger to cliché. Tune into Match of the Day and bathe in the warm comfort of familiar phrases: ‘a real six-pointer’, ‘a game of two halves’, ‘one game at a time’… The press conferences write themselves.

Also apparently writing themselves, though, are today’s heartfelt player apologies and goodbyes, player hires and manager appointments. ChatGPT is running rife in the world of football – much to fans’ annoyance.

Former Getafe midfielder Christantus Uche sparked fan fury when one eagle-eyed supporter noted some surprising similarities between his teary-eyed farewell and that of Fabio Silva’s, penned on departure from Wolverhampton Wanderers just days earlier. Whether AI was to blame, as many fans suspect, or it was simply a return to some more retro plagiarism, it’s a case study in how to disengage already disaffected audiences: a lack of authenticity.

Other examples abound: Nottingham Forest’s recent announcement of Sean Dyche’s arrival as manager, or Norwich City’s Shane Duffy apologising to fans following a public and insult-riddled X meltdown.

What both have in common is a barely-disguised reliance on ChatGPT. Tonally and grammatically, both are a significant departure from the norm, but the real giveaway is the prevalence of the lesser-spotted em-dash.

The reactions to all these posts are clear. Suspicion that AI has been used leads to anger and drives a lack of trust. Social media managers and communicators more broadly would be wise to take note: use ChatGPT at your peril or, at the very least, brush up on your hyphens, en-dashes and em-dashes.