Ad
Skip to content

Marketing consultant loses job because he doesn't understand generative AI

Image description
via YouTube (Screenshot)

A marketing consultant lost his job after using ChatGPT to "research" historical film reviews for a movie trailer. The incident highlights a widespread misunderstanding of how generative AI works.

According to Deadline, marketing consultant Eddie Egan has been fired for using an AI tool such as ChatGPT to generate review quotes for a trailer for the movie "Megalopolis". The trailer included highly critical quotes about director Francis Ford Coppola's previous work that turned out to be AI-generated fabrications.

Egan's goal was to argue that "Megalopolis," like Coppola's previous films, would initially face harsh criticism but ultimately be recognized as a masterpiece. The trailer quoted renowned film critics such as Pauline Kael of The New Yorker and Andrew Sarris of The Village Voice, who supposedly called classics like "The Godfather" a "sloppy, self-indulgent movie" and "Apocalypse Now" an "epic piece of trash."

In reality, these scathing reviews never happened. On the contrary, the critics praised these films, as reported by Vulture magazine. As a result, production company Lionsgate apologized for the mistake, removed the trailer and terminated Egan's contract.

Ad
DEC_D_Incontent-1

AI models generate words, not facts

This case demonstrates how easy it is to be misled by ChatGPT and similar systems if you don't understand their underlying mechanisms. The Large Language Models (LLMs) powering these tools generate words based on probabilities, influenced by the user's prompt. The resulting sentences can be either accurate or soft bullshit - these models have no built-in fact-checking capabilities. If you ask for critical reviews, it'll generate some.

Others have fallen for the chatbots' reasoned-sounding sentences: Attorney Steven A. Schwartz initially used ChatGPT for research, unaware that the system could generate false content. In another case, attorneys used ChatGPT to find and cite supposed reference cases that turned out to be AI inventions.

These examples show that many people do not yet understand how generative AI works, and that its results should not be used unchecked. Even OpenAI itself had a factual generation error in its first SearchGPT demo.

AI News Without the Hype – Curated by Humans

As a THE DECODER subscriber, you get ad-free reading, our weekly AI newsletter, the exclusive "AI Radar" Frontier Report 6× per year, access to comments, and our complete archive.

AI news without the hype
Curated by humans.

  • Over 20 percent launch discount.
  • Read without distractions – no Google ads.
  • Access to comments and community discussions.
  • Weekly AI newsletter.
  • 6 times a year: “AI Radar” – deep dives on key AI topics.
  • Up to 25 % off on KI Pro online events.
  • Access to our full ten-year archive.
  • Get the latest AI news from The Decoder.
Subscribe to The Decoder