Using ChatGPT to write an email? Sure. But an obituary?
When his grandmother died about two years ago, Jebar King, the writer of his family, was tasked with drafting her obituary. But King had never written one before and didn’t know where to start. The grief wasn’t helping either. “I was just like, there’s no way I can do this,” the 31-year-old from Los Angeles says.
Around the same time, he’d begun using OpenAI’s ChatGPT, the artificial intelligence chatbot, tinkering with the technology to create grocery lists and budgeting tools. What if it could help him with the obituary? King fed ChatGPT some details about his grandmother — she was a retired nurse who loved bowling and had a lot of grandkids — and asked it to write an obituary.
“I knew it was a beautiful obituary and it described her life,” King says. “It didn’t matter that it was from ChatGPT.”
The result provided the scaffolding for one of life’s most personal pieces of writing. King tweaked the language, added more details, and revised the obituary with the help of his mother. Ultimately, King felt ChatGPT helped him commemorate his grandmother with language that adequately expressed his emotions. “I knew it was a beautiful obituary and it described her life,” King, who works in video production for a luxury handbag company, says. “It didn’t matter that it was from ChatGPT.”
Generative AI has drastically changed the manner in which people communicate — and perceive communication. Early on, its uses proved relatively benign: Predictive text in iMessages and Gmail offered suggestions on word-by-word or phrase-by-phrase basis. But after the technological advances heralded by ChatGPT’s public release in late 2022, the applications of the technology exploded. Users found AI helpful when writing emails and recommendation letters, and even to spruce up responses on dating apps, as the number of chatbots available for experimentation also proliferated. But there was also backlash: If a piece of writing appears insincere or stilted, receivers are quick to claim the author used AI.
Now, the AI chatbot content creep has gotten increasingly personal, with some leveraging it to craft wedding vows, condolences, breakup texts, thank-you notes, and, yes, obituaries. As people apply AI to considerably more heartfelt and genuine forms of communication, they run the risk of offending — or appearing grossly insincere — if they are found out. Still, users say, AI isn’t meant to manufacture sentimentality, but to provide a template onto which they can map their emotions.
A gut check
As anyone who’s been asked to give a speech or console a friend can attest, crafting the perfect message is notoriously difficult, especially if you’re a first-timer. Because these communications are so personal and meant to evoke a specific response, the pressure’s on to nail the tone. There’s a thin line between an effective note of support and one that makes the recipient feel worse.
AI tools, then, are........
