The editorial’s perspective on recent additions to the dictionary: a leisure activity that can bring clarity to a frightening truth.
When the Cambridge dictionary declared “hallucinate” as its word of the year this week, it was not referring to its current definition as a human experience of perceiving something that is not real, but rather to the phenomenon of AI gaining the ability to create or fabricate things. This idea itself is somewhat surreal, as pointed out by Naomi Klein. She questioned why these mistakes are labeled as hallucinations, rather than algorithmic errors or glitches. By using language associated with psychology, psychedelics, and mysticism, Klein argued that those behind generative AI are positioning themselves as midwives for the birth of a sentient intelligence that they want us to believe will be a significant advancement for humanity.
The “word of the year” is a unique tradition that combines elements of a game and a marketing opportunity, and is eagerly participated in by dictionary editors all over the world. Those who recall Oxford dictionary’s selection for 2022 are well aware of how unconventional the choices can be. In 2022, the public was asked to vote for their own word of the year, and “goblin mode” was the overwhelming favorite with 318,956 votes, making up 93% of the total vote. While this term (which essentially means being lazy and unproductive) has been in use for over a decade, its first appearance in a British newspaper, according to research engine Factiva, was in The Observer in February of last year.
In most cases, the words chosen as “word of the year” are determined by analyzing search data on dictionary websites. This not only reveals the issues that are currently on people’s minds, but also sheds light on how they are attempting to understand them. The recent use of the term “hallucination” highlights a trend of attributing human characteristics to AI technology, which can lead to oversimplification and misunderstanding, as has been extensively documented in academic circles. However, the use of metaphors that humanize machines has been present in literature for a long time, dating back to the creation of Frankenstein’s monster in early 19th-century fiction.
The use of literary techniques is not limited to technology. It is demonstrated in a recent nonfiction work that addresses one of the biggest issues facing society: global warming. John Vaillant’s book, Fire Weather, which was awarded the Baillie Gifford prize for nonfiction last Thursday, is based on the devastating wildfire that ravaged the Canadian town of Fort McMurray in May 2016 and forced 90,000 residents to evacuate.
Mr. Vaillant’s interviewees compare the horror of the blaze to Balrog, a fire monster in Tolkien’s Lord of the Rings, and the asteroid strike in the movie Armageddon. One firefighter describes the blaze as a moving animal, targeting untouched areas to burn. Mr. Vaillant states that this description is not exaggerated. He explains, “Being in the presence of the fire felt like facing a determined and ravenous opponent, with the sole intention of causing chaos.” Usually, intention is not associated with fire.
Mary Shelley’s book is called “Frankenstein; or, The Modern Prometheus.” It was written during a time of significant advancements in science and technology. The subtitle refers to a Greek god who was punished for giving humans the power of fire. In order to tackle the current difficulties we face, such as a rapidly changing environment or out-of-control artificial intelligence, we need to use new and creative language. It’s important to debate and question the specific words we use.