Artificial intelligence (AI) is likely to play an important role on Valentine’s Day, according to a report released by security software firm McAfee. The firm conducted a survey which revealed that one in four adult respondents (26 percent) plan to use generative AI tools to write a love letter for their partner or potential love interest. Interestingly, the study also found that more than two-thirds of the adults (67 percent) could not differentiate between a love letter written by AI and one written by a human.
The findings are part of McAfee’s new research report titled Modern Love which aimed to discover the role of AI and the Internet in transforming love and relationships of the modern age. The study surveyed 5,000 people across nine countries to collect the data. The biggest highlight to come out of the study was that more than one-fourth of the respondents were already planning to leverage tools such as OpenAI’s ChatGPT, Google Gemini, and Microsoft Copilot to help them profess their love to their dates and partners.
The most common reason for using the AI-powered ghostwriter was that it would make the sender appear more confident (27 percent), as per the study. Lacking time to write the letter personally and the lack of inspiration was given as the second most popular reason at 21 percent each. Another 10 percent said using AI would be quicker at performing the same task.
While many of the respondents did not feel they could get caught, almost half of the surveyed adults (49 percent) said that they would be offended if they received a love letter written by a generative AI-powered tool. But when presented with a love letter, a whopping 67 percent failed to identify whether it was written by a human or with the help of a machine.
Generative AI tools based on large language models (LLM) are capable of writing texts that appear to be written by a human. Most such tools allow users to add prompts to control and customise the writing style, flow, structure, tonality, and more. Further, ChatGPT Plus, Copilot Pro, and other high-end AI assistants let users create chatbots that could be trained on just their written material and bear a much higher resemblance to them while writing responses.
The McAfee study highlights that this close resemblance with humans’ writing style can be maliciously used by cybercriminals to pull off romance scams. Romance scams are planned crimes where scammers prey on vulnerable people through false promises of love and relationships. The study found that 51 percent of the surveyed individuals confessed to being catfished (talking or meeting with strangers online who were pretending to be someone else). The firm also urged people to remain more vigilant during this period, and never to entertain the request of a stranger (or even someone they know) asking to send money or any sensitive information online.
Affiliate links may be automatically generated – see our ethics statement for details.