Shriya Saran Sounds Alarm on AI-Generated Deepfakes: 'Almost Every Day I See a Morphed Picture That's Not Me'
Image Source: Internet
Actor Shriya Saran has become the latest celebrity to fall victim to AI-generated deepfakes. The 41-year-old actress took to social media to share her disturbing experience of being impersonated by an unknown individual who had been sending suspicious messages to her colleagues. Shriya explained that the impersonator had been using her phone number, which she has had for nearly two decades, to deceive people she knows. Initially, she chose not to address the issue publicly, but the impersonator's actions escalated, prompting her to take drastic steps. Shriya reported the incident to cybercrime authorities and publicly clarified that the messages were not from her. However, she admitted that it's difficult to track whether the impersonator has stopped. The actress expressed her concern about the widespread use of AI-generated deepfakes, citing instances where people have used her name or profile picture in the past. She also spoke about the increasing difficulty in distinguishing between fake and real images, stating that even her family has been fooled by AI-generated pictures. Shriya's experience is not an isolated incident. Many celebrities have faced similar situations, including Aditi Rao Hydari, who recently called out a fake account impersonating her. Shriya emphasized that there's little one can do to prevent such incidents, and even her team has limited ability to prevent them. The actress urged people to be cautious and not to give unnecessary importance to such cases, but also stressed the need to address the issue when it affects others. The increasing use of AI-generated deepfakes has raised concerns about their potential impact on individuals and society. Shriya's experience serves as a reminder of the importance of being vigilant and taking steps to prevent such incidents. As the technology continues to evolve, it's essential to find ways to mitigate its misuse and protect people from falling victim to deepfakes.