Top

VIP deepfakes promote miracle cures on social media

Every day, sponsored content about miracle cures and supplements with extraordinary capabilities is posted on Facebook, Messenger, and Instagram, and it is often accompanied by artificial intelligence-generated images, video, and audio deepfakes of famous people and health experts. This is revealed in research published by Bitdefender on health-themed fraudulent campaigns, which shows that the problem of fake ads on social media is more widespread than we might think, resulting in risks to our safety and – most importantly – our health.

The research

The scams target topics dear to the user’s heart, namely wellness and health: cybercriminals lure users by conveying them to fake websites promoting drugs, treatments and cures with assured success and extremely low prices. Bitdefender collected data for three months (March-May 2024). Deepfake images, videos, and audio of celebrities such as Ben Carson, Brad Pitt, Cristiano Ronaldo, Denzel Washington, and Sofia Loren are used; fake pages on social media also exceed 350,000 followers. moreover, the problem is widespread throughout the world, including Europe.

“Bitdefender advises users to be wary of offers that use celebrity endorsements. These scams are evolving rapidly and becoming increasingly difficult to detect as artificial intelligence improves,” the company says. Basically, when it comes to health care, it’s a good idea to consult your doctor. DIY, cures-especially when recommended on social media-risk being extremely dangerous.

VIP deepfakes promote miracle cures on social media: how to defend yourself
VIP deepfakes promote miracle cures on social media: how to defend yourself

How to protect yourself from deepfakes in healthcare

If something seems too good to be true, it probably is. Be wary of videos or audio recordings that look suspicious or come from unreliable sources. Verify information: before believing something you have seen or heard online, verify information from reputable sources, such as the website of a reputable healthcare organization or a trusted doctor. Pay attention to the source: Pay attention to the source of the video or audio recording. Is it from a reputable healthcare organization or a credible news source? Check for manipulation: deepfakes can be very sophisticated, but some clues can help you identify them. Pay attention to audio or video inconsistencies, such as unnatural mouth movements or lip synchronization problems.

There is something to be said for the fact that deepfakes could be used to improve communication and health education. For example, they could be used to create realistic videos explaining complex medical procedures or to simulate first-aid scenarios to help train first responders. Deepfakes could also be used to personalize health education by creating videos tailored to everyone’s specific needs and interests. It is important to use deepfakes responsibly and ethically in the healthcare setting. Guidelines need to be developed to create and use deepfakes in this area and to ensure that they are used for the benefit of patients and not to harm them. Educating the public about the risks of deepfakes and how to identify them is also important.

Antonino Caffo has been involved in journalism, particularly technology, for fifteen years. He is interested in topics related to the world of IT security but also consumer electronics. Antonino writes for the most important Italian generalist and trade publications. You can see him, sometimes, on television explaining how technology works, which is not as trivial for everyone as it seems.