19/09/2025

FRIDAY | SEPT 19, 2025

5

Spike in use of nudify deepfakes raises concern

‘First 48hrs crucial for victims’ PETALING JAYA: The first 48 hours after discovering a sexualised deepfake could decide whether victims suffer lasting trauma or begin recovery, experts say. Malaysian Psychological Association president and psychologist Prof Dr Shazli Ezzat Ghazali said victims often experience an immediate wave of shock, fear and shame. “The trauma could resemble post-traumatic stress, where victims feel watched or haunted by the images,” he explained. He stressed that dismissing or minimising the victim’s reaction could worsen the damage. “In the first 48 hours, the key step is to validate the victim’s emotions. “Avoid saying things such as ‘ignore it’ or ‘do not think about it’ as this denies their reality. Instead, acknowledge their pain and reassure them that it is normal.” He recommended that victims take a break from social media to reduce exposure to possibly triggering content. He said in cases of severe distress, including panic attacks, prolonged crying or difficulty breathing, immediate referral to a counsellor or psychologist is crucial. He added that family members play a critical role. “Partners need reassurance and open communication to prevent mistrust. Children should be taught early that technology could create fake images that look real. Family support is the best protection.” Shazli also warned that without early intervention, victims risk developing chronic anxiety, depression or social withdrawal. The growing misuse of “nudify” deepfakes, which are AI tools that digitally strip clothing from images, has already affected thousands globally. While the legal and technological response remains limited, mental health specialists say recognising the human cost is just as important.

How nudify deepfake spreads and why it is hard to stop

safeguards, such as identity checks and keyword filters to reduce abuse.” Universiti Teknikal Malaysia Malacca AI and cybersecurity expert Prof Dr Azah Kamilah Muda said nudify deepfakes are especially insidious because they alter only part of an image. “They usually change only one part of the photo, such as removing clothes, and leave everything else untouched. Advanced tools blend the fake parts so smoothly that the edits look natural. Once the picture is uploaded, compression makes the remaining clues even harder to see. “Most detection systems are designed for face-swaps, not this type of edit.” She noted that the manipulated content could spread at lightning speed. “These services are promoted through Telegram bots, affiliate links or ads on Instagram and Facebook. Once created, they are quickly shared on social media and messaging apps. Stopping this requires cooperation between companies, governments and the community.” Beyond technology, the emotional fallout for victims is severe, said Universiti Kebangsaan Malaysia psychologist and Malaysian Psychological Association president Prof Dr Shazli Ezzat Ghazali. He urged families to prepare children early by teaching them that fake images could look real while partners and relatives must offer reassurance and open communication. He said for long-term recovery, cognitive behavioural therapy, resilience training, digital detox periods and community support are recommended. The Communications Ministry recently revealed the scale of the challenge. Between 2022 and August this year, 42,399 misleading AI-related posts were removed by social media platforms at the request of the Malaysian Communications and Multimedia Commission. It stressed that under the Communications and Multimedia Act 1998, offences involving false or fraudulent content are punishable by a maximum fine of RM500,000, up to two years’ jail or both.

o Academic warns current detection tools remain ineffective

What is a nudify deepfake? Digitally manipulated images that remove clothing of subjects. Different from face-swap deepfakes → more invasive.

Ű BY FAIZ RUZMAN newsdesk@thesundaily.com

PETALING JAYA: The rapid spread of “nudify” deepfakes, which are AI-generated sexualised images that digitally strip victims of their clothing, has sparked alarm among experts, who warn of its dangerous ease, devastating impact and weak safeguards against abuse. Universiti Malaysia Kelantan Institute for AI and Big Data director Dr Muhammad Akmal Remli said the latest “nudify” tools make exploitation disturbingly simple. “AI-based nudify applications, also known as synthetic non-consensual explicit AI-created imagery, allow anyone to generate sexualised images without needing graphic skills. “In the past, it required editing software, such as Photoshop. Now, even someone without training could do it. Victims may not even realise their pictures are being misused.” He said the algorithms could fabricate entirely new images with “realistic textures, skin tones, shadows and lighting”, making them almost impossible to detect with the naked eye. Akmal explained that distribution often begins in the shadows. “Images uploaded to the internet could be taken and processed into nudify deepfakes. At first, they might circulate on dark web forums, but once they go viral, they appear on open social media platforms, sometimes even through paid ads. “The spread is driven by pornography, shaming of individuals and business models that profit from such services.” He warned that current detection tools remain limited. “Some innovations are being developed, such as digital watermarks and machine learning analysis to spot subtle patterns, but these are not foolproof. Technology companies must also strengthen

Why it is harder to detect Edits are localised - only clothes are removed, with the rest of the photo unedited, providing fewer visual clues. AI fills in details - advanced inpainting creates realistic skin tones, shadows and textures. Compression erases traces - once reshared on social media, tiny tampering signals vanish. Detecting blind spots - most detection tools are built for face-swaps, not body edits.

The spread pathway

Created on demand via Telegram groups, underground sites, or cheap AI apps. ↓ Rapid circulation spreads in private groups first, then shared on social media. ↓ Targets such as popular influencers make them easy prey. ↓ Monetised networks - subscription models, affiliate schemes and crypto payments drive growth. ↓ Perpetrators encouraged by occasional evasion of consistent and strict enforcement action by government agencies.

Human impact (psychology) Psychologists note that victims of sexualised deepfakes often suffer initial trauma such as shame, anger, fear and loss of control. The first 48 hours are critical for validating emotions, avoiding triggers and seeking counselling. For long-term recovery, therapy, resilience training, digital detox and community support are key.

Monitoring and reporting Fact-check resources: Sebenarnya.my and the Malaysian Communications and Multimedia Commission (MCMC) AI chatbot help verify and report harmful content.

theSun graphics by Faiz Ruzman

“We cannot treat this as only a tech or legal issue. At its heart, it is a human violation that leaves scars. “The way we respond in the first hours could mean the difference between recovery and long-term trauma,”he said. – BY FAIZ RUZMAN All linked to immigration counter-setting case probed SEPANG: The Malaysian Legal action: Victims can lodge complaints with MCMC. Under the Communications and Multimedia Act 1998, spreading false or fraudulent content can lead to fines up to RM500,000 or two years’ jail, or both. Safety support: The Online Safety Act 2025 requires app and network providers to act on harmful content, giving its users more rights to demand removals.

Boy killed by crocodile in Sarawak KUCHING: A 12-year-old boy was mauled to death by a crocodile while fishing in a river in Samarahan district, southern Sarawak yesterday. “Witnesses at the scene said they saw the boy standing inside a boat near the riverbank. “He was casting fishing nets into the water.

have been made but did not rule out the possibility of further arrests, depending on the investigators’ findings. On Sept 10, MACC said it had arrested 27 individuals, including 18 enforcement agency officers, for suspected involvement in a counter-setting syndicate, which facilitates the entry of foreigners into the country without following prescribed procedures. According to sources, all 27 suspects, namely 19 men and eight women, aged between 20 and 50, were detained in operations conducted across Selangor, Malacca, Kuala Lumpur and Negeri Sembilan.

entity, but they still need to account for the source of the funds, and anyone linked would be called in.” Azam added that MACC would look into how those arrested managed to buy luxury cars, run businesses and amass cash. “They would be issued notices to declare their assets and all of this would be scrutinised, including how the money was acquired and its sources.” He said MACC has also launched an investigation under the Anti-Money Laundering, Anti-Terrorism Financing and Proceeds of Unlawful Activities Act 2001, Bernama reported. He confirmed that no new arrests

Anti-Corruption Commission (MACC) is examining the financial records of family members and close associates of those arrested in a counter-setting scandal, said Chief Commissioner Tan Sri Azam Baki. He also said the move is part of efforts to trace the money trail behind the counter-setting activities, with the probe extending to all linked to those in custody. “We consider them as witnesses and we need to obtain full clarification as some have claimed that the money came from other people (sources). “The money trail is not extensive since it does not involve a corporate

“Witnesses said they heard the boy screaming before he went missing,” the department said in a statement. The team combed the river and surrounding banks before finding the boy’s body floating about 10m from where he had disappeared. “The body has crocodile bite marks,” the department added. The remains were handed over to police for further action. – BY JOSEPH PETER

The victim, identified as Arif Fahmi Aiman, was casting fishing nets from a boat near the riverbank at Kampung Empila, about 30km from Kuching, when he was dragged into the water. The Sarawak Fire and Rescue Department said it received an emergency alert from the public at 8am. A search and rescue team from the Kota Samarahan station was swiftly deployed to the scene.

Made with FlippingBook Online newsletter creator