Cybercriminals are exploiting the popularity of AI tools by creating fake AI-themed platforms to lure users into downloading malware like Noodlophile stealer. These campaigns often use convincing social media posts and fake websites to trick users into unwittingly installing malicious files that harvest sensitive data.
Affected: Users downloading fake AI tools, social media platforms, and infected systems.
Affected: Users downloading fake AI tools, social media platforms, and infected systems.
Keypoints
- Threat actors are using fake AI-powered tools and social media campaigns to distribute the Noodlophile information stealer malware.
- Cybercriminals create convincing AI-themed platforms and advertise via legitimate-looking Facebook groups and viral posts.
- Users are prompted to download malicious ZIP files, which contain executables that initiate the malware infection chain.
- The malware ultimately deploys a Python-based payload that delivers the Noodlophile stealer, targeting sensitive data such as credentials and cryptocurrency information.
- The developer of Noodlophile is believed to be Vietnamese, associated with a cybercrime ecosystem targeting Facebook users.
- Public interest in AI tools is exploited, with malware campaigns leveraging popular platforms and services like ChatGPT and CapCut AI.
- CYFIRMA reports reveal that other malware families, such as PupkinStealer, also use simple techniques to steal data with minimal detection risk.
Read More: https://thehackernews.com/2025/05/fake-ai-tools-used-to-spread.html