Security News > 2025 > March > New ‘Rules File Backdoor’ Attack Lets Hackers Inject Malicious Code via AI Code Editors

2025-03-18 15:43
Cybersecurity researchers have disclosed details of a new supply chain attack vector dubbed Rules File Backdoor that affects artificial intelligence (AI)-powered code editors like GitHub Copilot and Cursor, causing them to inject malicious code. "This technique enables hackers to silently compromise AI-generated code by injecting hidden malicious instructions into seemingly innocent
News URL
https://thehackernews.com/2025/03/new-rules-file-backdoor-attack-lets.html
Related news
- Bybit Hack Traced to Safe{Wallet} Supply Chain Attack Exploited by North Korean Hackers (source)
- CrowdStrike Security Report: Generative AI Powers Social Engineering Attacks (source)
- Hackers Exploit Paragon Partition Manager Driver Vulnerability in Ransomware Attacks (source)
- Hackers Exploit AWS Misconfigurations to Launch Phishing Attacks via SES and WorkMail (source)
- How New AI Agents Will Transform Credential Stuffing Attacks (source)
- YouTube warns of AI-generated video of its CEO used in phishing attacks (source)
- MINJA sneak attack poisons AI models for other chatbot users (source)
- Chinese Hackers Breach Juniper Networks Routers With Custom Backdoors and Rootkits (source)
- Hackers target AI and crypto as software supply chain risks grow (source)
- TechRepublic EXCLUSIVE: New Ransomware Attacks are Getting More Personal as Hackers ‘Apply Psychological Pressure” (source)