New ‘Rules File Backdoor’ Attack Lets Hackers Inject Malicious Code via AI Code Editors
A new supply chain attack vector, “Rules File Backdoor,” enables hackers to inject malicious code into AI-powered code editors like GitHub Copilot and Cursor. By exploiting hidden characters and sophisticated evasion techniques, threat actors can manipulate the AI to generate code containing security vulnerabilities or backdoors. This attack vector poses a significant risk, potentially affecting millions of end users through compromised software.