Prompt Injection Tricks AI Into Downloading And Executing Malware | Hackaday
A proof-of-concept demonstrates how prompt injection can trick an AI service into downloading and executing malware, compromising the system. This highlights the security risks associated with large language models and the challenges of preventing prompt injection attacks.