AI-assisted code can be inherently insecure, study finds [TechSpot]
View Article on TechSpot
AI systems like GitHub Copilot promise to make programmers’ lives easier by creating entire chunks of “new” code based on natural-language textual inputs and pre-existing context. But code-generating algorithms can also bring an insecurity factor to the table, as a new study involving several developers has recently found.