Microsoft AI Employee Accidentally Leaks 38TB of Data

From PC Mag: A misconfigured link accidentally leaked access to 38TB of Microsoft data, opening up the ability to inject malicious code into its AI models.

The finding comes from cloud security provider Wiz, which recently scanned the internet for exposed storage accounts. It found a software repository on Microsoft-owned GitHub dedicated to supplying open-source code and AI models for image recognition.

On the affected GitHub page, a Microsoft employee had created a URL, enabling visitors to the software repository to download AI models from an Azure storage container. “However, this URL allowed access to more than just open-source models,” Wiz said in its report. “It was configured to grant permissions on the entire storage account, exposing additional private data by mistake.”

View: Full Article