Microsoft’s AI research team accidentally exposed 38 terabytes of private data through a Shared Access Signature (SAS) link it published on a GitHub repository, according to a report by Wiz research ...
White Hat Hackers Discover Microsoft Leak of 38TB of Internal Data Via Azure Storage Your email has been sent The Microsoft leak, which stemmed from AI researchers sharing open-source training data on ...
Microsoft has learned an important lesson after having to clean up a major data leak resulting from an “overly permissive” shared access signature (SAS) token accidentally disclosed by one of its ...
The Microsoft AI research division accidentally leaked dozens of terabytes of sensitive data starting in July 2020 while contributing open-source AI learning models to a public GitHub repository.
A Microsoft AI research team that uploaded training data on GitHub in an effort to offer other researchers open-source code and AI models for image recognition inadvertently exposed 38TB of personal ...