Microsoft’s AI research division accidentally leaked large amounts of sensitive data. This happened when they were contributing open-source AI learning models to a public GitHub repository.
Cloud security firm Wiz recently discovered that a Microsoft employee mistakenly shared a URL. This URL led to a misconfigured Azure Blob storage bucket that contained leaked information.
The Wiz Research Team has made a remarkable discovery that, in addition to the open-source models, the internal storage account unintentionally granted access to an astounding 38TB of additional and private data.
The exposed data included backups of personal information belonging to Microsoft employees. This included passwords for Microsoft services, secret keys, and an archive of over 30,000 internal Microsoft Teams messages from 359 Microsoft employees.
Microsoft issued an advisory on Monday stating that “no customer data was exposed and no other internal services were at risk during the incident”.
Wiz reported the incident to MSRC on June 22, 2023. They revoked the SAS token to prevent external access to the Azure storage account. The issue was resolved on June 24, 2023.
Source:
Bleeping computer
Times of India
TechCrunch