Privacy activists have lodged a complaint against OpenAI, the maker of ChatGPT, alleging that the chatbot disseminates false information about individuals, a practice termed “hallucination.” The complaint, filed by Vienna-based nonprofit noyb (none of your business), accuses OpenAI of violating Europe’s General Data Protection Regulation (GDPR), which is considered the strictest privacy and security law globally.
The complaint centers on ChatGPT’s alleged dissemination of inaccurate information about a public figure, including misinformation about the individual’s birth date. According to noyb, OpenAI has refused to correct or erase this false information, despite requests to do so. The complaint highlights OpenAI’s offer to block or filter results based on the figure’s name, which would effectively hide all information about them but would not remove the incorrect data from OpenAI’s systems.
Noyb also criticizes OpenAI for failing to disclose relevant information about the person, such as details regarding the data that was processed, its sources, and who it was shared with. The group argues that companies must comply with GDPR’s legal obligations when processing data about individuals, including providing accurate and transparent information.
OpenAI, which has acknowledged the issue of AI hallucinations in the past, has not responded to the complaint. The complaint adds to the legal challenges facing OpenAI regarding the use and training of its AI models, including copyright lawsuits and privacy concerns.
GDPR allows regulators to impose fines of up to 4% of a company’s global turnover. Noyb expects the matter to be dealt with at an EU level, potentially increasing the stakes for OpenAI. Noyb, founded in 2017, has been influential in European data protection, bringing numerous cases resulting in substantial fines.
The complaint against OpenAI underscores the challenges that AI developers face in ensuring that their systems provide accurate and reliable information. As AI technology continues to advance, regulators and activists are likely to scrutinize how these systems handle data and whether they comply with privacy and security laws.
Leave a comment