After triggering a spike in VPN service downloads following a temporary ban about a year ago, OpenAI faces troubles in the European Union once again. The culprit this time? ChatGPT's hallucination problems.
The popular AI chatbot is infamous for making up false information about individuals—something that OpenAI is admittedly unable to fix or control, experts say. That's why Austria-based digital rights group Noyb (stylized as noyb, short for “none of your business”) filed a complaint to the country's data protection authority on April 29, 2024, for allegedly breaking GDPR rules.
The organization is now urging the Austrian privacy protection body to investigate how OpenAI verifies the accuracy of citizens' personal data. Noyb also calls authorities to impose a fine to ensure GDPR compliance in the future.
ChatGPT's misinformation: a GDPR problem
We already discussed how ChatGPT and similar AI chatbots will probably never stop making stuff up. That's quite worrying considering that “chatbots invent information at least three percent of the time—and as high as 27 percent,” the New York Times reported.
Sure, we might be able to learn how to deal with AI-generated misinformation by training ourselves to spot fake facts before falling for them. However, experts now argue that these “AI hallucinations” are actually bad also for our privacy.
“Making up false information is quite problematic in itself. But when it comes to false information about individuals, there can be serious consequences,” said Maartje de Graaf, data protection lawyer at noyb. Worst still, ChatGPT's inaccuracy goes de facto against EU laws.
Under Article 5 of the GDPR, in fact, all online information about individuals in the EU must be accurate. While, according to Article 16, all inaccurate or false data needs to be rectified. Article 15 gives then Europeans “the right to access,” requiring companies to show which data they hold on individuals and what the sources are.
Yet, when noyb's founder, Max Schrems, decided to challenge the AI lord's compliance over ChatGPT's mistake of his correct birthday date, OpenAI admitted it could do neither of the above. Instead of rectifying or erasing the data, the firm said it could just filter or block the information to appear on certain prompts. However, this would have been possible only by filtering all information about him altogether.
According to de Graaf, this is a clear sign that tech firms cannot currently develop AI chatbots that comply with EU law. “The technology has to follow the legal requirements, not the other way around,” he said. “It seems that with each ‘innovation’, another group of companies thinks that its products don’t have to comply with the law.”
🚨 noyb has filed a complaint against the ChatGPT creator OpenAIOpenAI openly admits that it is unable to correct false information about people on ChatGPT. The company cannot even say where the data comes from.Read all about it here 👇https://t.co/gvn9CnGKObApril 29, 2024
After first launching in November 2022, ChatGPT quickly became mainstream. Throughout 2023, the AI chatbot race dominated the tech world with the biggest players all developing their iterations. From students, doctors, and lawyers to artists, and even cyber attackers, everyone seems to use OpenAI's products or similar apps.
As with all tech innovations, the public has been divided between those enthusiastic about AI's potential and those concerned about its power. Some experts wonder whether ChatGPT is the ultimate privacy nightmare, too.
These concerns eventually turned into real-life problems for OpenAI in Europe. The troubles began in March 2023 when Italy temporarily blocked ChatGPT for improperly collecting and storing Italians’ data. After that, other EU countries including France, Germany, and Ireland began investigating the matter. A ChatGPT task force to coordinate national efforts was then born. Yet, experts at noyb feel that authorities' efforts have been largely fruitless so far.
“For now, OpenAI seems to not even pretend that it can comply with the EU’s GDPR,” the group argues.
This is why noyb decided to take matters into its hands. The group is asking the Austrian data protection authority (DSB) to investigate how OpenAI handles people's data to bring its processing in line with the GDPR.