The Italian Data Protection Authority found OpenAI used personal data to train their AI without «an adequate legal basis» for doing so.
Italy’s data protection watchdog has fined OpenAI €15 million after completing a personal data collection probe into the company’s artificial intelligence chatbot, ChatGPT.
The Italian Data Protection Authority (Garante) said OpenAI used personal data to train ChatGPT «without having an adequate legal basis and violated the principle of transparency and the related information obligations towards users».
OpenAI also didn’t provide an «adequate age verification system,» to prevent users under 13 years old from being exposed to inappropriate AI-generated content, the investigation continued.
The Italian authority is asking OpenAI to launch a six-month campaign in local media to raise awareness on how the company collects personal data.
«ChatGPT users and non-users should be made aware of how to oppose the training of generative artificial intelligence with their personal data and, therefore, be effectively placed in the position to exercise their rights under the General Data Protection Regulations (GDPR),» the report read.
OpenAI brands decision ‘disproportionate’
Garante had previously put a temporary block on ChatGPT from being used due to privacy concerns while the authority investigated a possible data breach in 2023.
In a emailed statement, OpenAI dubbed the decision «disproportionate» and said it will appeal.
An OpenAI spokesperson said the fine was «nearly 20 times» the revenue it made in Italy during the same year.
The company did add that it remained «committed to working with privacy authorities worldwide to offer beneficial AI that respects privacy rights».
Regulators in the US and Europe have been examining OpenAI and other companies that have played a key part in the AI boom, while governments around the world have been drawing up rules to protect against risks posed by AI systems, led by the European Union’s AI Act, a comprehensive rulebook for artificial intelligence.