Monday , December 11 2023

EU draft bill sets copyright requirements for AI models like ChatGPT

A new draft of European Union legislation proposes that developers of artificial intelligence (AI) tools like ChatGPT must disclose copyrighted material used in building their systems.

Members of the European Parliament have reached a preliminary agreement on the EU’s AI Act, with new provisions added that will require generative AI developers to disclose copyrighted materials used to build their models.

This requirement could provide publishers and content creators with a means to seek profit shares when their work is used for AI-generated content. The EU bill leads the global push for AI regulation and is expected to be finalized and passed later this year.

Generative AI models are trained on billions of existing works to create content and have caused ire among content creators, who say they should be compensated.

EU legislators considered outright banning the use of copyrighted material in AI models but instead agreed on a transparency requirement, which has been praised as a compromise that regulates AI without stifling innovation.

The EU started drafting its AI Act in 2021 and focused initially on the use of artificially intelligent tools, classifying them according to the perceived level of risk they pose, from low to unacceptable. The strictest rules are reserved for the most high-risk applications, such as biometric surveillance or spreading misinformation.

The focus shifted to generative AI in the wake of the viral success of OpenAI’s ChatGPT, released in November.

Italy temporarily banned ChatGPT on privacy grounds, and governments, including the US, the UK, and China, are exploring AI regulation, with rules made in the EU capital Brussels often setting legal precedents worldwide.

About infosecbulletin

Check Also

Logo of Apple, Google

US senator's letter
Governments spying on Apple, Google users through push notifications

Some governments have asked Apple and Google for the push notification records of their users …

Leave a Reply

Your email address will not be published. Required fields are marked *